Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Magic line for visualising attention #69

Open
cipri-tom opened this issue Feb 25, 2018 · 0 comments
Open

Magic line for visualising attention #69

cipri-tom opened this issue Feb 25, 2018 · 0 comments

Comments

@cipri-tom
Copy link

Hello,

Thank you for releasing your code ! It is a great contribution and I see it helps a lot of people (including me!).

I have a question WRT the visualize_attention() method, namely this line:

attention_orig = np.convolve(attention_orig, [0.199547,0.200226,0.200454,0.200226,0.199547], mode='same')

I believe its purpose is to highlight the important part and drive the rest to zero, similar to a softmax. I'm curious as to how you got the numbers for the filter. Any insight would be greatly appreciated !

Thanks !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant