Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Attention用于NLP的一些小结 #733

Open
xiang578 opened this issue Jan 11, 2020 · 0 comments
Open

Attention用于NLP的一些小结 #733

xiang578 opened this issue Jan 11, 2020 · 0 comments

Comments

@xiang578
Copy link
Owner

Attention用于NLP的一些小结

Part I:背景知识 按照惯例,本节要介绍一下Attention是啥,打字也累请直接看这张图: Neural Machine Translation by Jointly Learning to Align and Translate这张图很是出名了,第一个将Attention用在NLP领域的论文,机器翻译,在每一步翻译的时候



Tags:



via Pocket https://ift.tt/35KmwCV



January 11, 2020 at 05:46PM

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant