Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

deep%20learning/Attention/ #3

Open
utterances-bot opened this issue Oct 2, 2022 · 1 comment
Open

deep%20learning/Attention/ #3

utterances-bot opened this issue Oct 2, 2022 · 1 comment

Comments

@utterances-bot
Copy link

Attention 기본 개념 정리 - 지오의 논문 탐방

Table of Contents Attention이란? Self-Attention Multi-Head Attention Transforemrs a. Encoder b. Decoder

https://jio0728.github.io/deep%20learning/Attention/

Copy link

ozzaney commented Oct 2, 2022

attentions의 작동원리를 직관적으로 잘 설명하신 것 같아요!! transformer의 encoder, decoder 파트도 기대하겠습니당!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants