We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
https://blog.philip-huang.tech/?page=blog_nlp_pytorch_self-attention
輸入準備
我們準備了兩個句子來進行這次實驗
sentences = ['helo attention','have a nice day']
一開始先建立詞表與對應的單詞one-hot encoding
vocabs = ' '.join(sentences).split() vocabs = list(set(vocabs)) one_hots = [] vocab_dict = {} for i,vocab in enumerate(vocabs): one_hots.append([0]*len(vocabs)) one_hots[i][i]=1 for i,(vocab,one_hot) in enumerate(zip(vocabs,one_hots)): vocab_dict[vocab] = one_hot pri
The text was updated successfully, but these errors were encountered:
No branches or pull requests
https://blog.philip-huang.tech/?page=blog_nlp_pytorch_self-attention
- tags: nlp pytorch self-attention - date: 2021/03/24我們準備了兩個句子來進行這次實驗
一開始先建立詞表與對應的單詞one-hot encoding
The text was updated successfully, but these errors were encountered: