Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Creation of sequential embeddings #8

Open
cchapin0927 opened this issue May 10, 2020 · 3 comments
Open

Creation of sequential embeddings #8

cchapin0927 opened this issue May 10, 2020 · 3 comments

Comments

@cchapin0927
Copy link

I am trying to repeat the experiment but using a different dataset. I was able to prepare the data in the preprocessing files, but I am having trouble creating sequential embeddings from rank_lstm.py to use in relation_rank_lstm.py. How did you save the hidden state of the LSTM in the correct format?

@fulifeng
Copy link
Owner

According to the released sequential embeddings and the default parameters, it should be easy to figure out the correct dimensions.

@PeterNjeim
Copy link

PeterNjeim commented Jan 18, 2021

I'm lost as to how I can generate the sequential embeddings. Can anyone of you guide me or give some general tips as to how to accomplish this?

Like the OP, I've successfully preprocessed my data by modifying the eod.py file, but after that, I just don't know how to generate the sequential embedding files.

@fffanrrr
Copy link

According to the released sequential embeddings and the default parameters, it should be easy to figure out the correct dimensions.

Excuse me, are the pretrained sequence embeddings obtained by running rank_lstm.py? Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants