-
Notifications
You must be signed in to change notification settings - Fork 80
Running a Local Environment fails #526
Comments
@epwalsh , Matt said that you might be able to help. |
Hey @nitishgupta, I would try just running the model image by itself to debug. Instructions for running an the model image by itself are here: https://github.com/allenai/allennlp-demo/tree/master/api#building |
Can you access the UI if you do that? |
No, but at least you could easily see the error messages from the server (if that's where the issue is). |
I guess, what I don't know is how to actually run the demo during development now. Since the split of the UI and the API, there are no longer instructions for how to do local development (running both the UI and the API together, and having them talk to each other). Or are they somewhere and I just missed them? |
Did you see step # 3 here: https://github.com/allenai/allennlp-demo#running-a-local-environment ? |
@epwalsh - Inside the Then I follow the steps in Local Development, but when I run the
Output from
|
Ah, yea to fix that error you should install the
|
But there isn't a |
That's a good point 😏 Try setting the PYTHONPATH env variable then:
|
Great, I didn't know about the So the error now is --
BTW, when I ran the UI + API for other models, I faced the same issue. I am guessing the predictor not being accessible is a common issue for the other models I tried. |
Related -- I've faced a similar issue when using |
Do you have |
@epwalsh, sorry for being dense, this is me not really understanding docker-compose, and remembering things incorrectly. But, I just tried running the commands that you linked exactly as-is, and I see the same error that @nitishgupta is reporting. I tried accessing the UI from both port 3000 (as @nitishgupta did, I think) and from port 8080 (as the instructions say to do), and I get an error in both cases. From port 3000 it's a 404, and from port 8080 it's a 502. |
Okay, so with bidaf-elmo the issue was incorrect predictor name in I killed the |
@nitishgupta looks like this is due to mismatch between the version of @matt-gardner I'll take to see if I can get that running. |
Okay; there is no Dockerfile in When I run the Does it point to the fact that the model is getting loaded fine and the issue in the demo is somewhere else which I think you'll be looking into with the |
I ran the command
MODEL=bidaf_elmo docker-compose up --build
and according to the logs the demo started correctlyWhen I try to get predictions I get the
"Something went wrong. Please try again"
error. The Chrome console shows the error:The text was updated successfully, but these errors were encountered: