-
I am just starting my journey with djl. I want to use custom huggingface QA models so I followed the directions in the Usage section of http://djl.ai/extensions/tokenizers/ . I converted Criteria<QAInput, String> criteria = Criteria.builder()
.setTypes(QAInput.class, String.class)
.optModelPath(Paths.get("model/nlp/question_answer/ai/djl/huggingface/pytorch/deepset/bert-base-cased-squad2/0.0.1/bert-base-cased-squad2.zip"))
.optProgress(new ProgressBar()).build(); a call to
The file path in
I'm happy to provide any further details and would be grateful for any tips. Thanks in advance,
|
Beta Was this translation helpful? Give feedback.
Replies: 4 comments
-
The DJL modules are fairly à la carte and it seems like you are missing some dependencies. I think you are missing the PyTorch engine. You also probably want the PyTorch model zoo where you can find the translator. You can also try using the HuggingFace model zoo which has models that can be downloaded more easily. |
Beta Was this translation helpful? Give feedback.
-
I think I have all of those dependencies <dependency>
<groupId>ai.djl</groupId>
<artifactId>api</artifactId>
</dependency>
<dependency>
<groupId>ai.djl.huggingface</groupId>
<artifactId>tokenizers</artifactId>
</dependency>
<dependency>
<groupId>ai.djl.pytorch</groupId>
<artifactId>pytorch-model-zoo</artifactId>
</dependency>
<dependency>
<groupId>ai.djl.pytorch</groupId>
<artifactId>pytorch-jni</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>ai.djl.pytorch</groupId>
<artifactId>pytorch-native-cpu</artifactId>
<classifier>osx-x86_64</classifier>
<scope>runtime</scope>
</dependency> and I believe I was able to load and use prepackaged models but I don't understand how to load a model in a zip file as created here. Is it supposed to look inside the zip files I provided and interpret Please advise,
|
Beta Was this translation helpful? Give feedback.
-
You do need specify a TranslatorFactory if your model is in a zip file. You can either:
|
Beta Was this translation helpful? Give feedback.
-
Thank you so much @frankfliu - both of those options worked! I made a PR #2511 so the docs reflect this advice. I was poking around trying to see how the logging might help user know more about what was working than it currently showed. The only thing different when it works is it executes just a few lines after the other threw the exception which logs
It might be cool to log what it found in that zip file to increase user confidence they're close? I see just a couple lines later it gets the model path and parses the serving properties djl/api/src/main/java/ai/djl/repository/zoo/BaseModelLoader.java Lines 101 to 108 in ce57306 |
Beta Was this translation helpful? Give feedback.
You do need specify a TranslatorFactory if your model is in a zip file. You can either:
ai.djl.huggingface.translator.QuestionAnsweringTranslatorFactory
directory,DeferredTranslatorFactory
, it will read the configurations from serving.properties.