-
Notifications
You must be signed in to change notification settings - Fork 105
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: RuntimeError:Internal:could not parse:ModelProto #833
Comments
Duplicate of #704 |
@Ye-D mind take a look? |
This example is only tested on the LLaMA7b of EasyML, not that of transformers library |
您好,请问我应该如何处理这个问题,是库导入不对的原因嘛? |
可能是版本不对,试试这个回答里的方法:#782 (comment) |
Hi @seeronline Due to security concerns, please do not post random download links without an explanation. |
我尝试了#782,将LLaMAConfigurator换成了LLaMAConfig,出现了以下错误:python flax_llama7b.py --model_path /home/lenovo/Documents/.vscode/llama_7b2 --config ./3pc.json During handling of the above exception, another exception occurred: Traceback (most recent call last): During handling of the above exception, another exception occurred: Traceback (most recent call last): |
@Ye-D 可以帮忙分析一下嘛,谢谢! |
我留意到当我使用checkpoint_dir时会报错:python convert_hf_to_easylm.py \
|
After using the EasyLM here to convert the hf model to an easylm one( https://github.com/young-geng/EasyLM/tree/08_31_2023),the model could be loaded successfully.However,the program cannot be run to complete. Is it because of the version of jaxlib? (py311xie) (base) lenovo@lenovo-07:~/Documents/.vscode/spu/examples/python/ml/flax_llama7b$ python flax_llama7b.py --model_path /home/lenovo/Documents/.vscode/llama_7b --config ./3pc.json Run on CPU Run on SPU |
Here is the logs when running nodectl.py: |
Issue Type
Build/Install
Modules Involved
SPU runtime, SPU compiler
Have you reproduced the bug with SPU HEAD?
Yes
Have you searched existing issues?
Yes
SPU Version
0.9.2b0
OS Platform and Distribution
Linux
Python Version
3.11
Compiler Version
gcc
Current Behavior?
I was trying to reproduce "Flax Llama-7B Example with Puma" in "examples/python/ml/flax_llama7b".However,I failed to load flax-llama7b-EasyLM model.
Standalone code to reproduce the issue
Relevant log output
No response
The text was updated successfully, but these errors were encountered: