-
Notifications
You must be signed in to change notification settings - Fork 106
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Usability]: 运行LLaMa时报错'Config' object has no attribute 'define_bool_state' #704
Comments
@Ye-D can you take a look? Thanks |
it seems that the flax or jax versions are incompatible with EasyLM. |
@Ye-D Thanks for your reply. At this point, there are still some additional questions confusing me.
|
First, run the mode in plaintext means you do not need spu, just to test your python environment to satisfy the EasyLM requirements. Secondly, yes, you should re-build spu. |
在conda创建的虚拟环境里,似乎解决了依赖问题。
terminal echo: |
--model_path should be assigned the dir-to-flax-llama7b-EasyLM, not the .msgpack. |
不好意思,我可能对教程前面的操作的理解有一些问题。
如果只有模型的文件夹路径,那如何找到模型的文件本身呢?这里.msgpack文件的命名是固定的? |
模型命名为flax_model.msgpack,放在一个目录里,path指向这个目录就行 |
将flax_llama7b.py中的model_path = parser.model_path改为model_path = args.model_path似乎能够解决问题。然后我得到了以下的输出。请问这里的脚本的功能是从huggingface还是从模型文件获取tokenizer? 报错如下: |
好吧,我理解有误。我没有意识到转换后的.msgpack文件只是llama模型文件的一部分。.msgpack文件需要被移动到模型文件夹,才能找到tokenizer.model。 可以关掉这个issue了。 |
Issue Type
Usability
Modules Involved
Documentation/Tutorial/Example
Have you reproduced the bug with SPU HEAD?
Yes
Have you searched existing issues?
Yes
SPU Version
spu 0.9.0
OS Platform and Distribution
Linux Ubuntu 20.04LTSC
Python Version
3.8.5
Compiler Version
GCC 11.2
Current Behavior?
按照CONTRIBUTING.md#build和issue#393,从源码编译spu为python包,然后按照这里运行LLaMa的spu推理报错。
一个终端在~/spu/下执行
并拉起节点
然后另一个终端执行
P.S. EasyLM的作者在最近更新了convert_hf_to_easylm.py脚本,spu的文档中提到的需要修改的代码字段
已经不存在了。
Standalone code to reproduce the issue
Relevant log output
The text was updated successfully, but these errors were encountered: