You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi everyone, when I try to load an adapter with any Qwen model I get an error. For example with the Qwen 2 VL 2B I get the following: AttributeError: 'Qwen2Model' object has no attribute 'model' rank=0
File "/opt/conda/lib/python3.11/site-packages/text_generation_server/server.py", line 268, in serve_inner 2024-12-16T12:27:28.160533136Z model = get_model_with_lora_adapters( 2024-12-16T12:27:28.160537816Z File "/opt/conda/lib/python3.11/site-packages/text_generation_server/models/__init__.py", line 1350, in get_model_with_lora_adapters 2024-12-16T12:27:28.160543427Z target_to_layer = build_layer_weight_lookup(model.model) 2024-12-16T12:27:28.160548907Z File "/opt/conda/lib/python3.11/site-packages/text_generation_server/utils/adapter.py", line 301, in build_layer_weight_lookup 2024-12-16T12:27:28.160554247Z m = model.text_model.model 2024-12-16T12:27:28.160558947Z File "/opt/conda/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1729, in __getattr__ 2024-12-16T12:27:28.160563567Z raise AttributeError(f"'{type(self).__name__}' object has no attribute '{name}'")
By the way, it is on the roadmap to add support to Qwen models?
Motivation
This would make easier to know which models support adapters. Also would be great to have this information in the documentation. It would save time to not being trying all the possible models. I know that LoRA is still under development but it is a really great feature to have
Your contribution
Anything that I can help
The text was updated successfully, but these errors were encountered:
Feature request
Hi everyone, when I try to load an adapter with any Qwen model I get an error. For example with the Qwen 2 VL 2B I get the following: AttributeError: 'Qwen2Model' object has no attribute 'model' rank=0
File "/opt/conda/lib/python3.11/site-packages/text_generation_server/server.py", line 268, in serve_inner 2024-12-16T12:27:28.160533136Z model = get_model_with_lora_adapters( 2024-12-16T12:27:28.160537816Z File "/opt/conda/lib/python3.11/site-packages/text_generation_server/models/__init__.py", line 1350, in get_model_with_lora_adapters 2024-12-16T12:27:28.160543427Z target_to_layer = build_layer_weight_lookup(model.model) 2024-12-16T12:27:28.160548907Z File "/opt/conda/lib/python3.11/site-packages/text_generation_server/utils/adapter.py", line 301, in build_layer_weight_lookup 2024-12-16T12:27:28.160554247Z m = model.text_model.model 2024-12-16T12:27:28.160558947Z File "/opt/conda/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1729, in __getattr__ 2024-12-16T12:27:28.160563567Z raise AttributeError(f"'{type(self).__name__}' object has no attribute '{name}'")
By the way, it is on the roadmap to add support to Qwen models?
Motivation
This would make easier to know which models support adapters. Also would be great to have this information in the documentation. It would save time to not being trying all the possible models. I know that LoRA is still under development but it is a really great feature to have
Your contribution
Anything that I can help
The text was updated successfully, but these errors were encountered: