Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

qwenvl_oai 无法兼容 gpt4o 多模态功能 #390

Open
jameslian87v5 opened this issue Nov 8, 2024 · 1 comment
Open

qwenvl_oai 无法兼容 gpt4o 多模态功能 #390

jameslian87v5 opened this issue Nov 8, 2024 · 1 comment

Comments

@jameslian87v5
Copy link

按道理讲 oai 应该能支持多模态,然后不行。然后使用qwen_oai 仍然支持多模态功能 ,在这个问题上困扰很久才发现 是不支持的

@tuhahaha
Copy link
Collaborator

您好,如果想使用vl模型的oai接口,请在llm参数中设置'model_type': 'qwenvl_oai',qwenvl_oai是可以调用vl的。不知您说的gpt4o 多模态功能具体是指什么功能呢?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants