Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Usage]: openbmb-MiniCPM-Llama3-V-2_5 在vllm的兼容openai接口服务下无法使用 #8

Closed
renjingneng opened this issue Jul 25, 2024 · 3 comments

Comments

@renjingneng
Copy link

Your current environment

报错信息如下:

INFO: 172.27.100.187:35858 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 426, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in call
return await self.app(scope, receive, send)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in call
await super().call(scope, receive, send)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/applications.py", line 123, in call
await self.middleware_stack(scope, receive, send)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in call
raise exc
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in call
await self.app(scope, receive, _send)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/middleware/cors.py", line 85, in call
await self.app(scope, receive, send)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 65, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/routing.py", line 756, in call
await self.middleware_stack(scope, receive, send)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/routing.py", line 776, in app
await route.handle(scope, receive, send)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle
await self.app(scope, receive, send)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/routing.py", line 77, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/routing.py", line 72, in app
response = await func(request)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/fastapi/routing.py", line 278, in app
raw_response = await run_endpoint_function(
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
return await dependant.call(**values)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/vllm/entrypoints/openai/api_server.py", line 130, in create_chat_completion
generator = await openai_serving_chat.create_chat_completion(
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/vllm/entrypoints/openai/serving_chat.py", line 197, in create_chat_completion
return await self.chat_completion_full_generator(
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/vllm/entrypoints/openai/serving_chat.py", line 448, in chat_completion_full_generator
async for res in result_generator:
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/vllm/engine/async_llm_engine.py", line 772, in generate
async for output in self._process_request(
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/vllm/engine/async_llm_engine.py", line 888, in _process_request
raise e
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/vllm/engine/async_llm_engine.py", line 884, in _process_request
async for request_output in stream:
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/vllm/engine/async_llm_engine.py", line 93, in anext
raise result
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/vllm/engine/async_llm_engine.py", line 46, in _log_task_completion
return_value = task.result()
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/vllm/engine/async_llm_engine.py", line 637, in run_engine_loop
result = task.result()
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/vllm/engine/async_llm_engine.py", line 565, in engine_step
await self.engine.add_request_async(**new_request)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/vllm/engine/async_llm_engine.py", line 323, in add_request_async
processed_inputs = await self.process_model_inputs_async(
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/vllm/engine/async_llm_engine.py", line 305, in process_model_inputs_async
return self.input_processor(llm_inputs)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/vllm/inputs/registry.py", line 202, in process_input
return processor(InputContext(model_config), inputs)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/vllm/model_executor/models/minicpmv.py", line 366, in input_processor_for_minicpmv
+ text_chunks[1]
IndexError: list index out of range

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 426, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in call
return await self.app(scope, receive, send)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in call
await super().call(scope, receive, send)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/applications.py", line 123, in call
await self.middleware_stack(scope, receive, send)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in call
raise exc
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in call
await self.app(scope, receive, _send)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/middleware/cors.py", line 85, in call
await self.app(scope, receive, send)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 65, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/routing.py", line 756, in call
await self.middleware_stack(scope, receive, send)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/routing.py", line 776, in app
await route.handle(scope, receive, send)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle
await self.app(scope, receive, send)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/routing.py", line 77, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/starlette/routing.py", line 72, in app
response = await func(request)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/fastapi/routing.py", line 278, in app
raw_response = await run_endpoint_function(
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
return await dependant.call(**values)
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/vllm/entrypoints/openai/api_server.py", line 130, in create_chat_completion
generator = await openai_serving_chat.create_chat_completion(
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/vllm/entrypoints/openai/serving_chat.py", line 197, in create_chat_completion
return await self.chat_completion_full_generator(
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/vllm/entrypoints/openai/serving_chat.py", line 448, in chat_completion_full_generator
async for res in result_generator:
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/vllm/engine/async_llm_engine.py", line 772, in generate
async for output in self._process_request(
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/vllm/engine/async_llm_engine.py", line 873, in _process_request
stream = await self.add_request(
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/vllm/engine/async_llm_engine.py", line 676, in add_request
self.start_background_loop()
File "/root/anaconda3/envs/MiniCPMV/lib/python3.10/site-packages/vllm/engine/async_llm_engine.py", line 516, in start_background_loop
raise AsyncEngineDeadError(
vllm.engine.async_llm_engine.AsyncEngineDeadError: Background loop has errored already.

How would you like to use vllm

No response

@renjingneng
Copy link
Author

补充:
vllm serve /path/to/llava-hf_llava-v1.6-mistral-7b-hf --port 8010 --trust-remote-code 没问题
vllm serve /path/to/openbmb-MiniCPM-Llama3-V-2_5 --port 8010 --trust-remote-code 报错
openbmb-MiniCPM-Llama3-V-2_5是从hugging face下的

@HwwwwwwwH
Copy link
Collaborator

HwwwwwwwH commented Jul 26, 2024

应该已经修复了,我也同步更新了此库代码
具体的可以看这个PR,vllm-project#6787

@renjingneng
Copy link
Author

已经可以用了 但是返回很长一段,我把<|eot_id|>设置成终止符就可以了

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants