Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot use functions when deploying local model #374

Open
choiszt opened this issue Oct 10, 2024 · 0 comments
Open

Cannot use functions when deploying local model #374

choiszt opened this issue Oct 10, 2024 · 0 comments

Comments

@choiszt
Copy link

choiszt commented Oct 10, 2024

I am currently deploying the LLaVA series based on the qwen-agent framework. However, I encountered an issue while trying the basic demo, where I am unable to get any return for function calls. It appears that the code does not take functions as input, and as a result, the _call_llm method does not return any function entry points or outputs.

After investigating the issue, I found that the _chat_with_functions method doesn’t seem to handle function calls properly. Below is the relevant section of the code:
def _chat_with_functions( self, messages: List[Message], functions: List[Dict], stream: bool, delta_stream: bool, generate_cfg: dict, lang: Literal['en', 'zh'], ) -> Union[List[Message], Iterator[List[Message]]]: if delta_stream: raise NotImplementedError('Please use stream=True with delta_stream=False, because delta_stream=True' ' is not implemented for function calling due to some technical reasons.') generate_cfg = copy.deepcopy(generate_cfg) for k in ['parallel_function_calls', 'function_choice']: if k in generate_cfg: del generate_cfg[k] return self._continue_assistant_response(messages, generate_cfg=generate_cfg, stream=stream)
Is there any solutions?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant