Replies: 4 comments 3 replies
-
Hey, any updates? |
Beta Was this translation helpful? Give feedback.
-
@noshila I think we can integrate this model with LangChain environment, as we can deploy functionary as an openai-compatible service. So any library, program, software that is using OpenAI can switch to using functionary as an alternative |
Beta Was this translation helpful? Give feedback.
-
I tried it with vllm, using the langchain class ChatOpenAI, but got: BadRequestError: Error code: 400 - {'object': 'error', 'message': "FunctionaryTokenizer.apply_chat_template() missing 1 required positional argument: 'tools'", 'type': 'BadRequestError', 'param': None, 'code': 400} then I tried passing the tools as a model_kwargs "tools", but I get BadRequestError: Error code: 400 - {'object': 'error', 'message': "[{'type': 'extra_forbidden', 'loc': ('body', 'tools'), 'msg': 'Extra inputs are not permitted', Not sure if you can fix this using the vanilla Langchain ChatOpenAI class. So you might need to develop a dedicated langchain class for this model. Langchain is kind of esoteric when it comes to different model prompt templates, especially tool usage. I don't understand it at all. It would be cool if it worked the same as in the transformers library. But oc that is nothing you can fix. |
Beta Was this translation helpful? Give feedback.
-
I have added functionary v3.1 small to ollama. You should be able to use the ollama integration in langchain to run the model and execute langchain functions. Here is the model link: https://ollama.com/dwightfoster03/functionary-small-v3.1. I have not tested it with langchain yet but it should work. |
Beta Was this translation helpful? Give feedback.
-
Is it possible to integrate this model with the LangChain environment? Can it use LangChain tools?
Beta Was this translation helpful? Give feedback.
All reactions