Using LiteLLM with other models #705
-
My work has set up LiteLLM with access to a bunch of different models but I'm having trouble with the Anthropic ones.
It would seem there might be some settings that can be adjustable on the LiteLLM server but I don't have access to it. I was playing around with the fabric source code seeing if I could prevent it from sending those parameters without success so far so if someone knows a way I would be most grateful. I know that Fabric has direct support for Anthropic APIs so perhaps I could get my access token directly but I think my work wishes to use LiteLLM to track usage so they may not want to give it to me. Any suggestions? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 4 replies
-
hi @chriswarkentin I'm the litellm maintainer if you set this on your litellm config.yaml it will resolve the problem for your team
|
Beta Was this translation helpful? Give feedback.
you can also specify params to drop https://docs.litellm.ai/docs/completion/drop_params#specify-params-to-drop
would it be easier if you could control this client-side? @chriswarkentin