-
Notifications
You must be signed in to change notification settings - Fork 5.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
max_tokens args doesn't work on stanford_town #1675
Comments
Yeah, i get what you mean. But the STAction is from official example of MetaGPT from metagpt.config2 import config In this example, they import global config to control It will lead that llm output many many tokens even though I set max_tokens = 50, so I recommend you can fix this bug |
Sorry, I misunderstood. I'll fix it. |
iorisa
pushed a commit
to iorisa/MetaGPT
that referenced
this issue
Jan 22, 2025
Open
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Bug description
max_tokens args doesn't work on STAction._run_gpt35_max_tokens()
Actually, self.llm module use self.llm.config rather than config.llm, even though they are both same class type (id is not same)
Bug solved method
Simply, just replace config.llm to self.llm.config, max_tokens can work
But I think self.llm.config has same class type with config.llm, they should be same thing (id is same), maybe something wrong in initialization these class
The text was updated successfully, but these errors were encountered: