Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

openai new models dont work #1682

Open
cranyy opened this issue Feb 3, 2025 · 2 comments
Open

openai new models dont work #1682

cranyy opened this issue Feb 3, 2025 · 2 comments

Comments

@cranyy
Copy link

cranyy commented Feb 3, 2025

The new models give max_temperature unknown error and cannot be used.

@seehi
Copy link
Contributor

seehi commented Feb 12, 2025

temperature?

@cranyy
Copy link
Author

cranyy commented Feb 12, 2025

my bad, i meant tokens -- ^^^^^^^^^^^^^^^^^^^^

File "E:\MetaStocky\env\Lib\site-packages\openai\_base_client.py", line 1625, in _request
   raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.", 'type': 'invalid_request_error', 'param': 'max_tokens', 'code': 'unsupported_parameter'}}

llm:
api_type: "openai" # or azure / ollama / groq etc.
model: "o3-mini" # or gpt-3.5-turbo
base_url: "https://api.openai.com/v1" # or forward url / other llm url
api_key: "sk-proj-

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants