Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

o1 Model fails when making tool calls #761

Open
mtessar opened this issue Jan 24, 2025 · 11 comments · Fixed by #764
Open

o1 Model fails when making tool calls #761

mtessar opened this issue Jan 24, 2025 · 11 comments · Fixed by #764
Assignees
Labels
bug Something isn't working

Comments

@mtessar
Copy link

mtessar commented Jan 24, 2025

Thanks for integrating the o1 model. I can use o1 agents without tools, but when I try and use tools, I get an unsupported parameter error due to the parallel_tool_calls being set by the pydantic ai framework.

Here is some code that shows the agent working without tools and produces a BadRequestError when using tools:

agent = Agent(
    'openai:o1',
    system_prompt=(
        'You are a helpful assistant.  Today is 2024-12-12'
    ),
)
result = agent.run_sync("What is today's date?")
print(result)
"""
Today is January 24, 2025.
"""

@agent.tool
def get_joke(str) -> str:
    return 'Why did the chicken cross the road? To get to the other side!'

result = agent.run_sync("Call the `get_joke` function")
"""
---------------------------------------------------------------------------
BadRequestError                           Traceback (most recent call last)
<ipython-input-12-512d4a647163> in <cell line: 0>()
     19     return 'Why did the chicken cross the road? To get to the other side!'
     20 
---> 21 result = agent.run_sync("Call the `get_joke` function")
     22 

10 frames
/usr/local/lib/python3.11/dist-packages/openai/_base_client.py in _request(self, cast_to, options, stream, stream_cls, retries_taken)
   1642 
   1643             log.debug("Re-raising status error")
-> 1644             raise self._make_status_error_from_response(err.response) from None
   1645 
   1646         return await self._process_response(

BadRequestError: Error code: 400 - {'error': {'message': "Unsupported parameter: 'parallel_tool_calls' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'parallel_tool_calls', 'code': 'unsupported_parameter'}}
"""

This code works for openai:gpt-4o but not o1. Thanks!

@sydney-runkle sydney-runkle added the bug Something isn't working label Jan 24, 2025
@sydney-runkle
Copy link
Member

@mtessar,

Thanks for the report. I'll work on a fix for this today 👍

@sydney-runkle sydney-runkle self-assigned this Jan 24, 2025
@mtessar
Copy link
Author

mtessar commented Jan 24, 2025

Pretty sure you are the one who deserves the thanks for adding this support @sydney-runkle

@sydney-runkle
Copy link
Member

sydney-runkle commented Jan 24, 2025

@mtessar, could you please confirm your agent works on #764? I'm getting an awfully long delay in responses, but no error 👍

@mtessar
Copy link
Author

mtessar commented Jan 24, 2025

@sydney-runkle I will check

@mtessar
Copy link
Author

mtessar commented Jan 24, 2025

I couldn't figure out how to get the branch installed with pip but I did temporarily edit the line you did and it worked (and was also very slow for me :) )

@izzyacademy
Copy link
Contributor

izzyacademy commented Jan 24, 2025

@mtessar you can you the following steps to build and install the package from the git branch

# Install the tools we will use to build the library
pip install hatch hatchling

# Clone the repository
git clone [email protected]:pydantic/pydantic-ai.git

# Navigate to the repository
cd pydantic-ai/

# Checkout the specific branch [main]
git checkout main (or your specific branch) 

# Build the pydantic packages
hatch build

# Navigate to the folder containing your example programs
# Install the *.whl files from the dist folder
pip install ~/Code/pydantic-ai/*/dist/pydantic_*-0.0.19-py3-none-any.whl

Make sure to run pip uninstall on the pydantic-ai, pydantic-ai-slim and pydantic-graph packages first before installing the *.whl packages

@mtessar
Copy link
Author

mtessar commented Jan 24, 2025

I was able to get the pydantic-ai installed with the instructions above but pydantic-ai-slim and pydantic-ai-graph are still the pip packages

pydantic-ai @ file:///Users/mtessar/src/pydantic-ai/dist/pydantic_ai-0.0.20-py3-none-any.whl#sha256=52553460e55e194d179c486fd38b60e357609b9b55782af8e9afa1cacd62fbaf
pydantic-ai-slim==0.0.20
pydantic-graph==0.0.20

Sorry!

@elmehalawi
Copy link

This doesn't seem to work for o1-mini. I get the following error:

openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'developer' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}

I'm setting up my model like this:

open_ai_model = OpenAIModel("o1-mini", system_prompt_role="developer")

scheduler_agent = Agent(
    open_ai_model,
    deps_type=Dependencies,
    result_type=str,
    system_prompt=system_prompt,
    retries=4,
)

@Kludex
Copy link
Member

Kludex commented Feb 25, 2025

@elmehalawi This is a different issue: #974

@drdavella
Copy link

@Kludex even when accounting for the system call issue in #974 this still does not work. Here's a minimal example:

from pydantic import BaseModel
from pydantic_ai import Agent
from pydantic_ai.models.openai import OpenAIModel

model = OpenAIModel(
    model_name='o1-mini',
    system_prompt_role='user',
)

class YesNoAnswer(BaseModel):
    answer: str

agent = Agent(
    model=model,
    system_prompt='You are a helpful assistant. Provide a yes-no answer to the question.',
    result_type=YesNoAnswer,
)

print(agent.run_sync('Is the sky blue?'))

With the following result:

 File "/Users/username/miniconda3/envs/pydantic-ai/lib/python3.13/site-packages/openai/_base_client.py", line 1651, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'error': {'message': 'tools is not supported in this model. For a list of supported models, refer to https://platform.openai.com/docs/guides/function-calling#models-supporting-function-calling.', 'type': 'invalid_request_error', 'param': None, 'code': None}}

@Kludex Kludex reopened this Feb 26, 2025
@Kludex
Copy link
Member

Kludex commented Feb 26, 2025

If you remove the result_type, it does. 🤔

@Kludex Kludex assigned Kludex and unassigned sydney-runkle Feb 26, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants