-
Notifications
You must be signed in to change notification settings - Fork 549
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
o1 Model fails when making tool calls #761
Comments
Thanks for the report. I'll work on a fix for this today 👍 |
Pretty sure you are the one who deserves the thanks for adding this support @sydney-runkle |
@sydney-runkle I will check |
I couldn't figure out how to get the branch installed with pip but I did temporarily edit the line you did and it worked (and was also very slow for me :) ) |
@mtessar you can you the following steps to build and install the package from the git branch # Install the tools we will use to build the library
pip install hatch hatchling
# Clone the repository
git clone [email protected]:pydantic/pydantic-ai.git
# Navigate to the repository
cd pydantic-ai/
# Checkout the specific branch [main]
git checkout main (or your specific branch)
# Build the pydantic packages
hatch build
# Navigate to the folder containing your example programs
# Install the *.whl files from the dist folder
pip install ~/Code/pydantic-ai/*/dist/pydantic_*-0.0.19-py3-none-any.whl Make sure to run pip uninstall on the pydantic-ai, pydantic-ai-slim and pydantic-graph packages first before installing the *.whl packages |
I was able to get the pydantic-ai installed with the instructions above but pydantic-ai-slim and pydantic-ai-graph are still the pip packages pydantic-ai @ file:///Users/mtessar/src/pydantic-ai/dist/pydantic_ai-0.0.20-py3-none-any.whl#sha256=52553460e55e194d179c486fd38b60e357609b9b55782af8e9afa1cacd62fbaf Sorry! |
This doesn't seem to work for o1-mini. I get the following error:
I'm setting up my model like this: open_ai_model = OpenAIModel("o1-mini", system_prompt_role="developer")
scheduler_agent = Agent(
open_ai_model,
deps_type=Dependencies,
result_type=str,
system_prompt=system_prompt,
retries=4,
) |
@elmehalawi This is a different issue: #974 |
@Kludex even when accounting for the system call issue in #974 this still does not work. Here's a minimal example: from pydantic import BaseModel
from pydantic_ai import Agent
from pydantic_ai.models.openai import OpenAIModel
model = OpenAIModel(
model_name='o1-mini',
system_prompt_role='user',
)
class YesNoAnswer(BaseModel):
answer: str
agent = Agent(
model=model,
system_prompt='You are a helpful assistant. Provide a yes-no answer to the question.',
result_type=YesNoAnswer,
)
print(agent.run_sync('Is the sky blue?')) With the following result:
|
If you remove the |
Thanks for integrating the o1 model. I can use o1 agents without tools, but when I try and use tools, I get an unsupported parameter error due to the parallel_tool_calls being set by the pydantic ai framework.
Here is some code that shows the agent working without tools and produces a BadRequestError when using tools:
This code works for openai:gpt-4o but not o1. Thanks!
The text was updated successfully, but these errors were encountered: