-
Notifications
You must be signed in to change notification settings - Fork 147
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cannot call CompleteChatAsync with options after ToolChatMessage in messages of type List<ChatMessage> #218
Comments
Thank you for reaching out, @taihuy ! This snippet is from our function calling example linked below, correct? I just ran the example using the latest version of the library, and it works as expected. Did you make any modifications to the code? If you could share an end-to-end repro, that would be very helpful! |
Hi @joseharriaga, Thanks very much for your quick answer. I really appreciate it. It is even more helpful when we work with this kind of innovative technology and still in beta test.
I got the following warning "Azure.AI.OpenAI.AzureChatCompletionOptionsExtensions.AddDataSource(OpenAI.Chat.ChatCompletionOptions, Azure.AI.OpenAI.Chat.AzureChatDataSource)' is for evaluation purposes only and is subject to change or removal in future updates.", but I need both tools and Azure Search Service in my bot. Is there any ways that I can have both? |
@joseharriaga I have the same issue when using I noticed that it's related to I tried it with following test code snippet: var messages = new List<ChatMessage>
{
new AssistantChatMessage(toolCalls: [], content: null),
new AssistantChatMessage(toolCalls: [], content: string.Empty),
new AssistantChatMessage(toolCalls: [], content: "test")
};
var result = await client.CompleteChatAsync(messages); This is how the request looks like: {
"messages":[
{
"role":"assistant"
},
{
"role":"assistant",
"content":[
{
"type":"text",
"text":""
}
]
},
{
"role":"assistant",
"content":"test"
}
],
"model":"gpt-4o"
}
It looks like Azure OpenAI with data service works only when |
Thank you for providing more context! Unfortunately, the Azure OpenAI service does not support function calling in combination with data sources. You can find more information about it here: |
@joseharriaga It's not supported, but in this case the request should still be successful and either function calling, or data sources should be ignored. But taking into account the serialization aspect of assistant messages that I shared above, instead of ignoring function calling/data source configuration, it throws |
@dmytrostruk Ah! We merged and released a change where an empty string is represented as |
@joseharriaga Great news, thanks a lot! |
Confirm this is not an issue with the OpenAI Python Library
Confirm this is not an issue with the underlying OpenAI API
Confirm this is not an issue with Azure OpenAI
Describe the bug
I really like the idea of this codesnip, but it seems like the client.CompleteChat(messages, options); throws exception from server without any specific errors (only 400 Bad Request has been returned), if we call this method after the ToolCall were processed in the previous loop. If we remove options and only keep messages without options CompleteChat(messages), and again after ToolCall, it would work fine.
in some forums, I see that they suggest to use CompleteChat with options for the first time, and then go through the tool before call the CompleteChat for the last time without options. In this case how we can include the whole history? For example when the bot need a parameter from user and ask a question before the bot can run function further?
On other words, how can we process tool calls as chain (not parallel), output from the first tool can be input for the next tool.
To Reproduce
Just run the codesnip and make sure that the model runs at least one tool. You will see that it throws exception.
The error I have gotten:
Code snippets
No response
OS
Windows
.NET version
8.0.6
Library version
2.0.0-beta.2
The text was updated successfully, but these errors were encountered: