You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
With agents, we might not necessarily need responses instantaneously, to reduce costs, by half atleast, supporting batch/async processing functionality might help.
Describe the solution you'd like
Not fully flushed yet, but lets say we want OpenAI chat completions, instead of calling the API directly, we can call the https://platform.openai.com/docs/guides/batch API option. Any parallel non dependent tasks could carry on, while this pipeline waits for the response to come up. Once the response is available, the followup tasks continue.
Is this not possible with the current options.
From looking up at code in some places, I do not see it implemented.
Describe alternatives you've considered
this is a functionality addition, a nice to have, not necessarily a must have probably, hence not a feature yet?
Is your feature request related to a problem? Please describe.
With agents, we might not necessarily need responses instantaneously, to reduce costs, by half atleast, supporting batch/async processing functionality might help.
Describe the solution you'd like
Not fully flushed yet, but lets say we want OpenAI chat completions, instead of calling the API directly, we can call the https://platform.openai.com/docs/guides/batch API option. Any parallel non dependent tasks could carry on, while this pipeline waits for the response to come up. Once the response is available, the followup tasks continue.
Is this not possible with the current options.
From looking up at code in some places, I do not see it implemented.
Describe alternatives you've considered
this is a functionality addition, a nice to have, not necessarily a must have probably, hence not a feature yet?
Additional context
https://platform.openai.com/docs/guides/batch support for these options.
The text was updated successfully, but these errors were encountered: