You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
api_type: "openai" # or azure / ollama / open_llm etc. Check LLMType for more options
model: "gpt-4o" # or gpt-3.5-turbo-1106 / gpt-4-1106-preview
base_url: "https://api.agicto.cn/v1"
When I execute the following code, I meet some problems:
2025-02-20 20:13:45.205 | WARNING | metagpt.utils.common:wrapper:649 - There is a exception in role's execution, in order to resume, we delete the newest role communication message in the role's memory.
Traceback (most recent call last):
File "/home/ubuntu/venv1/lib/python3.10/site-packages/metagpt/utils/common.py", line 640, in wrapper
return await func(self, *args, **kwargs)
File "/home/ubuntu/venv1/lib/python3.10/site-packages/metagpt/roles/role.py", line 550, in run
rsp = await self.react()
File "/home/ubuntu/venv1/lib/python3.10/site-packages/metagpt/roles/role.py", line 519, in react
rsp = await self._act_by_order()
File "/home/ubuntu/venv1/lib/python3.10/site-packages/metagpt/roles/role.py", line 473, in _act_by_order
rsp = await self._act()
File "/home/ubuntu/metagpt_test/test1.py", line 89, in _act
result = await todo.run(msg.content)
File "/home/ubuntu/metagpt_test/test1.py", line 30, in run
rsp = await self._aask(prompt)
File "/home/ubuntu/venv1/lib/python3.10/site-packages/metagpt/actions/action.py", line 93, in _aask
return await self.llm.aask(prompt, system_msgs)
File "/home/ubuntu/venv1/lib/python3.10/site-packages/metagpt/provider/base_llm.py", line 150, in aask
rsp = await self.acompletion_text(message, stream=stream, timeout=self.get_timeout(timeout))
File "/home/ubuntu/venv1/lib/python3.10/site-packages/tenacity/_asyncio.py", line 88, in async_wrapped
return await fn(*args, **kwargs)
File "/home/ubuntu/venv1/lib/python3.10/site-packages/tenacity/_asyncio.py", line 47, in __call__
do = self.iter(retry_state=retry_state)
File "/home/ubuntu/venv1/lib/python3.10/site-packages/tenacity/__init__.py", line 314, in iter
return fut.result()
File "/usr/lib/python3.10/concurrent/futures/_base.py", line 451, in result
return self.__get_result()
File "/usr/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
raise self._exception
File "/home/ubuntu/venv1/lib/python3.10/site-packages/tenacity/_asyncio.py", line 50, in __call__
result = await fn(*args, **kwargs)
File "/home/ubuntu/venv1/lib/python3.10/site-packages/metagpt/provider/openai_api.py", line 141, in acompletion_text
return await self._achat_completion_stream(messages, timeout=timeout)
File "/home/ubuntu/venv1/lib/python3.10/site-packages/metagpt/provider/openai_api.py", line 94, in _achat_completion_stream
usage = CompletionUsage(**chunk.usage)
TypeError: openai.types.completion_usage.CompletionUsage() argument after ** must be a mapping, not NoneType
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/ubuntu/metagpt_test/test1.py", line 105, in <module>
fire.Fire(main)
File "/home/ubuntu/venv1/lib/python3.10/site-packages/fire/core.py", line 141, in Fire
component_trace = _Fire(component, args, parsed_flag_args, context, name)
File "/home/ubuntu/venv1/lib/python3.10/site-packages/fire/core.py", line 466, in _Fire
component, remaining_args = _CallAndUpdateTrace(
File "/home/ubuntu/venv1/lib/python3.10/site-packages/fire/core.py", line 681, in _CallAndUpdateTrace
component = fn(*varargs, **kwargs)
File "/home/ubuntu/metagpt_test/test1.py", line 100, in main
result = asyncio.run(role.run(msg))
File "/usr/lib/python3.10/asyncio/runners.py", line 44, in run
return loop.run_until_complete(main)
File "/usr/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete
return future.result()
File "/home/ubuntu/venv1/lib/python3.10/site-packages/metagpt/utils/common.py", line 662, in wrapper
raise Exception(format_trackback_info(limit=None))
Exception: Traceback (most recent call last):
File "/home/ubuntu/venv1/lib/python3.10/site-packages/metagpt/utils/common.py", line 640, in wrapper
return await func(self, *args, **kwargs)
File "/home/ubuntu/venv1/lib/python3.10/site-packages/metagpt/roles/role.py", line 550, in run
rsp = await self.react()
File "/home/ubuntu/venv1/lib/python3.10/site-packages/metagpt/roles/role.py", line 519, in react
rsp = await self._act_by_order()
File "/home/ubuntu/venv1/lib/python3.10/site-packages/metagpt/roles/role.py", line 473, in _act_by_order
rsp = await self._act()
File "/home/ubuntu/metagpt_test/test1.py", line 89, in _act
result = await todo.run(msg.content)
File "/home/ubuntu/metagpt_test/test1.py", line 30, in run
rsp = await self._aask(prompt)
File "/home/ubuntu/venv1/lib/python3.10/site-packages/metagpt/actions/action.py", line 93, in _aask
return await self.llm.aask(prompt, system_msgs)
File "/home/ubuntu/venv1/lib/python3.10/site-packages/metagpt/provider/base_llm.py", line 150, in aask
rsp = await self.acompletion_text(message, stream=stream, timeout=self.get_timeout(timeout))
File "/home/ubuntu/venv1/lib/python3.10/site-packages/tenacity/_asyncio.py", line 88, in async_wrapped
return await fn(*args, **kwargs)
File "/home/ubuntu/venv1/lib/python3.10/site-packages/tenacity/_asyncio.py", line 47, in __call__
do = self.iter(retry_state=retry_state)
File "/home/ubuntu/venv1/lib/python3.10/site-packages/tenacity/__init__.py", line 314, in iter
return fut.result()
File "/usr/lib/python3.10/concurrent/futures/_base.py", line 451, in result
return self.__get_result()
File "/usr/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
raise self._exception
File "/home/ubuntu/venv1/lib/python3.10/site-packages/tenacity/_asyncio.py", line 50, in __call__
result = await fn(*args, **kwargs)
File "/home/ubuntu/venv1/lib/python3.10/site-packages/metagpt/provider/openai_api.py", line 141, in acompletion_text
return await self._achat_completion_stream(messages, timeout=timeout)
File "/home/ubuntu/venv1/lib/python3.10/site-packages/metagpt/provider/openai_api.py", line 94, in _achat_completion_stream
usage = CompletionUsage(**chunk.usage)
TypeError: openai.types.completion_usage.CompletionUsage() argument after ** must be a mapping, not NoneType
How can I fix the bug?
The text was updated successfully, but these errors were encountered:
My config2.yaml:
When I execute the following code, I meet some problems:
The problem:
How can I fix the bug?
The text was updated successfully, but these errors were encountered: