Skip to content

[Bug] error in streaming mode if llm call tool does not require parameters #3134

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wusskk opened this issue May 9, 2025 · 2 comments
Open
Assignees
Labels
bug Something isn't working

Comments

@wusskk
Copy link

wusskk commented May 9, 2025

Description

Communicate with qwen3-32b via streaming, when need to call a tool without arguments, if the model returns the argument as '', wait until the completion of the tool call and then return to the model will report an error.
I think there are 3 main reasons for this problem:

  1. the ability of the model itself (no such phenomenon when using deepseek-chat API, but it will be triggered when using locally deployed qwen3-32b)
  2. the use of streaming (not triggered when stream=False)
  3. tool does not require parameters

Steps to Reproduce

  1. select qwen3-32b as model
  2. use stream=True
  3. selec a tool does not require parameters

Agent Configuration (if applicable)

Minimum Implementation Code:

import random

from agno.agent import Agent
from agno.models.openai.like import OpenAILike
from agno.tools import tool


@tool(show_result=True)
def get_number() -> str:
    """Get the number."""
    # In a real implementation, this would call a number API
    random_number = random.randint(1, 100)

    return f"The number is {random_number}."


model = OpenAILike(
    id="qwen3-32b", base_url="http://127.0.0.1:8080, api_key="sk-xxxxx"
)

agent = Agent(model=model, tools=[get_number], debug_mode=True)
agent.print_response("Give me a random number.", stream=True)

Expected Behavior

Call the tool and output the results

Actual Behavior

PS D:\Code\agent\agno_bug> & D:/Code/agent/agno_bug/.venv/Scripts/python.exe d:/Code/agent/agno_bug/main.py
DEBUG ******************************************* Agent ID: d70c8231-afc4-4770-8ff7-812fc329179d *******************************************
DEBUG ****************************************** Session ID: 4a93145c-b17e-4f0a-aace-08780e3fe147 ******************************************
DEBUG *************************************** Agent Run Start: 4d03b13d-f073-4f03-b938-ffa1b1a242e9 ****************************************
DEBUG Processing tools for model
DEBUG Added tool get_number
DEBUG ---------------------------------------------------- OpenAI Response Stream Start ----------------------------------------------------
DEBUG ---------------------------------------------------------- Model: qwen3-32b ----------------------------------------------------------
DEBUG ================================================================ user ================================================================
DEBUG Give me a random number.
DEBUG messages in stream: [{'role': 'user', 'content': 'Give me a random number.'}]
DEBUG ============================================================= assistant ==============================================================
DEBUG <think>
      Okay, the user is asking for a random number. Let me check the tools available. There's a function called get_number with no parameters required.   
      Since the user wants a random number, I should call that function. But wait, the function's description just says "Get the number." Does it generate
      a random one or return a fixed value? The parameters are empty, so maybe it's designed to return a random number by default. I'll proceed to call   
      get_number without any arguments. I need to make sure the tool call is correctly formatted in JSON within the XML tags. Alright, that should do it. 
      </think>


DEBUG Tool Calls:                                                                                                                                         
        - ID: 'chatcmpl-tool-1077bda4db6941d1936130b9f9d57102'                                                                                            
          Name: 'get_number'                                                                                                                              
DEBUG *************************************************************  METRICS  **************************************************************              
DEBUG * Tokens:                      input=139, output=144, total=283                                                                                     
DEBUG * Time:                        3.3514s                                                                                                              
DEBUG * Tokens per second:           42.9673 tokens/s                                                                                                     
DEBUG * Time to first token:         0.9840s                                                                                                              
DEBUG *************************************************************  METRICS  **************************************************************              
DEBUG Getting function get_number                                                                                                                         
DEBUG Running: get_number()                                                                                                                               
DEBUG ================================================================ tool ================================================================              
DEBUG Tool call Id: chatcmpl-tool-1077bda4db6941d1936130b9f9d57102
DEBUG The number is 67.                                                                                                                                   
DEBUG ***********************************************************  TOOL METRICS  ***********************************************************              
DEBUG * Time:                        0.0022s                                                                                                              
DEBUG ***********************************************************  TOOL METRICS  ***********************************************************              
DEBUG messages in stream: [{'role': 'user', 'content': 'Give me a random number.'}, {'role': 'assistant', 'content': '<think>\nOkay, the user is asking   
      for a random number. Let me check the tools available. There\'s a function called get_number with no parameters required. Since the user wants a    
      random number, I should call that function. But wait, the function\'s description just says "Get the number." Does it generate a random one or      
      return a fixed value? The parameters are empty, so maybe it\'s designed to return a random number by default. I\'ll proceed to call get_number      
      without any arguments. I need to make sure the tool call is correctly formatted in JSON within the XML tags. Alright, that should do                
      it.\n</think>\n\n', 'tool_calls': [{'id': 'chatcmpl-tool-1077bda4db6941d1936130b9f9d57102', 'type': 'function', 'function': {'name': 'get_number',  
      'arguments': ''}}]}, {'role': 'tool', 'content': 'The number is 67.', 'tool_call_id': 'chatcmpl-tool-1077bda4db6941d1936130b9f9d57102'}]            
ERROR    API status error from OpenAI API: Error code: 400 - {'object': 'error', 'message': 'Expecting value: line 1 column 1 (char 0)', 'type':
         'BadRequestError', 'param': None, 'code': 400}
▰▱▱▱▱▱▱ Thinking...
┏━ Message ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓
┃                                                                                                                                                        ┃
┃ Give me a random number.                                                                                                                               ┃
┃                                                                                                                                                        ┃
┗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┛
┏━ Tool Calls ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓
┃                                                                                                                                                        ┃
┃ • get_number()                                                                                                                                         ┃
┃                                                                                                                                                        ┃
┗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┛
┏━ Response (3.4s) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓
┃                                                                                                                                                        ┃
┃ <think>                                                                                                                                                ┃
┃ Okay, the user is asking for a random number. Let me check the tools available. There's a function called get_number with no parameters required.      ┃
┃ Since the user wants a random number, I should call that function. But wait, the function's description just says "Get the number." Does it generate a ┃
┃ random one or return a fixed value? The parameters are empty, so maybe it's designed to return a random number by default. I'll proceed to call        ┃
┃ get_number without any arguments. I need to make sure the tool call is correctly formatted in JSON within the XML tags. Alright, that should do it.    ┃
┃ </think>                                                                                                                                               ┃
┃                                                                                                                                                        ┃
┃ The number is 67.                                                                                                                                      ┃
┃                                                                                                                                                        ┃
┗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┛
Traceback (most recent call last):
  File "D:\Code\agent\agno_bug\.venv\Lib\site-packages\agno\models\openai\chat.py", line 455, in invoke_stream
    yield from self.get_client().chat.completions.create(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Code\agent\agno_bug\.venv\Lib\site-packages\openai\_utils\_utils.py", line 287, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "D:\Code\agent\agno_bug\.venv\Lib\site-packages\openai\resources\chat\completions\completions.py", line 925, in create
    return self._post(
           ^^^^^^^^^^^
  File "D:\Code\agent\agno_bug\.venv\Lib\site-packages\openai\_base_client.py", line 1239, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Code\agent\agno_bug\.venv\Lib\site-packages\openai\_base_client.py", line 1034, in request
    raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'object': 'error', 'message': 'Expecting value: line 1 column 1 (char 0)', 'type': 'BadRequestError', 'param': None, 'code': 400}

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "d:\Code\agent\agno_bug\main.py", line 24, in <module>
    agent.print_response("Give me a random number.", stream=True)
  File "D:\Code\agent\agno_bug\.venv\Lib\site-packages\agno\agent\agent.py", line 4545, in print_response
    for resp in self.run(
                ^^^^^^^^^
  File "D:\Code\agent\agno_bug\.venv\Lib\site-packages\agno\agent\agent.py", line 655, in _run
    for model_response_chunk in self.model.response_stream(messages=run_messages.messages):
                                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Code\agent\agno_bug\.venv\Lib\site-packages\agno\models\base.py", line 520, in response_stream
    yield from self.process_response_stream(
  File "D:\Code\agent\agno_bug\.venv\Lib\site-packages\agno\models\base.py", line 492, in process_response_stream
    for response_delta in self.invoke_stream(messages=messages):
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Code\agent\agno_bug\.venv\Lib\site-packages\agno\models\openai\chat.py", line 490, in invoke_stream
    raise ModelProviderError(
agno.exceptions.ModelProviderError: Unknown model error

Screenshots or Logs (if applicable)

Image

Environment

  • OS: Windows 11
  • Agno Version: v1.4.5
  • External Dependency Versions: openai-v1.78.0
  • Additional Environment Details: Python 3.12.8

Possible Solutions (optional)

# add code at _format_messages in agno.model.opnai.chat.OpenAIChat
if "tool_calls" in message_dict:
            for tool in message_dict["tool_calls"]:
                if tool["type"] == "function":
                    if not tool["function"]["arguments"]:
                        tool["function"]["arguments"] = '{}'

Image

Additional Context

A similar problem arises in pydantic-ai, and they think should fix in agent framework.

@wusskk wusskk added the bug Something isn't working label May 9, 2025
@ysolanky ysolanky self-assigned this May 20, 2025
@ysolanky
Copy link
Contributor

Hello @wusskk ! Looks like what is happening here is an incompatibility between the model qwen3-32b and the OpenAI spec for function calling.

OpenAI expects the arguments for a tool call with no params to be {} whereas qwen3-32b returns '' resulting in the error. Your suggestion works but is tied to just qwen3-32b. I think a better solution would be to have a separate Qwen model class if this behaviour is consistent across all qwen models. Can you please share how you are hosting this model?

@wusskk
Copy link
Author

wusskk commented May 21, 2025

Hi @ysolanky , I am using the model qwen3-32b, but it is an API provided by others. As far as I know, they use vllm for deployment, and I am not clear about the specific commands.
Now I have switched to other API providers (siliconflow and Aliyun), using the qwen3 series models for testing, and found that this issue did not occur. Therefore, I suspect it might be related to the deployment method.
Moreover, only when the streaming output is enabled, the qwen3 deployed by vllm will encounter this issue; it runs normally when the streaming output is disabled.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants