Skip to content

is it possible to provide a detail step by step guide how to use together with deepseek? thanks #165

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
tonywang78 opened this issue Apr 27, 2025 · 4 comments

Comments

@tonywang78
Copy link

I tried to setup it to use deepseek but it seems it can not invoke the tool correctly. the right sidebar display nth.
here is the error messages:

2025-04-27 21:14:19,784 - ERROR - Error processing stream: cannot access local variable 'complete_native_tool_calls' where it is not associated with a value
Traceback (most recent call last):
File "C:\Users\tonyw\Documents\GitHub\suna\backend\agentpress\response_processor.py", line 406, in process_streaming_response
if config.native_tool_calling and complete_native_tool_calls:
^^^^^^^^^^^^^^^^^^^^^^^^^^
UnboundLocalError: cannot access local variable 'complete_native_tool_calls' where it is not associated with a value
2025-04-27 21:14:20,638 - INFO - Successfully added message to thread 73b689db-6c25-4051-b83f-49f40736e05c
2025-04-27 21:14:20,648 - INFO - Stream generator main loop cancelled for b1ced9b7-0833-4227-bb1e-f669393ee8ab
2025-04-27 21:14:20,649 - ERROR - Error in listener for b1ced9b7-0833-4227-bb1e-f669393ee8ab: Connection closed by server.

@sahariarpku
Copy link

how did you connect deepseek?

@tonywang78
Copy link
Author

Thanks for the reply, here is what i have done:

  1. agent/api.py add deepseek mapping.
"deepseek-chat": "openai/deepseek-chat",
"deepseek-reasoning": "openai/deepseek-reasoning",
  1. agent/run.py switch the tool calling:
                xml_tool_calling=False,
                native_tool_calling=True,
  1. chat-input.tsx add deepseek option and update the default state to deepseek-chat (it seems the current version didn't provide the model switch button yet or i missed it somewhere):
    { id: "deepseek-chat", label: "Deepseek v3"},
    { id: "deepseek-reasoning", label: "deepseek r1"}

  2. update .env file for the openai_api_base, openai_api_key and model_to_use

@sahariarpku
Copy link

I've added the initialization of complete_native_tool_calls as an empty list at the beginning of the method. This should fix the error you're seeing.
The error was occurring because the code was trying to access complete_native_tool_calls in the condition:

// ... existing code ...
accumulated_content = ""
tool_calls_buffer = {}
current_xml_content = ""
xml_chunks_buffer = []
pending_tool_executions = []
yielded_tool_indices = set() # Stores indices of tools whose status has been yielded
tool_index = 0
xml_tool_call_count = 0
finish_reason = None
last_assistant_message_object = None # Store the final saved assistant message object
tool_result_message_objects = {} # tool_index -> full saved message object
has_printed_thinking_prefix = False # Flag for printing thinking prefix only once
complete_native_tool_calls = [] # Store complete native tool calls for processing
// ... existing code ...

@sahariarpku
Copy link

also, pip3 install uvicorn fastapi

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants