-
Notifications
You must be signed in to change notification settings - Fork 5
Integration of chat llm behavior #112
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: ros2
Are you sure you want to change the base?
Conversation
@mamaheux @philippewarren Can you have a final look at this version? I would merge it to ros2 branch and continue improving it later in the summer. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some comments from the first review are still unadresses, they were hidden in the "Expand hidden conversation" thing
def load_prompts_into_history(self, prompts: dict) -> bool: | ||
"""Load prompts from a dict""" | ||
if isinstance(prompts, list): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it a dict or a list? Is the type annotation wrong?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oups, list it is. will fix.
RCLCPP_WARN(rclcpp::get_logger(NODE_NAME), | ||
fmt::format("Volume cannot be lower than 0. Will decrese by {0} instead.", baseStatusMsg->volume).c_str()); | ||
} | ||
uint8_t volume = std::max<uint8_t>(baseStatusMsg->volume - amount, 0); | ||
uint8_t volume = baseStatusMsg->volume - amount; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we want a similar handling in volume_up with the maximum volume?
def _remove_think_tags(partial_message: str) -> str: | ||
def _remove_think_tags(self, partial_message: str) -> str: | ||
# TODO better handling of <think></think> tags over multiple partial messages. | ||
return re.sub(r"<think>.*</think>", "", partial_message, flags=re.DOTALL) | ||
|
||
def _replace_enumeration_characters(partial_message: str) -> str: | ||
def _replace_enumeration_characters(self, partial_message: str) -> str: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Or @staticmethod?
RCLCPP_WARN( | ||
rclcpp::get_logger(NODE_NAME), | ||
fmt::format( | ||
"Volume cannot be higher than {0}. Will increased by {1} instead.", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"Volume cannot be higher than {0}. Will increased by {1} instead.", | |
"Volume cannot be higher than {0}. Will increase by {1} instead.", |
fmt::format("Volume cannot be lower than 0. Will decrese by {0} instead.", baseStatusMsg->volume).c_str()); | ||
RCLCPP_WARN( | ||
rclcpp::get_logger(NODE_NAME), | ||
fmt::format("Volume cannot be lower than 0. Will decrese by {0} instead.", amount).c_str()); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fmt::format("Volume cannot be lower than 0. Will decrese by {0} instead.", amount).c_str()); | |
fmt::format("Volume cannot be lower than 0. Will decrease by {0} instead.", amount).c_str()); |
# However, if you have an old version of openai library and you see error messages, install it manually | ||
pip3 install openai |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We need to investigate this more to confirm the problem: an old openai version did not seem to be the problem
|
||
The `chat_node.py` is designed to be used in conjunction with a speech-to-text (STT) module and a text-to-speech (TTS) module. Listening or speaking is performed using an external HBBA node which controls desires, strategies and filters. A demo application can be found in the [demos/chatbot](../../demos/chatbot/README.md) folder. | ||
|
||
### Requirements |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we include these in the setup script?
self._transcript_sub = self.create_subscription( | ||
Transcript, | ||
"speech_to_text/transcript", | ||
self._on_transcript_received_cb, | ||
1, | ||
callback_group=self._subscriber_callback_group_transcript, | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It should be an HbbaSubscriber to prevent the CHatNode to process transcription want only le STT is enabled.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, will do.
: Strategy<ChatDesire>( | ||
utility, | ||
{{"sound", 1}}, | ||
{{"talk/filter_state", FilterConfiguration::onOff(FilterConfiguration::DefaultState::DISABLED)}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add the chat node transcript filter (see the previous comment). It should be enabled by default.
@mamaheux, @philippewarren,
I need your advice because chat behavior node is blocking after service call, does not receive "Talk Done".
Flow :
Transcript->Process LLM (stream call to OpenAI/ollama) -> Call tools -> Send Talk message -> Talk Done (Not received) -> Send Chat Done
When there is no tool called, it works.
[INFO] [1744814421.762334852] [chat_node]: Transcript received: volume de 10
[INFO] [1744814428.478169379] [chat_node]: Processing...
[INFO] [1744814431.747961959] [chat_node]: Calling external service: volume_up with arguments: {"amount":10}
[INFO] [1744814436.772234745] [chat_node]: Service call result: behavior_srvs.srv.ChatToolsFunctionCall_Response(ok=True, result='{"status": "Volume increased"}')
send_request_and_process_response done
send_request_and_process_response done
[INFO] [1744814437.194302523] [chat_node]: Processing done!
[INFO] [1744814437.196909401] [chat_node]: Sending talk message: Le volume a été augmenté de 10.
---> Waiting for Talk Done that never occurs...
send_request_and_process_response done is recursively called if there is a tool function (calling API twice to send the result).
I tried with a different thread to call the service with no success. I also used the MultiThreadedExecutor and called spin_until_future_complete, with same result. I tried different callback groups, same result.
Any idea ?