Telegram Bot: Conversation with Assist (and others like OpenAI)

:face_with_monocle: About this blueprint

Ok, so… Home Assistant has recently made huge improvements in the implementation of conversation agents that allow us (users) to chat with Home Assistant and even to execute actions from text messages!

But, what if we could talk to the conversation agents directly from Telegram? THIS IS POSSIBLE WITH THIS BLUEPRINT! :confetti_ball:

This blueprint uses the Telegram Bot to chat with a Conversation Agent. The result is that you can chat with the default Home Assistant Assist or with the OpenAI Conversation Agent to respond to messages.

Technical Details
  • Type of blueprint: AUTOMATION

  • Minimum Home Assistant version: 2024.6.0

Limitations

From now, it will respond to all received text messages, and only text messages. It doesn’t respond to commands, images, voice messages…

:gear: Configuration

Requirements

Input fields

  • Conversation Agent :asterisk:: REQUIRED Home Assistant conversation agent to respond to messages.

  • Response options:

    • Reply in private?: Whether the answer should be sent in private or group chat. This option has no effect when talking directly to the bot in a private chat.
  • Filter: Respond to messages coming only from certain users or chats.

    • Filter user_id: Filter messages received from these user IDs.
    • Filter chat_id: Filter messages received from these chat IDs.

:pushpin: Related blueprints

:heavy_plus_sign: Other blueprints

:arrow_down: How to get this blueprint?

Click the badge to import this Blueprint:

Open your Home Assistant instance and show the blueprint import dialog with a specific blueprint pre-filled.

Or you can also:

  • Copy-paste the content from this link to a new file inside your configuration’s blueprints/automation folder.

Changelog

  • 2024-08-16: First release.
4 Likes

Thank you for this, it works great! i love that you included exclusion and inclusion lists, I am piping it into the main Telegram channel for my HA, I have multiple bots running and multiple users using it. Everything just worked right out of the box, with no conflicts with other bots.

1 Like

Thanks @skynet01 for your feedback! I appreciate knowing how this blueprint can be useful in different environments :upside_down_face:

Sorry for my ingnorace but i don’t get how is this supposed to work. I wish to chat with my bot throught telegram and get response from my conversation agent, is what this blueprint does?
Because if i chat with the bot i don’t get any asnwer back

I got it to work, it only work using polling not with webhooks.
Looks like this way therems no chat history, every new line is lime a new chat. There’s no way of keeping the chat history like the assist frontend in the dashboard do?

This is so awesome, thank you!

1 Like

Hi @Herian !

I’m glad you made it work. It’s interesting what you mention about “polling” vs “webhooks”. I think nothing should change in the blueprint to support one or the other. My Telegram Bot is using “polling”. I didn’t try the webhook option and cannot tell if it works for me. I’ll investigate.

About the chat history, the conversation.process action used in this blueprint has an attribute called conversation_id that is used to provide for the history. It is configured in the blueprint, so chats should be context-aware. You can try it by saying “From now on your name will be…”, and then asking for its name. Other parameters that might influence the response are the length of the history and if there is any maximum age for messages. These parameters depend on the conversation integration, and there is no way to control them from the conversation.process action, thus from this blueprint.

I hope it helps!

Thank you @marc-romu
for the webhook issue, i think that the problem has something to do with SSL or server config, in the ‘default’ mode looks like the bot is only able to receive but is not sending any data (that’s what the nodered node about telegram say node-red-contrib-telegrambot/WEBHOOK.md at master · windkh/node-red-contrib-telegrambot (github.com))

About the history, i can confirm you that is not working. You mentioned lenght of the history, i’ve done many test so the chat is filled with lots of useless messages. I think that this can be the cause. There’s not a way to delete history and start over?

Hi @Herian !

Which conversation agent are you using? I don’t think I can do much from the blueprint, because it just links the telegram bot with the conversation agent, but I’ll try to reproduce your problem.

I checked it with the OpenAI conversation agent and it works fine.

I’m usign extended open ai conversation agent, so i think it works like the open ai one. As i said, i think that i’ve reached the hostory lenght limit, do you know a way to clear it and start over?

Hi @Herian,

I don’t know how conversation agents are set up, but I guess they would take the last X messages as the history. So, conversation agents should remember the last information and forget about the old one. At least this is my experience with the OpenAI conversation agent in Home Assistant.

I hope it helps!

This is what chatgpt suggest me, but i don’t get why you got it working with history while in my case do not work.

The blueprint you provided is designed to facilitate conversations between a Telegram bot and Home Assistant via a conversation agent, such as OpenAI’s model. It includes functionality for filtering messages based on user or chat IDs and allows sending responses either privately or publicly.

Here’s the key part related to the conversation memory:

yaml

Copia codice

variables:
  conversation_id: telegram_{{ telegram_chat_id }}

The conversation_id is dynamically generated based on the chat ID, ensuring a unique ID for each conversation. However, OpenAI’s GPT models (like the one you’re using) do not maintain memory of previous interactions unless explicitly managed on the backend.

The blueprint is using the conversation_id correctly to try and maintain a session, but it relies on how the conversation agent handles this ID. In this case, OpenAI’s GPT-based systems generally treat each request independently unless explicitly designed to store or retrieve previous conversation contexts, which Home Assistant does not natively do for OpenAI models.

To resolve the issue, you may need to implement a separate mechanism for storing and retrieving conversation history (e.g., by using a database or Home Assistant’s persistent states) or switch to a conversation agent that supports memory across sessions. ​

I’m using the extended open ai conversation agent instead of the core Open AI one. Maybe that’s the issue?

I forgot to mention that if i use the frontend for assist in the dashboard, it correctly mantain the conversation history, unless i close the session and start over

I also saw in the ha conversation logs that the tokens keeps increasing for every interaction. Could be usefull if you can set a limit somehow, like keep only the last x number of lines of dialogue or something like that. Otherwise the cost will skyrocket really soon

Hi @Herian ,

This blueprint only takes the received text messages from Telegram and processes them using the conversation.process action (Conversation - Home Assistant). I cannot add extra functionalities to the blueprint that cannot be handled by this action. The only way to handle chat history is by defining the chat_id, which I did. How this ID is used to create the context for the conversation agent depends on the configuration of the agent, not the blueprint.

I asked my agent via Telegram which was the first message of the conversation, and it returned a message from yesterday, so the agent has history and is truncating the chat. Additionally, I checked my OpenAI API and I didn’t experience any significant increase in the token consumption per message in the past two months. Keep in mind that the more entities you expose to the agent, the more tokens you’ll need.

I’d suggest you to try the official OpenAI integration. It appears that the extended unofficial integration is handling history differently, or it might have some issue with the conversation.process action.

I’m sorry, I think I cannot help you more. This blueprint seems to work as expected.

With the official open ai integration the history works correctly, not with the exdended.
Thank you for your investigation and the time you spent with it.

Unfortunately looks like the extended one is not maintened anymore, i prefer that one because you can specify tools and functions with better results and less tokens used.

Thank you again @marc-romu

The extended one has a fork that fixes the memory issue. Just look at the issues on GitHub and the memory one has a fork for the memory implementation

Great now it works! Thank you!
Just a little bit off topic, i see a lots of forks for the extended one, but i’m not a github expert. How can i check what’s the difference from the main branch?

In GitHub repo click on the commit link it’s something like “this branch is X commits ahead…” and then it can show the file difference right there. I think it was only one file that was changed so you can just edit it yourself