[Custom Component] extended_openai_conversation: Let's control entities via ChatGPT

Does this component work for you guys? I lost the settings button in the UI. And when reinstalling the custom component I lost my settings too :man_facepalming:

Edit: ah my bad. The config is under the integrations configure-button.

Has anyone tried to ask for weekday or for a random word?
I donā€™t know if is chatgpt or the integration, but if i ask witch day of the week is today, every time a get a different (wrong) asnwer. If i ask for a random word he asnwer alway with same one.

Works with my config:

Iā€™m using gpt-4o-mini with this system prompt:

You are an assistant controlling a Home Assistant instance.
Current time: {{ now() }}
Your operating instructions are:
1. When I want to know the current state of something, use the information in these entity state tables in your answer:

{% set ns = namespace(ents=[]) -%}
{%- for e in exposed_entities -%}
  {%- set ns.ents = ns.ents + [{
    "entity_id": e.entity_id,
    "name": e.name,
    "state": e.state,
    "area": (area_name(e.entity_id) or ""),
    "aliases": e.aliases,
  }] -%}
{%- endfor -%}
{%- set sorted_list = (ns.ents) | sort(attribute="area") -%}
{% for area, entities in sorted_list|groupby("area") %}
{% if not area %}
Entities without configured area:
{%- else %}
Entities in: {{ area }}
{%- endif %}
```csv
entity_id,name,state,aliases
{%- for e in entities %}
{{ e.entity_id }};{{ e.name }};{{ e.state }};{{e.aliases | join('/')}}
{%- endfor %}
```
{%- endfor -%}

2. If I ask about an area, reply by telling me the current states of the entities in that area.
3. If I ask you to control something, you may use one of the tools available to you.
4. If I query or converse about something unrelated to the house, just reply to the best of your abilities using what you know is true.

Your replies should be casual, short and concise, mainly sticking to "this: that" type of responses. Don't repeat part of my question in you answer. Use short forms of the entity names.

Examples:
User: "Turn on the lights in the office"
Assistant (calls service light.turn_on): "Office lights on"

User: "Whats going on in the garage?"
Assistant (uses information from the provited entity tables): "Lights on
Motion detected
Garage door closed,
Temperature: 20Ā°"

User: "How long has the TV been on for?"
Assistant (uses function get_history): "24 minutes"

User: "Is it going to rain all day?"
Assistant: "No rain between 1 and 3 this afternoon" or "No rain after 4"

Itā€™s funny because the first random word is always the same, serendipity.
Try to restart the chat, and you will get always the same.

Iā€™m experiencing poor intent recognition with the same prompt in the last 2 or 3 days. It says that he cannot find a sensor (that doesnt exist since weather is not a sensor) or that he canā€™t call a service when he has not to (he tried to call a sensor service instead of read the sensor).
Never had this problem with the same prompt/configuration before.
In the log, i see a new thing called ā€˜reasoning tokenā€™, there was always there and i didnt noticed? Or open ai is changing something?

I found that open ai have launched 2 new reasoning model, thatā€™s why we have reasoning token now.
Wonder if they are better for our use case and if the problem i had with the gpt4omini recently are somehow related

Ha! Youā€™re right, it gives me the same words again. Weird. But then again maybe not. I donā€™t know how these things work.
But I recognize the behavior you describe from my earlier config with gpt-3.5.
Did you try my system prompt? It seems to work well for me since gpt-4*.

Did you by any chance tried the new models?

No but seems expensive :slight_smile:

This is amazing and works great in text assist and android companion app. Butā€¦

As several people above, Iā€™m looking for a way to make a conversation using Wyoming Satellite. What I need is a way to distinguish if the conversation ended and the command has been executed or the system asked for an additional data.

Iā€™m using this fork of Wyoming satellete and it has an event which returns data depending if an intent was recognized. Automation is triggered by this event simulates wake word recognition.

Is any thing similar can work with this component?

P.S. Sorry for my English. Itā€™s late and Iā€™m not native speaker :slight_smile:

1 Like

Hi everyone. I have some question about programming or giving instructions. So Chatgpt wonā€™t call services, even if I ask him to do so.
For Example, i want him to set the climate to some temperature, but he cannot use the service climate.set_temperature. How can I program chatgpt to use that?
The Same is for a cat feeder, where he needs to put a number in and so on and so on.

And the last. When i do an automation, where a text would trigger it.
It doesnt work when i say the ā€œtrigger wordā€ do I need anything more?

3rd. How can i ask him to trigger an automation? He always tells me, that he startet it, but nothing happens, and again here. when i tell him what service he needs to call, he ignores it.

Perhaps someone can help me with this ā€œeasyā€ tasks.

and just to be clear. All those functions in here, are going into the config yaml under functions right? because i think mine are not working

It would be easier to understand your problem if you share your code here.

Hi there

So the service climate.set_temperature wont work since its not implementet in the Nabu API, saw an issue on github.

but still open the question how i can use chatgpt to start automations.
I donā€™t have a code yet, since I dont know how to do or programm that, if I tell or write to execute or start the automation it wont work, he thinks he startet it but nothing happend.

about the functions. i have everything in the function.yaml and in configuration.yaml it references thereā€¦ But I think thats the wrong place to put the functions for chatgpt perhaps?

functions:
  ####Shopping Cart
  - spec:
    name: add_item_to_list
    description: Add item to a list
    parameters:
      type: object
      properties:
        item:
          type: string
          description: The item to be added to the list
        list:
          type: string
          description: the entity id of the list to update
          enum:
            - todo.shopping_list
            - todo.to_do
      required:
      - item
      - list
    function:
    type: script
    sequence:
    - service: todo.add_item
      data:
        item: '{{item}}'
      target:
        entity_id: '{{list}}'
    
  - spec:
    name: remove_item_from_list
    description: Check an item off a list
    parameters:
      type: object
      properties:
        item:
          type: string
          description: The item to be removed from the list
        list:
          type: string
          description: the entity id of the list to update
          enum:
            - todo.shopping_list
            - todo.to_do
      required:
      - item
      - list
    function:
    type: script
    sequence:
    - service: todo.update_item
      data:
        item: '{{item}}'
        status: 'completed'
      target:
        entity_id: '{{list}}'
  
  - spec:
    name: get_items_from_list
    description: Read back items from a list
    parameters:
      type: object
      properties:
        list:
          type: string
          description: the entity id of the list to update
          enum:
            - todo.shopping_list
            - todo.to_do
      required:
      - list
    function:
    type: script
    sequence:
    - service: todo.get_items
      data:
        status: 'needs_action'
      target:
        entity_id: '{{list}}'
      response_variable: _function_result

You should put this function into the system prompt in conversation agent. You donā€™t need to create an automation. The AI should call the service specified in the function.

can you show me where? or guide me there?
Under ā€œConfigurationā€ on the OpenAI extended conv i dont have the function option.

The AI is not able to call the function to start an automation. Thats weird. He thinks he did but it doesnt work.
Same with automation where you have a ā€œconversationā€ to trigger it. that does not work aswell like in this example:

trigger: conversation
command:
  - FĆ¼ttere die Katzen
  - Gib den Katzen zu essen

when i tell him in the instruction follwing. it does not work
ā€œ5. If I ask you to start an automatoin. please use the ā€œautomation.triggerā€ serviceā€

What integration are you using? Iā€™m using Extended OpenAI Conversation integration and the function section is right below the temperature slider.

1 Like

the posterā€™s picture is the builtin Openai conversation integration not the custom.

1 Like

Hi !

when I ask chatgpt to carry out an action and there may be a choice to make in a particular situation, I would like it to ask me the question.

How to do this in scripts ?

thank you guys. i used the wrong openai conversationā€¦ now i have all the functions!

Anyone has an Idea for the problem about the Automations i have?

Soā€¦ honest question. Is anyone able to get this to work in a stable and consistent way? I have been tweaking things for months but honestly, it seems completely random if things are going to work or not. Sometimes the system sets the colour of a light, sometimes it says the device doesnā€™t support colour, sometimes it dims a dimmable light, sometimes it doesnā€™t. Even with the exact same command in the span of the same session. This seems like a fun toy but nothing you actually can use at this point. Definitely not the creators fault, this is down to the AI models at this point just not being able to do this it seems.
If anyone has been able to get this to a point of being stable, what was your process?

Hi!
Perhaps itā€™s hard for openai to initiate the event that will launch the automation.

I think itā€™s better to put all the actions of the automation in a script.
And put a call to the script in the automation.
And let openai simply call the script.