[Custom Component] DISCONTINUED ChatGPT v1.1.1

This is so great! If you don’t mind please share some user cases for using this so we can share some inspiration. My first thought is to send a message when a person in the family is leaving work with the ETA!

What is it with the automation you don’t understand, maybe I can help

OMG Thanks so much for this!!! It works really well.

Here is a funny example I made. Every time the relative humidity goes above 75%, it tells a joke about the fact it may be raining soon!

alias: Rain is coming
trigger:
  - type: humidity
    platform: device
    device_id: <<your fav weather station>>
    entity_id: sensor.<<put your relative humidity from your fav station>>
    domain: sensor
    above: 75
condition: []
action:
  - parallel:
      - alias: call chatgpt
        sequence:
          - delay:
              hours: 0
              minutes: 0
              seconds: 0
              milliseconds: 100
          - service: chatgpt.chat
            data:
              messages:
                - role: user
                  content: >-
                    explain in a short 3 line story that the humidity is above
                    75% and that it means it may rain soon. Use humour.  
              callback_id: "{{this.context.id}}"
      - alias: call tts
        sequence:
          - wait_for_trigger:
              - platform: event
                event_type: return_value
                event_data:
                  callback_id: "{{this.context.id}}"
            timeout:
              hours: 0
              minutes: 0
              seconds: 30
              milliseconds: 0
            continue_on_timeout: false
          - service: tts.google_translate_say
            data:
              language: en
              entity_id: media_player.kitchen_display
              message: "{{wait.trigger.event.data.content | trim | replace('\"','')}}"
mode: single

2 Likes

I just released v1.1.0 which makes it possible to configure the response event type. The default is still return_value for backwards compatibility.

@teitelbot This should be easier to work with now if you look at the example with 2 automations:

1 Like

Besides the calendar reminders that are in the example, I also have a separate reminder for bedtime. It includes the current time which is surprisingly helpful for someone who loses track of time constantly.

Oh and also I get custom messages when my laundry is done.

1 Like

Is it possible to give the content some context like “give me a professional weather report. for your information the current temprature is <<sensor.weather.temprature>> etc etc…” something like that ?

@jeroen.nijssen


  - parallel:
      - alias: call chatgpt
        sequence:
          - delay:
              hours: 0
              minutes: 0
              seconds: 0
              milliseconds: 100
          - service: chatgpt.chat
            data:
              messages:
                - role: user
                  content: "{{message.replace('$style', styles | random)}}"
              callback_id: "{{this.context.id}}"
      - alias: call tts
        sequence:
          - wait_for_trigger:
              - platform: event
                event_type: return_value
                event_data:
                  callback_id: "{{this.context.id}}"
            timeout:
              hours: 0
              minutes: 0
              seconds: 60
              milliseconds: 0
            continue_on_timeout: false
          - service: tts.google_translate_say
            data:
              language: en-au
              entity_id: media_player.living_room
              message: "{{wait.trigger.event.data.content | trim | replace('\"','')}}"


variables:
  styles:
    - happy
    - upbeat
    - motivational
    - funny
    - epic
    - dark humor
    - polite
  message: Give me a $style weatherforecast for today based on: Condition {{states("weather.knmi_spijkenisse")}} Temperature min:{{state_attr("weather.knmi_spijkenisse","forecast")[1].templow|round}}. Temperature max:{{state_attr("weather.knmi_spijkenisse","forecast")[1].temperature|round}} degrees. Percent change on sun {{state_attr("weather.knmi_spijkenisse","forecast")[1].sun_chance|round}} Percent change on rain {{state_attr("weather.knmi_spijkenisse","forecast")[1].precipitation_probability|round}} {% if state_attr("weather.knmi_spijkenisse","forecast")[1].wind_bearing == "null" -%} Wind bearing {{state_attr("weather.knmi_spijkenisse","forecast")[1].wind_bearing|round}} {%- else -%} Wind bearing: 0 {%- endif %} Wind speed {{state_attr("weather.knmi_spijkenisse","forecast")[1].wind_speed|round}} km/hr. Visibility {{state_attr("weather.knmi_spijkenisse","visibility")|round}} km. Start with goodmorning/day/afternoon/evening when it is now {{now().strftime('%H:%M')}}

Gives me 7 different styles randomly based on the context provided by knmi for today’s forecast :wink:

Thanks go to @jjbankert for the random idea :grin:

2 Likes

wow thanks!! :smiley: exactly what i needed

@jeroen.nijssen Looking at your name i figured it would :grin:

Great work here! Awesome addition to HA

Hey all,

this project will no longer be maintained. The responding services feature in Home Assistant’s 2023.7 release enables me to use the OpenAI Conversation integration for all my personal use cases.

Thanks to everyone who responded with issues, pull requests and on the Home Assistant community thread.

I’ll show here how you can modify Example 2: Single Automation to use the new native feature. It’s overall much simpler than before, which is nice. The only tricky thing is figuring out the agent_id. Go to developer tools → services → conversation.process. In the UI mode, select your OpenAI conversation agent from the Agent drop-down, and then switch to YAML mode to see the id.

alias: Say With ChatGPT
trigger:
  - platform: event
    event_type: your_trigger_event_type
action:
  - service: conversation.process
    data:
      agent_id: <agent_id goes here>
      text: Write a happy, one line, spoken language reminder for the 'cleaning' calendar event.
    response_variable: chatgpt
  - service: notify.mobile_app_<your_device_id_here>
    data:
      message: TTS
      data:
        tts_text: "{{chatgpt.response.speech.plain.speech | trim | replace('\"','')}}"
5 Likes

I just keep getting “Sorry I didn’t understand” responses from the AI, using your suggestion in here.
I have setup the OpenAI integration, using the default settings. Have I missed something?

Sounds like you maybe haven’t configured the correct conversation agent?

Also, it works fine for my use cases to replace the system message (“prompt template” in the openai config) that’s sent every time with a space (’ '). That saves on tokens.

Had your old app working great now trying to use your new example to talk using my Google mini and the built in integration. It runs with no errors but it doesn’t say anything. Any ideas?

Thanks

alias: Say With ChatGPT
trigger: []
action:
  - service: conversation.process
    data:
      agent_id: (hidden)
      text: Write a happy, one line, spoken language reminder to feed the dogs.
    response_variable: chatgpt
  - service: tts.google_translate_say
    data:
      language: en
      entity_id: media_player.living_room_speaker
      message: TTS
      data:
        tts_text: "{{chatgpt.response.speech.plain.speech | trim | replace('\"','')}}"

Are you running 2023.7.2 (or higher)? I ran into nasty issues with 2023.7.0 that were fixed in .2.

Also general tip is is to run both steps in the developer tools to see if the separate actions work as intended.

Thanks. Haven’t used the developer tools before so not sure what to look at in there. But I did grab the traces of the automation running. Also, this is on Yellow hardware.

Screenshot 2023-07-16 5.07.21 PM


Looks like the chatgpt part is working and the data is passed to the next action, so that’s great!

So I guess that there’s an issue with the TTS action parameters. See if you can get the TTS to work in the services tab in the developer tools. Then use those settings in the automation (but with the chatgpt output). If you get stuck then googling for a tutorial should give more detailed guides for the dev tools.

1 Like

Thanks for the tip. Easy when you know where to look/test!

For anyone else with this issue, here is the working code:

alias: Say With ChatGPT
trigger: []
action:
  - service: conversation.process
    data:
      agent_id: (hidden)
      text: Write a random happy, one line, spoken language reminder to feed the dogs.
    response_variable: chatgpt
  - service: tts.google_translate_say
    data:
      entity_id: media_player.living_room_speaker
      message: "{{chatgpt.response.speech.plain.speech | trim | replace('\"','')}}"
2 Likes

Question for you, @jjbankert

I’m writing an app for Hubitat to use chatGPT like you did here. I have it connecting but only getting gibberish back, like it’s quoting parts of a book. Wondering if you could look at this code snippet and see if I’m hitting the wrong endpoint or missing something. Thanks for looking!

    def message = "write a random happy reminder to feed the dogs"
    def apiUrl = "https://api.openai.com/v1/completions"
    def headers = [
        "Authorization": "Bearer ${apiKey}",
        "Content-Type": "application/json"
    ]
    def requestBody = [
        model: "davinci",
        prompt: message,
        max_tokens: 150,
        top_p: 1,
        temperature: 1,
        stop: "\n",
    ]

    httpPostJson(uri: apiUrl, body: requestBody, headers: headers) { response ->

A bit late but hopefully useful to someone.

The “davinci” model is gpt 3.0 iirc, so you might want to use a newer one. Older GPT models generate grammatically correct but content-wise nonsense text.

Maybe check the temperature value. I got GPT 3.5 to return random pdf texts if I cranked the temp.