ChatGPT Integration for Home Assistant: Enhance Your Smart Home with AI

Could you share your automation on this please?

Does it just take the output from any question? Or does it know to only display the content when something is playing ?

Here is the code:

automation:
  - id: 'update_media_info'
    alias: Update Media Info
    trigger:
    - platform: state
      entity_id: media_player.marantz_sacd30n
      attribute: media_album_name
    condition: []
    action:
      - service: input_text.set_value
        data_template:
          entity_id: input_text.gpt_input
          value: >-
            Tell me about {{state_attr('media_player.marantz_sacd30n', 'media_artist')}}'s album {{state_attr('media_player.marantz_sacd30n', 'media_album_name')}}"
    initial_state: true

So far I’m using this only for media info, so there is no distinguishing regarding source of question. Not sure if with simple input/output model this would be possible to distinguish what process/automation asked the question and redirect answer to specific destination. This might be possible with asking question/blocking another question/waiting for answer/storing the answer wherever appropriate/ unblocking asking another question sort of flow.
BTW. I checked that you can also use song title to get more detailed info about any specific song. Pretty interesting answer you can get for this!

2 Likes

How can I use this as a voice assistant. I have a mic and speaker using browser mod.

1 Like

After working with the UI for a while, then checking entities, services, documentation and here, then going back and repeating, I was able to determine the only proper docs are on your GitHub. Please improve the docs.

Hmm I just tried to put this together but couldn’t get it to work. I followed all the steps via the GitHub link. Do you need a premium account for the API key to work? As it’s not generating a response

Do you have credits in your OpenAI account?

Looks like mine free credits just expired

Thanks, working perfect in Lovelace UI!
Any Chance to get an Automation working with the Telegram Bot? I just can’t update the input_text.gpt_input entity …
Here is my code:

alias: Telegram-ChatGPT
trigger:
  - platform: event
    event_type: telegram_command
    event_data:
      command: /chatgpt
action:
  - service: input_text.set_value
    data_template:
      entity_id: input_text.gpt_input
            value: "{{ trigger.event.data.text }}"
  - delay: "00:00:05"
  - service: telegram_bot.send_message
    data_template:
      message: "{{ state_attr('sensor.hassio_openai_response', 'response_text') }}"
mode: single

It gets triggered, but doesn’t fill in my text behind the /chatgpt command, the input_text.gpt_input doesn’t update.
I’ve tried nearly everything at the trigger.event.data.text part.
ChatGPT itself said I should try it with trigger.message.text.split(' ', 1)[1] but this still doesn’t work. Dunno whats the correct trigger data.
Maybe someone knows the correct way?

The Part of sending the response_text is working fine standalone!

Why is your indentation like that? (Also you shouldn’t really be asking ChatGPT about automations, as the pinned warning notes)

Just want to advertise that there’s some improvements done. There’s PR that brings service call and support for models gpt-3.5-turbo and gpt-4.
See Multiple enhancements by Olen · Pull Request #6 · Hassassistant/openai_response · GitHub
and openai_response/custom_components/openai_response at gpt-3.5-turbo · Olen/openai_response · GitHub

1 Like

that’s huge !
is there a way to expose our sensors (read only mode) to this instance of OpenAI ? or there will be ?

recently I come over some strange error i nmy log file related to this integration (just because of very long artist name in query about album info):

2023-04-22 09:44:27.281 WARNING (MainThread) [homeassistant.components.input_text] Invalid value: Tell me about Bohren & der Club of Gore / Christoph Clöser / Morten Gass's album Piano Nights in less than 200 words (length range 0 - 100)

Upon investigation it seems that there is a limot of 100 characters for query that can be send to ChatGPT. Is this limitation of API or integration? Any way to overcome this?
Should not be problem of input_text itself (as error might imply), as this should hold up to 255 characters…

Still getting this error using gpt-3.5-turbo model.

This is a chat model and not supported in the v1/completions endpoint. Did you mean to use v1/chat/completions?

I’m using the latest version downloaded this morning

It seems that main branch doesn’t yet contain those enhancements. You need to check that issue or pr for new code.

Yeah I downloaded a revision from earlier and it’s working. Time to have a play around.

So i found that storing the response into a text input wasnt going to work as the max a state can be is 255 characters. So my workaround to this which allows me to store responses for different things is to use MQTT.

My example is storing the ChatGPT response for a notification that my dishwasher has finished. We will use the standard automations with the mqtt.publish service call. An MQTT Sensor is required to store the response.

MQTT Sensor

Topic can be anything, sensor state will be the value we send under the “state” attribute and the response will be the ChatGPT response text.

mqtt:
  sensor:
    - name: ChatGPT Response Dishwasher
      state_topic: "homeassistant/chatgpt/response/dishwasher"
      value_template: "{{ value_json.state }}"
      json_attributes_topic: "homeassistant/chatgpt/response/dishwasher"
      json_attributes_template: >
        {"response":"{{ value_json.response }}"}

Automation - Action

Calls the ChatGPT service, then checks if the sensor state is “response_received”, if so it publishes the response to the MQTT topic with a custom JSON payload. If the sensor state isnt “response_received” a custom text is published instead.

The notification then takes the value from the MQTT sensor attribute “response”. Note, the state attribute is only there to provide a state for the MQTT sensor and isnt used, however you could use it to perform another check if required - this can be all be customised.

action:
  - service: openai_response.openai_input
    data:
      model: gpt-3.5-turbo
      prompt: >-
        In 30 words or less and in a random style tell me that my dishwasher
        has finished
  - if:
      - condition: state
        entity_id: sensor.hassio_openai_response
        state: response_received
    then:
      - service: mqtt.publish
        data:
          topic: homeassistant/chatgpt/response/dishwasher
          payload: |-
            {
              "state": "ok",
              "response": "{{  state_attr('sensor.hassio_openai_response','response_text') }}"
            }
          retain: true
    else:
      - service: mqtt.publish
        data:
          topic: homeassistant/chatgpt/response/dishwasher
          payload: |-
            {
              "state": "error",
              "response": "Dishwasher has finished."
            }
          retain: true
  - service: notify.mobile_app_iphone
    data:
      data:
        group: whitegoods-notification-group
      title: Dishwasher Finished
      message: >-
        {{  state_attr('sensor.chatgpt_response_dishwasher','response') }}
        That run cost: £{{
        (((((int(states('sensor.mss310_dishwasher_energy')) -
        (int(states('input_number.dishwasher_energy_base'))))) / 1000)) *
        (float(states('sensor.octopusgo_current_price')) / 100) | round(3))
        | round(2) }} and used used: {{
        ((int(states('sensor.mss310_dishwasher_energy')) -
        (int(states('input_number.dishwasher_energy_base')))) / 1000) }} kWh
        of Energy

You can setup multiple MQTT sensors for different requirements and this method will allow you to store more than the 255 character limit in the text_input helper.

Hope this helps.

EDIT

Noticed whilst MQTT will take a long response i found artist information for example wouldnt parse " and line returns.

As such i have added this to some publish service calls to replace " with nothing.

service: mqtt.publish
data:
  topic: homeassistant/chatgpt/response/artistinfo
  payload: |-
    {
      "state": "ok",
      "response": "{{  state_attr('sensor.hassio_openai_response','response_text') | regex_replace(find='"', replace='', ignorecase=true) }}"
    }
  retain: true
1 Like

Happy to announce my updated version.
No API key needed (including GPT4). Same functionalities as this integration.

https://community.home-assistant.io/t/openai-gpt4-integration-no-api-key-needed

What is the difference between this and the OpenAi integration?

This is a sensor that creates a response based on an input text, instead of a integration that goes in the conversation button.

2 Likes

I’m wondering what is possible with the new chatGPT Plug-in architecture. I can imagine a plug-in run locally grabbing entity names from chat, and have it translate to actionable Rest API calls.
I can also ground GPT to behave like an assistant for the home + with pre-prompt I can inform ChatGPT what entities I have, or use the Plug-in to enumerate selected entities to GPT. Have you thought about this?