ChatGPT Integration for Home Assistant: Enhance Your Smart Home with AI

that’s huge !
is there a way to expose our sensors (read only mode) to this instance of OpenAI ? or there will be ?

recently I come over some strange error i nmy log file related to this integration (just because of very long artist name in query about album info):

2023-04-22 09:44:27.281 WARNING (MainThread) [homeassistant.components.input_text] Invalid value: Tell me about Bohren & der Club of Gore / Christoph Clöser / Morten Gass's album Piano Nights in less than 200 words (length range 0 - 100)

Upon investigation it seems that there is a limot of 100 characters for query that can be send to ChatGPT. Is this limitation of API or integration? Any way to overcome this?
Should not be problem of input_text itself (as error might imply), as this should hold up to 255 characters…

Still getting this error using gpt-3.5-turbo model.

This is a chat model and not supported in the v1/completions endpoint. Did you mean to use v1/chat/completions?

I’m using the latest version downloaded this morning

It seems that main branch doesn’t yet contain those enhancements. You need to check that issue or pr for new code.

Yeah I downloaded a revision from earlier and it’s working. Time to have a play around.

So i found that storing the response into a text input wasnt going to work as the max a state can be is 255 characters. So my workaround to this which allows me to store responses for different things is to use MQTT.

My example is storing the ChatGPT response for a notification that my dishwasher has finished. We will use the standard automations with the mqtt.publish service call. An MQTT Sensor is required to store the response.

MQTT Sensor

Topic can be anything, sensor state will be the value we send under the “state” attribute and the response will be the ChatGPT response text.

mqtt:
  sensor:
    - name: ChatGPT Response Dishwasher
      state_topic: "homeassistant/chatgpt/response/dishwasher"
      value_template: "{{ value_json.state }}"
      json_attributes_topic: "homeassistant/chatgpt/response/dishwasher"
      json_attributes_template: >
        {"response":"{{ value_json.response }}"}

Automation - Action

Calls the ChatGPT service, then checks if the sensor state is “response_received”, if so it publishes the response to the MQTT topic with a custom JSON payload. If the sensor state isnt “response_received” a custom text is published instead.

The notification then takes the value from the MQTT sensor attribute “response”. Note, the state attribute is only there to provide a state for the MQTT sensor and isnt used, however you could use it to perform another check if required - this can be all be customised.

action:
  - service: openai_response.openai_input
    data:
      model: gpt-3.5-turbo
      prompt: >-
        In 30 words or less and in a random style tell me that my dishwasher
        has finished
  - if:
      - condition: state
        entity_id: sensor.hassio_openai_response
        state: response_received
    then:
      - service: mqtt.publish
        data:
          topic: homeassistant/chatgpt/response/dishwasher
          payload: |-
            {
              "state": "ok",
              "response": "{{  state_attr('sensor.hassio_openai_response','response_text') }}"
            }
          retain: true
    else:
      - service: mqtt.publish
        data:
          topic: homeassistant/chatgpt/response/dishwasher
          payload: |-
            {
              "state": "error",
              "response": "Dishwasher has finished."
            }
          retain: true
  - service: notify.mobile_app_iphone
    data:
      data:
        group: whitegoods-notification-group
      title: Dishwasher Finished
      message: >-
        {{  state_attr('sensor.chatgpt_response_dishwasher','response') }}
        That run cost: £{{
        (((((int(states('sensor.mss310_dishwasher_energy')) -
        (int(states('input_number.dishwasher_energy_base'))))) / 1000)) *
        (float(states('sensor.octopusgo_current_price')) / 100) | round(3))
        | round(2) }} and used used: {{
        ((int(states('sensor.mss310_dishwasher_energy')) -
        (int(states('input_number.dishwasher_energy_base')))) / 1000) }} kWh
        of Energy

You can setup multiple MQTT sensors for different requirements and this method will allow you to store more than the 255 character limit in the text_input helper.

Hope this helps.

EDIT

Noticed whilst MQTT will take a long response i found artist information for example wouldnt parse " and line returns.

As such i have added this to some publish service calls to replace " with nothing.

service: mqtt.publish
data:
  topic: homeassistant/chatgpt/response/artistinfo
  payload: |-
    {
      "state": "ok",
      "response": "{{  state_attr('sensor.hassio_openai_response','response_text') | regex_replace(find='"', replace='', ignorecase=true) }}"
    }
  retain: true
1 Like

Happy to announce my updated version.
No API key needed (including GPT4). Same functionalities as this integration.

https://community.home-assistant.io/t/openai-gpt4-integration-no-api-key-needed

What is the difference between this and the OpenAi integration?

This is a sensor that creates a response based on an input text, instead of a integration that goes in the conversation button.

2 Likes

I’m wondering what is possible with the new chatGPT Plug-in architecture. I can imagine a plug-in run locally grabbing entity names from chat, and have it translate to actionable Rest API calls.
I can also ground GPT to behave like an assistant for the home + with pre-prompt I can inform ChatGPT what entities I have, or use the Plug-in to enumerate selected entities to GPT. Have you thought about this?