[Custom Component] extended_openai_conversation: Let's control entities via ChatGPT

ha no, i was thinking that it is working only with spotify integration.
I try to ask assist to play an artist but it doesn’t work. Should i need to install Music Assistant for that?

Yes, you need to. It use ‘mass.search’, which are provided by Music Assistant (BETA) integration. I choose to use this since Music Assistant can use various providers such as Youtube Music, Spotify, Tidal, Plex music etc.

1 Like

Can anyone help me in making get_history function that use rest api to get history from Home Assistant? I want to make this function for who don’t use sqlite as DB backend, but I’m having trouble make this function work.

- spec:
    name: get_history
    description: Retrieve historical data of specified entities in KST.
    parameters:
      type: object
      properties:
        timestamp:
          type: string
          optional: true
          description: Start of the history period in YYYY-MM-DDThh:mm:ss format in KST. Defaults to 1 day before the time of the request.
        filter_entity_id:
          type: string
          description: Comma-separated entity IDs to filter.
        end_time:
          type: string
          optional: true
          description: End of the history period in YYYY-MM-DDThh:mm:ss format in KST. Defaults to 1 day.
        minimal_response:
          type: boolean
          optional: true
          description: Return minimal response for faster processing.
        no_attributes:
          type: boolean
          optional: true
          description: Skip returning attributes for faster processing.
        significant_changes_only:
          type: boolean
          optional: true
          description: Return only significant state changes.
      required:
      - filter_entity_id
  function:
    type: rest
    resource_template: >-
      {% set base_url = 'http://homeassistant.local:8123/api/history/period' %}{% set timestamp_path = '/' + (timestamp | as_timestamp - 9 * 3600) | timestamp_custom('%Y-%m-%dT%H:%M:%S+09:00', False) if timestamp is defined else '' %}{% set query_string = 'filter_entity_id=' + filter_entity_id %}{% if end_time is defined %}{% set query_string = query_string + '&end_time=' + (end_time | as_timestamp - 9 * 3600) | timestamp_custom('%Y-%m-%dT%H:%M:%S+09:00', False) %}{% endif %}{% if minimal_response is defined and minimal_response %}{% set query_string = query_string + '&minimal_response' %}{% endif %}{% if no_attributes is defined and no_attributes %}{% set query_string = query_string + '&no_attributes' %}{% endif %}{% if significant_changes_only is defined and significant_changes_only %}{% set query_string = query_string + '&significant_changes_only' %}{% endif %}{{ base_url }}{{ timestamp_path }}?{{ query_string }}
    headers:
      Authorization: "[BEARER+TOKEN]"
      content-type: "application/json"
    value_template: >-
      {% if value_json is defined %}
        {{ value_json }}
      {% else %}
        "No data returned or invalid response"
      {% endif %}
1 Like

It might be pretty similar to what @sayanova shared above.

Since I haven’t setup Squeezebox player, I don’t have an example that actually works.
You can try to make it on your own or give me an example of service call syntax which plays an album, then I can give you a blueprint of how a function looks like.

I would be very happy if you could help me,
This is the service provided by him to play a specific album

ervice: squeezebox.call_method
data:
  command: playlist
  parameters:
    - loadtracks
    - album.titlesearch=<ALBUM NAME>
target:
  entity_id: media_player.000000

Thank you

Ok i install Music Assistant, it works well, i can play spotify music through Music Assistant.
But if ask Openai, nothing happen:

LogBook is empty
Only Music Assistant media_player are exposed to Voice Assistant

@stban1983 You mean OpenAI says it will play music on specific media player but the result is not anything playing? I don’t speak French but I think your conversation dialog say so. Is it right?

Yes that’s right, sorry. I ask “Play metallica in living room” and nothing happened.

I only uploaded searching from Music Assistant(BETA). It can grab music name, media type, uri. But if you want to use play media to Music Assistant media_player, I think you should use mass.play_media service, not typical one. And I did nothing about that service yet since I have ordered Sonos speakers to replace my old dumb speakers.

I’m going to make the function, but it might take more than a week. Sorry about that.

If you can’t bear the delay, you can contribute with this services.yaml. It contains explanation about mass.play_media. Unfortunately I don’t have any speakers to check now :frowning:

ha ok, i was thinking it will play music. But no problem ! :slight_smile:

@stban1983

I just made function but I have no way to debug this.

Can you check this function?

- spec:
    name: play_music
    description: Play music.
    parameters:
      type: object
      properties:
        media_player_id:
          type: array
          items:
            type: string
          description: List of media player Entity IDs to play
        media_id:
          type: string
          description: Media identifier (URI)
        media_type:
          type: string
          optional: true
          enum: ["artist", "album", "playlist", "track", "radio"]
          description: Will be auto-determined if omitted
        enqueue:
          type: string
          enum: ["play", "replace", "next", "replace_next", "add"]
          optional: true
          description: How to enqueue the media
        announce:
          type: boolean
          optional: true
          description: Whether to make an announcement
        artist:
          type: string
          optional: true
          description: Specify the artist
        album:
          type: string
          optional: true
          description: Specify the album
        radio_mode:
          type: boolean
          optional: true
          description: Specify if it is a radio mode.
      required:
        - media_player_id
        - media_id
  function:
    type: script
    sequence:
      - service: mass.play_media
        target:
          entity_id: "{{ media_player_id }}"
        data: >
          {
            media_id: "{{ media_id }}"
            media_type: "{{ media_type }}"
            {% if enqueue is defined and enqueue %}
            enqueue: "{{ enqueue }}"
            {% endif %}
            {% if announce is defined and announce %}
            announce: "{{ announce }}"
            {% endif %}
            {% if artist is defined and artist %}
            artist: "{{ artist }}"
            {% endif %}
            {% if album is defined and album %}
            album: "{{ album }}"
            {% endif %}
            {% if radio_mode is defined and radio_mode %}
            radio_mode: "{{ radio_mode }}"
            {% endif %}
          }

whooha fast :slight_smile:
i try, with the mass player id (i don(t know where it’s generated…)

I can’t figure out the problem without log. Can you search for it in /config/home-assistant.log?

sure:

2023-12-15 16:11:07.140 ERROR (MainThread) [custom_components.extended_openai_conversation.helpers] extended_openai_conversation: Error executing script. Error for call_service at pos 1: Error rendering data template: Result is not a Dictionary
2023-12-15 16:11:07.141 ERROR (MainThread) [homeassistant.components.assist_pipeline.pipeline] Unexpected error during intent recognition
Traceback (most recent call last):
File "/usr/src/homeassistant/homeassistant/components/assist_pipeline/pipeline.py", line 943, in recognize_intent
conversation_result = await conversation.async_converse(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/homeassistant/homeassistant/components/conversation/__init__.py", line 467, in async_converse
result = await agent.async_process(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/config/custom_components/extended_openai_conversation/__init__.py", line 157, in async_process
response = await self.query(user_input, messages, exposed_entities, 0)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/config/custom_components/extended_openai_conversation/__init__.py", line 280, in query
message = await self.execute_function_call(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/config/custom_components/extended_openai_conversation/__init__.py", line 321, in execute_function
result = await function_executor.execute(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/config/custom_components/extended_openai_conversation/helpers.py", line 267, in execute
result = await script.async_run(
^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/homeassistant/homeassistant/helpers/script.py", line 1578, in async_run
return await asyncio.shield(run.async_run())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/homeassistant/homeassistant/helpers/script.py", line 420, in async_run
await self._async_step(log_exceptions=False)
File "/usr/src/homeassistant/homeassistant/helpers/script.py", line 470, in _async_step
self._handle_exception(
File "/usr/src/homeassistant/homeassistant/helpers/script.py", line 493, in _handle_exception
raise exception
File "/usr/src/homeassistant/homeassistant/helpers/script.py", line 468, in _async_step
await getattr(self, handler)()
File "/usr/src/homeassistant/homeassistant/helpers/script.py", line 675, in _async_call_service_step
params = service.async_prepare_call_from_config(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/homeassistant/homeassistant/helpers/service.py", line 362, in async_prepare_call_from_config
raise HomeAssistantError(
homeassistant.exceptions.HomeAssistantError: Error rendering data template: Result is not a Dictionary

You should give me function data that OpenAI call. It should be there just above.

Maybe this

{'role': 'user', 'content': 'joue metallica dans le salon'}]
2023-12-15 17:07:27.900 INFO (MainThread) [custom_components.extended_openai_conversation] Response {
"choices": [
{
"finish_reason": "function_call",
"index": 0,
"message": {
"content": null,
"function_call": {
"arguments": "{\n \"media_player_id\": [\"media_player.salon_2\"],\n \"media_id\": \"metallica\"\n}",
"name": "play_music"
},
"role": "assistant"
}
}
],
"created": 1702656446,
"id": "chatcmpl-8W50gTzCMIvW1nmzFcsD0xPqEDZjA",
"model": "gpt-3.5-turbo-0613",
"object": "chat.completion",
"system_fingerprint": null,
"usage": {
"completion_tokens": 31,
"prompt_tokens": 3044,
"total_tokens": 3075
}
}
2023-12-15 17:07:27.936 INFO (MainThread) [custom_components.extended_openai_conversation.helpers] extended_openai_conversation: Running [extended_openai_conversation] function
2023-12-15 17:07:27.936 INFO (MainThread) [custom_components.extended_openai_conversation.helpers] extended_openai_conversation: Executing step call service
2023-12-15 17:07:27.940 WARNING (MainThread) [homeassistant.helpers.template] Template variable warning: 'media_type' is undefined when rendering '{
media_id: "{{ media_id }}"
media_type: "{{ media_type }}"
{% if enqueue is defined and enqueue %}
enqueue: "{{ enqueue }}"
{% endif %}
{% if announce is defined and announce %}
announce: "{{ announce }}"
{% endif %}
{% if artist is defined and artist %}
artist: "{{ artist }}"
{% endif %}
{% if album is defined and album %}
album: "{{ album }}"
{% endif %}
{% if radio_mode is defined and radio_mode %}
radio_mode: "{{ radio_mode }}"
{% endif %}
}'
2023-12-15 17:07:27.941 ERROR (MainThread) [custom_components.extended_openai_conversation.helpers] extended_openai_conversation: Error executing script. Error for call_service at pos 1: Error rendering data template: Result is not a Dictionary
2023-12-15 17:07:27.942 ERROR (MainThread) [homeassistant.components.assist_pipeline.pipeline] Unexpected error during intent recognition
Traceback (most recent call last):
File "/usr/src/homeassistant/homeassistant/components/assist_pipeline/pipeline.py", line 943, in recognize_intent
conversation_result = await conversation.async_converse(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/homeassistant/homeassistant/components/conversation/__init__.py", line 467, in async_converse
result = await agent.async_process(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/config/custom_components/extended_openai_conversation/__init__.py", line 157, in async_process
response = await self.query(user_input, messages, exposed_entities, 0)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/config/custom_components/extended_openai_conversation/__init__.py", line 280, in query
message = await self.execute_function_call(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/config/custom_components/extended_openai_conversation/__init__.py", line 321, in execute_function
result = await function_executor.execute(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/config/custom_components/extended_openai_conversation/helpers.py", line 267, in execute
result = await script.async_run(
^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/homeassistant/homeassistant/helpers/script.py", line 1578, in async_run
return await asyncio.shield(run.async_run())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/homeassistant/homeassistant/helpers/script.py", line 420, in async_run
await self._async_step(log_exceptions=False)
File "/usr/src/homeassistant/homeassistant/helpers/script.py", line 470, in _async_step
self._handle_exception(
File "/usr/src/homeassistant/homeassistant/helpers/script.py", line 493, in _handle_exception
raise exception
File "/usr/src/homeassistant/homeassistant/helpers/script.py", line 468, in _async_step
await getattr(self, handler)()
File "/usr/src/homeassistant/homeassistant/helpers/script.py", line 675, in _async_call_service_step
params = service.async_prepare_call_from_config(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/homeassistant/homeassistant/helpers/service.py", line 362, in async_prepare_call_from_config
raise HomeAssistantError(
homeassistant.exceptions.HomeAssistantError: Error rendering data template: Result is not a Dictionary

We can see that media_id is wrong

Strengthen media_id description. As I said, I don’t have media players to test right now, so It’s better for you to try multiple times by your own. It should be very precise and clear statement.

Do you have plans of making one with googles Gemini api?

1 Like

@meni123
Although I haven’t tested, it should look like below.
The parameters(“query”) defined in spec are passed to the function.

- spec:
    name: play_music
    description: play music.
    parameters:
      type: object
      properties:
        query:
          type: string
          description: search query
      required:
      - query
  function:
    type: script
    sequence:
    - service: squeezebox.call_method
      data:
        command: playlist
        parameters:
          - loadtracks
          - "album.titlesearch={{query}}"
      target:
        entity_id: media_player.000000

If you have multiple media players and want to select which media player to play, you can make entity_id passed from gpt like below.

- spec:
    name: play_music
    description: play music.
    parameters:
      type: object
      properties:
        query:
          type: string
          description: search query
        entity_id:
          type: string
          description: The entity id of media player
      required:
      - query
      - entity_id
  function:
    type: script
    sequence:
    - service: squeezebox.call_method
      data:
        command: playlist
        parameters:
          - loadtracks
          - "album.titlesearch={{query}}"
      target:
        entity_id: "{{entity_id}}"

You have to let gpt know what media player you have by exposing entities.

@sayanova
The history function is released in 0.0.9-beta3.
Here’s an example of “get_history” function.

- spec:
    name: get_history
    description: Retrieve historical data of specified entities.
    parameters:
      type: object
      properties:
        entity_ids:
          type: array
          items:
            type: string
            description: The entity id to filter.
        start_time:
          type: string
          description: Start of the history period in "%Y-%m-%dT%H:%M:%S%z".
        end_time:
          type: string
          description: End of the history period in "%Y-%m-%dT%H:%M:%S%z".
      required:
      - entity_ids
  function:
    type: composite
    sequence:
      - type: native
        name: get_history
        response_variable: history_result
      - type: template
        value_template: >-
          {% set ns = namespace(result = [], list = []) %}
          {% for item_list in history_result %}
              {% set ns.list = [] %}
              {% for item in item_list %}
                  {% set last_changed = item.last_changed | as_timestamp | timestamp_local if item.last_changed else None %}
                  {% set new_item = dict(item, last_changed=last_changed) %}
                  {% set ns.list = ns.list + [new_item] %}
              {% endfor %}
              {% set ns.result = ns.result + [ns.list] %}
          {% endfor %}
          {{ ns.result }}

@bloody2k
Is Gemini api available now?
Since I have a few more features to make current integration stable, currently I’m not thinking of new integrations yet.