2023.7: Responding services

Hello, thank you for your work!
After this upgrade my HA is lightning fast. It was pretty average previously, taking a couple of seconds to load a page, but now it opens every page in a moment of mouse click.

4 Likes

The very last example automation in the Hyperion documentation should return the functionality you are used to https://www.home-assistant.io/integrations/hyperion/#examples

IMHO the Hyperion PR should be reverted. It adds nothing but confusion.

Also, once enabled, the switch.[instance]_component_led_device does nothing for my Hyperion install.

If I turn it on in Home Assistant it instantly turns off.

I can toggle the switch on the Hyperion Dashboard.

1 Like

I had the same, I wasnā€™t sure if it is the update or a firmware upgrade on the envoy. I deleted the integration and installed the custom integration and was able to log in with my email and enphase password. Make sure to restart Home Assistant after deleting the core integration. Somehow it was still in cache or something and it was interfering with the custom integration. Custom integration gave me an entity for the lifetime production with the exact same name so didnā€™t have to change anything in the energy dashboard.

yes, it is time for the energy dashboard to change!

While I initially got excited about service response, but then realized limitation of implementation :frowning: The only way to consume the response is within the script or automation that called the service. But sometimes it is needed to store the response for further processing. If response is short, we could use for example input_text.set_value to store it as entity state. But this brings a limitation that was many times highlighted in this forum - only 255 characters that entity state can store. While I understand this limitation and that we should store larger chunks of text as entities attributes, but then again we are missing ability to set the value of attribute from script or automation. The only possibility to set attribute is is to use template trigger sensor, but with this implementation we are missing such trigger to use.
Why we do not have set_attribute service available for custom entities (e.g. to have possibility to add custom attribute to input_text)?
Why canā€™t we define callback_event for called service, that would allow to use template trigger sensor to store such response in attribute?

3 Likes

This new energy sufficiency guage isnā€™t a lot of use to me, doesnā€™t seem to take into account exporting of energy.

Iā€™ve been 100% sufficient for some time but because I choose to import during cheap rate and export loads more during peak it concludes Iā€™m not self sufficient.

Can you provide a few examples where the response must be stored (as opposed to processed immediately)?

Itā€™s not an economic indicator itā€™s a self sufficiency indicator. i.e. how much you depend on the grid.

Anything you import will bring it down, cheap, free or expensive.

I donā€™t think it should be removed, it should just be restructured to be friendlier. There should be 2 lights. 1 for colors and effects, 1 for usb capture. Both should automatically turn on/off LEDS. The lights themselves should be mutually exclusive. I.e. when colors and effects light is on, usb capture should be off and vice versa (also allowing both to be off at the same time). This is essentially what I do in my template switches to make it easier for myself.

You canā€¦ just make a custom event template sensor and pass the data to the attributes.

What about for people like me who donā€™t use USB capture?

I use a screen grabber as Hyperion runs on my TV box.

Same principles if it works the same way as usb capture. I.e. 3rd light

Is there a way of turning the gauge off please or adjusting how it calculates?

No. If you want a different calculation you will have to create your own template sensor and gauge.

Custom weather state descriptions, meteolalerts, local community feeds, etc provided not by integration but by REST sensors or scrapping (local garbage collection).
Case Iā€™m working on right now; asking OpenAI for descriptive information about played artist/album/track to be displayed in custom media dashboard. BTW Taras, this is the same case we discussed here, Iā€™m now trying to move it native HA solution, as @jjbankert ChatGPT integration is being discontinued.

You can still do this though, same principle in the blog post for the calendar event template sensor.

Sorry @petro, I do not get itā€¦ I tried, but I do not understand how to use variable data from service call to set attributeā€¦ For state it would work, but it way longer than 255 characters.

So here is the service call make:

service: conversation.process
data:
  agent_id: 9794fb3fee4a1e2220e2f47a524fce93
  text: Tell me about Pink Floyd's album Animals in less than 100 words
response_variable: openai_test_response

and response received:

response:
  speech:
    plain:
      speech: >-
        Pink Floyd's album Animals, released in 1977, is a concept album that
        critiques society through the lens of animal metaphors. Divided into
        three tracks, "Pigs on the Wing" bookends the album, while "Dogs," "Pigs
        (Three Different Ones)," and "Sheep" form the core. The album explores
        themes of power, greed, and conformity, with each animal representing
        different aspects of society. The music combines progressive rock with
        elements of blues and psychedelic rock, featuring intricate guitar work,
        atmospheric keyboards, and thought-provoking lyrics. Animals is regarded
        as one of Pink Floyd's most politically charged and musically ambitious
        albums.
      extra_data: null
  card: {}
  language: en-GB
  response_type: action_done
  data:
    targets: []
    success: []
    failed: []
conversation_id: 01H4TW6ZBB0ZF7P797EMM2DDXF

What comes next?

after,

service: conversation.process
data:
  agent_id: 9194fb3fee4a1e2220e2f47a524fce92
  text: Tell me about Pink Floyd's album Animals in less than 100 words
response_variable: openai_test_response

add a custom event

- event: openai_response
  event_data:
    response: "{{ openai_test_response.response.speech.plain.speech  }}"

Then make a template sensor

template:
- trigger:
  - platform: event
    event_type: openai_response
  sensor:
  - name: Response
    device_class: timestamp
    state: "{{ now() }}"
    attributes:
      response: "{{ trigger.event.data.response }}"
4 Likes

There are currently two service calls that produce a response_variable. How are you planning to use them for the applications you listed? Iā€™m interested to learn how you will be employing this new functionality.


Regarding the example you posted, what initiates this service call?

service: conversation.process
data:
  agent_id: 9194fb3fee4a1e2220e2f47a524fce92
  text: Tell me about Pink Floyd's album Animals in less than 100 words
response_variable: openai_test_response