2023.7: Responding services

The very last example automation in the Hyperion documentation should return the functionality you are used to https://www.home-assistant.io/integrations/hyperion/#examples

IMHO the Hyperion PR should be reverted. It adds nothing but confusion.

Also, once enabled, the switch.[instance]_component_led_device does nothing for my Hyperion install.

If I turn it on in Home Assistant it instantly turns off.

I can toggle the switch on the Hyperion Dashboard.

1 Like

I had the same, I wasn’t sure if it is the update or a firmware upgrade on the envoy. I deleted the integration and installed the custom integration and was able to log in with my email and enphase password. Make sure to restart Home Assistant after deleting the core integration. Somehow it was still in cache or something and it was interfering with the custom integration. Custom integration gave me an entity for the lifetime production with the exact same name so didn’t have to change anything in the energy dashboard.

yes, it is time for the energy dashboard to change!

While I initially got excited about service response, but then realized limitation of implementation :frowning: The only way to consume the response is within the script or automation that called the service. But sometimes it is needed to store the response for further processing. If response is short, we could use for example input_text.set_value to store it as entity state. But this brings a limitation that was many times highlighted in this forum - only 255 characters that entity state can store. While I understand this limitation and that we should store larger chunks of text as entities attributes, but then again we are missing ability to set the value of attribute from script or automation. The only possibility to set attribute is is to use template trigger sensor, but with this implementation we are missing such trigger to use.
Why we do not have set_attribute service available for custom entities (e.g. to have possibility to add custom attribute to input_text)?
Why can’t we define callback_event for called service, that would allow to use template trigger sensor to store such response in attribute?


This new energy sufficiency guage isn’t a lot of use to me, doesn’t seem to take into account exporting of energy.

I’ve been 100% sufficient for some time but because I choose to import during cheap rate and export loads more during peak it concludes I’m not self sufficient.

Can you provide a few examples where the response must be stored (as opposed to processed immediately)?

It’s not an economic indicator it’s a self sufficiency indicator. i.e. how much you depend on the grid.

Anything you import will bring it down, cheap, free or expensive.

I don’t think it should be removed, it should just be restructured to be friendlier. There should be 2 lights. 1 for colors and effects, 1 for usb capture. Both should automatically turn on/off LEDS. The lights themselves should be mutually exclusive. I.e. when colors and effects light is on, usb capture should be off and vice versa (also allowing both to be off at the same time). This is essentially what I do in my template switches to make it easier for myself.

You can… just make a custom event template sensor and pass the data to the attributes.

What about for people like me who don’t use USB capture?

I use a screen grabber as Hyperion runs on my TV box.

Same principles if it works the same way as usb capture. I.e. 3rd light

Is there a way of turning the gauge off please or adjusting how it calculates?

No. If you want a different calculation you will have to create your own template sensor and gauge.

Custom weather state descriptions, meteolalerts, local community feeds, etc provided not by integration but by REST sensors or scrapping (local garbage collection).
Case I’m working on right now; asking OpenAI for descriptive information about played artist/album/track to be displayed in custom media dashboard. BTW Taras, this is the same case we discussed here, I’m now trying to move it native HA solution, as @jjbankert ChatGPT integration is being discontinued.

You can still do this though, same principle in the blog post for the calendar event template sensor.

Sorry @petro, I do not get it… I tried, but I do not understand how to use variable data from service call to set attribute… For state it would work, but it way longer than 255 characters.

So here is the service call make:

service: conversation.process
  agent_id: 9794fb3fee4a1e2220e2f47a524fce93
  text: Tell me about Pink Floyd's album Animals in less than 100 words
response_variable: openai_test_response

and response received:

      speech: >-
        Pink Floyd's album Animals, released in 1977, is a concept album that
        critiques society through the lens of animal metaphors. Divided into
        three tracks, "Pigs on the Wing" bookends the album, while "Dogs," "Pigs
        (Three Different Ones)," and "Sheep" form the core. The album explores
        themes of power, greed, and conformity, with each animal representing
        different aspects of society. The music combines progressive rock with
        elements of blues and psychedelic rock, featuring intricate guitar work,
        atmospheric keyboards, and thought-provoking lyrics. Animals is regarded
        as one of Pink Floyd's most politically charged and musically ambitious
      extra_data: null
  card: {}
  language: en-GB
  response_type: action_done
    targets: []
    success: []
    failed: []
conversation_id: 01H4TW6ZBB0ZF7P797EMM2DDXF

What comes next?


service: conversation.process
  agent_id: 9194fb3fee4a1e2220e2f47a524fce92
  text: Tell me about Pink Floyd's album Animals in less than 100 words
response_variable: openai_test_response

add a custom event

- event: openai_response
    response: "{{ openai_test_response.response.speech.plain.speech  }}"

Then make a template sensor

- trigger:
  - platform: event
    event_type: openai_response
  - name: Response
    device_class: timestamp
    state: "{{ now() }}"
      response: "{{ trigger.event.data.response }}"

There are currently two service calls that produce a response_variable. How are you planning to use them for the applications you listed? I’m interested to learn how you will be employing this new functionality.

Regarding the example you posted, what initiates this service call?

service: conversation.process
  agent_id: 9194fb3fee4a1e2220e2f47a524fce92
  text: Tell me about Pink Floyd's album Animals in less than 100 words
response_variable: openai_test_response

I’m loving this release! I’ve been playing around with ChatGPT responses today and it’s working great for me, as is the calendar.list_events service.

I have run into one limitation though; that the calendar.list_events method won’t return a response if there are multiple calendars entities passed to it.

Failed to call service calendar.list_events. Service call requested response data but matched more than one entity

I’d love to be able to query multiple calendars at once to get a combined agenda for my day; for example, my work calendar, personal calendar, and the one I share with my partner. Is this something on the roadmap for this service, or is it only possible to support one calendar at a time?