EMHASS: An Energy Management for Home Assistant

What about coral tpu?

What do you mean? Tpu can be interesting IF you have a tpu capable device which will be 0.001% of Home Assistant users, so not a viable option.

I look forward to the heat pump modelling.

Currently summer here so I see great correlation between solar PV production, low cost of electricity, high temperature and demand for cooling. The sunniest :sun_with_face: times of the day are also the hottest with the highest demand for cooling and the cheapest energy.

Inside EMHASS I calculate def_total_hrs_hvac as the time the external temperature is above a comfort threshold.

EMHASS then schedules my HVAC for cooling for times that are close to optimal from a comfort perspective.

This method of scheduling doesnā€™t work for heating in the winter as the the correlation isnā€™t there. So for the winter case I take my heating out of deferrable load scheduling and let the thermostat control the call for heat and report the power consumption in EMHASS as power_load_no_var_loads.

EMHASS always looks so coolā€¦then I try to set it up myself and canā€™t even get past initial Setupā€¦:grin:
I could not find any way to link Emhass to my dynamic energy price entityā€¦and thatā€™s definitely basic information it needs to do itā€™s thing.

But one of these days I will figure it outā€¦perhapsā€¦:crossed_fingers:

Thanks for you reply.

Unfortunately I can not get it to run.

Iā€™m using a Synology NAS and its docker package.
This does not seem to allow make and so on. Therefore, I changed the config_emhass.yaml and secrets_emhass.yaml on Ubuntu and then ran ā€œmakeā€.

Then I took the resulting tar-file, uploaded it to the NAS and added it as an image in the Docker GUI.
When I now try to run it (also via the GUI, setting the paths, timezone and LOCAL_COSTFUN), it does not work and docker log only shows some docker logos and a story.

Any idea how I could get this to run on Synology?

Thanks

Hi Per,

Iā€™m trying your query with emhass by Iā€™m getting an error as your list has 24 Entries and emhass seems to expect 24. may I ask how you are added the tibber data?

1 Like

Hi Jens,

Iā€™m sorry, but I donā€™t understand your question. I add 24 entries like you said and that is what EMHASS expects. Can you explain in more detail what problem you have?

Cheers, Per

Sure, I added your shell command which indeed returns the correct values in a list of 24.

post_tibber_forecast: "curl -i -H \"Content-Type: application/json\" -X POST -d '{
		  \"prod_price_forecast\":[0.1998, 0.1812, 0.1717, 0.1634, 0.1641, 0.191, 0.2261, 0.2655, 0.287, 0.2807, 0.2738, 0.262, 0.252, 0.2445, 0.2485, 0.256, 0.2625, 0.2763, 0.2763, 0.2751, 0.262, 0.2524, 0.252, 0.2422],
		  \"load_cost_forecast\":[0.1915, 0.2021, 0.2141, 0.21, 0.2094, 0.2094, 0.2146, 0.2242, 0.2232, 0.2254, 0.221, 0.2264, 0.2192, 0.191, 0.1769, 0.1717, 0.1717, 0.1741, 0.1772, 0.1691, 0.1717, 0.1717, 0.168, 0.1671]
          }' http://localhost:5000/action/dayahead-optim"

But I get following error from emhass as it seems to expect 48.

 ERROR in utils: ERROR: The passed data is either not a list or the length is not correct, length should be 48
[2023-01-13 19:12:43,303] ERROR in utils: Passed type is <class 'list'> and length is 48

I might be wrong, but I suspect that you need to adjust one of the configuration parameters in in EMHASS called

optimization_time_step

to 60. The default is 30 minutes, which is why EMHASS expect 48 values.

I actually no longer use data from Tibber since Iā€™ve found their integration too unstable to rely on it for EMHASS. I use data directly from Nordpool using GitHub - custom-components/nordpool: This component allows you to pull in the energy prices into Home-Assistant.. I add the following under sensor: in my sensors.yaml file that I include in my configuration.yaml file.

# Nordpool
- platform: nordpool
  VAT: True
  price_in_cents: false
  region: "SE3"
  precision: 3
  additional_costs: "{{0.0|float}}"

I use the data from Nordpool to create two template sensors that reflect the total price for consumption and production including network charges for transport over the grid. Note that these sensors will offer the next 24 forecast values starting from the next hour and that it combines todays prices with next day forecast. This allows me to optimize production at 21.57 every evening. Iā€™ve found this to be more powerful than to simply send tomorrows 24 forecast right before midnight. Iā€™ve also changed my configuration with EMHASS so that my energy storage aim for being depleted (0.15) instead of 0.60 for each cycle. With the price variations we typically see over a 24 h period I find this to be more rational, but you can also choose to optimize right before midnight if you find that it works better for you.

- name: "Electricity consumption price 24h forecast"
  unique_id: electricity_consumption_price_24h_forecast
  state: >-
    {% set data = state_attr('sensor.nordpool_kwh_se3_sek_3_10_025', 'raw_today') | map(attribute='value') | list %}
    {% set values = namespace(all=[]) %}
    {% for i in range(now().hour+1,data | length) %}
      {% set v = ((data[i] | float(0) + 1.25*0.0875 + 0.75) | round(4)) %}
      {% set values.all = values.all + [ v ] %}
    {% endfor %}
    {% set data = state_attr('sensor.nordpool_kwh_se3_sek_3_10_025', 'raw_tomorrow') | map(attribute='value') | list %}
    {% for i in range(data | length) %}
      {% set v = ((data[i] | float(0) + 1.25*0.0875 + 0.75) | round(4)) %}
      {% set values.all = values.all + [ v ] %}
    {% endfor %} {{ (values.all)[:24] }}
  availability: >
    {{states('sensor.nordpool_kwh_se3_sek_3_10_025') not in ['unknown','unavailable']}}

- name: "Electricity production price 24h forecast"
  unique_id: electricity_production_price_24h_forecast
  state: >-
    {% set data = state_attr('sensor.nordpool_kwh_se3_sek_3_10_025', 'raw_today') | map(attribute='value') | list %}
    {% set values = namespace(all=[]) %}
    {% for i in range(now().hour+1,data | length) %}
      {% set v = ((data[i] | float(0)/1.25 + 0.0725 + 0.6*0.555) | round(4)) %}
      {% set values.all = values.all + [ v ] %}
    {% endfor %}
    {% set data = state_attr('sensor.nordpool_kwh_se3_sek_3_10_025', 'raw_tomorrow') | map(attribute='value') | list %}
    {% for i in range(data | length) %}
      {% set v = ((data[i] | float(0)/1.25 + 0.0725 + 0.6*0.555) | round(4)) %}
      {% set values.all = values.all + [ v ] %}
    {% endfor %} {{ (values.all)[:24] }}
  availability: >
    {{states('sensor.nordpool_kwh_se3_sek_3_10_025') not in ['unknown','unavailable']}}

My shell_commands.yaml file included using

shell_command: !include shell_commands.yaml

in the configuration.yaml file has an entry as below:

24h_ahead_optim:
  'curl -i -H ''Content-Type: application/json'' -X POST -d ''{"load_cost_forecast":{{(
  states.sensor.electricity_consumption_price_24h_forecast.state)
  }},"prod_price_forecast":{{(
  states.sensor.electricity_production_price_24h_forecast.state)}}}'' http://localhost:5000/action/dayahead-optim'

I optimize using an automation as below:

alias: EMHASS 24h-ahead optimization
description: ""
trigger:
  - platform: time
    at: "21:57:00"
condition: []
action:
  - service: shell_command.24h_ahead_optim
    data: {}
mode: single

and publish using:

alias: EMHASS publish data
description: ""
trigger:
  - platform: time_pattern
    minutes: "0"
    seconds: "0"
condition: []
action:
  - service: shell_command.publish_data
    data: {}
mode: single

I hope this sorts out your problem!

Cheers, Per

3 Likes

I donā€™t have a Synology so I donā€™t know. If I understand well is a special Linux distro so thing may be different. What is your system architecture?
Otherwise to avoid the need of the make command you can directly build your image with the commands inside the mk file and try if that works.
Are you able to run any other docker images on that system. Do a first try with hello world

It is a NAS and it runs Synology DSM, which is some linux kernel OS but from what I know there is not a lot known about it publicly.
I can without problems run any containers which can be found under docker hub and I can also add other repositories via URL:


After an image is downloaded, I can run it via the GUI and while doing so define things like port, TZ and so on.

Doing anything like ā€œmakeā€ can only either be done via ssh or on a different machine, then putting the generated container (donā€™t even know how I would call this, guess the resulting tar is what needs to be uploaded) into the NAS and then run it from there.
Would it be possible to add EMHASS repository, pull the standard image and afterwards add the edited config files?

Currently I have Node-Red, ESPHome, Home Assistant and zwavejs running as containers.

Thanks

Iā€™m really happy with this plan today from EMHASS.

Unusually today I have very little variation between maximum and minimum prices during the day, so EMHASS isnā€™t trying to madly charge my battery to 100% and a nice balance of charging my EV (green) and running my air-conditioning (red).

Hi @davidusb,

Firstly, many thanks for making this addon!

Iā€™m eager to get involved but Iā€™m having a little difficulty and could do with some assistance.

Iā€™m running the latest version of Home Assistant and EMHASS on a raspberry pi 4.

Home Assistant 2023.1.4
Supervisor 2022.12.1
Operating System 9.4
Frontend 20230110.0 - latest
EMHASS: 0.2.23

EMHASS is configured like so

hass_url: http://10.0.0.15:8123
long_lived_token: [redacted]
costfun: profit
optimization_time_step: 30
historic_days_to_retrieve: 2
method_ts_round: nearest
set_total_pv_sell: false
lp_solver: COIN_CMD
lp_solver_path: /usr/bin/cbc
sensor_power_photovoltaics: sensor.pv_power_w
sensor_power_load_no_var_loads: sensor.power_load_no_var_loads

In Home Assistant, I have created the pv_power_w and power_load_no_var_loads sensors and verified they have valid data. I can also retrieve this data via a call to the API like this

http://10.0.0.15:8123/api/history/period/2023-01-14?filter_entity_id=sensor.power_load_no_var_loads&minimal_response&significant_changes_only

The EMHASS web server is up and running and I can see the default data that comes with EMHASS. However, when running either optimisations I receive the following error.

[2023-01-14 15:46:52,268] INFO in web_server: EMHASS server online, serving index.html...
[2023-01-14 15:47:08,683] INFO in command_line: Setting up needed data
[2023-01-14 15:47:08,762] INFO in retrieve_hass: Retrieve hass get data method initiated...
[2023-01-14 15:47:08,766] ERROR in app: Exception on /action/perfect-optim [POST]
Traceback (most recent call last):
  File "/usr/local/lib/python3.9/dist-packages/pandas/core/indexes/base.py", line 3803, in get_loc
    return self._engine.get_loc(casted_key)
  File "pandas/_libs/index.pyx", line 138, in pandas._libs.index.IndexEngine.get_loc
  File "pandas/_libs/index.pyx", line 165, in pandas._libs.index.IndexEngine.get_loc
  File "pandas/_libs/hashtable_class_helper.pxi", line 5745, in pandas._libs.hashtable.PyObjectHashTable.get_item
  File "pandas/_libs/hashtable_class_helper.pxi", line 5753, in pandas._libs.hashtable.PyObjectHashTable.get_item
KeyError: 'sensor.power_load_no_var_loads'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.9/dist-packages/flask/app.py", line 2525, in wsgi_app
    response = self.full_dispatch_request()
  File "/usr/local/lib/python3.9/dist-packages/flask/app.py", line 1822, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/usr/local/lib/python3.9/dist-packages/flask/app.py", line 1820, in full_dispatch_request
    rv = self.dispatch_request()
  File "/usr/local/lib/python3.9/dist-packages/flask/app.py", line 1796, in dispatch_request
    return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args)
  File "/usr/local/lib/python3.9/dist-packages/emhass/web_server.py", line 134, in action_call
    input_data_dict = set_input_data_dict(config_path, str(config_path.parent), costfun,
  File "/usr/local/lib/python3.9/dist-packages/emhass/command_line.py", line 68, in set_input_data_dict
    rh.prepare_data(retrieve_hass_conf['var_load'], load_negative = retrieve_hass_conf['load_negative'],
  File "/usr/local/lib/python3.9/dist-packages/emhass/retrieve_hass.py", line 177, in prepare_data
    self.df_final[var_load+'_positive'] = self.df_final[var_load]
  File "/usr/local/lib/python3.9/dist-packages/pandas/core/frame.py", line 3804, in __getitem__
    indexer = self.columns.get_loc(key)
  File "/usr/local/lib/python3.9/dist-packages/pandas/core/indexes/base.py", line 3805, in get_loc
    raise KeyError(key) from err
KeyError: 'sensor.power_load_no_var_loads'

Any thoughts on where I could be going wrong?

Cheers

Hi, Everything seems ok to me. What may happen sometimes is that you just need to wait for the history recorder to record enough data for the power_load_no_var_loads sensor. So try again when you have at least 48h of data on that sensor.

2 Likes

That was the exact issue that I stumbled into as well in the beginning :wink:
Pulled my hair for a bit before realising.

Thanks guys, I did see comments about this so waited a while for the data to come in. However, Iā€™m still getting the same error with 9 days worth of data in the sensor now.

Why cant we publish prices the same way as the no load vars and pv2 sensor with date and time into emhass instead of the eai its done today? Kind of irritating when experimenting that all the values comes in the wrong place when they do not includes date and time?

2 Likes

Hi @davidusb do you have any idea about this?

I once more tried loading the tar file and runnning the container, both via CLI and GUI.
Unfortunately both variants are not working.

Would it be possible to get a compiled tar file of EMHASS? Maybe the make part is the issue on my side.

Thank you very much!

Hi Tom. Are you able to get a history plot of those 9 days for sensor.power_load_no_var_loads ?

Hi. Ok just forget about the tar file. If you have docker installed and already running some containers then the docker build command should work just fine. This will build your image and then just can just docker run that image.

So ssh into your system and then go to the emhass repository and do something like this:

docker build -t emhass/core:v0.3.21 -f Dockerfile .

Then run this image as:

docker run -it --restart always -p 5000:5000 -e "LOCAL_COSTFUN=profit" -v $(pwd)/config_emhass.yaml:/app/config_emhass.yaml -v $(pwd)/secrets_emhass.yaml:/app/secrets_emhass.yaml --name DockerEMHASS emhass/core:v0.3.21

I that doesnā€™t work then what are the error messages?

As a last resort I can push a builded image to dockerhub but it will be architecture dependent, so it is not the best option.