Honeywell CH/DHW via RF - evohome, sundial, hometronics, chronotherm

which zones are multiroom mode?

Here is my config for proxmox passthrough:
image

When booting the VM it could be that HA misses it and a restart is required of HA before it finds it.

Device I specify:

serial_port: /dev/ttyUSB0

04 (Bedroom) and 05 (Hall Landing)

i had the USB3 checkmark on. i turned it off as in your screenshot and now i see data slowly coming in.
for the serial port i use:
serial_port: /dev/serial/by-id/usb-Texas_Instruments_TUSB3410_Boot_Device_TUSB3410-if00-port0

@Lloyd Hmmm… Something is going on, isn’t it. I presumed it worked well before 0.20.x?

I see two issues - one would normally make up fro the other, but both features are broken in your system… Can’t see why, yet…

I need a packet log please - the answer isn’t in your ramses_cc file.

Yes, everything was working fine before 0.20.x. And everything apart from those two temperatures appears to be working now. I’ll email a packet log shortly.

OK, here is your answer:

  1. version 0.20.x of ramses_rf is much more efficient with the number of packets it send per unit time - it sent way to many, previously
  2. because of 1., the system does not send a RQ to learn the zone temp, it simply leverages the temperature packet of the periodic sync_cycle set (see below) that are sent every 3-5 minutes:
2022-07-28T10:29:14.453783 || CTL:123456 |            |  I | system_sync      |      || {'remaining_seconds': 176.0, '_next_sync': '10:32:10'}
2022-07-28T10:29:14.478621 || CTL:123456 |            |  I | setpoint         | [..] || [{'zone_idx': '00', 'setpoint': 5.0}, {'zone_idx': '01', 'setpoint': 5.0}, {'zone_idx': '02', 'setpoint': 5.0}, {'zone_idx': '03', 'setpoint': 5.0}, {'zone_idx': '04', 'setpoint': 5.0}, {'zone_idx': '05', 'setpoint': 5.0}, {'zone_idx': '06', 'setpoint': 5.0}, {'zone_idx': '07', 'setpoint': 5.0}]
2022-07-28T10:29:14.496550 || CTL:123456 |            |  I | temperature      | [..] || [{'zone_idx': '00', 'temperature': 22.12}, {'zone_idx': '01', 'temperature': 19.89}, {'zone_idx': '02', 'temperature': 19.91}, {'zone_idx': '03', 'temperature': 19.49}, {'zone_idx': '06', 'temperature': 20.59}, {'zone_idx': '07', 'temperature': 20.17}]
  1. You’ll see that this packet does not include a temperature for zones 4 & 5.

So, I now understand the problem… the issue is creating a fix.

That’s certainly a bit odd. Good luck with finding a fix for it.

not odd at all -everything is working exactly as designed.

Hello fellow Evohome fans,

I myself have been using Evohome for many years using the cloud integration to HA but now considering the move to this custom integration. Before doing so, I would like to know whether the following is possible;

  1. Hot water (DHW) boiler demand status (cloud solution provides this already), and

  2. Heating (HTG) boiler demand status (not available in cloud solution)

My query relates specifically to “boiler” demand, rather than zone demand through a thermostat or TRV. Effectively I want to know when the boiler is actually running, i.e. BDR91 switches on, so I can report boiler runtime hours.

You haven’t said if your boiler is controlled by an opentherm bridge (modulated), or a BDR91 relay (TPI).

And I should make clear that there are multiple ways to implement CH/DHW, because you can have two additional relays on top of the appliance-control relay / OTB.

It is even possible to design a system that does not have an appliance control relay by using a DHW relay and a heating relay. I am not sure if this is what you mean you say DHW and HTG?

Or do you have a combi boiler? In which case you probably won’t get the data you want without having an opentherm bridge…

So, more details might get you a more detailed answer…

The third complicating factor is: there is a distinction between a call for heat, and the pump running, and the flame being on.

In any case, with this integration, you can get the demand for any relay, plus, additional data on top of that, depending.
FC: heat source demand (for CH or DHW)
FA: heating relay
F9: DHW relay

Others good stepping and share some of the sensor templates, but for example:

Hi David, thank you for your reply and apologies for not describing my setup.

My configuration is a conventional S-plan with two port valves for heating and hot water control. I have two BRD91 relays, one to control the valve for heating and the other to control the hot water valve for the storage cylinder. I am not using a “boiler relay” or “appliance control” relay as Evohome now call it in their latest firmware. Both heating and hot water demand is through the two port valves (switched by BRD91 relays).

That said, I think you’ve answered my question anyhow

I appreciate this is not 100% accurate as it’s not a definitive way to know whether the boiler is running, flame is burning, pump running, etc, but all I want to know is whether the green LED on the BRD91 is ON or OFF. So if either green LED is illuminated it assumes the boiler is running, of course I will template this sensor logic in HA but I just want to confirm these conditions are reported, which I think they are! :grinning:

Thank you for your contributions to HA.

I am marking 0.20.15 as a regular release.

Note: it has breaking changes from 0.19.x, 0.18.x - please do a backup before you migrate.

I have pre-released 0.20.18 - it includes some minor changes to configuration.yaml too.

Please report any bugs here.

I’ve updated, but I can’t see where I’m going wrong with the YAML config. Below is what I have but I get this error. If I #out the orphans_hvac: line it works?

Invalid config for [ramses_cc]: [Match('^[0-9]{2}:[0-9]{6}$', msg=None)] for dictionary value @ data['ramses_cc']['orphans_hvac']. Got ['32:172534', '32:168240', '32:172534', '32:123456', '30:079129']. (See /config/configuration.yaml, line 49).
  ramses_cc:
    orphans_hvac: [32:172534, 32:168240, 32:172534, 32:123456, 30:079129] 
    serial_port: /dev/ttyACM2
    packet_log: /config/packet_logs
    known_list:    
        10:051349: # Appliance Control
        01:169176: # main_controller
        18:135447: # Config Sensor
        04:190691: # Harry's Room
        04:198483: # Bedroom
        04:198487: # Spare Room
        04:112546: # Living Room 1
        04:198485: # Living Room 2
        04:038015: # Utility Room
        04:090189: # Kitchen
        32:166025: {class: CO2}
        32:123456: {class: CO2, faked: true}
        30:079129: {class: FAN}
        32:168240: {class: HUM}
        32:172534: {class: REM}
#       32:172522: {class: SWI}
        03:000001: # kitchen
        03:000002: # Living Room 1
        03:000004: # Harry's Room
        03:000005: # Master Bedroom
        03:000006: # Utility Room 
        03:000007: # Spare Room    
    ramses_rf:
      enforce_known_list: true
      enable_eavesdrop: false
    01:169176: #Evohome
      system:
        appliance_control: 10:051349 #OTB

This one is easy (I think) - you have 32:172534 twice in the list.

Bugs: 0, Non-bugs: 1

1 Like

Thanks! Sometimes you can’t see the woods for the trees.

These anything to worry about?

Logger: ramses_rf.processor
Source: runner.py:119
First occurred: 14:04:20 (51 occurrences)
Last logged: 14:14:01

RQ --- 18:135447 30:079129 --:------ 22F2 001 00 < Invalid code for 30:079129 (FAN) to Rx: 22F2
RQ --- 18:135447 30:079129 --:------ 22F4 001 00 < Invalid code for 30:079129 (FAN) to Rx: 22F4
RQ --- 18:135447 30:079129 --:------ 22F8 001 00 < Invalid code for 30:079129 (FAN) to Rx: 22F8
RQ --- 18:135447 30:079129 --:------ 313E 001 00 < Invalid code for 30:079129 (FAN) to Rx: 313E
RQ --- 18:135447 30:079129 --:------ 3222 001 00 < Invalid code for 30:079129 (FAN) to Rx: 3222

No… ignore them - I will remove them presently - do send me a packet log, though.

I have just migrated to 0.20.15 and initially had trouble with the following errors:

2022-08-07 22:10:12.048 ERROR (MainThread) [homeassistant.components.binary_sensor] Error while setting up ramses_cc platform for binary_sensor
Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/helpers/entity_platform.py", line 281, in _async_setup_platform
    await asyncio.shield(task)
  File "/config/custom_components/ramses_cc/binary_sensor.py", line 58, in async_setup_platform
    new_sensors += [
  File "/config/custom_components/ramses_cc/binary_sensor.py", line 59, in <listcomp>
    entity_factory(broker, dev, k, **v)
  File "/config/custom_components/ramses_cc/binary_sensor.py", line 45, in entity_factory
    migrate_to_ramses_rf(hass, "binary_sensor", f"{device.id}-{attr}")
  File "/config/custom_components/ramses_cc/helpers.py", line 30, in migrate_to_ramses_rf
    if hass.states.get(entity_id).state == STATE_UNAVAILABLE:  # HACK
AttributeError: 'NoneType' object has no attribute 'state'
2022-08-07 22:10:12.051 ERROR (MainThread) [homeassistant.components.sensor] Error while setting up ramses_cc platform for sensor
Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/helpers/entity_platform.py", line 281, in _async_setup_platform
    await asyncio.shield(task)
  File "/config/custom_components/ramses_cc/sensor.py", line 91, in async_setup_platform
    new_sensors = [
  File "/config/custom_components/ramses_cc/sensor.py", line 92, in <listcomp>
    entity_factory(broker, device, k, **v)
  File "/config/custom_components/ramses_cc/sensor.py", line 83, in entity_factory
    migrate_to_ramses_rf(hass, "sensor", f"{device.id}-{attr}")
  File "/config/custom_components/ramses_cc/helpers.py", line 30, in migrate_to_ramses_rf
    if hass.states.get(entity_id).state == STATE_UNAVAILABLE:  # HACK
AttributeError: 'NoneType' object has no attribute 'state'

This resulted in sensors and binary_sensors never appearing, though the climate entities did appear.

I’ve made the following small local change to ramses_cc/helpers.py which appears to have resolved the problem. Change line 30 from:

if hass.states.get(entity_id).state == STATE_UNAVAILABLE:  # HACK

to

if hass.states.get(entity_id) is not None and hass.states.get(entity_id).state == STATE_UNAVAILABLE:  # HACK

The above hack seems to have worked, but I’ll keep my eyes on it.

I am sorry - I had thought I’d fixed this - I have just pushed an update, 0.20.15a…

1 Like

I’m getting an configuration.yaml error on 0.20.15a: (basically a new install, did not run this config on older version).

“Got multiple values for keyword argument ‘orphans_hvac’”.

Logger: homeassistant.setup
Source: custom_components/ramses_cc/__init__.py:252
Integration: ramses_cc ([documentation](https://github.com/zxdavb/ramses_cc), [issues](https://github.com/zxdavb/ramses_cc/issues))
First occurred: 11:10:47 (1 occurrences)
Last logged: 11:10:47

Error during setup of component ramses_cc

Traceback (most recent call last): File "/usr/src/homeassistant/homeassistant/setup.py", line 235, in _async_setup_component result = await task File "/config/custom_components/ramses_cc/__init__.py", line 92, in async_setup await broker.create_client() # start with a merged/cached, config, null schema File "/config/custom_components/ramses_cc/__init__.py", line 252, in create_client self.client = Gateway( TypeError: ramses_rf.gateway.Gateway() got multiple values for keyword argument 'orphans_hvac' 

My configuration.yaml:

ramses_cc:
  serial_port: /dev/ttyUSB1
  orphans_hvac: [32:134446, 32:132403, 37:005302, 37:005608, 37:171685]

  packet_log:
    file_name: packet.log
    rotate_backups: 28

#  known_list:
#    32:134446: {class: FAN}               # WTW HRC400 confirmed
#    32:132403: {class: HVC}               # zone valve upstairs area
#    37:005302: {class: CO2}               # CO2 woonkamer
#    37:005608: {class: CO2}               # CO2 slaapkamer
#    37:171685: {class: DIS}               # RF15 display

  ramses_rf:
    enforce_known_list: true
    enable_eavesdrop: false

(And, yes, I’ve checked that there is no other ramses_cc part in the YAML). Removing the ramses_rf part does not help.

Also packet.log shows older version of ramses_rf running:

2022-08-08T11:12:56.632837 # ramses_rf 0.20.14

I removed ramses_cc from HACS and installed again, but that also does not help :frowning: