Ha!
Im happy to see someone reply to this, I ended up solving this a slightly different way and forgot to post my solution.
I also ended up finding the Ontario Energy Board and ended up scraping it also, but a different way.
I wanted three sensors that would tell me what the offpeak, onpeak and midpeak prices are at all times so I can use that information for my utility_meter. With the new Energy dashboard, I think I also need the three seperate price sensors, although Im not sure when it calculates the daily total.
Regardless, maybe you can take a look at my code and let me know if it makes sense to you.
In order to have all the prices always available to me, I scraped the prices from the legend on the alectra website, which is still visible, even without using javascript:
#This finds the legend to the left of the clock and reports all three prices at once.
- platform: scrape
resource: https://alectrautilities.com/time-use-pricing
name: Alectra Scrape Pricing
select: ".peak-legend"
scan_interval: 3600 # Request every hour
This gives me the following value:
$0.082 per kWh - Off-Peak $0.113 per kWh - Mid-Peak $0.170 per kWh - On-Peak
I then map each of these values to a seperate sensor. I was thinking of using one sensor with 3 attributes, but Im not sure the Energy dashboard supports attributes for pricing. Someone correct me if Im wrong.
- platform: template
energy_price_offpeak:
friendly_name: "Off-peak Price"
value_template: "{{(states('sensor.alectra_scrape_pricing'))|regex_findall_index(find='[0]....', index=0, ignorecase=False)}}"
icon_template: mdi:speedometer-slow
unit_of_measurement: "$"
energy_price_midpeak:
friendly_name: "Mid-peak Price"
value_template: "{{(states('sensor.alectra_scrape_pricing'))|regex_findall_index(find='[0]....', index=1, ignorecase=False)}}"
icon_template: mdi:speedometer-medium
unit_of_measurement: '$'
energy_price_onpeak:
friendly_name: "On-peak Price"
value_template: "{{states('sensor.alectra_scrape_pricing')|regex_findall_index(find='[0]....', index=2, ignorecase=False)}}"
icon_template: mdi:speedometer
unit_of_measurement: '$'
Now I want to know what the current peak is - this changes depending on the day of the week or for holidays. This could be hard coded into my utility meter, but it can also change on a whim, so I prefer having it scraped from the Ontario website:
# Scrapes current peak, used to find the current peak, incase of holidays or seasonal changes.
- platform: scrape
resource: https://www.oeb.ca/rates-and-your-bill/electricity-rates
name: Alectra Scrape Current Peak
select: '.off-peakactive, .mid-peakactive, .on-peakactive'
value_template: '{{ value| regex_findall_index(find="Mid-peak|On-peak|Off-peak", index=0, ignorecase=True)}}'
scan_interval: 3600
This will return the currect peak name, as you also found out! So the value will be either Off-peak
, On-peak
or Mid-peak
.
Once I have my three pricing sensors, and what the current peak is, I collect them all into one sensor:
# Looks at scraped string and finds the three values that are "0.###" and picks the right one for the current peak. This block is under platform: template
energy_price_current:
friendly_name: "Current peak Price"
value_template: >-
{% if is_state('sensor.alectra_scrape_current_peak', 'Off-peak') %}
{{(states('sensor.energy_price_offpeak'))}}
{% elif is_state('sensor.alectra_scrape_current_peak', 'Mid-peak') %}
{{(states('sensor.energy_price_midpeak'))}}
{% elif is_state('sensor.alectra_scrape_current_peak', 'On-peak') %}
{{(states('sensor.energy_price_onpeak'))}}
{% else %}
0
{% endif %}
icon_template: >-
{% if is_state('sensor.alectra_scrape_current_peak', 'Off-peak') %}
mdi:speedometer-slow
{% elif is_state('sensor.alectra_scrape_current_peak', 'Mid-peak') %}
mdi:speedometer-medium
{% elif is_state('sensor.alectra_scrape_current_peak', 'On-peak') %}
mdi:speedometer
{% else %}
mdi:exclamation-thick
{% endif %}
unit_of_measurement: '$'
I think this last sensor is where I could make use of your proposed sensor - I didnt know I could extract the pricing like you did. Do you think there’s anything else I can do to reduce the amount of scraping I’m doing?
Here is the final result:
Thanks for the reply @kdubb !
If anyone else is looking at this, feel free to message me if you need help.