I’m thinking of writing a 7 day summary that uses mdi icons and the publicly available BoM FTP forecast data. It’s a lot easier to parse than the HTML. E.g.
It’s still on my list of things to do but don’t hold your breath. It’s a bit of a low priority compared to the many other projects I have on the go at the moment. I expect it will be a few months before I get to it.
Code:
platform: rest
resource: http://timetableapi.ptv.vic.gov.au/v3 7
name: API
value_template: “{{value_json[‘departures’][0][‘estimated_departure_utc’]}}”
thn i add this code to split the time value.
I know I am re-opening an older thread, but I have a couple of questions about the PTV setup
Currently I have got it working using the API and it shows me the departure time from my local station in both directions…(using a different rest sensor for both directions)
I have a couple of questions about using the json filter, I would like to include both the scheduled time as well as the estimated time.
Also my link collects the next 5 departures, so I would like to some how display the next five departures, so any advice on how to configure the JSON to get it to display it…
Same here, stopped working 2 days ago.
Gives the following error:
scrape: Error on device update!
Summary
scrape: Error on device update!
Traceback (most recent call last):
File “/usr/local/lib/python3.6/site-packages/homeassistant/helpers/entity_platform.py”, line 251, in _async_add_entity
await entity.async_device_update(warning=False)
File “/usr/local/lib/python3.6/site-packages/homeassistant/helpers/entity.py”, line 353, in async_device_update
yield from self.hass.async_add_job(self.update)
File “/usr/local/lib/python3.6/concurrent/futures/thread.py”, line 56, in run
result = self.fn(*self.args, **self.kwargs)
File “/usr/local/lib/python3.6/site-packages/homeassistant/components/sensor/scrape.py”, line 120, in update
value = raw_data.select(self._select)[0].text
IndexError: list index out of range
Added few more other scrape sensors from different sources, it works perfectly, but not from bom.gov.au. I am just using 5 sensors to get basic data like min/max temperature, chance of rain, worked for half a year and just stopped working few days ago.
The same has happened to me… It just stopped, I thought it was maybe pi-hole, but it isn’t even trying to connect. I wonder if it is been blocked from scraping…
I tried to roll-back my hassio to a couple of previous snapshots, where scrape sensor definitely worked and no luck. All the other scrape sensors are working fine. Also found this on bom website:
The use of any material on the Bureau websites obtained through use of automated or manual techniques including any form of scraping or hacking is prohibited.
So if you have a permanent public IP address, then it probably won’t work again. Does anyone with the problem have a chance to change your public IP address to verify that? I have disabled my scrape sensors, will re-enable in a week to check again.
It’s probably a good idea to modify the sensor to scrape not that often, as it captures data every 30 seconds which may not be required by most of us. Every 10-20 minutes could be enough.
Hope scan interval works fine with that type of sensor, will test it if it starts working again for me e.g.:
- platform: scrape
resource: http://www.bom.gov.au/nsw/forecasts/sydney.shtml
name: Temperature Max
select: '.max'
unit_of_measurement: '°C'
scan_interval: 900