Weather forecast for Australia using BOM website

Found http://timetableapi.ptv.vic.gov.au/swagger/ui/index

When I use this page (after entering my devID and devKey) I get 403 errors for the session key that their own website creates. Do you get any errors?

Essentially, you can then just copy and paste the URL that the webpage generates and feed this into a REST sensor.!

Capture|690x312

Code:
platform: rest
resource: http://timetableapi.ptv.vic.gov.au/v3 7
name: API
value_template: “{{value_json[‘departures’][0][‘estimated_departure_utc’]}}”
thn i add this code to split the time value.

Thanks Gillrishi.

How did you create the session key?

I’m using PTV website (and a few php scripts from github) but am getting 403 errors for session key being incorrect.

Thanks for the help. PTV fixed my dev id and key, and so now it’s working.

I’m parsing the value in a single sensor. Pasting here in case in helps someone else.

  - platform: rest
    resource: http://timetableapi.ptv.vic.gov.au/v3/departures/route_type/0/stop/.....
    name: Departure_1
    value_template: "{{ as_timestamp(value_json['departures'][0]['estimated_departure_utc']) | timestamp_custom('%H:%M')}}"

I know I am re-opening an older thread, but I have a couple of questions about the PTV setup
Currently I have got it working using the API and it shows me the departure time from my local station in both directions…(using a different rest sensor for both directions)

I have a couple of questions about using the json filter, I would like to include both the scheduled time as well as the estimated time.

Also my link collects the next 5 departures, so I would like to some how display the next five departures, so any advice on how to configure the JSON to get it to display it…

Thanks all…

Has the BOM scape method stopped working for some people in the last day or two (or ever!)?

Mine just stopped working, and I’m wondering if they changed their code so it breaks everyone or if my IP has been identified as ‘scraper’?

1 Like

My BOM sensor is working… hasn’t missed a beat.

Same here, stopped working 2 days ago.
Gives the following error:

scrape: Error on device update!

Summary

scrape: Error on device update!
Traceback (most recent call last):
File “/usr/local/lib/python3.6/site-packages/homeassistant/helpers/entity_platform.py”, line 251, in _async_add_entity
await entity.async_device_update(warning=False)
File “/usr/local/lib/python3.6/site-packages/homeassistant/helpers/entity.py”, line 353, in async_device_update
yield from self.hass.async_add_job(self.update)
File “/usr/local/lib/python3.6/concurrent/futures/thread.py”, line 56, in run
result = self.fn(*self.args, **self.kwargs)
File “/usr/local/lib/python3.6/site-packages/homeassistant/components/sensor/scrape.py”, line 120, in update
value = raw_data.select(self._select)[0].text
IndexError: list index out of range

I get lots of scrape errors per day. Seems to be working though.

Added few more other scrape sensors from different sources, it works perfectly, but not from bom.gov.au. I am just using 5 sensors to get basic data like min/max temperature, chance of rain, worked for half a year and just stopped working few days ago.

Wondering if they can blacklist my IP address

The same has happened to me… It just stopped, I thought it was maybe pi-hole, but it isn’t even trying to connect. I wonder if it is been blocked from scraping…

I tried to roll-back my hassio to a couple of previous snapshots, where scrape sensor definitely worked and no luck. All the other scrape sensors are working fine. Also found this on bom website:

The use of any material on the Bureau websites obtained through use of automated or manual techniques including any form of scraping or hacking is prohibited.

So if you have a permanent public IP address, then it probably won’t work again. Does anyone with the problem have a chance to change your public IP address to verify that? I have disabled my scrape sensors, will re-enable in a week to check again.

It’s probably a good idea to modify the sensor to scrape not that often, as it captures data every 30 seconds which may not be required by most of us. Every 10-20 minutes could be enough.

Hope scan interval works fine with that type of sensor, will test it if it starts working again for me e.g.:

  - platform: scrape
    resource: http://www.bom.gov.au/nsw/forecasts/sydney.shtml
    name: Temperature Max
    select: '.max'
    unit_of_measurement: '°C'
    scan_interval: 900

We really need a set of sensors that use their free public FTP resource.

1 Like

And it’s gone. Now not working for me either.

I don’t think it’s banned by IP address, since I can still browse to it from my PC. It’s probably blocking using a WAF on certain other details, like user agent or similar.

This is the sort of data available from the freely available public resource. Seven days of forecast data all nicely wrapped up in xml tags:

ftp://ftp.bom.gov.au/anon/gen/fwo/IDT13600.xml

To find your forecast product ID use this search page: http://reg.bom.gov.au/catalogue/anon-ftp.shtml

I’ve been trying to work out how to scrape this but am not having any success. Anyone else want to have a go?

Terms of use:

All products available via the anonymous FTP service are subject to the default terms of the Bureau’s copyright notice: you may download, use and copy that material for personal use, or use within your organisation but you may not supply that material to any other person or use it for any commercial purpose. Users intending to publish Bureau data should do so as Registered Users.

1 Like

Mine is still working fine…

If you are blocked I suggest disabling scraping for a week or so and then re-enable with a longer scan interval, 2 or 3 hours.

I don’t see an option in the docs to change the scan interval…

It’s documented ‘centrally’ because it applies to every component that polls. Put the scan interval option on each instance of your scrape.