Bin / Waste Collection

That sucks.

I had a look at their website, there is a get alerts. What does the alert look like.

Could you setup a server to receive the alert and parse?

These get sent in the form of a non standard formatted email for exception so can’t really be used, especially since nearly every time they mess up the month or year of the exception in their email :face_with_raised_eyebrow:
Regular bin is every Friday for me, recycling is every other Friday, so it was easy to set up a recurring item on GCAL and then manually update the exceptions :slight_smile:

My community uses an app called My Waste that seems to serve a number of communities: My Waste by Municipal Media Inc. https://itunes.apple.com/us/app/my-waste/id493621739?mt=8

This looks interesting. I might have a look at my local council again and see if I can adapt a similar approach. From memory you have to put in your address and it gives you the next date for each bin.

Cool do it and post if here when you work it out for your council

A tip, work out the post url for your address and use that address as your scrape/curl url

Yeah, that’s the thing. Last time I checked, the URL didn’t load the data if directly requested. On the previous screen you had have 3 drop down menus. Suburb, street, and street number, the you submit and it gives you the info you need.

I need to look into it some more. Maybe there is some extra data submitted or something else.

I’ve just finished doing a bit of investigating my council and their website. They seem to be using Zen Cart as there are options for additional services for each bin (missed collection, extra collection, replace bin, repair bin, etc).

The issue I’ve come across is that a cookie is stored after selecting your address, and this cookie seems to expire after a short period of time. Maybe an hour?

So to get any information, I need to fill out the form on the previous page, and then submit that form, and then get then I can scrape what I need.

Now, I could probably use the referer URL in the headers which contains the values of the form but I have no idea on how to work with that.

Can you provide the URL and a real address (not yours) ?

go here and you can pick any address. Every address in the council area is listed in a series of drop down menus. Pick a suburb, and then a drop down of street names appear, pick a street and then you get street numbers.

https://shellharbourwaste.net.au/index.php?main_page=vrp_advanced_search&menu=main

Take note of the URL as you pick different options though, it changes as you work your way through the different drop down menus. From the looks of it, the URL after you have picked everything is then passed on and used to get the details for waste collection.

Steps to get the Scrape URL

  1. Open Chrome
  2. Press F12 to activate developer console
  3. Click Console in the developer console
  4. Navigate to https://shellharbourwaste.net.au/index.php?main_page=vrp_advanced_search&menu=main
  5. Select your address - you will see post urls start to appear
    For my example I picked 1 AITKEN CL ALBION PARK

(ignoring the errors) these are the URLS that came up for me

Navigated to https://shellharbourwaste.net.au/index.php?main_page=vrp_advanced_search&suburbs_id=1

Navigated to https://shellharbourwaste.net.au/index.php?main_page=vrp_advanced_search&suburbs_id=1&street=AITKEN+CL

Navigated to https://shellharbourwaste.net.au/index.php?main_page=vrp_advanced_search&suburbs_id=1&street=AITKEN+CL&product_id=265

Navigated to https://shellharbourwaste.net.au/index.php?main_page=product_vrp_info&cPath=53_1&products_id=265

I then went to https://shellharbourwaste.net.au/index.php?main_page=vrp_advanced_search&suburbs_id=1&street=AITKEN+CL&product_id=265 directly and it created this url for me - which should be scrapeable

https://shellharbourwaste.net.au/index.php?main_page=product_vrp_info&cPath=53_1&products_id=265&zenid=ucdpdh1q8jcvsh5egcihq7a7t0

You could try the same - if the zenid expires(maybe it never does?) it might be an issue

1 Like

Yeah, that’s as far as I’ve got. The zenid expires and the page is blank under the address. Even if I click on the link with the zenid above, it’s still blank.

BUT… if I click on the link above that, it has the form pre-filled and has the next button and clicking on that works.

Is there a way to load up a pre-filled form in the scrapper and have it click on next and then it can scrape what I need?

On further investigation it submits a form - have a look at

or

@cjsimmons try this script in bash - it uses Python

I tried everything !! only the below worked

#!/usr/bin/env python

import requests
import time

s=requests.Session()
headers = {
    'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/67.0.3396.87 Safari/537.36'
}

ra=s.get('https://shellharbourwaste.net.au/index.php?main_page=vrp_advanced_search&suburbs_id=1&street=AITKEN+CL&product_id=265', headers=headers)
zencode=ra.cookies['zenid']
print zencode
print 'https://shellharbourwaste.net.au/index.php?zenid=' + str(zencode) + '&main_page=vrp_advanced_search_result&suburbs_id=1&street=AITKEN+CL&product_id=265'
time.sleep(5)

r = s.get('https://shellharbourwaste.net.au/index.php?zenid=' + str(zencode) + '&main_page=vrp_advanced_search_result&suburbs_id=1&street=AITKEN+CL&product_id=265', headers=headers)

print r.text

nice to see that there are a lot of ways to rome.

i did take a different approach, but some of the things i use might be helpfull/ inspiring to others here.

my approach:
we only get a piece of paper with the dates and bins collected.
a few years ago i did put that on a csv file, because the only thing changing are the dates.
that csv file i did import in a google calendar.
after i started with HA i also got that calendar in HA.

now the part that can be helpfull.
i use automation (in my case an appdaemon app) to look at the next time a waste bin is collected.
the day that it must go out i let TTS speak out a message every hour untill someone flips an input boolean telling HA that the waste is done.
of course you can also use any kind of notify for that.

i did put that on a dashboard and now we never forget it any more.

1 Like

You are perfectly correct of course. Something like a google calendar where you can set a recurring appointment (fortnightly cycle in my case) will be much quicker than hacking my local council’s API. The effort involved for the low number of people using home assistant within my city (pop 400,000) is possibly not worth it, except for the learning. And I might do it for that, the council has an android and ios reminder app that I have done a network trace on, and a nice json response is generated, so it is eminently doable.

However until we get a worldwide rubbish API (like we have for weather) then it is really only worth it for the intellectual advantage of learning python requests.

Along with the other Aussies in this thread, I’m thinking ours would be pretty simple.
We have a set day based on location.
Red (Rubbish) is weekly, Yellow (Recycling) and Green (Garden) is alternating weeks.

My council uses a bit of JS on their website to display it, based on whether the week number is odd (yellow) or even (green).

Without knowing what I’m doing, I figure a py script to set the value of a sensor using a calc on the week number would be the easiest?

KablammoNick,

My council publishes a pdf calendar. It’s pretty basic.
I use an automation, and flip the value of a sensor.

- alias: UpdateBins
  trigger:
    platform: time
    at: '03:00:00'
  condition:  
    condition: time
    weekday: 
      - sat  
  action:
    - service: input_select.select_option
      entity_id: input_select.bins
      data_template:
        option: >
          {% if is_state('input_select.bins', 'Recycling') %}
            Green Waste
          {% else %}
            Recycling
          {% endif %}!

bins|505x213

I’ve done something similar. It’s not as simple as just toggling alternate weeks because if your bin is collected on certain public holidays, the date moves and to be honest, at least in the UK it’s never always obvious how they reach the decisions as to where to move them to :smiley:

I used Node-Red to create a json sensor for HA.

I have a flow which:

  • walks the council website (as everyone else is doing) as best as I cant to get to the bin details page.
  • Extract the data from the resulting page
  • Work out collection type and days
  • If the data isn’t there (council website didn’t work for best part of 5 weeks after a schedule change!), then I default in a guess based on the alternate weeks schedule but highlight in the result it’s a guess
  • Cache the results

The sensor flow then:

  • First checks if the data in the cache exists and is less than 24hrs old
  • if it is, then it returns the result from the cache to save hitting the council website many times an hour/day for no real reason
  • if it isn’t, then it’ll run the flow to get the bin information a and cache it

It looks like:

And in my HA Panel, it shows as:
dashboard

4 Likes

My council has an ios/android app that alerts you on bin day, and what bins are to go out, and another one to remind you to bring them in. Clicking the ‘refresh collections’ button and sniffing the traffic reveals an API type call to a website which sends your address and gets a nice json formatted message back. BUT I have not figured the authentication yet.

1 Like

I need to learn node red. I do all that working out in my head and script it.

Node red might take longer but it’s so readable.