Bin / Waste Collection

Has anyone tried this for Hounslow council in London? https://www.hounslow.gov.uk/homepage/86/recycling_and_waste_collection_day_finder#collectionday

Thanks!

1 Like

Hi,

I recognised the structure of the json-file, so your municipality also uses https://2go-mobile.com/referenties ? This also used in among others:

  • Ede
  • Renkum
  • Veenendaal
  • Wageningen
  • Almere
  • Daarle
  • Daarlerveen
  • Haarle
  • Hellendoorn
  • Nijverdal
  • Almelo
  • Borne
  • Enschede
  • Haaksbergen
  • Hengelo
  • Hof van Twente
  • Losser
  • Oldenzaal
  • Wierden
  • Coevorden / Zweeloo
  • Emmen
  • Hoogeveen

Did you find a way to parse it?

I now have verry clumsy way of getting the dates (i’m a complete noob)

 - platform: command_line
   name: "Fetch date"
   value_template: '{{ value_json.dataList[0].pickupDates[0] }},{{ value_json.dataList[0]._pickupTypeText }} ; {{ value_json.dataList[1].pickupDates[0] }},{{ value_json.dataList[1]._pickupTypeText }} ; {{ value_json.dataList[2].pickupDates[0] }},{{ value_json.dataList[2]._pickupTypeText }} ; {{ value_json.dataList[3].pickupDates[0] }},{{ value_json.dataList[3]._pickupTypeText }} ; {{ value_json.dataList[4].pickupDates[0] }},{{ value_json.dataList[4]._pickupTypeText }}'
   command: >-
     curl 'https://wasteapi.2go-mobile.com/api/GetCalendar' -H 'origin: https://afvalportaal.2go-mobile.com' -H 'accept-encoding: gzip, deflate, br' -H 'accept-language: nl-NL,nl;q=0.9,en-US;q=0.8,en;q=0.7' -H 'user-agent: Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/71.0.3578.98 Safari/537.36' -H 'content-type: application/json;charset=UTF-8' -H 'accept: application/json, text/plain, */*' -H 'referer: https://afvalportaal.2go-mobile.com/modules/REDACTED/kalender/calendar.html?REDACTED' -H 'authority: wasteapi.2go-mobile.com' -H 'dnt: 1' --data-binary '{"companyCode":"REDACTED","startDate":"2019-02-17","endDate":"2022-01-09","uniqueAddressID":"REDACTED"}' --compressed
   scan_interval: 14400

If there is anyone with a better solution, please let me know.

june 1st 2019:
found a custom component made by @yniezink, https://github.com/yniezink/twentemilieu. You will have to adapt this to your municipality.

august 9th 2019:
a custom component for Ede, Renkum, Veenendaal and Wageningen can be found here: https://github.com/Cadsters/acv-hass-component

Can anyone help me whether I can use my local service?
It is available here: https://varde-sb.renoweb.dk/Legacy/selvbetjening/mit_affald.aspx

From there you write an address in the field and search. The data I need is then in the field “Der er tilmeldt følgende materiel”.

Now I know zero coding or anything, but I pressed F12 in my Chrome and figured out where the data I need is at, and this seems to be where it is:
https://varde-sb.renoweb.dk/Legacy/JService.asmx/GetAffaldsplanMateriel_mitAffald

Else, it is actually pretty simple. It regular is picked up mondays in uneven weeks, so next is on monday 25/02 and then it is always 14 days from there. Recycling is every 4. week, also next time is this monday 25/02.

Maybe it is easier to just make a dumb script of some sort for this?

you need to figure out what your address ID is, then post this as json to the URL,
e.g.

Posting
{"adrid":26275}

to the URL:
https://varde-sb.renoweb.dk/Legacy/JService.asmx/GetAffaldsplanTakst_mitAffald

Would return:

 {
   "d": "{\"list\":[{\"id\":135274,\"navn\":\"140 L cont 14 dg helårsbolig\",\"modul\":{\"id\":1,\"name\":\"Dagrenovation\",\"shortname\":\"\"},\"antal\":\"1\",\"totalpris\":\"1.017,50 kr.\"},{\"id\":21757,\"navn\":\"Genbrugsordning bolig\",\"modul\":{\"id\":2,\"name\":\"Genbrug\",\"shortname\":\"\"},\"antal\":\"1\",\"totalpris\":\"1.480,00 kr.\"},{\"id\":94546,\"navn\":\"240 L genbrug 4. uge\",\"modul\":{\"id\":2,\"name\":\"Genbrug\",\"shortname\":\"\"},\"antal\":\"1\",\"totalpris\":\"0,00 kr.\"}],\"status\":{\"id\":0,\"status\":\"Ok\",\"msg\":\"\"}}"
 }

There’s potentially a few other calls made as well, but the language makes it a little difficult for me to understand :smiley:

Might be easier to use something like node-red, otherwise you could use wget/curl to call it from a shell script and parse however needed.

Appreciate though that if you know zero about coding this might be difficult to achieve for you :frowning:

1 Like

I tried doing it with various tools. I used the postman app on Windows, but when I do a POST to https://varde-sb.renoweb.dk/Legacy/JService.asmx/GetAffaldsplanTakst_mitAffald with:
key: adrid
value: 26275

I get a generic error message returned.

Seems harder than I thought :smiley:

It has to be the actual JSON message in the body/payload of the post - you can’t break it into parameters.

This is what I’ve created for my local council. I’ve put my code on GitHub

They use a system call iShare from Astun Technology

1 Like

Ive got the trash pickup dates (Sun and Thurs) and recycle dates (Wed) set up in google calendar.
Im trying to set something up like this in lovelace but have no idea where to begin (I am new to HA)

Anyone who can give a hand with my council one?

https://mycouncil.northampton.digital/binFinder.html

Looks like it’s getting the data from a Java applet… postcode to use NN3 2HB

Looks like you you can post to this URL https://mycouncil.northampton.digital/BinRoundFinder?postcode=NN3%202HB

Here is the result.

{"result":"success","rounds":"single","day":"Monday","type":"brown","date":"201904010630","url":"07","oldDay":"Monday","oldWeek":"2","newWeek":"2"}
1 Like

Wonderful, I will work with those values to make a sensor :slight_smile: Thanks!

1 Like

Thanks for bringing up this idea.

I’ve just created a similar thing using my local council (Wakefield) that involved simple web-scraping using Node-Red. The final values - next bin collection for waste, recycling, and garden waste - are now sent to HA via MQTT. Been using for a couple of weeks now and it seems to be working well.

image

I’d like to sort the sensors out in terms of date, but for now I’m pretty happy with this.

1 Like

Nice work. Get a Pushbullet notification on that bad boy! :wink:

I like the idea of a Pushbullet (or similar) notification. That could work well as the webpage shows “Today” when a specific bin is delivered that day.

Cheers.

Hi, couldnt get scrape to work so went for another solution, used a sensor to get week number, my garden/recycle are on alternate weeks, so odd weeks for one - even for the other, was pretty simple

Cheers

I have been trying this with the Chiltern District council site, and have hit a problem with session cookies. If I search at https://isa.chiltern.gov.uk/jointwastecalendar/ for a postcode, it returns a list of addresses and then I can select mine to get the bin data.

If I sniff the requests, I can see that I can get directly to the bin page e.g. https://isa.chiltern.gov.uk/jointwastecalendar/calendar.asp?uprn=100080536405

However, if I try this without visiting the first page and creating a session cookie, I get a 500 internal server error.

Is there any way to create a scrape sensor that fires off two requests, separated by a pause? Or perhaps two scrape sensors that don’t update automatically, but only update manually with an automation to fire them in order once a day?

I’m not sure how you’d do that in HA, but in node-red, you could create a flow that performs:

first web-scarpe -> waits for reply -> delay XX seconds after reply -> performs second web-scrape -> send result to HA (via MQTT in my case)

Thanks. Not currently using node red, but will look into it.

Managed to get this working using my own bodge, with a little help from https://www.experts-exchange.com/questions/26271031/PHP-screen-scraping-from-page-that-requires-you-to-be-loged-in.html.

In short, I have written a PHP script hosted on my own server that uses CURL to fire off the two requests, one after the other. The first request sets the session cookie, and the second collects the data for my address. I then use the PHP simple DOM parser to extract the information that I need, create an array, format it as JSON:

<?php
include('simple_html_dom.php');

error_reporting(E_ALL);

// READ THE FIRST PAGE TO SET THE COOKIE
$baseurl = 'https://isa.chiltern.gov.uk/jointwastecalendar/';

// GET THE ACTUAL DATA
$nexturl = 'https://isa.chiltern.gov.uk/jointwastecalendar/calendar.asp?uprn=xxxxx';

// SET UP OUR CURL ENVIRONMENT
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $baseurl);
curl_setopt($ch, CURLOPT_COOKIEFILE, 'cookie.txt');
curl_setopt($ch, CURLOPT_COOKIEJAR,  'cookie.txt');
curl_setopt($ch, CURLOPT_FAILONERROR, TRUE);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, TRUE);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt($ch, CURLOPT_TIMEOUT, 5);

// CALL THE FIRST PAGE
$htm = curl_exec($ch);
$err = curl_errno($ch);
$inf = curl_getinfo($ch);
if ($htm === FALSE)
{
    echo "\nCURL GET FAIL: $baseurl CURL_ERRNO=$err ";
    var_dump($inf);
    die();
}

// WAIT A RESPECTABLE PERIOD OF TIME
sleep(1);

// NOW ON TO THE NEXT PAGE
curl_setopt($ch, CURLOPT_URL, $nexturl);
curl_setopt($ch, CURLOPT_POST, FALSE);
curl_setopt($ch, CURLOPT_POSTFIELDS, '');

$xyz = curl_exec($ch);
$err = curl_errno($ch);
$inf = curl_getinfo($ch);
if ($xyz === FALSE)
{
    echo "\nCURL 2ND GET FAIL: $posturl CURL_ERRNO=$err ";
    var_dump($inf);
}

// PARSE DATA
$html = str_get_html($xyz);

$bins = [
	'rubbish' => $html->find('td', 5)->plaintext,
    'recycling' => $html->find('td', 8)->plaintext,
    'paper' => $html->find('td', 11)->plaintext,
    'food' => $html->find('td', 14)->plaintext,
    'garden' => $html->find('td',17)->plaintext
	];

header('Content-Type: application/json');
echo json_encode($bins);

?>

The data are then pulled in to Home Assistant using a REST Sensor:

- platform: rest
  resource: 'https://mywebhost/scrape.php'
  name: 'Garden Waste'
  value_template: '{{ value_json.garden }}'
  scan_interval: 3600

I am sure I could achieve the same using NodeRed or a python script in HA itself, but as I am more familiar with PHP this works for me.

2 Likes

A few changes to make it more user- and server-friendly. I updated the php script to calculate the number of days to the next collection:

<?php
include('simple_html_dom.php');

error_reporting(E_ALL);

// READ THE FIRST PAGE TO SET THE COOKIE
$baseurl = 'https://isa.chiltern.gov.uk/jointwastecalendar/';

// GET THE ACTUAL DATA
$nexturl = 'https://isa.chiltern.gov.uk/jointwastecalendar/calendar.asp?uprn=12345678';

// SET UP OUR CURL ENVIRONMENT
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $baseurl);
curl_setopt($ch, CURLOPT_COOKIEFILE, 'cookie.txt');
curl_setopt($ch, CURLOPT_COOKIEJAR,  'cookie.txt');
curl_setopt($ch, CURLOPT_FAILONERROR, TRUE);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, TRUE);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt($ch, CURLOPT_TIMEOUT, 5);

// CALL THE FIRST PAGE
$htm = curl_exec($ch);
$err = curl_errno($ch);
$inf = curl_getinfo($ch);
if ($htm === FALSE)
{
    echo "\nCURL GET FAIL: $baseurl CURL_ERRNO=$err ";
    var_dump($inf);
    die();
}

// WAIT A RESPECTABLE PERIOD OF TIME
sleep(1);

// NOW ON TO THE NEXT PAGE
curl_setopt($ch, CURLOPT_URL, $nexturl);
curl_setopt($ch, CURLOPT_POST, FALSE);
curl_setopt($ch, CURLOPT_POSTFIELDS, '');

$xyz = curl_exec($ch);
$err = curl_errno($ch);
$inf = curl_getinfo($ch);
if ($xyz === FALSE)
{
    echo "\nCURL 2ND GET FAIL: $posturl CURL_ERRNO=$err ";
    var_dump($inf);
}

// PARSE DATA
$html = str_get_html($xyz);

$bins = [
	'rubbish' => $html->find('td', 5)->plaintext,
    'recycling' => $html->find('td', 8)->plaintext,
    'paper' => $html->find('td', 11)->plaintext,
    'food' => $html->find('td', 14)->plaintext,
    'garden' => $html->find('td',17)->plaintext
	];

date_default_timezone_set ('Europe/London');
$now = new DateTime();

foreach ($bins as $key => &$field){
	  $field = end(explode(' ',$field));
      $field = \DateTime::createFromFormat('d/m/Y',$field);	
      $field = "In " . $field->diff($now)->format("%a") . " days";
	  if ($field == "In 0 days") {
		  $field = "Today";
	  };
	  	  if ($field == "In 1 days") {
		  $field = "Tomorrow";
	  };
};
	
header('Content-Type: application/json');
echo json_encode($bins);

?>

I then used a template sensor so that all bin dates can be parsed from a single server call:

- platform: rest
  name: bins
  resource: 'my-server-script.php'
  value_template: 'OK'
  json_attributes:
    - rubbish
    - recycling
    - food
    - garden
    - paper
- platform: template
  sensors:
    rubbish:
      friendly_name: 'General Rubbish'
      icon_template: 'mdi:trash-can'
      value_template: '{{ states.sensor.bins.attributes["rubbish"] }}'
    garden:
      friendly_name: 'Garden Waste'
      icon_template: 'mdi:tree'
      value_template: '{{ states.sensor.bins.attributes["garden"] }}'
    recycling:
      friendly_name: 'Recycling'
      icon_template: 'mdi:recycle'
      value_template: '{{ states.sensor.bins.attributes["recycling"] }}'
    paper:
      friendly_name: 'Paper and Cardboard'
      icon_template: 'mdi:package-variant'
      value_template: '{{ states.sensor.bins.attributes["paper"] }}'
    food:
      friendly_name: 'Food Waste'
      icon_template: 'mdi:food'
      value_template: '{{ states.sensor.bins.attributes["food"] }}'

This results in:

1 Like