You can limit the output of the scraping to 255 by using a truncate filter in the value_template for your sensor. It’s not the best option, but at least the filter will put an ellipsis ("…") at the end of the text so it’s obvious that the value has been truncated.
thanks perrin7 for your reply.
i already look at your post above
followed your instructions.
i use select: ‘.DepartingIn’
error:
rror on device update!
Traceback (most recent call last):
File “/srv/homeassistant/lib/python3.6/site-packages/homeassistant/helpers/entity_component.py”, line 217, in async_add_entity
yield from entity.async_device_update(warning=False)
File “/srv/homeassistant/lib/python3.6/site-packages/homeassistant/helpers/entity.py”, line 306, in async_device_update
yield from self.hass.async_add_job(self.update)
File “/usr/local/lib/python3.6/asyncio/futures.py”, line 331, in iter
yield self # This tells Task to wait for completion.
File “/usr/local/lib/python3.6/asyncio/tasks.py”, line 244, in _wakeup
future.result()
File “/usr/local/lib/python3.6/asyncio/futures.py”, line 244, in result
raise self._exception
File “/usr/local/lib/python3.6/concurrent/futures/thread.py”, line 55, in run
result = self.fn(*self.args, **self.kwargs)
File “/srv/homeassistant/lib/python3.6/site-packages/homeassistant/components/sensor/scrape.py”, line 120, in update
value = raw_data.select(self._select)[0].text
IndexError: list index out of range
2017-12-29 15:23:07 WARNING (MainThread) [homeassis
I tried to get it working for myself - but I think that website doesn’t work with the simple scraper. It has dynamic loading with updating every minute, so might be too complex for the simple scraper built into hass. If you do get it working please post here! I’d like to use it as well
2018-02-10 22:48:03 ERROR (SyncWorker_19) [homeassistant.components.sensor.rest] Error fetching data: <PreparedRequest [GET]>
2018-02-10 22:48:03 ERROR (MainThread) [homeassistant.helpers.entity] Update for sensor.melbourne_maximum fails
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/homeassistant/helpers/entity.py", line 201, in async_update_ha_state
yield from self.async_device_update()
File "/usr/lib/python3.6/site-packages/homeassistant/helpers/entity.py", line 308, in async_device_update
yield from self.hass.async_add_job(self.update)
File "/usr/lib/python3.6/asyncio/futures.py", line 332, in __iter__
yield self # This tells Task to wait for completion.
File "/usr/lib/python3.6/asyncio/tasks.py", line 250, in _wakeup
future.result()
File "/usr/lib/python3.6/asyncio/futures.py", line 245, in result
raise self._exception
File "/usr/lib/python3.6/concurrent/futures/thread.py", line 56, in run
result = self.fn(*self.args, **self.kwargs)
File "/usr/lib/python3.6/site-packages/homeassistant/components/sensor/scrape.py", line 115, in update
raw_data = BeautifulSoup(self.rest.data, 'html.parser')
File "/usr/lib/python3.6/site-packages/bs4/__init__.py", line 192, in __init__
elif len(markup) <= 256 and (
TypeError: object of type 'NoneType' has no len()
Odd thing is, that it appears as though the sensor is actually working and I am getting the forecasted maximum temp … perhaps it is my OCD … the constant error in the logs bugs me!
I use the Dark Sky component for Min/Max Temps as well as cloud cover in Melbourne and find it to be just as accurate as BOM, and much easier to set up than scraping.
I might give Dark Sky a go then. I was steering clear of it, as I had heard that it is not that accurate for Melbourne weather, but if you say you are actually using it, and it is OK, then it will cost me nothing to give it a go.
I would still be interested to know why this error is occurring though, as I am bound to use the scrape sensor for other stuff over my home automation journey.
Thanks @icaman004 … looks like Dark Sky has gotten me what I need. Still not sure why I was getting the weird errors with the scrape sensor, but I will concentrate on other things for now and worry about scraping when I need to worry about it.
I’m thinking of writing a 7 day summary that uses mdi icons and the publicly available BoM FTP forecast data. It’s a lot easier to parse than the HTML. E.g.
It’s still on my list of things to do but don’t hold your breath. It’s a bit of a low priority compared to the many other projects I have on the go at the moment. I expect it will be a few months before I get to it.