I need help getting Jellyfish safe Beaches shown in the Frontend

Hi there,

first of all - thank you for taking a look at my problem :wink:

second … i have a basic understanding of coding … as i am slightly older … i grew up with Turbo Pascal and did also some HTML/PHP coding.

third : what i want to do

living on Malta and having long summers, swimming is a pleasure here. I try to get each day to the Beach and swim some rounds. But … there are other swimming too … so there is a Website that shows “Safe Beaches” for Malta & Gozo - https://www.malteseislandsweather.com/

It would be nice to have a card in the Frontend that shows these Beaches and as an extra - i would like to select one Beach (that i frequent often) that is highlighted in this list - one look at the Dashboard and i know its safe to swim.

forth: what i did so far

  • i gave the builtin Scraper a try - 255 Char error was the result
  • i installed multiscraper with the same result
  • i discovered “attributes” but as the Beach-list is dynamic … i could´not find a way to tell how many “attributes” should be scraped. Also to have the Beaches in a Dropdown iss kind of counterproductive - no “with one look knowing if its safe or not” - it would be faster just to call the Website and scroll down.

– at this point i was very frustrated - i couldn´t understand why this is so hard. I remeber my early days in Website-parsing with php and did a quick research:

  • Cargo & htmlq installation on the underlaying Debian 11 … parsed the website and put all beaches in a textfile. Each Beach on one line … no HTML … just the Names. Copy file into the HA config Folder - cronjob for all 2 h

  • took me 10 minutes to do that i was very happy … and just googled “Textfile shown in HA” - just to get shown my place :smiley:

i could not display the content of a text file on the Frontend.

i am sure that at the end it should be possible to show the Beaches on one card but i am affraid that my code skills are to old and to low … anybody here who can help me getting this done ?

any help is very appreciated

havew a nice day


I have not done scraping but have a bunch of command line sensors that are looking at syslog etc., so I can tell you (and this may be of some help) - the command line sensors (as well as most likely the sensors you are using) only can handle up to 255 characters, so to make sure it is truncated please note for example the below command line sensor (note I am using “| cut -c-166” at the end of the line to truncate it to 166 characters returned to HA here as I don’t need the rest - but you could use something like “| cut -c-255” in your code possibly to resolve the first problem.

  - sensor:
      # Beginning 15 characters has when transmitted, in this format: Jul 11 20:27:53
      # Character 166 (the last one) is a comma after the actual time of the measurement 
      #  and shows in this format: 1657585620000, (which translates to 20:27:00 - or every minute they are sending)
      command: "grep 'baromrelin' /share/syslog | tail -1 | cut -c-166"
      scan_interval: 30
      command_timeout: 5
      value_template: >
        {% set time = strptime(now().year ~ ' ' ~ (value | regex_findall('(.+)kruse-pi') | first).strip(), '%Y %b %d %H:%M:%S') | as_local %}
        {% set str = time.strftime('%a %-m/%-d %-I:%M:%S%p') ~ ' (1 min)' %}
        {% set indx = str.find("M (") %}
        {% set oldstr = str[indx-1:indx+1] %}
        {{ str.replace(oldstr,oldstr.lower()) }}

The above sensor simply takes ean entry in the syslog that shows when acertasin kind of data is sent and strips out of it when the data is sent, to show this (in the red rectangle) in my dashboard:

For the second issue (unable to determine what information is being sent back), I would suggest you look through your logs (or just manually the web site) and try to get a copy of the entire string that is typically returned. Then you could possibly in your scraping before the information is returned to HA, control the size of each piece of data. Also within your browser (for example if you are using chrome), to get more details I would suggest looking in the three-dot chrome menu, then select from that menu, “More tools >” and “Developer Tools”. Then it will show you the “guts” (for lack of a better term) of the code behind the web page, that may be useful if you can find the part of the data that is returned, then you can find what to strip out of it (I am by no means an expert in this area but very interested as well as I plan on using some scraping sensors in the future)…

Hope that helps!

1 Like

Greetings …

it helped a lot … thank you … at the moment i struggle with the command line …
i am using a debian lite 64 bit as base … on that i have Casaos for Containermanagment and here i have my HA Container. As far as i understand … all uses that debian lite as BaseOS!?

When i ssh on the raspberry pi … i can login and execute all commands :

Grep/Head/cut etc … i get my file cut at the position i choose …

but when i put

  - sensor:
      command: "sudo grep -w Sliema Beach.txt"
      name: SafeBeaches

in my configuration.yaml

the HA Log has this entry :

Logger: homeassistant.components.command_line.utils
Source: components/command_line/utils.py:54
Integration: Command Line (documentation, issues)
First occurred: August 15, 2023 at 01:34:57 (4441 occurrences)
Last logged: 14:35:02
Command failed (with return code 127): sudo grep -w Sliema Beach.txt

127 is command not found.

Exactly the same Command runs in Terminal like a charm.

is my assumption wrong that all apps (Casaos/HAContainer) uses the Debian Lite as OS and my HA Container OS is missing all the basic UNIX commands?

thx :wink: