Python script to add scraped value from an html page

I have started to dip my toes into very simple python code. I have managed to write a script that can find the sea temperature from a website and print the result as a string. How do I make a script that would display this value as an entity in HA?

At this stage I want to keep it as simple as possible, just getting the value once or twice a day and displaying it in HA.

Thank you

Didn’t the scrape sensor work? Or what is the reason to build it by yourself?

Oh, I wasn’t aware of the scrape sensor. But it is also for learning purposes.

Let’s say I would like to write a python script and place it in /config/python_scripts (which I did). How do I call it from hass and pass the result to an entity, e.g. sensor.seatemp?

My script:

import urllib.request
content = urllib.request.urlopen('http://worldseatemp.com/en/Spain/Almunecar')
html = content.read()
from bs4 import BeautifulSoup
soup = BeautifulSoup(html,'html.parser')
temp = soup.findAll("td", {"class": "temp"})
today = temp[0].string
yesterday = temp[2].string

icon = "mdi:coolant-temperature"

# Return sensor state
hass.states.set({ 
    'friendly_name': 'Vattentemp',
    'icon': icon,
    'SeaTemp': today
})

I am stumbling already on line 1, import with the following error:

  File "/usr/src/homeassistant/homeassistant/components/python_script/__init__.py", line 205, in execute
    exec(compiled.code, restricted_globals)
  File "sea_temp.py", line 1, in <module>
ImportError: __import__ not found

I would love to be able to accomplish this or something similar, both for the actual result, but also from a learnings experience. I have read through the documentation in hass and looked at the python section of the forum, but not yet found anything that helps me.