In the moment restarting HA is a common thing for me.
On every restart the last_changed timestamp gets updated.
Inspired by this topic Last changed/Last updated i made a script to get the real last_changed timestamp.
I need this for:
monitoring my old heating control with a sensor and if a temperature not changes in a few minutes, it crashed and i need a notification.
monitoring how long the playstation is on and send my son a message when his time is over.
…
When i’m fiddling with home assistant it could get cold at home and my son can play forever.
This works only with recorder on MySQL, but maybe someone can change it for sqlite or something.
May I ask what does the script report which entity? and how?
I am trying to use your script as I have mysql to update my location, I have two device trackers and need the most reliable one to update my location and show it on HA as a single entity.
As you can see in the command_line sensor above, the script gets called with one entity_id.
Then it fetches the most recent records for that entity and compares the states.
I have implemented this script using the built-in SQLite database (albeit I don’t modify the format or check for UTC):
# todo: error handling, think about utc
import sys
from sqlalchemy import create_engine
import pandas as pd
# define your home assistant db location below:
db = "/home/amadeus/.homeassistant/home-assistant_v2.db"
# specify entity_id as first argument to this script
# entity_id = "sensor.attic_office_temperature"
entity_id = sys.argv[1]
# connect using SQLalchemy
engine = create_engine(f"sqlite:///{db}")
conn = engine.connect()
# how many restart events can we expect?
num_restarts = 100
# build the query
query = f"""
SELECT
state,
last_changed
FROM states
WHERE
domain="sensor" AND
entity_id="{entity_id}" AND
state !="unknown"
ORDER BY last_updated DESC
LIMIT {num_restarts}
"""
# execute the query and load into a pandas dataframe:
df = pd.read_sql_query(query, conn)
conn.close()
# compare the results to a dataframe shifted one forward with backfill,
# extract the 1st changed value,
# and extract the last_changed value
real_last_changed = pd.to_datetime(df.last_changed[df.state.ne(df.state.shift().bfill())].iloc[0])
print(real_last_changed)
For my use case (automation to switch off heater is sensor goes stale), I may just create a sensor value that is how many hours since last changed (accounting for restarts) to make the comparison easier.
Hi! I tried to do it, but an error comes out. This is my configuration:
entity_id: switch.irrigation_pump
My db file is in directory config/home-assistant_v2.db
I created the scripts folder in /config folder and I created the file named real_last_change.py
real_last_change.py:
# todo: error handling, think about utc
import sys
from sqlalchemy import create_engine
import pandas as pd
# define your home assistant db location below:
db = "/config/home-assistant_v2.db"
# specify entity_id as first argument to this script
entity_id = "switch.irrigation_pump"
# entity_id = sys.argv[1]
# connect using SQLalchemy
engine = create_engine(f"sqlite:///{db}")
conn = engine.connect()
# how many restart events can we expect?
num_restarts = 100
# build the query
query = f"""
SELECT
state,
last_changed
FROM states
WHERE
domain="switch" AND
entity_id="{entity_id}" AND
state !="unknown"
ORDER BY last_updated DESC
LIMIT {num_restarts}
"""
# execute the query and load into a pandas dataframe:
df = pd.read_sql_query(query, conn)
# compare the results to a dataframe shifted one forward with backfill,
# extract the 1st changed value,
# and extract the last_changed value
real_last_changed = pd.to_datetime(df.last_changed[df.state.ne(df.state.shift().bfill())].iloc[0])