I’m puzzled by the results I’m getting from as_timestamp
when fed datetimes over the change to BST tonight. The docs say that it returns the number of seconds since 1/1/1970 UTC so I had thought I could display this as local time with timestamp_custom/local=True. From experimenting, as_timstamp gives a different result if it is fed a TZ aware datetime string (ie it assumes the datetime string is in local time by default). I have a REST sensor that is returning non-TZ aware datetimes so I’m looking for an elegant way of getting the conversion right. This is what I’ve got:
Some of the raw input from REST:
[{'EventType': 'LowWater', 'DateTime': '2025-03-30T00:57:00', 'IsApproximateTime': False, 'Height': 0.3726451733974183, 'IsApproximateHeight': False, 'Filtered': False, 'Date': '2025-03-30T00:00:00'}, {'EventType': 'HighWater', 'DateTime': '2025-03-30T06:19:00', 'IsApproximateTime': False, 'Height': 8.551698547503939, 'IsApproximateHeight': False, 'Filtered': False, 'Date': '2025-03-30T00:00:00'}]
Looking at one just after the change to BST:
{{events[n].DateTime|as_timestamp}} 1743311940.0
{{(events[n].DateTime~'Z')|as_timestamp}} 1743315540.0
I had been processing the raw data to get a list of timestamps like this:
{{ events|map(attribute='DateTime')|map('as_timestamp',0)|map('int')|list }}
Which clearly is giving the “wrong” results, in that if I try to display them using the local=True
option in timestamp_custom
, I get the raw time not the BST time.
I’ve got this work around, but it strikes me as inelegant, surely there is a filter or a map
that will avoid having to loop through the entire list?
{% set ns = namespace(out=[]) %}
{%- for tide in events %}
{%- set ns.out = ns.out + [(tide.DateTime~'Z')|as_timestamp] %}
{%- endfor %}
{{ns.out}}
As ever, I’m grateful to the wonderful community for any guidance.