Rest Sensor - Getting JSON data, but from an infinite data/output , json that keeps on updating

Hey, i need some help with a sensor, for this setup i am creating a rest sensor, buts not like another JSON call, that just gives an output when requesting it, in this case , when i request the url, its a constant output being created

for example, if i open the url in chrome, it downloads me a file, that constantly filling with data, in my case new events…

how can i get those events as attributes ? offcourse , the next event should overwrite the current event

this is the command :

curl -i --digest -u admin:xxx http://192.168.0.70/ISAPI/Event/notification/alertStream

created a rest sensor like this , the scan interval is wrong, actually it doesnt need a scan interval at all, it just needs to be called 1 time only , so maybe i do need to set the scan interval verry high

  - platform: rest
    name: Hikvision Doorstatus
    authentication: digest
    username: !secret hikvision_username
    password: !secret hikvision_password
    scan_interval: 10000
    resource: !secret hikvision_stream_sensor
    value_template: Hikvision Doorstatus   
    json_attributes:
      - dateTime
      - eventType
      - AccessControllerEvent

the sensor is created, but no attributes
i see this tough in the log file :
2021-07-28 21:29:28 WARNING (MainThread) [homeassistant.components.sensor] Platform rest not ready yet; Retrying in background in 30 seconds

this is an example when i do the command in putty or chrome, you can see its a file being created with new data every xx amount of seconds…


Content-Length: 314

--MIME_boundary
Content-Type: application/json; charset="UTF-8"
Content-Length: 223

{
	"ipAddress":	"192.168.0.70",
	"portNo":	80,
	"protocol":	"HTTP",
	"dateTime":	"Wed, 28 Jul 2021 21:35:34 GMT",
	"activePostCount":	1,
	"eventType":	"heartBeat",
	"eventState":	"active",
	"eventDescription":	"heartBeat"
}
--MIME_boundary
Content-Type: application/json; charset="UTF-8"
Content-Length: 223

{
	"ipAddress":	"192.168.0.70",
	"portNo":	80,
	"protocol":	"HTTP",
	"dateTime":	"Wed, 28 Jul 2021 21:35:45 GMT",
	"activePostCount":	2,
	"eventType":	"heartBeat",
	"eventState":	"active",
	"eventDescription":	"heartBeat"
}
--MIME_boundary
Content-Type: application/json; charset="UTF-8"
Content-Length: 223

{
	"ipAddress":	"192.168.0.70",
	"portNo":	80,
	"protocol":	"HTTP",
	"dateTime":	"Wed, 28 Jul 2021 21:35:56 GMT",
	"activePostCount":	3,
	"eventType":	"heartBeat",
	"eventState":	"active",
	"eventDescription":	"heartBeat"
}
--MIME_boundary
Content-Type: application/json; charset="UTF-8"
Content-Length: 223

{
	"ipAddress":	"192.168.0.70",
	"portNo":	80,
	"protocol":	"HTTP",
	"dateTime":	"Wed, 28 Jul 2021 21:36:07 GMT",
	"activePostCount":	4,
	"eventType":	"heartBeat",
	"eventState":	"active",
	"eventDescription":	"heartBeat"
}
--MIME_boundary
Content-Type: application/json; charset="UTF-8"
Content-Length: 223

{
	"ipAddress":	"192.168.0.70",
	"portNo":	80,
	"protocol":	"HTTP",
	"dateTime":	"Wed, 28 Jul 2021 21:36:18 GMT",
	"activePostCount":	5,
	"eventType":	"heartBeat",
	"eventState":	"active",
	"eventDescription":	"heartBeat"
}

It is obviously not suitable for a rest sensor.

yeah, that was i already thinking :slight_smile:
what can i use instead?

i was trying to create a shell command on HA startup, that creates the file and saves to disk
then use the file sensor, to read out the file , but seems the curl command , with output, doesnt save instantly… its not fast enough, seems it saves likes every 1 min or something
obviously for a events sensor, its too late :slight_smile:

curl -i --digest -u admin:xxx http://192.168.0.70/ISAPI/Event/notification/alertStream -o /config/hikvision_stream.txt

I found this Record data stream from curl to text file - Stack Overflow which suggests that curl is not much good for streams. There are two valid suggestiona - writing a script (a node one is provided) or using the -N option to curl.

That’s indeed interesting, thnx for sharing!! Will have a look into it

Keep in mind that the File sensor only read the last line of a file at each update.
So it will only work if, somehow, you manage to create a single line per JSON “event”.

yeah, a file sensor is not the way to go
gonna test that node script , dont think i can use curl here then , also with the -N option,
its to slow then, first i need to curl => save to file => read file … thats not a good approach

When you say curl takes too long, it is never going to finish if you feed it an endless stream.

I think your whole approach is fundamentally flawed - curl is the wrong tool IMHO?

yes, gonna test that node script, hopefully it works
or maybe something in python, must be possible