Snapshots from a Nest camera so I can do object detection / camera_view has to be set to live to see on dashboard

Hey Everyone,

This one is puzzling me a bit.
I got a new Nest outdoor security camera and set up the integration with home assistant. My plan was to use this with something like deepstack to do object detection but when I try to send a snapshot (using the snapshot service) to my deepstack API I just get a black picture.
After more research I discovered in the Nest Home Assistant Docs my camera (outdoor battery) doesn’t support snapshots.- Nest - Home Assistant

However surely there is still some way to capture my “live image”
If I add a picture card to my dashboard and set the camera_view to live then I can see the live stream on my dashboard. However if I leave it in auto then it’s just the black picture. I was wondering if there was some flag or something I can use to snapshot the live view and not whatever the auto is showing if this makes sense

This is what I see when I leave camera_view as auto

But when I switch to camera_view live then I get the correct image

Reading this it looks like a live stream has to be activated before it will send anything which makes sense but it does seem like the nest integration with HA is doing this fine. The problem I see at the moment is it doesn’t look like there’s any way to do this kind of before it takes a snapshot of the feed.

It’s just annoying that I can still see it fine if I have the correct option set in the picture card. It would be awesome if I could set the same option in the snapshot feature

1 Like

I was just trying to get this working myself, thanks for the details.

I think this is where we need to move from the nest integration supporting WebRTC to Home Assistant itself supporting that stream type.

Hi, I’m the nest integration author who added webrtc support.

The docs say pretty clearly, what is supported today based on what I’ve been able to add, as you have read. I would not suggest using auto as it reopens a webrtc steam every few seconds.

Webrtc works totally different than other stream types and is currently client side and doesn’t work in the core. I continue to look for ways to improve home assistant support for nest, just don’t have any promises yet.

Like… I’d make it work with snapshot if it were easy right? Sorry this is “annoying” for you but perhaps another way to look at this is to be super excited that webrtc cameras work in home assistant at all :slight_smile: as I am. (I’m confused, the docs say it’s not supported sure why you’re asking how to support it?)

Presumably you’ve seen the docs about media player support and snapshots for events? Big improvements there in the next release though I haven’t fully documented how to call the media apis to get snapshots yet.

1 Like

So, not sure if you were replying to me or to the author, but for what it’s worth I meant no disrespect by any of my comments. The work you’ve done on the nest integration has been top notch, and it’s totally not your fault Google hasn’t bothered to implement some of these features correctly. You are doing the best you can with the tools you are being given.

My comment is more about the fact that, for Instance, I can call /api/camera_proxy/camera.front_door for an image, but all it hands me is a black square (because it’s not supported). That’s a really inelegant way to fail. It should throw an error like

WebRTC format not supported

when you call that API or some such, not silently fail and do something confusing. That’s a problem (I assume) in the way the HA Core handles camera feeds. Since it doesn’t KNOW what WebRTC is, it can’t handle the error correctly, or even better, initialize the feed and grab an image.

My major frustration is that none of Google’s roadmaps are documented anywhere we can see, so we buy these products and basically just have to hope they do a reasonable job of implementing the API. What’s even worse is that there is obviously more of the API they could expose and they just choose not to, like with the flood lights. Just 0 ability to control the lights from anything other then their platform at all.

I’m replying to the OP.

The Nest API support for WebRTC is fine, in my opinion. We decided not to implement native webrtc support in home assistant at the current time, given nest is the only user of native webrtc cameras at the moment.

Edit: If you’re making purchasing decisions of the new cameras based on the APIs, i think they pretty clearly articulate what is supported and what is not at the time of purchase. Support for snapshots can be implemented just fine using the web rtc APIs they have provided. I’m exploring doing this some day down the road with an add-on, it just needs to be implemented by someone excited about working on it. No need to be frustated, someone just has to write the code.

So, for what it’s worth, Nest isn’t the only ones doing WebRTC, they are just the biggest.

https://antmedia.io/

Just to name 2 other platforms that are implementing it as the primary way to share camera feeds…not to mention the possibility that users could DIY their own security cameras using webcams with a half proper implementation. It just feels like this is something that would be beneficial to the idea of an open, local smart home. More choices, yeah?

As far as the communication from google goes, I agree that they do a good job of explaining what is supported currently. My gripe is the lack of communication about what they are working towards supporting in the future. For example, I have 2 nest floodlights. I bought them understanding I would have limited support with the API as it exists today. What I didn’t know was that I couldn’t control WHICH events turn the floodlight on, only ALL events or NO events. IF google were to add support for turning the floodlights on and off, I could have other sensors or even other cameras handle that instead of relying on Google to implement this feature in a more useful way…but I have no way to know that one way or another. That means I either get to wait however long it takes them to implement it, if ever, or jump platforms and make compromises in some other way.

If there are other camera platforms integrating with home assistant with webrtc i’d be happy to help the integration authors make them work with webrtc. To be clear, I’m not trying to say webrtc important, just that there aren’t any other integrations using it yet.

Hi check out Nest - Home Assistant which describes the new media source APIs to get media for events if supported by the camera.

1 Like

Saw that, but it seems to be broken for me for some reason? It gets ~2/3 of the available ‘people’ videos, but then skips the others. It’s puzzling, I haven’t figured out the pattern yet.

So, has 2022.3 made a first step towards getting these thumbnails working? Since it now uses the live feed to generate thumbnails? Or did I read the related pull request wrong?

What I really would like is to have a service I could call to request a snapshot and let it be the active snapshot for the camera entity. Then I could display the camera entity in a Lovelace dashboard and the just captured thumbnail would show. Is this possible?

I tried using the camera.snapshot service but it results in that black image with gradient that OP posted. I’m using a Nest Doorbell Battery.

1 Like

Looking for this exact thing for a nest doorbell camera. I’d love to get a snapshot as live feed won’t work well for the battery even when plugged in.

You can see what the API supports here Nest - Home Assistant or click through to the link on the nest site. (Nest doorbell battery does not have snapshot support.)

Is it possible to get the picture glance card to show the clip/preview instead? I tried to point the Image Location at the folder where the clips are stored but it didn’t work.

Google Nest - Home Assistant says has a section that describes the web rtc cameras and recommendations on how to use picture glance card if you want to preview the stream:

WebRTC: These devices support direct browser to camera communication and a super low latency
stream. A Picture Glance Card can show the live stream in the grid with the Camera View set to live
(not recommended for battery-powered cameras). camera services like stream recording are not supported.

Does that address what you need or are you specifically trying to watch a clip instead of the actual live feed?

As for showing the last event, it likely needs a custom card, but is definitely possible (e.g. its able to render in the media browser, for example)

1 Like

Thank you for the response! I don’t think showing the live feed will work for me as it’s the nest doorbell battery. I have it wired but IDK if that’s enough to keep the battery going. Let me look into some custom cards to see if I can pull the latest clip in a folder and display it till a new clip is created. That’s what I’m looking for since snapshot is not available. I’d also be okay with just displaying the last person thumbnail.

service: camera.record
target:
  entity_id: camera.living_room
data:
  filename: >-
    /config/www/nest/living_room/living_person_detected.mp4
  duration: 20
camera_view: auto
type: picture-glance
image: http://HASSipADDRESS/local/nest/living_room/snapshot.jpg
entities: []
title: Kitchen
tap_action:
  action: url
  url_path: http://HASSipADDRESS/local/nest/living_room/living_person_detected.mp4

did you manage to get it to work?

No, I ditched the doorbell, and am ditching nest entirely in the next year. All of my use cases are basically not supported by their API.

I feel like Nest devs need a magic kick to get the bare minimum we need:

  • clip previews CameraClipPreview for most of the cams (at least when wired to power)
  • Nest Cam Doorbell (wired) SDM API support. It’s been 3-4 months and they still don’t have it.