Support for Casting cameras through Google Assistant

Google assistant has special support for camera type, allowing users to cast it into cast-enabled devices. Ideally HASS should expose stream from camera in one of the supported streaming protocols. This however might require some extra work.

As a minimum, we could add support for casting images using camera_proxy API. The challenge is however that chromecast discovers the media type using extension, so the current endpoint won’t work.


My suggestion is to change/add to the api ability to pass extension (.jpg) here. Then, adding support for casting picture from camera will be trivial (I have working POC already)

Why it matters? It will actually enable ‘Show me who’s at the door’, show me my living room etc, for all cameras (notably old phones converted to ip cams).

Additional information:

class CameraStreamTrait(_Trait):
    """Trait to offer basic camera stream as image functionality.

    commands = [

    def supported(domain, features):
        """Test if state is supported."""
        return domain in (

    def sync_attributes(self):
        """Return CameraStream attributes for a sync request."""
        return {
            'cameraStreamNeedAuthToken': False,
            'cameraStreamSupportedProtocols': []

    def query_attributes(self):
        """Return CameraStream query attributes."""
        picture_url = (self.hass.config.api.base_url +
        return {'cameraStreamAccessUrl': picture_url}

    async def execute(self, command, params):
        """Execute an CameraStream command."""

Sounds as though you are well on the way to your first pull request.

The thing is that I don’t know if changing the pattern will be safe. The default content type of camera picture is set to jpg, although it can be different per camera. And for some cams (like for example LocalFile) the content type can be different depending on the guess_type output. So the only way (or proper way) to do this is changing CameraImageView to actually resolve the type first and register the method with correct extension based on mime type. This however is way out of my league, as my Python skills and knowledge how HASS is build are next to zero :wink:

Thanks for the work so far! I’d love to see this fully implemented.

I wonder if it would be easier to add support for a different format of camera proxy URL, for example, something like:


and then the code would check for an entity_id parameter first, and the camera.jpg would only be used as the filename served to the browser / google endpoint.


I’ve recently done a little work to make rtmp streams into dash/hls streams using nginx (; I’m thinking of making a PR that utilises the trait that @mariuszluciow made but essentially add another variable to cameras in their config (not sure how that would work with the unifi video component) but essentially just like we have the “google_assistant*” properties we’d add another one called “google_assistant_stream” to cameras where you could define your HLS/DASH stream uri

What do you think?

1 Like

I have WyzeCam running the dafang software, they already do HLS/DASH using H264/MP3, so I wouldn’t need to do the nginx proxy as Google Cast docs and H264/MP3 is a supported format. If you do a PR to add google_assistant_stream option would be awesome to have the camera show up on Google Home.

By the way how did you got the Unif to show up in the Hub? Im debating buying a two of them to do the same for the external house cameras.

Well, currently Camera component in HASS exposes only mjpg stream, so even if your camera supports the H264/MP3 I still cannot use it, as it’s not exposed as readable data in HASS. Theoretically the camera entity could enrich its supported_features configuration, including information if H264/MP3 is supported, with additional property containing URL to the stream. But still that’s quite substantial change. @danjenkins is that more or less what you propose?

@mariuszluciow yeah as a first pass I wasn’t suggesting hass actually do any exposing of the HLS/DASH streams - just an extra config on the camera itself (we’d need to do something to help setup the platforms that don’t mean setting each camera up like setting a “base_cast_url” or something)

But i think in its most basic form, a camera (for example the generic IP camera) just has an extra attribute on it “google_assistant_camera_stream_url” with “” as the value. That then allows the camera trait to be added into google assistant with that url (allowing you to be able to say “show me my front door on the living room hub”)

Not saying to not do the standalone picture casting too - not everyone will be able to have HLS/Dash/MP4 streams external to home assistant

Make sense?

I just added my BlueIris cameras using the ‘mjpeg’ platform; I was hoping cameras in HA would flow through to Google Home as cameras, and when that didn’t happen I stumbled upon this.

Regarding the stream format, based on this implementation, shouldn’t the existing mjpg streams work?

If you will install a proxy converting stream on the fly it will. Anyway, for mjpeg I have a work around, if you have an android phone hanging around: Casting Anything (including android_ip_cameras) through google cast

You may also be able to use the MotionEye Hassio addon in place of BlueIris to create a unauthenticated feed from your existing cameras

I’d love something like this. Instead of adding to the camera domain, I propose adding it to the Google Assistant entity_config:

stream_url: http://camera/stream/video.mp4

Looks like they’re going to put this into the core of HA -

Oh sweet! I missed that. Thanks for pointing it out.

I’m maybe missing something here, but looking at that issue it appears to be talking about transcoding the video - is there any reason for the need to do this (if your camera provides MJPEG)?

I currently use MotionEye for my cameras and it provides an MJPEG feed, this is playable on the Google Home Hub by just starting media playback and specifying the type as image/jpeg.
I played around with HLS/WebM and all sorts but the delay was insane… some about 30s behind realtime.
MJPEG is about 500ms behind live…

Probably for cameras that have sound.

Good point.
I hope that mjpeg would be an option though, as there are certain scenarios where sound is not required, and not having to transcode would be a big bonus. (Baby camera, outdoor security camera etc)

@mariuszluciow You mentioned in the OP that you have a working POC for this - can you provide some details on how we may accomplish getting cameras exposed via HA ?

I already have my feeds in a format that chromecast displays (mjpeg), and have the Google assistant setup from here (not using HA cloud)

Is there some code that I can change to expose the cameras ?

In the first post. You need to add a trait, exposing URL to Google Assistant. However, it won’t work as camera trait expects to link to the encoded stream, not mjpeg (unless something changed recently)

Thank you - that’s a shame about the mjpeg, I already use home assistant to start a cast of the mjpeg stream from my camera manually and it works great.
Everything else (transcoding etc…) added far too much latency to the video stream (i’m running Pi Cameras)