Thanks for the quick response. Please excuse my stupidity but how do I connect to the web ui from a browser?
Yes, you need tinycam pro for it, that is the only version that can host a webui to act as a proxy for the camera feed.
I got 2 Wyze Cam (v1, hacked) and found using their rtsp stream as ffmpeg camera in HA take a big toll on my server CPU.
Using TinyCam Pro is a way to shift that work load to my spare android tablet (and TinyCam can handle motion detection).
I hope there will be better rtsp camera support in HA or I’ll have to try hack the Wyze Cam again to do http/mjpeg server.
Btw, you can use camera=n (the order enabled camera?) instead of cameraID(unique random number) in HA mjpeg_url.
There is an api document link on the web server page http://whistler.loginto.me:8083/static/api.html which explains accessing streams from server and things you can do with their api.
Go vote for it here: https://forums.wyzecam.com/t/feature-request-home-assistant-intagration/3971
Since I set a username and password on the tinycam web server in the app, I had to include http://username:password@… As the url , and that got it showing.
Hey guys,
I got my Wyze Cam up and running in HA using the code from @forte with a custom raspberry pi image pre-installed with Android Things and TinyCam Pro: Just download the IMG here: https://plus.google.com/+willymarlian/posts/BpG8Fsh9xJm, write it to an SD Card, boot the pi, and setup TinyCam Pro to find your Wyze Cam. When it finds the cam, setup the brand to wyze labs and camera model to wyzecam. Set the credentials you use in the Wyze App, then configure tiny cam to run the web server on startup. Remove the password for the web server, go to the web server, select the live view and in chrome inspect the URL and copy it and use it in the code from @forte
Working good so far!
Can you show us a screenshot of how it looks? Also how does it update? Is the video updated every few seconds?
Not really… You mean the half cut off picture?
they will get rtsp and all the features necessary soon
Oh you mean the picture of the overview. Is just a card showing the camera. Mine is 5fps(I set that intentionally)
Fingers crossed🤞
Unfortunately I am not a programmer, but maybe this be ported to Home Assistant code?
It looks like the TinyCam source code has been shared by the author
Wyze has RTSP in beta testing. I’m going to wait until it is ready for release, but thinking it can be added easily using generic camera component
once RTSP is working.
Wyze RTSP is no longer Beta now:
I tried myself with official update app for iOS to 2.3.x and updated 3 cams v2 to fw described here: https://support.wyzecam.com/hc/en-us/articles/360026245231-Wyze-Cam-RTSP
I used ffmpeg camera component in order to add them to hass. works like a charm!
example of config:
camera:
- platform: ffmpeg
name: YourSuperWyzeCam
input: -rtsp_transport tcp -i rtsp://username:passwd@ip_address/live
Enjoy! )
i have mine added as generic with my still image coming from motioneye… you think ffmpeg gets any better performance. my images update every 10 seconds
With the latest RTSP beta firmware I have it working using the generic IP camera component:
# Camera
camera:
- platform: generic
name: Wyze cam
still_image_url: http://192.168.1.123/jpg
stream_source: rtsp://username:[email protected]/live
verify_ssl: false
Anyway to actually grab a still image or just use a stock icon on the dashboard?
Does anyone know what the difference is between generic platform and the ffmpeg platform? I’m guessing they both use ffmpeg.