Hello All,
I wanted to think outside the box by combining the power of AI (to help me code in Python) with that of VRED. The idea was to ‘import’ as much Home Assistant information as possible into an external tool specialized in 3D visualization. The current capabilities of this personal project:
- VRED session launched locally to communicate via MQTT
- Local and remote use/visualization
- Luminosity depending on sun position
- Floor selection and link to the HA session
- Removal of Google Maps background (+ ‘animated’ vehicles) when approaching the house
- For sensors: their connectivity or battery is visualized by a colored sphere above
- For texts: display of temperature, humidity, time, electrical power
- For show/no-show: hide/display objects (envelope, human, water drops) if mail in the mailbox, movement in front of presence/motion sensor, leak detected
- Day/Night processing with ambient brightness, vehicle light illumination and urban lighting, shading
- For actuators: most devices can be activated by clicking, with a yellow color applied to show their operation
- Clicking on a floor transitions the camera to a human view
- Visualization of a PC screen, cameras, Raspberry Pi screens, HX99G
- Animation of gates/doors/blinds/vacuum cleaner…, car entry and exit + gyro flashing, music notes above when media players are functioning… when one of the 2 doorbells is detected…
Enjoy the video : https://youtu.be/IqkmNFBgtds