Real-Time Sync Between PC Monitor and Lights Similar to Hue Sync

Hi :wave:

I’m trying to replicate a feature I greatly miss from the Philips Hue - real-time syncing of my PC monitor with my lights. The ability to have my lights dynamically change in accordance to whatever I’m watching on my monitor was so nice, and I was wondering if there is any reliable way to achieve this with Home Assistant.

I’ve stumbled upon a project (HASS Light Sync) which comes close. It does sync a single light with the content displayed on a Windows PC monitor, but it is very choppy and delayed compared to Hue Sync.

I actually tweaked their code a little bit to build my own version that syncs multiple lights according to multiple screen zones, which did work, but made the performance even worse. Has anyone successfully implemented or come across a more efficient method to achieve smooth, real-time light syncing with their PC monitor?

Thanks!

Here’s the code I have right now, mostly thanks to Domi2803

#[allow(unused_must_use)]

extern crate captrs;
extern crate reqwest;

use captrs::*;
use std::{time::Duration};
use console::Emoji;

use serde::{Deserialize, Serialize};

#[derive(Serialize, Deserialize)]
struct Settings {
    api_endpoint: String,
    light_entity_names: Vec<String>,
    token: String,
    grab_interval: i16,
    skip_pixels: i16,
    smoothing_factor: f32,
    monitor_id: i16,
}

#[derive(Serialize, Deserialize)]
struct HASSApiBody {
    entity_id: String,
    rgb_color: [u64; 3],
    brightness: u64,
}

#[tokio::main]
async fn main() {
    let term = console::Term::stdout();
    term.set_title("HASS-Light-Sync running...");
    
    println!("{}hass-light-sync - Starting...", Emoji("💡 ", ""));
    println!("{}Reading config...", Emoji("⚙️ ", ""));
    
    let settingsfile = std::fs::read_to_string("settings.json").expect("❌ settings.json file does not exist");
    let settings: Settings = serde_json::from_str(settingsfile.as_str()).expect("❌ Failed to parse settings. Please read the configuration section in the README");

    println!("{}Config loaded successfully!", Emoji("✅ ", ""));

    let steps = settings.skip_pixels as u64;
    let grab_interval = settings.grab_interval as u64;
    let smoothing_factor = settings.smoothing_factor;

    let mut capturer = Capturer::new(settings.monitor_id as usize).expect("❌ Failed to get Capture Object");
    let (w, h) = capturer.geometry();
    let half_h = h / 2;
    let half_w = w / 2;
    let size = (w as u64 * h as u64) / steps / 3;

    let client = reqwest::Client::new();

    let mut last_timestamp = std::time::Instant::now();

    loop {
        match capturer.capture_frame() {
            Ok(ps) => {
                let mut regions_avg = vec![(0, 0, 0); 3]; // top left, top right, bottom

                let mut counts = vec![0; 3];

                for (index, Bgr8 { r, g, b, .. }) in ps.into_iter().enumerate() {
                    let x = index as u32 % w;
                    let y = index as u32 / w;
                    let region = if y < half_h {
                        if x < half_w { 0 } else { 1 }
                    } else {
                        2
                    };

                    if counts[region] % steps == 0 {
                        regions_avg[region].0 += r as u64;
                        regions_avg[region].1 += g as u64;
                        regions_avg[region].2 += b as u64;
                    }
                    counts[region] += 1;
                }

                for region in 0..3 {
                    let (total_r, total_g, total_b) = regions_avg[region];
                    let count = counts[region] / steps;
                    let avg_r = total_r / count;
                    let avg_g = total_g / count;
                    let avg_b = total_b / count;

                    let brightness = *[avg_r, avg_g, avg_b].iter().max().unwrap();
                    send_rgb(
                        &client,
                        &settings.light_entity_names[region],
                        &[avg_r, avg_g, avg_b],
                        &brightness,
                        &settings.api_endpoint,
                        &settings.token,
                    ).await;
                }

                let time_elapsed = last_timestamp.elapsed().as_millis();
                last_timestamp = std::time::Instant::now();

                term.move_cursor_up(1);
                term.clear_line();
                println!("{}Colors sent. FPS: {}", Emoji("💡 ", ""), 1000 / time_elapsed);
                std::thread::sleep(Duration::from_millis(grab_interval));
            },
            Err(error) => {
                println!("{} Failed to grab frame: {:?}", Emoji("❗ ", ""), error);
                std::thread::sleep(Duration::from_millis(100));
                continue;
            }
        }
    }
}

async fn send_rgb(
    client: &reqwest::Client,
    entity_id: &str,
    rgb_vec: &[u64],
    brightness: &u64,
    api_endpoint: &str,
    token: &str,
) {
    let api_body = HASSApiBody {
        entity_id: entity_id.to_string(), // Convert &str to String
        rgb_color: [rgb_vec[0] as u64, rgb_vec[1] as u64, rgb_vec[2] as u64],
        brightness: *brightness,
    };

    let _response = client
        .post(format!("{}/api/services/light/turn_on", api_endpoint))
        .header("Authorization", format!("Bearer {}", token))
        .json(&api_body)
        .send()
        .await;

    match _response{
        Ok(_res) => {
            if _res.status() != 200 {
                println!("{}Connection to Home Assistant failed: HTTP {}", Emoji("❌ ",""), _res.status());
                std::process::exit(0);
            }
        },
        Err(e) => {
            println!("{}Connection to Home Assistant failed: {}", Emoji("❌ ", ""), e);
            std::process::exit(0);
        }
    }
}

1 Like

Hi @mr-elephant
Any update on your work since March ?
I am also looking for this feature directly from homeAssistant.
Did you get some success with your script ?