No voice commands used:
https://imgur.com/gallery/jIgBmiQ
WARNING: Voice control with Google Assistant, better kill your Google assistant before watching this
Commands: prepare pizza | prepare quesadilla | test quesadilla | test homeassistant video
Voice commands are not fixed (set via Google Home App) - they are dynamic, accepting any query.
The vocal control with Google Assistant was a bit tricky since the only way to get an assistant query in HomeAssistant was by using IFTTT
Commands breakdown: prepare āfoodā will play videos from the same channel while test āsomethingā is not restricted, will broadcast the first YouTube video that matches the query; There is also a 3rd command - mix, that is also channel bound but related to mixology.
This project is not complete yet. Iāve been planning this for over a year (is one of the reasons I installed rail lighting) and Iāve been working on it (on and off) for the last 6 months to get it to a state where is stable and fun to use, but since it looks a lot better than I ever expectedā¦here we goā¦another big ass postā¦
I still want to buy another projector (the 5th, just for this project), but this time a non-smart one, since I believe I can make it even better without Android annoying boot times and I will differently do a complete code overhaul once I decide what I want from it and how I will use it most oftenā¦so a part II of this might happen.
Since I have this installed on the kitchen furniture, I wanted to give it the feel of a kitchen assistant . . . so alcohol-related stuff, maybe cooking and as soon as I integrate a food delivery system in Home Assistantā¦ there will defiantly be a part II.
HARDWARE
The basic idea is very simple; I have rail lighting with Cinema Track Projectors with the E27 socket, bulb, and the rest of the electronics stripped out and everything inside was replaced with a DLP P09 projector on a stand and a DIY power supply-adaptor-thingie (official name).
The projector has a standard tripod screw thread which allows me to fit it on the stand and then, through extremely precise eyeballing measurements I managed to fit that stand perfectly (ish) horizontal inside the light projector. . . and any hypothetical misalignment is solved by software/keystone correction.
The power supply-adaptor-thingie (that is completely wrapped in electrical tape on the final build) is made out of a random modular power plug connected to a Shelly 1 relay following this connection scheme. I would recommend using the Shelly 1PM 220V relay instead but I only had a dry contact at hand.
The supply-adaptor-thingie is there to power the projector from its original charger, which I didnāt want to cannibalize just in case I decide to take it out at some point, and is electrical input comes from what was initially at the ent of the E27 light socket, getting power from the lights electrical rail.
The point of the relay is to regulate charging of the projector, since it is powered from the light network, I donāt want it charging every time I turn on the lights (even for only 5 minutes) to preserve the life of the battery.
The projector comes with an integrated battery so it will work even if the lights are off (since it is powered as a light bulb)ā¦it will also work when the power is completely offā¦but probably no netflix/internet/etc.
CONTROL
This is an Android projector. So you have WiFi and ADB to control it. The greatest challenge (check the end of the postā¦) is turning it on; this particular model has an IR port on the front so Iām using a Broadlink (in my kitchen) to turn it on. It also has a physical power button (most projectors this size come with a touch interface)ā¦so worst-case scenario, you can connect a smart relay in parallel with the button and you have control.
After is turned on, there are no more problems. You can add some peripherals to it, Iām only using a H18 keyboard.
ADB seems slow sometimes so I prefer using Fully Kiosk Browser API (if you never heard of it, donāt judge it by its looks/logo.etc, - itās amazing)
I prefer turning it on and changing the screens simply using my light switches that can control anything in the home. . .if you ever have way too much time, you can check that integration as well
KITCHEN ASSISTANT SCENARIO
TL;DR LOGIC:
- I ask Google Assistant for something
- I get that certain something in Home Assistant using IFTTT
- From HA itās sent to a bash script that builds a Youtube URL around it and sends the URL back to HA
- HA is casting the final URL to a device
The IFTTT integration is as follows:
-
On HomeAssistant side you will need to set it up as an Integration and save the URL you will get.
-
On the IFTTT Applet IF you will need to use Google Assistant - Say a phrase with a text ingredient
-
On the IFTTT Applet THEN THAT you will select Webhooks - Make a web request
- Under URL you need to use the URL that you got in HomeAssistant in the first step
- Method:
POST
- Content Type:
application/json
- Body:
{ "action": "holodeck", "cast": "{{TextField}}" }
{{TextField}} is going to be your Assistant query saved under ā$ā in the IF phase
So far IFTTT will send to your HA instance your voice command, but in order to use it you have to intercept it in HomeAssistant using an automation like this:
- id: ifttt_cast
alias: ifttt_cast
initial_state: on
trigger:
- platform: event
event_type: ifttt_webhook_received
event_data:
action: 'holodeck'
action:
- service: input_text.set_value
data_template:
entity_id: input_text.holo_query
value: '{{ trigger.event.data.cast }}'
- service: shell_command.holodeck_x
This automation is triggered by an ifttt event, and will add the string received (your assistant query, from ā$ā or ā{{TextField}}ā) to a input_text.
input_text:
holo_query:
name: Holo
initial: ' '
holo_video_url:
name: Video URL
initial: ' '
Everything up to the last service from the automation above, will only get a Google Assistant voice command string in Home Assistant; the last service is what you actually want to do with it.
In my case, this is the shell_command.holodeck_x:
holodeck_x: sudo sh -c 'cd /home/pi/.SCRIPTS/HOLO && sudo bash holodeck_x.sh {{states("input_text.holo_query") | replace(' ', '+') }}'
In the future, I will integrate this completely in HA, but for now, I have it as an independent shell script.
This script accepts an argument, and the command above will send your Google Assistant query as the argument to that script.
If you ask the recipe for a āMargaritaā there will be no problems but if you ask for something like ājack on the rocksā (just an exampleā¦I donāt actually need a recipe for that) the spaces between the words will break the URL, so the | replace(' ', '+')
part is to ensure that the URL stays Youtube friendly no matter what you say, by replacing spaces with ā+ā
The purpose of the script is to generate a video ID for a Youtube video corresponding to your query, build the corresponding URL (embedded and set to autoplay) and send the final URL back to HA, to use it as you want. (I donāt send it to the projector because I might want to use it on a smart mirror or on some other DIY display)
In order to get it working you will need to get a Google Project API key and a HomeAssistant Long-Lived Access Token and pass them to the script below.
- Google API:
- Go to https://console.cloud.google.com/
- Creat a a project
- From the LEFT menu - select APIs & Services - Credentials
- From the top bar - ā+ Create credentialsā - API key
- Home Assistant Token
- Go to http://your.homeassitant.ip:8123/profile
- scroll down to the bottom - Long-Lived Access Tokens
- Click āCREATE TOKENā
holodeck_x.sh:
#!/bin/bash
# CREDENTIALS
API_KEY="YOUR_API_KEY"
# VIDEO DATA
QUERY=$@
VIDEO_ID=$(sudo curl "https://www.googleapis.com/youtube/v3/search?part=snippet&maxResults=1&order=relevance&publishedAfter=2018-01-01T00:00:00Z&q=${QUERY}&safeSearch=none&type=video&key=${API_KEY}" | grep "videoId" | awk '{print $2}' | tr -d \")
# VIDEO URL
URL="https://www.youtube.com/embed/${VIDEO_ID}?autoplay=1"
# # TESTING
# echo ${URL}
# HA INTEGRATION
curl -X POST -H "Authorization: Bearer your_homeassistant_authorization_bearer" \
-H "Content-Type: application/json" \
-d '{"state": "'"${URL}"'", "attributes": {"editable": "false", "min": "0", "max": "100", "pattern": "null", "mode": "text", "friendly_name": "Video URL"}}' \
http://192.168.0.100:8123/api/states/input_text.holo_video_url
Finally, we have the final URL, Youtube friendly, and ready to be broadcasted to a device, in this case to a smart projector. (Planning to exploit this integration a lot more in the future)
The bash script will send the URL to another input_text, and from there will be intercepted by an automation.
If you manually write/paste an URL in that input_text, in HA UI, it will trigger the automation and the manually entered URL will also be broadcasted to your device.
The projector is powered by Android, so I could cast it directly to it via ADB and wait for the YT app to launch, but it takes way to long, so Iām using Fully kiosk Browser - amazing app, very useful in this project, but not mandatory.
Automation:
- id: holodeck_broadcast
alias: "HOLODECK BROADCAST"
initial_state: on
trigger:
- platform: state
entity_id: input_text.holo_video_url
action:
- service: script.holodeck_cast_link
- delay: 00:00:03
- service: script.daydream_off
Script:
holodeck_cast_link:
sequence:
- service: homeassistant.turn_off
entity_id: automation.holodeck_random
- service: shell_command.daydream_off
- service: shell_command.holodeck_cast_link
For now you can ignore the first 2 services.
shell_command:
holodeck_cast_link: curl "http://your.projector.ip:2323/?password=YOUR_KIOSK_PASSWORD&type=json&cmd=loadUrl&url={{states.input_text.holo_video_url.state}}"
At this point, I have 3 IFTTT applets that eventually link to 3 different bash scripts. I will most probably unify everything in the future, or just scrap what Iām not using. The case presented so far will create a YT link based on popularity alone, basically, it will be the first video that YT lists if you manually search for something.
The other 2 are channel-specific; one for food and one for mixology; the main difference is in the bash script under VIDEO DATA. There is a static CHANNEL_ID (use whatever you need) and then it is added to the URL template &channelId=${CHANNEL_ID}
# VIDEO DATA
QUERY=$@
CHANNEL_ID="UCRIZtPl9nb9RiXc9btSTQNw"
VIDEO_ID=$(sudo curl "https://www.googleapis.com/youtube/v3/search?part=snippet&channelId=${CHANNEL_ID}&maxResults=1&order=relevance&publishedAfter=2018-01-01T00:00:00Z&q=${QUERY}&safeSearch=none&type=video&key=${API_KEY}" | grep "videoId" | awk '{print $2}' | tr -d \")
HOMEASSISTANT UI SCENARIO
If you can cast any URL it only makes sense to have a full HomeAssistant interface on display. Not as practical as a wall-mounted tablet, but a lot more futuristic. For me, this is the default view of the projector.
Iām using Kiosk Browser for this and I have an automation that randomizes the lovelace tabs to be casted.
- id: holodeck_random
alias: "HOLODECK RANDOM"
initial_state: off
trigger:
- platform: time_pattern
minutes: "/10"
action:
- service: script.turn_on
data_template:
entity_id: '{{ ["script.poltergeist_uix", "script.poltergeist_climate", "script.poltergeist_electricity", "script.poltergeist_player"] | random }}'
script:
poltergeist_uix:
sequence:
- service: shell_command.daydream_off
- service: shell_command.poltergeist_uix
shell_command:
poltergeist_uix: curl "http://your_projector_ip:2323/?password=YOUR-KIOSK-PASSWORD&type=json&cmd=loadUrl&url=http://your_homeassistant_ip:8123/lovelace/UIX"
SMART MIRROR INTERFACE
Since Iām using MagicMirror and Iām casting stuffā¦it makes sense to have it integrated as well.
Funny thing is that this is actually the only integration I could use considering that my smart mirror setup is really powerful, being fully integrated with any voice assistant and Home Assistant.
Full details about that in thisā¦slightlyā¦smaller post.
This part is mostly useful since I have my phone notifications, news and other stuff mirrored . . . on the mirror.
The Smart Mirror is casted just like Home Assistant UI presented above.
MOVIE STREAMING
All my drive for automation comes from movies so if a streaming platform has an Android app or a web app it will most certainly not be forgotten.
It is absolutely amazing that the lighting projector Iām using can rotate 360 degrees horizontally on itās rail and it is able of 360 degrees vertical adjustment as well . . . ish (until it hits the ceiling); that combined with auto keystone and manual keystone adjustment of the projector means that it can be rotated or moved and project any of this on any wall/area of the room is installed it.
ā¦next project - I need to motorize itā¦
FULLY FUNCTIONAL ANDROID DEVICE
At this point there is no surprise - there is a fully functional Android 9.0 device with all itās perks and drawbacks (boot time is the main problem).
Add a wireless keyboard or any peripherals and you have access to any android app. Add a jack 3.5mm microphone and you can use the integrated Google Assistant / Alexa / whatever; same microphone + fully kiosk browser or tasker and you can have the projection change depending on noise levels, etc.
Iām thinking about using tasker to automate some delivery services that donāt have APIs.
Maybe call for an uber and see it on the map on the projectionā¦you get the pointā¦itās nice to have a functional OS as backup.
CHALLENGES (A.K.A. dumb shit)
The greatest challenge with this project was turning it onā¦I shit you notā¦it was exhausting. . . everything else is trivial, but turning on something like this (installed in this manner) almost drove me insane.
The Android is heavily modified and even if you have access to bootloader/fastboot it will not accept any commands; the idea was to modify oem off-mode-charge 0 to boot when charger is connected / relay powered.
Then I tried to re-mount the Android /system partition to be able to write on read-only system files in order to replace the charging icon with a boot/reboot scriptā¦to start when the charger is connected / relay poweredā¦the result was even less promisingā¦I had to buy a new motherboardā¦
On my third attempt, I got really closeā¦if that would have worked I would actually be impressed with my ingenuity (spoiler: it didnāt).
The problem was that the IR port was on the back of the device, which is encased in aluminum (by the light projector) so no remote can reach that portā¦so what is the next logical step? if something is not where you want it to be - you move itā¦
Therefore, since I had scrap parts already, I desoldered a 38 kHz IR receiver from a previews candidate and soldered it parallel with itās existing IR receiverā¦but not directly on the motherboardā¦I used some wires to have the receiver mobile and at a distance; sure, I had a little dick hanging out of the projectorā¦but it was workingā¦until I turned it offā¦
That was a DLP X2 that comes with 2 remotes (this is not a typo, it has 2 remotes) and none of them can turn the damn thing on - Iām not crazy, the seller confirmed, the X2 doesnāt start from remote, you need to use the touch interface, then you can use the remote . . . so I bought another one
On my fourth attempt, the results were overwhelming; I bought an L1 projector this time (simply because at this point, if it was really really quiet, I could hear my wallet screaming in agony); itās greatest feature was itās price - only 40 USD | no Android, no nothing. I paired it with a Google TV Chromecast and a MiBox S (for some smarts) and it was amazing - this is the reason I want to buy another non-smart projector (but DLP) for a part II.
Boot time was 10 seconds, boot on last used app/screen (HDMI) - perfect! . . . just one problem, itās not a Texas Instruments DLP . . . so the image is absolute garbageā¦i mean, itās so shit that even the Chinese store that sells it, says that is a TOY and it is for children
By the time I got to my fifth attempt they launched the 2020 DLP P09 - IR port on the front (that actually works), Android 9.0, physical ON button (so if everything else fails, I can remove the button and add a relay to control it) - itās like it was especially designed for this project - itās the one Iām currently using
OKā¦i think itās time to stopā¦sorry, I talk a lotā¦Have fun and thanks for the read!