Any way to send all events to Google BigQuery?

Hi,

I’d love to send all data to Google BigQuery so I can create a dashboard later on. I’m planning on using Google Data Studio to see times, most used automation, etc.

Any way to do this? Thanks!

You could read it from the logs.
You could write a appdaemon app to send it all
You could send all events out as mqtt and ingest it
You could subscribe using websockets and send it
You could query it directly from the SQLite database
You could use the rest api to query and import

:smiley:

1 Like

Thanks! But I should’ve said I’m not good at coding. So I was hopping much of a “export to Google Drive and import from there”, LOL. I’ll look into those ones, though :slight_smile:

So I was going after the same thing and instead of trying to get all sorts of custom components in place I decided with the quick and dirty route. It works! Not sure for how long but we’ll see.

Basically I export all events table rows everyday that are from one day earlier to csv. Then I load the CSV into bigquery. As simple as that.

First you have to install the google cloud sdk (this will also install bq command line tool).
This is what I did on my dietpi installation. But more instructions on installing the cloud sdk can be found here: https://cloud.google.com/sdk/docs/quickstart-debian-ubuntu

# Install lsb-release
sudo apt-get install lsb-release

# Create environment variable for correct distribution
export CLOUD_SDK_REPO="cloud-sdk-$(lsb_release -c -s)"

# Add the Cloud SDK distribution URI as a package source
echo "deb http://packages.cloud.google.com/apt $CLOUD_SDK_REPO main" | sudo tee -a /etc/apt/sources.list.d/google-cloud-sdk.list

# Import the Google Cloud Platform public key
curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | sudo apt-key add -

# Update the package list and install the Cloud SDK
sudo apt-get update && sudo apt-get install google-cloud-sdk

Then I created this bash script in /etc/cron.daily to export the database events every day and import it to bigquery.
#!/bin/bash

#Export yesterdays events to csv

sqlite3 -header -csv /mnt/YOUR_HASS_INSTALLATION/homeassistant/home-assistant_v2.db **"SELECT * FROM events WHERE DATE(time_fired) = DATE('now', '-1 day');"** > /tmp/events.csv

#Load CSV to BQ

bq --location=EU load --autodetect --noreplace --source_format=CSV YOUR_DATASET.events /tmp/events.csv

# delete tmp csv

rm /tmp/events.csv

Hope this helps!

Kees.

Hi!

Thanks for the workaround! Problem is… I’m running Hassio, so i don’t know wether I can follow those steps. There’s no regular SSH access :confused:

Hi,

what is with this solution: GitHub - lborie/home-assistant-to-big-query
Does this work?