Custom integration - file upload to S3

I just published a custom integration to upload files to AWS S3. In case you dont know what S3, it is Amazons file store in the cloud, which has infinite capacity and is low cost. I am using this integration to backup images captured by my motion triggered camera to S3 for long term storage and analysis. If there is interest in this I will add to HACS.
Cheers

3 Likes

Yes interested. HACS would be good!

1 Like

Made PR to HACS

1 Like

FYI, I submitted a PR to your repo to add support for configuring this integration via the UI, the ability to configure multiple S3 buckets, and the ability to specify the S3 storage class in the ‘put’ service.

https://github.com/robmarkcole/HASS-S3/pull/4

1 Like

I can’t find this in HACS, was it merged ?

You need to add @robmarkcole’s github to HACS as a custom repository.

@robmarkcole or others – I am stumped. I am certain I am most likely doing something stupid, but I cannot get this Integration to work.

The UI configuration of the Integration does not seem to work correctly as it I hit submit after entering the info and it just refreshes the fields to blank with no changes made. So I manually edited my configuration.yaml to add the s3 domain info. (pasted below)

When trying to call the s3.put service, I get the following error:

Logger: custom_components.s3
Source: custom_components/s3/__init__.py:99
Integration: Amazon S3 (documentation, issues)
First occurred: 12:26:04 PM (1 occurrences)

This is my automation text

  - service: s3.put
    data_template:
      bucket: "******"
      key: "snapshot/doorbell.jpg"
      file_path: "/config/www/snapshot/doorbell.jpg"
      storage_class: "STANDARD_IA"

Like I said, I am sure it is something stupid I have done wrong, but for the life of me I cannot figure it out.

configuration.yaml

# Configure a default setup of Home Assistant (frontend, api, etc)
default_config:

# Text to speech
tts:
  - platform: google_translate

group: !include groups.yaml
automation: !include automations.yaml
script: !include scripts.yaml
scene: !include scenes.yaml
s3:
  aws_access_key_id: AKI*********NPQ
  aws_secret_access_key: UnO*********pXjo0Es
  region_name: us-east-1

what error message? Given the line (99) presumably S3 client instance not found which indicates your config has an error, possibly on region_name

Hi @robmarkcole - so I sort of figured it out. It has something to do with the lastest release and the config_flow. As mentioned, I was manually putting it in the configuration.yaml not realizing that does not work for an integration that has been migrated to the config_flow. When using the latest release, I install it through HACS fine, but when adding the Integration under Configuration, putting in the Region, Key, and Secret did nothing…I hit submit and the fields just clear with no indication.

So, removed the HACS version and downloaded S3 0.4 and manually copied those files into the custom_components and then did the Integration under Configuration and tada, it worked after I entered the Region/Key/Secret it said successful and I now see the entries in the .storage/config files.

Plus it works in my integration.

I do want to thank you for this integration. It solved my final piece of my White Whale effort. Specifically, using a combination of your integration, plus the Unifi Protect and the Alexa Media player, when someone rings my G4 doorbell it takes a snapshot, sends it to S3, then plays a message on all of the Alexa Shows in the house and finally sets the background image of the Echo Shows to the snapshot of the camera.

2 Likes

@scott7685 I would be very interested to see a write up on Hackster.io - The community dedicated to learning hardware.

I’m currently strugging on using the add on. Installed it and tried to upload a backup from my /backup folder using:

service: s3.put
data:
  bucket: MyFancyBucketName
  key: myBackup.tar
  file_path: /backup/ccf26263.tar
  storage_class: STANDARD_IA

and the file actually exists when checking via the terminal add-on:

image

I receive the error
2022-06-09 22:38:04 ERROR (SyncWorker_6) [custom_components.s3] Invalid file_path /backup/ccf26263.tar - then thought that /backup isn’t whitelisted.

Tried to add it to my config:

homeassistant:
  allowlist_external_dirs:
    - "/backup"

which results in an invalid config: Not a directory @ data['allowlist_external_dirs'][0]

Any hints on what i am doing wrong? is /backup mounted somewhere else? Running Home Assistant OS 8.1 on a RPi4

I cant remember the solution but this has been asked before, bit more searching should resolve

When adding your github as a new repo in home assistant , under settings - add-ons - add-on store → add repo, unlike 2 others I have added , it comes up with an error ‘https://github.com/robmarkcole/HASS-S3 is not a valid add-on repository’

Is this compat with the supervisor HA install or am I missing something else ?

Because it is not an addon, it is a custom integration.

1 Like

You need to add via HACS.

Hey man @robmarkcole , you just saved my life! Awesome job. I am running successfully here.

1 Like

Can you help with how to access a file on an amazon bucket from an ios notification? I have your integration up and running with new camera snapshots uploaded correctly, but I can’t figure out permissions to allow them to be attached to an ios notification.

Edit: Nevermind, I was able to use the Sign URL Service to get a public URL for the image and send that with the notification. Works great!

1 Like

Were you able to find a fix @ElGaucho? Having the exact same issue as you.