Custom integration - file upload to S3

I just published a custom integration to upload files to AWS S3. In case you dont know what S3, it is Amazons file store in the cloud, which has infinite capacity and is low cost. I am using this integration to backup images captured by my motion triggered camera to S3 for long term storage and analysis. If there is interest in this I will add to HACS.
Cheers

3 Likes

Yes interested. HACS would be good!

1 Like

Made PR to HACS

1 Like

FYI, I submitted a PR to your repo to add support for configuring this integration via the UI, the ability to configure multiple S3 buckets, and the ability to specify the S3 storage class in the ‘put’ service.

https://github.com/robmarkcole/HASS-S3/pull/4

1 Like

I can’t find this in HACS, was it merged ?

You need to add @robmarkcole’s github to HACS as a custom repository.

@robmarkcole or others – I am stumped. I am certain I am most likely doing something stupid, but I cannot get this Integration to work.

The UI configuration of the Integration does not seem to work correctly as it I hit submit after entering the info and it just refreshes the fields to blank with no changes made. So I manually edited my configuration.yaml to add the s3 domain info. (pasted below)

When trying to call the s3.put service, I get the following error:

Logger: custom_components.s3
Source: custom_components/s3/__init__.py:99
Integration: Amazon S3 (documentation, issues)
First occurred: 12:26:04 PM (1 occurrences)

This is my automation text

  - service: s3.put
    data_template:
      bucket: "******"
      key: "snapshot/doorbell.jpg"
      file_path: "/config/www/snapshot/doorbell.jpg"
      storage_class: "STANDARD_IA"

Like I said, I am sure it is something stupid I have done wrong, but for the life of me I cannot figure it out.

configuration.yaml

# Configure a default setup of Home Assistant (frontend, api, etc)
default_config:

# Text to speech
tts:
  - platform: google_translate

group: !include groups.yaml
automation: !include automations.yaml
script: !include scripts.yaml
scene: !include scenes.yaml
s3:
  aws_access_key_id: AKI*********NPQ
  aws_secret_access_key: UnO*********pXjo0Es
  region_name: us-east-1

what error message? Given the line (99) presumably S3 client instance not found which indicates your config has an error, possibly on region_name

Hi @robmarkcole - so I sort of figured it out. It has something to do with the lastest release and the config_flow. As mentioned, I was manually putting it in the configuration.yaml not realizing that does not work for an integration that has been migrated to the config_flow. When using the latest release, I install it through HACS fine, but when adding the Integration under Configuration, putting in the Region, Key, and Secret did nothing…I hit submit and the fields just clear with no indication.

So, removed the HACS version and downloaded S3 0.4 and manually copied those files into the custom_components and then did the Integration under Configuration and tada, it worked after I entered the Region/Key/Secret it said successful and I now see the entries in the .storage/config files.

Plus it works in my integration.

I do want to thank you for this integration. It solved my final piece of my White Whale effort. Specifically, using a combination of your integration, plus the Unifi Protect and the Alexa Media player, when someone rings my G4 doorbell it takes a snapshot, sends it to S3, then plays a message on all of the Alexa Shows in the house and finally sets the background image of the Echo Shows to the snapshot of the camera.

2 Likes

@scott7685 I would be very interested to see a write up on https://www.hackster.io/home-assistant