Upload Images & Videos to S3

I’m struggling to figure out something which I thought would be really simple. I’m trying to upload files local to Home Assistant to an S3 bucket.

I thought this would be what I needed GitHub - robmarkcole/HASS-S3: Home Assistant integration for S3 but it doesn’t appear to expose any form or service or action I can call in an integration to upload the file to S3. Nor does the official s3 integration.

Am I missing something very obvious? Surely someone else must have figured this out already? Any help would be greatly appreciated. Here is my automation for context:

alias: Upload Footage
description: ""
triggers:
  - trigger: state
    entity_id:
      - binary_sensor.tapo_indoor_cam_1_cell_motion_detection
conditions: []
actions:
  - variables:
      timestamp: "{{ now().strftime('%Y-%m-%d %H:%M:%S') }}"
  - action: camera.record
    metadata: {}
    data:
      lookback: 0
      filename: /config/www/wildlife/{{ timestamp }}.mp4
      duration: 5
    target:
      device_id: 1173a6736dd2f46a3b00633e5a766eca
  - action: UPLOAD TO S3!
mode: single

does filename: '/local/wildlife/{{ timestamp }}.mp4’ work?

Thanks for the quick response! Yep, that part all works, I can view the video or images via the browser with no issue. Just can’t work out the best way to upload to s3!

have you explored folder watcher? Forgive me if I’m playing catchup…

No problem, I appreciate your time. That just allows me to watch for new/changed files right? I’m not having any issues with the local files; I just can’t figure out how to upload them to an Amazon S3 bucket. There doesn’t appear to be any examples of someone else doing this online - that I can find anyway.

Not exactly

Example automation

The following automation uses the folder_watcher to automatically upload files created in the local filesystem to S3:

- id: '1587784389530'
  alias: upload-file-to-S3
  description: 'When a new file is created, upload to S3'
  trigger:
    event_type: folder_watcher
    platform: event
    event_data:
      event_type: created
  action:
    service: s3.put
    data_template:
      bucket: "my_bucket"
      key: "input/{{ now().year }}/{{ (now().month | string).zfill(2) }}/{{ (now().day | string).zfill(2) }}/{{ trigger.event.data.file }}"
      file_path: "{{ trigger.event.data.path }}"
      storage_class: "STANDARD_IA"

I don’t know why I need the folder watcher, shouldn’t I just be able to do this?

alias: Upload Footage
description: ""
triggers:
  - trigger: state
    entity_id:
      - binary_sensor.tapo_indoor_cam_1_cell_motion_detection
conditions: []
actions:
  - variables:
      timestamp: "{{ now().strftime('%Y-%m-%d %H:%M:%S') }}"
  - action: camera.record
    metadata: {}
    data:
      lookback: 0
      filename: /config/www/wildlife/{{ timestamp }}.mp4
      duration: 5
    target:
      device_id: 1173a6736dd2f46a3b00633e5a766eca
  - action: 
      service: s3.put
      data_template:
        bucket: "my_bucket"
        key: "input/{{ now().year }}/{{ (now().month | string).zfill(2) }}/{{ (now().day | string).zfill(2) }}/{{ timestamp }}.mp4"
        file_path: "/config/www/wildlife/{{ timestamp }}.mp4"
        storage_class: "STANDARD_IA"
mode: single

But when I do I can’t save the automation:

Message malformed: value should be a string for dictionary value @ data['actions'][6]['action']

You don’t, but you do need to use the services provided by S3

action:
  service: s3.put
  data_template:
     bucket: my_bucket
     key: my_key/xxxx.mp4
     file_path: /local/wildlife/xxxx.mp4
 
 

Thanks again, that looks exactly to be what I’ve done in my last post that won’t, let me save, I must be missing something obvious?

Why is this invalid:

alias: Upload Footage
description: ""
triggers:
  - trigger: state
    entity_id:
      - binary_sensor.tapo_indoor_cam_1_cell_motion_detection
conditions: []
actions:
  - variables:
      timestamp: "{{ now().strftime('%Y-%m-%d %H:%M:%S') }}"
  - action: camera.record
    metadata: {}
    data:
      lookback: 0
      filename: /config/www/wildlife/{{ timestamp }}.mp4
      duration: 5
    target:
      device_id: 1173a6736dd2f46a3b00633e5a766eca
  - action: 
      service: s3.put
      data_template:
        bucket: "my_bucket"
        key: "input/{{ now().year }}/{{ (now().month | string).zfill(2) }}/{{ (now().day | string).zfill(2) }}/{{ timestamp }}.mp4"
        file_path: "/config/www/wildlife/{{ timestamp }}.mp4"
        storage_class: "STANDARD_IA"
mode: single

Thanks again!

That is not even close to your OP posted code. Again to be clear, you aren’t using the suggested format for your file path.

When I use your example code I get a similar error:

Am I just not understanding how to call a service from an automation?

What happens if you change data_template to just data?

action:
  service: s3.put
  data:
     bucket: my_bucket
     key: my_key/xxxx.mp4
     file_path: /local/wildlife/xxxx.mp4

The data_template option in Home Assistant was deprecated many versions ago and (I’m not sure) might be flagged as an error nowadays.

1 Like

Good point, after looking at the source code. It does references data:

Same error, unfortunately. Also if I try and find the concept of this s3.put service through settings there is no reference to it. If I look at the config for this integration this is all that is listed:

Which is why I was opening with a general question as to how I might best do this, I’ve assumed this old integration just doesn’t work anymore. I’ll probably have to write a custom server to do the upload for me and use a webhook to send the image data instead. I was hoping someone else might have already solved this problem :slight_smile: thanks again for your time.

I’ll give it more time later today. It should still work…

I’m just tried the code in your OP and I’m able to save it just fine? I’m not able to test it though as I don’t have the S3 services installed.

Anyway, I suggest to use data instead of data_template and also to use entity_id instead of device_id. Why and how to avoid device_ids.

Something like the below code (change the entity_id to the one of your camera).

alias: Upload Footage
description: ""
triggers:
  - trigger: state
    entity_id:
      - binary_sensor.tapo_indoor_cam_1_cell_motion_detection
    to: "on"
conditions: []
actions:
  - variables:
      timestamp: "{{ now().strftime('%Y-%m-%d %H:%M:%S') }}"
  - action: camera.record
    metadata: {}
    data:
      duration: 5
      lookback: 0
      filename: /local/wildlife/{{ timestamp }}.mp4
    target:
      entity_id: camera.wildlife_cam
  - action: s3.put
    data:
      bucket: my_bucket
      key: >-
        input/{{ now().year }}/{{ (now().month | string).zfill(2) }}/{{
        (now().day | string).zfill(2) }}/{{ timestamp }}.mp4
      file_path: /local/wildlife/{{ timestamp }}.mp4
      storage_class: STANDARD_IA
    
mode: single

Btw.

/local is mapped to /config/www/, so /local/wildlife and /config/www/wildlife is the same location.

Thank you for trying, that lets me save it, but then I get this error:

The integration does show up as a backup location so it’s installed correctly:

I’m not so sure whether it’s installed correctly. Home Assistant is complaining about an unknown action, so maybe something is going wrong when the custom component is registering the service. Can you find it under Development Tools-Actions?

I run Home Assistant in Docker and use a simple Python script to watch a folder for new photos, resize and save them in HA’s local folder (for use as last snapshot in still cameras). There’s probably also some docker images doing something like watching a folder and automatically upload to a S3 bucket using FTP.

I think I know what I’ve done here… I’ve confused GitHub - PhantomPhoton/S3-Compatible: A custom component to enable using S3 compatible endpoints for home assistant backups with GitHub - robmarkcole/HASS-S3: Home Assistant integration for S3 and installed the wrong thing :man_facepalming:

Let me try manually installing the correct thing.