SOLVED (Workaround) : NFS - manual mount of NFS folder failed

Hi ,

Installation type : HA Supervised ( Debian over Proxmox )

When a motion detector is activated , an automation takes a screenshot of the camera and sends the photos in /config/media/camera folder.
=> This is working

Now i want to mount a NAS shared folder in order to upload directly the photos to the NAS

So I ran the mount command directly from the Homeassistant container console in Portainer , from the file location where are stored the photos:

mount -t nfs 192.168.1.xx:/volume1/homes /config/media/camera
=> MOUNTING ERROR : Connection refused

What I have done :

  • search in the forum : No success for a solution
  • Look at syslog : Nothing
  • trace of messages betwen HA and NAS by tcpdump : No message ( sounds like normal)

Please any advice to debug this problem ?
Thank you

Aside from the connection issues, mounting an NFS export inside an container will last until the next container recreation (e.g. update).

Never done it, but you need to do the NFS-mounting outside the container and pass it as volume.

See https://docs.docker.com/storage/volumes/#create-a-service-which-creates-an-nfs-volume

Thanks @m0wlhed for the proposal.

I do not feel for the moment to have the knowledge actually to setup this configuration , so I will use the method i have seen in the forum :

=> nfs-common has been installed in the Debian OS for the connection issue = OK

1) manual mount command from the homeassistant container in the Portainer console

mount -t nfs 192.168.1.xx:/volum1/homes/xxxxx/ /config/media/xxxx/

2) test with “developper Tools/Service”

service : camera.screenshot
entity : Test camera

service data :

entity_id: camera.xxxx_ovprofile00
filename: /config/media/camera/test.jpg

=> The test.jpg camera screenshot is sent to the proper NAS folder (volume1/homes/xxxx) = OK

3) Test with an automation ( camera screeshot sent to NAS when motion detected)

Automation code

- id: '1599228545663'
  alias: Motion  FGMS-01
  description: Actions suite motion Fibaro FGMS-01
  trigger:
  - device_id: c52b50d6959a439a8ce79b8cae0db055
    domain: binary_sensor
    entity_id: binary_sensor.fgms001_zw5_motion_sensor_home_security_motion_detected
    platform: device
    type: motion
  condition:
  - condition: or
    conditions:
    - condition: state
      entity_id: alarm_control_panel.home_alarm
      state: armed_away
    - condition: state
      entity_id: alarm_control_panel.home_alarm
      state: armed_night
  action:
  - data:
      message: AUTOMATION! Motion sensor FGMS-01 triggered in Away / Night mode
    service: notify.pushbullet
  - data:
      entity_id: camera.dcs_942l_ovprofile00
      filename: /config/media/camera/D942L_{{ now().strftime("%Y%m%d-%H%M%S") }}.jpg
    entity_id: camera.dcs_942l_ovprofile00
    service: camera.snapshot
  mode: single

=> There is a camera screenshot ( jpg file ) in the NAS shared folder = OK but not in the HA local folder (NOK)

4) Automatic mapping of the NFS folder mount

To create an automation each time you reboot the docker or the HA , so the mapping will be restored

# Custom Shell
shell_command:
  map_nas_camera_folder: mount -t nfs 192.168.1.xxx:/volume1/homes/xxxxx/ /config/media/xxxxx

- id: '1611660762952'
  alias: NAS folder mounting
  description: ''
  trigger:
  - platform: homeassistant
    event: start
  condition: []
  action:
  - service: shell_command.map_nas_camera_folder
    data: {}
  mode: single

4) Remaining issue

a) I can view the camera screenshot in the NAS web browser in the remote NAS share folder ( volume1/homes/xxxx) but not in HA local folder /config/media/xxxx as I expect .

Is it a normal behavior ?

I noticed that accessing the local folder using SSH is OK ( jpg files are there) but not from the Web Browser ( folder empty)

b) I agree that this NFS setup is a workaround solution

EDIT : Following a new test in Core 2021.7.3 version , this remaining issue is CLOSED ( I do not know how !) , the camera screenshot can be viewed with Media Browser in lConfig/Media/camera directory

Thanks for your work on this. I am considering the same approach, has this been stable for your?

yes it has been stable

1 Like