Which ports are required for Synology Integration?

Hi, I was running a Synology NAS and Home Assistant successfully intergatated, but have now moved the NAS to one VLAN and Home Assistant on another VLAN and broken the integration. Please can I ask which ports need to be opened/allowed between the VLANs to support the Synology integration - both for the NAS stats as well as Surveillance Station? Thanks!

I use some Syniology DSM API calls on top of integration to get more detailed information from NAS. All these calls to DSM work on port 80. So I’d try this first. Specific applications’ APIs might use different ports, depending on your configuration. In my case, for example, Download Station API uses port 8000.

I just tested it, seems to be via port 5001. (if you tick SSL), should be 5000 if you don’t (default ports)

Out of interest @mirekmal, what additional information are you getting from the NAS?

I would like the status of encrypted shares (e.g unlocked, vs locked). I’m currently writing some C# windows app which will mount and dismount via a SSH command automatically… but also report the status via MQTT… (C# as I want to be prompted when I log in to my windows PC if I want to mount encrypted drives). It would be much neater to get it from the API if that’s possible.

Currently I have a mixtiure of sources and here is my Synology NAS card. Quite a lot of info :slight_smile: BTW for Volume 2 you can see the same info as for Volume 1, but I kept it rolled up as it contains data for 10 drives I couldn’t make such large screenshot from one screen.

Interesting one is chart with details of storage consumption. I use for it python script running once per hour on NAS itself and producing JSON file, that is the scrapped directly into HA sensors.

1 Like

Nice, looks good!.

Do you mind sharing your python script? / homeassistant JSON scraping script?

Sure, np!

So first the python script. It has to be located in any shared folder on Synology NAS:

# importing modules
import os
import shutil

# note the format of path to store the outcome of this script
# in my case I use the same folder (/web) that I use for some web services.
# it is accessible from network via http(s)
# starting woith opening the file and writing JSON header
with open('/volume1/web/Storage_temp.json', 'w') as s:
    s.write('{')
    s.write('\n')
    s.write('  "share_count":13,')
    s.write('\n')
    s.write('  "data":[')
    s.write('\n')

# this is actual code to calculate the size of folder defined in Folderpath variable
# for each folder you want to include in statistics this section needs to be repeated
# just Folder path needs to be updated, as calculation is instantly written to temporary file
    size = 0
    Folderpath = '/volume2/ActiveBackupforBusiness'  
    for path, dirs, files in os.walk(Folderpath):
        for f in files:
            fp = os.path.join(path, f)
            size += os.stat(fp).st_size
            div = 1024 * 1024 * 1024
            res = round(size / div, 1)
    #print("Folder size: " + str(res))
    s.write('    {"share_name":"Active Backup", "share_size":' + str(res) + '},')
    s.write('\n')

# for reference here is secodn folder
    size = 0
    Folderpath = '/volume2/Videos'  
    for path, dirs, files in os.walk(Folderpath):
        for f in files:
            fp = os.path.join(path, f)
            size += os.stat(fp).st_size
            div = 1024 * 1024 * 1024
            res = round(size / div, 1)
    #print("Folder size: " + str(res))
    s.write('    {"share_name":"Videos", "share_size":' + str(res) + '},')
    s.write('\n')
# and I'm skipping remaing folders, as code is excactly the same.

. . .

# finalizing JSON output
    s.write('  ]')
    s.write('\n')
    s.write('}')

    s.close()

# copying temporary file to final one
# this step is added as during script run output file is not available to rest sensors in HA, causing some errors
# coping ready to use temp file to final one avoids teh risk of this happening
    src_path = r'/volume1/web/Storage_temp.json'
    dst_path = r'/volume1/web/Storage.json'
    shutil.copy2(src_path, dst_path)

This script is executed once per hour using Task Scheduler in DSM:

image


image

And finally you need to define your rest sensors to read from JSON file sizes of folders:

sensor:
  - platform: template
    sensors:
# this sensor calculate free space by deducting from total size of volume (enered manually in template) sizes of all individual shares
      share_free_space:
        value_template: >-
          {{ ((20.9 * 1024) - (states('sensor.share_size_0') | float + states('sensor.share_size_1') | float + states('sensor.share_size_2') | float + states('sensor.share_size_3') | float + states('sensor.share_size_4') | float + states('sensor.share_size_5') | float + states('sensor.share_size_6') | float + states('sensor.share_size_7') | float + states('sensor.share_size_8') | float + states('sensor.share_size_9') | float + states('sensor.share_size_10') | float + states('sensor.share_size_11') | float + states('sensor.share_size_12') | float)) | round(1) }}
        unit_of_measurement: 'GB'
        friendly_name: 'Free Space' 

# and here go pair of sensors, one giving the name of share and second givint the size of this share
# note data[x] that need to be adjusted to position of share in JSON file
# resource: http://192.168.52.21/Storage.json refers to IP of your NAS and location of JSON file
# there is no need to add /web in address, as this is root for web services on Synology web server.
  - platform: rest
    resource: http://192.168.52.21/Storage.json
    name: share_name_0
    value_template: '{{ value_json.data[0].share_name }}'
  - platform: rest
    resource: http://192.168.52.21/Storage.json
    name: share_size_0
    value_template: '{{ value_json.data[0].share_size }}'
    unit_of_measurement: 'GB'
. . . 

Of course you need to define as many sensors as folders you are checking.

1 Like

@mirekmal, legend, thanks for the SHARE!.. just confirming, you don’t need user/pass as it’s in the web folder correct? I’ll have to allow port 80 through the NAS firewall, currently blocked. (everything is locked down)

I can get the status of the mount from the terminal using the below command, just need the python script to run a shell command and capture the output.

e.g.
mount | awk '{if ($3 == "/mnt/backup") { exit 0}} ENDFILE{exit -1}'
The above line will exit with 0 (success) if /mnt/backup is mounted. Otherwise, it’ll return -1 (error).

(other methods here from this site)

I should be able to use that to generate some additional parameters in your JSON file, and report to Home assistant the current status of my encrypted shares.

Interestingly, hypothetically, you could get all the synology DSM data via this method, and not require an administrator login with the integration. (excluding commands etc)

Anyway, thanks again…

Yes, port 80 needs to be opened, but in my case I did nothing special, as some I thing just installation of any package using Web Station automatically opens it.
Yes, this method can be used to get virtually any information from DSM, especially that there is well documented API that Synology made public, so it can be used from python to write required information stright to JSON and parse from there using REST. Some time ago I created set of RESR sensors using Synology Download Station API, but due to some racing conditions it was not working well. I was thinging about applying this solution to rewrite it, but as usual time, or lack lack of it, is the blocking factor :slight_smile:

1 Like

.py to json for retrieving the encrypted status of Syology NAS shared folders…

# encrypted State
    p = subprocess.Popen(["mount"], stdout=subprocess.PIPE)
    out, err = p.communicate()

    if out.find(b"@EncryptedData@") != -1:
        s.write('    {"share_name":"Encrypted_Data", "Encryption_State": "Decrypted"' + '},')
        s.write('\n')
    else:
        s.write('    {"share_name":"Encrypted_Data", "Encryption_State": "Encrypted"' + '},')
        s.write('\n')

# encrypted State
    p = subprocess.Popen(["mount"], stdout=subprocess.PIPE)
    out, err = p.communicate()

    if out.find(b"@ExternalBackup@") != -1:
        s.write('    {"share_name":"External_Backup", "Encryption_State": "Decrypted"' + '},')
        s.write('\n')
    else:
        s.write('    {"share_name":"External_Backup", "Encryption_State": "Encrypted"' + '}')
        s.write('\n')