Local realtime person detection for RTSP cameras

Thanks for your quick feedback!

I copy/pasted this config into the Supervisor addon and it seems to stay running (in that the addon page keeps the buttons “stop” and “restart” active rather than changing back to showing the option to “start”.

However I don’t think it’s actually working. Here is what the log shows:

Current thread 0x00007feae939c700 (most recent call first):
File “/usr/local/lib/python3.8/dist-packages/tflite_runtime/interpreter.py”, line 117 in init
File “/usr/local/lib/python3.8/dist-packages/tflite_runtime/interpreter.py”, line 161 in load_delegate
File “/opt/frigate/frigate/edgetpu.py”, line 55 in init
File “/opt/frigate/frigate/edgetpu.py”, line 106 in run_detector
File “/usr/lib/python3.8/multiprocessing/process.py”, line 108 in run
File “/usr/lib/python3.8/multiprocessing/process.py”, line 315 in _bootstrap
File “/usr/lib/python3.8/multiprocessing/popen_fork.py”, line 75 in _launch
File “/usr/lib/python3.8/multiprocessing/popen_fork.py”, line 19 in init
File “/usr/lib/python3.8/multiprocessing/context.py”, line 277 in _Popen
File “/usr/lib/python3.8/multiprocessing/context.py”, line 224 in _Popen
File “/usr/lib/python3.8/multiprocessing/process.py”, line 121 in start
File “/opt/frigate/frigate/edgetpu.py”, line 159 in start_or_restart
File “detect_objects.py”, line 113 in run
File “/usr/lib/python3.8/threading.py”, line 932 in _bootstrap_inner
File “/usr/lib/python3.8/threading.py”, line 890 in _bootstrap
Detection appears to have stopped. Restarting detection process
Starting detection process: 188
Attempting to load TPU as usb:0
Fatal Python error: Illegal instruction
Current thread 0x00007feae939c700 (most recent call first):
File “/usr/local/lib/python3.8/dist-packages/tflite_runtime/interpreter.py”, line 117 in init
File “/usr/local/lib/python3.8/dist-packages/tflite_runtime/interpreter.py”, line 161 in load_delegate
File “/opt/frigate/frigate/edgetpu.py”, line 55 in init
File “/opt/frigate/frigate/edgetpu.py”, line 106 in run_detector
File “/usr/lib/python3.8/multiprocessing/process.py”, line 108 in run
File “/usr/lib/python3.8/multiprocessing/process.py”, line 315 in _bootstrap
File “/usr/lib/python3.8/multiprocessing/popen_fork.py”, line 75 in _launch
File “/usr/lib/python3.8/multiprocessing/popen_fork.py”, line 19 in init
File “/usr/lib/python3.8/multiprocessing/context.py”, line 277 in _Popen
File “/usr/lib/python3.8/multiprocessing/context.py”, line 224 in _Popen
File “/usr/lib/python3.8/multiprocessing/process.py”, line 121 in start
File “/opt/frigate/frigate/edgetpu.py”, line 159 in start_or_restart
File “detect_objects.py”, line 113 in run
File “/usr/lib/python3.8/threading.py”, line 932 in _bootstrap_inner
File “/usr/lib/python3.8/threading.py”, line 890 in _bootstrap
Detection appears to have stopped. Restarting detection process
Starting detection process: 196
Attempting to load TPU as usb:0
Fatal Python error: Illegal instruction

and the Ingress page shows:

{“detection_fps”:0.0,“detectors”:{“coral”:{“detection_start”:0.0,“inference_speed”:10.0,“pid”:278}},“porch”:{“camera_fps”:5.1,“capture_pid”:20,“detection_fps”:0.0,“frame_info”:{“detect”:1604549649.859921,“process”:0.0},“pid”:21,“process_fps”:5.1,“skipped_fps”:0.0}}

Not really sure what this all means. Thanks again for your quick reply and any additional insight would be appreciated!

I have never seen that Illegal instruction error. What hardware are you running this on?

I’m running on an old desktop. Xeon 3230, 8GB RAM. Headless with no GPU. Running Ubuntu Server 20.04.1. Using the Coral USB stick plugged into a USB3 card. The Coral stick shows up when I run “lsusb”

I restarted the addon and caught the beginning of the log and it’s now showing:

Fontconfig error: Cannot load default config file
On connect called
ffprobe -v panic -show_error -show_streams -of json “rtsp://user:password@CAMERA_URL:554/cam/realmonitor?channel=1&subtype=1”
Starting detection process: 19
Attempting to load TPU as usb:0
Fatal Python error: Illegal instruction
Current thread 0x00007f900bdf9740 (most recent call first):
File “/usr/local/lib/python3.8/dist-packages/tflite_runtime/interpreter.py”, line 117 in init
File “/usr/local/lib/python3.8/dist-packages/tflite_runtime/interpreter.py”, line 161 in load_delegate
File “/opt/frigate/frigate/edgetpu.py”, line 55 in init
File “/opt/frigate/frigate/edgetpu.py”, line 106 in run_detector
File “/usr/lib/python3.8/multiprocessing/process.py”, line 108 in run
File “/usr/lib/python3.8/multiprocessing/process.py”, line 315 in _bootstrap
File “/usr/lib/python3.8/multiprocessing/popen_fork.py”, line 75 in _launch
File “/usr/lib/python3.8/multiprocessing/popen_fork.py”, line 19 in init
File “/usr/lib/python3.8/multiprocessing/context.py”, line 277 in _Popen
File “/usr/lib/python3.8/multiprocessing/context.py”, line 224 in _Popen
File “/usr/lib/python3.8/multiprocessing/process.py”, line 121 in start
File “/opt/frigate/frigate/edgetpu.py”, line 159 in start_or_restart
File “/opt/frigate/frigate/edgetpu.py”, line 142 in init
File “detect_objects.py”, line 189 in main
File “detect_objects.py”, line 441 in
{‘streams’: [{‘index’: 0, ‘codec_name’: ‘h264’, ‘codec_long_name’: ‘unknown’, ‘profile’: ‘100’, ‘codec_type’: ‘video’, ‘codec_time_base’: ‘1/10’, ‘codec_tag_string’: ‘[0][0][0][0]’, ‘codec_tag’: ‘0x0000’, ‘width’: 640, ‘height’: 480, ‘coded_width’: 640, ‘coded_height’: 480, ‘closed_captions’: 0, ‘has_b_frames’: 0, ‘sample_aspect_ratio’: ‘126:95’, ‘display_aspect_ratio’: ‘168:95’, ‘pix_fmt’: ‘yuvj420p’, ‘level’: 22, ‘color_range’: ‘pc’, ‘color_space’: ‘bt709’, ‘color_transfer’: ‘bt709’, ‘color_primaries’: ‘bt709’, ‘chroma_location’: ‘left’, ‘field_order’: ‘progressive’, ‘refs’: 1, ‘is_avc’: ‘false’, ‘nal_length_size’: ‘0’, ‘r_frame_rate’: ‘5/1’, ‘avg_frame_rate’: ‘5/1’, ‘time_base’: ‘1/90000’, ‘start_pts’: 108000, ‘start_time’: ‘1.200000’, ‘bits_per_raw_sample’: ‘8’, ‘disposition’: {‘default’: 0, ‘dub’: 0, ‘original’: 0, ‘comment’: 0, ‘lyrics’: 0, ‘karaoke’: 0, ‘forced’: 0, ‘hearing_impaired’: 0, ‘visual_impaired’: 0, ‘clean_effects’: 0, ‘attached_pic’: 0, ‘timed_thumbnails’: 0}}]}
Camera capture process started for porch: 27
Creating ffmpeg process…
ffmpeg -hide_banner -loglevel panic -avoid_negative_ts make_zero -fflags nobuffer -flags low_delay -strict experimental -fflags +genpts+discardcorrupt -rtsp_transport tcp -stimeout 5000000 -use_wallclock_as_timestamps 1 -i rtsp://user:password@CAMERA_URL:554/cam/realmonitor?channel=1&subtype=1 -f segment -segment_time 10 -segment_format mp4 -reset_timestamps 1 -strftime 1 -c copy -an -map 0 /cache/porch-%Y%m%d%H%M%S.mp4 -f rawvideo -pix_fmt yuv420p pipe:
Camera process started for porch: 29

  • Serving Flask app “detect_objects” (lazy loading)
  • Environment: development
  • Debug mode: off
    Detection appears to have stopped. Restarting detection process
    Starting detection process: 46
    Attempting to load TPU as usb:0
    Fatal Python error: Illegal instruction
    Current thread 0x00007f9001be4700 (most recent call first):
    File “/usr/local/lib/python3.8/dist-packages/tflite_runtime/interpreter.py”, line 117 in init
    File “/usr/local/lib/python3.8/dist-packages/tflite_runtime/interpreter.py”, line 161 in load_delegate
    File “/opt/frigate/frigate/edgetpu.py”, line 55 in init
    File “/opt/frigate/frigate/edgetpu.py”, line 106 in run_detector
    File “/usr/lib/python3.8/multiprocessing/process.py”, line 108 in run
    File “/usr/lib/python3.8/multiprocessing/process.py”, line 315 in _bootstrap
    File “/usr/lib/python3.8/multiprocessing/popen_fork.py”, line 75 in _launch
    File “/usr/lib/python3.8/multiprocessing/popen_fork.py”, line 19 in init
    File “/usr/lib/python3.8/multiprocessing/context.py”, line 277 in _Popen
    File “/usr/lib/python3.8/multiprocessing/context.py”, line 224 in _Popen
    File “/usr/lib/python3.8/multiprocessing/process.py”, line 121 in start
    File “/opt/frigate/frigate/edgetpu.py”, line 159 in start_or_restart
    File “detect_objects.py”, line 113 in run
    File “/usr/lib/python3.8/threading.py”, line 932 in _bootstrap_inner
    File “/usr/lib/python3.8/threading.py”, line 890 in _bootstrap

Hi i’m trying to use your addon, I have home assistant supervised installed on docker and i’m getting this error

Fontconfig error: Cannot load default config file
Starting detection process: 20
Starting detection process: 21
Traceback (most recent call last):
  File "detect_objects.py", line 441, in <module>
    main()
  File "detect_objects.py", line 196, in main
    ffmpeg_input = get_ffmpeg_input(ffmpeg['input'])
KeyError: 'input'

the add on then stops immediately

Your config has a formatting error.

I installed frigate on another computer with a bit newer of a CPU that does support AVX. I’ve seen a few posts hinting at AVX support being needed for Tensorflow 1.6+. @blakeblackshear , do you think this could be causing my issue on my original computer (that does not have an AVX CPU)? I do have a Coral USB stick plugged into the system, but I don’t fully understand what parts the CPU plays in frigate and if the AVX could be causing the issue for me.

That would make sense, but I don’t know for sure.

Hi @blakeblackshear
Thank you for this project.
Is there any way to get the number of persons detected from a single camera?

Hi,
I have searched for this function too and when I couldn’t find it I added a few lines to the object_processing.py to send a number of same objects for each object through MQTT (with tag <camera_name>/person/count for counting people), if you want I can share it with you

Not at the moment, but it would be simple enough to add in.

Hello,

I must have missed a step, because after my docker run command nothing seems to start. Using a CLX Container in proxmox and the container can see the Coral Usb stick.

docker run --name frigate --privileged --shm-size=1g -v /opt/frigate/config:/config:ro -v /opt/frigate/clips:/clips:rw -v /etc/localtime:/etc/localtime:ro -v /dev/bus/usb:/dev/bus/usb --device=/dev/dri/renderD128 -d -p 5000:5000 -e FRIGATE_RTSP_PASSWORD='password' blakeblackshear/frigate:stable-amd64
2db5c892caa7: Pull complete
cc3520c678aa: Pull complete
ba226f4bdcfe: Pull complete
Digest: sha256:4d7c80e2ffc3cd68b7aa3a8d0e1a5e849091893840fa9ea822ab8c8200340f1f
Status: Downloaded newer image for blakeblackshear/frigate:stable-amd64
f66cfceda08f75493160e8e277d09e0a3419e0cbce8f765fa39864420d1de2e0
root@FrigateLXC:/opt/frigate/config# docker ps
CONTAINER ID        IMAGE               COMMAND             CREATED             STATUS              PORTS               NAMES

So the container does not seem to be started.

I have created a config.yml and placed it in the /opt/frigate/config

Where did I go wrong?

Container must have exited. Try docker ps -a and docker logs frigate.

1 Like

Thank you! Now I can see that error message. What do you think is wrong?

root@FrigateLXC:/opt/frigate/config# docker logs frigate
Fontconfig error: Cannot load default config file
Traceback (most recent call last):
  File "detect_objects.py", line 33, in <module>
    CONFIG = yaml.safe_load(f)
  File "/usr/local/lib/python3.8/dist-packages/yaml/__init__.py", line 162, in safe_load
    return load(stream, SafeLoader)
  File "/usr/local/lib/python3.8/dist-packages/yaml/__init__.py", line 114, in load
    return loader.get_single_data()
  File "/usr/local/lib/python3.8/dist-packages/yaml/constructor.py", line 49, in get_single_data
    node = self.get_single_node()
  File "/usr/local/lib/python3.8/dist-packages/yaml/composer.py", line 36, in get_single_node
    document = self.compose_document()
  File "/usr/local/lib/python3.8/dist-packages/yaml/composer.py", line 55, in compose_document
    node = self.compose_node(None, None)
  File "/usr/local/lib/python3.8/dist-packages/yaml/composer.py", line 84, in compose_node
    node = self.compose_mapping_node(anchor)
  File "/usr/local/lib/python3.8/dist-packages/yaml/composer.py", line 133, in compose_mapping_node
    item_value = self.compose_node(node, item_key)
  File "/usr/local/lib/python3.8/dist-packages/yaml/composer.py", line 84, in compose_node
    node = self.compose_mapping_node(anchor)
  File "/usr/local/lib/python3.8/dist-packages/yaml/composer.py", line 127, in compose_mapping_node
    while not self.check_event(MappingEndEvent):
  File "/usr/local/lib/python3.8/dist-packages/yaml/parser.py", line 98, in check_event
    self.current_event = self.state()
  File "/usr/local/lib/python3.8/dist-packages/yaml/parser.py", line 438, in parse_block_mapping_key
    raise ParserError("while parsing a block mapping", self.marks[-1],
yaml.parser.ParserError: while parsing a block mapping
  in "/config/config.yml", line 50, column 3
expected <block end>, but found '<block sequence start>'
  in "/config/config.yml", line 58, column 5
Traceback (most recent call last):
  File "detect_objects.py", line 441, in <module>
    main()
  File "detect_objects.py", line 202, in main
    ffmpeg_output_args = ["-r", str(config.get('fps'))] + ffmpeg_output_args
TypeError: can only concatenate list (not "NoneType") to list
On connect called
Traceback (most recent call last):
  File "detect_objects.py", line 441, in <module>
    main()
  File "detect_objects.py", line 202, in main
    ffmpeg_output_args = ["-r", str(config.get('fps'))] + ffmpeg_output_args
TypeError: can only concatenate list (not "NoneType") to list
On connect called

My config.yml looks like this.

# Optional: port for http server (default: shown below)
web_port: 5000

# Optional: detectors configuration
# USB Coral devices will be auto detected with CPU fallback
detectors:
  # Required: name of the detector
  coral:
    # Required: type of the detector
    # Valid values are 'edgetpu' (requires device property below) and 'cpu'.
    type: edgetpu
    # Optional: device name as defined here: https://coral.ai/docs/edgetpu/multiple-edgetpu/#using-the-tensorflow-lite-python-api
    device: usb

# Required: mqtt configuration
mqtt:
  # Required: host name
  host: 10.0.0.xx
  # Optional: port (default: shown below)
  port: 1883
  # Optional: topic prefix (default: shown below)
  # WARNING: must be unique if you are running multiple instances
  topic_prefix: frigate
  # Optional: client id (default: shown below)
  # WARNING: must be unique if you are running multiple instances
  client_id: frigate
  # Optional: user
  user: <user>
  # Optional: password
  # NOTE: Environment variables that begin with 'FRIGATE_' may be referenced in {}. 
  #       eg. password: '{FRIGATE_MQTT_PASSWORD}'
  password: <pass>

# Optional: Global configuration for saving clips
save_clips:
  # Optional: Maximum length of time to retain video during long events. (default: shown below)
  # NOTE: If an object is being tracked for longer than this amount of time, the cache
  #       will begin to expire and the resulting clip will be the last x seconds of the event.
  max_seconds: 300
  # Optional: Location to save event clips. (default: shown below)
  clips_dir: /clips
  # Optional: Location to save cache files for creating clips. (default: shown below)
  # NOTE: To reduce wear on SSDs and SD cards, use a tmpfs volume.
  cache_dir: /cache

# Optional: Global ffmpeg args
# "ffmpeg" + global_args + input_args + "-i" + input + output_args
ffmpeg:
  hwaccel_args:
    - -hwaccel
    - vaapi
    - -hwaccel_device
    - /dev/dri/renderD128
    - -hwaccel_output_format
    - yuv420p
 # Optional: global input args (default: shown below)
# Optional: Global object filters for all cameras.
# NOTE: can be overridden at the camera level
objects:
  # Optional: list of objects to track from labelmap.txt (default: shown below)
  track:
    - person
  # Optional: filters to reduce false positives for specific object types
  filters:
    person:
      # Optional: minimum width*height of the bounding box for the detected object (default: 0)
      min_area: 5000
      # Optional: maximum width*height of the bounding box for the detected object (default: max_int)
      max_area: 100000
      # Optional: minimum score for the object to initiate tracking (default: shown below)
      min_score: 0.5
      # Optional: minimum decimal percentage for tracked object's computed score to be considered a true positive (default: shown below)
      threshold: 0.85

# Required: configuration section for cameras
cameras:
  # Required: name of the camera
  back:
    # Required: ffmpeg settings for the camera
    ffmpeg:
      # Required: Source passed to ffmpeg after the -i parameter.
      # NOTE: Environment variables that begin with 'FRIGATE_' may be referenced in {}
      input: rtsp://xxx:[email protected]:554/Streaming/Channels/102
      # Optional: camera specific global args (default: inherit)
      global_args:
      # Optional: camera specific hwaccel args (default: inherit)
      hwaccel_args:
      # Optional: camera specific input args (default: inherit)
      input_args:
      # Optional: camera specific output args (default: inherit)
      output_args:
    
    # Optional: height of the frame
    # NOTE: Recommended to set this value, but frigate will attempt to autodetect.
    height: 720
    # Optional: width of the frame
    # NOTE: Recommended to set this value, but frigate will attempt to autodetect.
    width: 1280
    # Optional: desired fps for your camera
    # NOTE: Recommended value of 5. Ideally, try and reduce your FPS on the camera.
    #       Frigate will attempt to autodetect if not specified.
    fps: 5

    # Optional: motion mask
    # NOTE: see docs for more detailed info on creating masks
    mask: poly,0,900,1080,900,1080,1920,0,1920

    # Optional: timeout for highest scoring image before allowing it
    # to be replaced by a newer image. (default: shown below)
    best_image_timeout: 60

    # Optional: camera specific mqtt settings
    mqtt:
      # Optional: crop the camera frame to the detection region of the object (default: False)
      crop_to_region: True
      # Optional: resize the image before publishing over mqtt
      snapshot_height: 300

    # Optional: zones for this camera
    zones:
      # Required: name of the zone
      # NOTE: This must be different than any camera names, but can match with another zone on another
      #       camera.
      front_steps:
        # Required: List of x,y coordinates to define the polygon of the zone.
        # NOTE: Coordinates can be generated at https://www.image-map.net/
        coordinates: 545,1077,747,939,788,805
        # Optional: Zone level object filters.
        # NOTE: The global and camera filters are applied upstream.
        filters:
          person:
            min_area: 5000
            max_area: 100000
            threshold: 0.8

    # Optional: save clips configuration
    # NOTE: This feature does not work if you have added "-vsync drop" in your input params. 
    #       This will only work for camera feeds that can be copied into the mp4 container format without
    #       encoding such as h264. It may not work for some types of streams.
    save_clips:
      # Required: enables clips for the camera (default: shown below)
      enabled: False
      # Optional: Number of seconds before the event to include in the clips (default: shown below)
      pre_capture: 30
      # Optional: Objects to save clips for. (default: all tracked objects)
      objects:
        - person      

    # Optional: Configuration for the snapshots in the debug view and mqtt
    snapshots:
      # Optional: print a timestamp on the snapshots (default: shown below)
      show_timestamp: True
      # Optional: draw zones on the debug mjpeg feed (default: shown below)
      draw_zones: False
      # Optional: draw bounding boxes on the mqtt snapshots (default: shown below)
      draw_bounding_boxes: True

    # Optional: Camera level object filters config. If defined, this is used instead of the global config.
    objects:
      track:
        - person
        - car
      filters:
        person:
          min_area: 5000
          max_area: 100000
          min_score: 0.5
          threshold: 0.85

Remove all the optional ffmpeg camera args settings if you aren’t using them.

this is great work @blakeblackshear definitely the righ way of doing this. I had a coral device sitting for like a year, finally put some use to it.

Some observations and would like to hear your comments ideas:

  • Stability is the main issue for me, it doesnt last more than 3-4 hours before i need to restart, i see the cameras freeze on the weburl when it happens - what is the best way to debug that?
  • Looks like the main issue is ffmpeg instability? what is a good way to debug ffmpeg settings and stability?
  • i have 5 cameras running, already tried them all at 640p with 5fps

here is an example of what i see in the logs constantly

ffmpeg -hide_banner -loglevel panic -hwaccel vaapi -hwaccel_device /dev/dri/renderD128 -hwaccel_output_format yuv420p -avoid_negative_ts make_zero -fflags nobuffer -flags low_delay -strict experimental -fflags +genpts+discardcorrupt -rtsp_transport tcp -stimeout 5000000 -use_wallclock_as_timestamps 1 -i rtsp://admin:[email protected]/cam/realmonitor?channel=1&subtype=0 -f segment -segment_time 10 -segment_format mp4 -reset_timestamps 1 -strftime 1 -c copy -an -map 0 /cache/patio-%Y%m%d%H%M%S.mp4 -r 5 -f rawvideo -pix_fmt yuv420p pipe:
frontyard: ffmpeg sent a broken frame. something is wrong.
frontyard: ffmpeg process is not running. exiting capture thread...
backyard: ffmpeg sent a broken frame. something is wrong.
backyard: ffmpeg process is not running. exiting capture thread...
Creating ffmpeg process...

it works but it is unstable, cameras hang in the weburl after 3 hours or less sometimes, some cameras before others

It never comes back, but it does after a container restart? You can increase ffmpeg logging with global_args on one of the cameras. What type of cameras do you have and what does the network look like between frigate and the camera?

it comes back with a docker restart.

Here is the “info” log level output

Creating ffmpeg process...
ffmpeg -hide_banner -loglevel panic -hwaccel vaapi -hwaccel_device /dev/dri/renderD128 -hwaccel_output_format yuv420p -avoid_negative_ts make_zero -fflags nobuffer -flags low_delay -strict experimental -fflags +genpts+discardcorrupt -rtsp_transport tcp -stimeout 5000000 -use_wallclock_as_timestamps 1 -i rtsp://192.168.7.47/stream0 -f segment -segment_time 10 -segment_format mp4 -reset_timestamps 1 -strftime 1 -c copy -an -map 0 /cache/backyard-%Y%m%d%H%M%S.mp4 -r 5 -f rawvideo -pix_fmt yuv420p pipe:
Input #0, rtsp, from 'rtsp://admin:[email protected]/cam/realmonitor?channel=1&subtype=0':
  Metadata:
    title           : Media Server
  Duration: N/A, start: 1604682764.724211, bitrate: N/A
    Stream #0:0: Video: h264, yuvj420p(pc, bt709, progressive), 1920x1080 [SAR 45:59 DAR 80:59], 5 fps, 5 tbr, 90k tbn, 10 tbc
[segment @ 0x55c33bf3f0c0] Opening '/cache/trees-20201106091248.mp4' for writing
Could not write header for output file #0 (incorrect codec parameters ?): No space left on device
Stream mapping:
  Stream #0:0 -> #0:0 (copy)
  Stream #0:0 -> #1:0 (h264 (native) -> rawvideo (native))
    Last message repeated 1 times
trees: ffmpeg sent a broken frame. something is wrong.
trees: ffmpeg process is not running. exiting capture thread...
openspace: ffmpeg sent a broken frame. something is wrong.
openspace: ffmpeg process is not running. exiting capture thread...
Creating ffmpeg process...

Also there is that "could not write header for output file

Multiple cameras, AMCREST POE, AMCREST wireless and a S3VC, issue occurs with both ethernet and wifi cameras

Server is a dual core kabylake w 8GB of ram under ubuntu 20.04. Running the frigate docker.

Server has a wifi connection to the network, i am changing that to wired to see if it makes a difference, do you expect differences there?

It is telling you the /cache folder is out of space in the container.

hmm, ok so i am hitting the 1GB limit on the tmpfs file per the docker composer file, what is stored in tmpfs? does it grow forever or is it flushed every now and then? as that is DRAM, i dont want to increase that too much.

I could map a static directory as a volume as an alternative i guess. Can you elaborate on how is the “cache” used and how you manage it?

version: '2.4'
services:
  frigate:
      container_name: frigate
      # restart: unless-stopped
      privileged: true
      image: blakeblackshear/frigate:stable-amd64
      volumes:
        - /dev/bus/usb:/dev/bus/usb
        - /etc/localtime:/etc/localtime:ro
        - ./config:/config
        - ./clips:/clips
        - type: tmpfs # 1GB of memory, reduces SSD/SD Card wear
          target: /cache
          tmpfs:
            size: 100000000
      ports:
        - "5000:5000"
      environment:
        FRIGATE_RTSP_PASSWORD: "password"
      healthcheck:
        test: ["CMD", "wget" , "-q", "-O-", "http://localhost:5000"]
        interval: 30s
        timeout: 10s
        retries: 5
        start_period: 3m

thanks

you could open an issue on github and add the code there so that it can be merged by @blakeblackshear