Home Assistant Add-on: Promtail
Promtail is an agent which ships the contents of local logs to a private Loki instance or Grafana Cloud. It is usually deployed to every machine that has applications needed to be monitored.
This addon requires supervisor version 2021.03.8
as it relies on the new journald
capability just added. This is the current stable release as of 4/5. If you haven’t updated yet, make sure you update first.
About
By default this addon version of Promtail will tail logs from the systemd journal. This will include all logs from all addons, supervisor, home assistant, local log files in /share
or /ssl
if you have a particular add-on that logs to a file instead of to stdout
.
How do I use it?
Promtail is a central piece of what’s known as the PLG Stack for application monitoring - Promtail, Loki and Grafana. I’m sure a lot of you are already familiar with Grafana data analysis and visualization tools either from the great community add-on or use in some other aspect of your life.
But Grafana is also central to system monitoring. The same company also owns Loki and Promtail which are used to collect and aggregate logs and other metrics from your systems. Then Grafana can pull in this information from Loki so you can explore, analyze, and create metrics and alerts. Grafana isn’t the only tool that can read from Loki but it is usually used in this stack since its all designed to work well together.
Essentially the process you probably want to set up is this:
- Promtail scrapes your logs and feeds them to Loki
- Loki aggregates and indexes and makes its API available
- Add Loki as a data source to Grafana and explore
Great! Where’s Loki?
Also in this repository! You can find it here.
Anything else I need to know?
Before making any complicated scrape configs I’d recommend reading the Loki best practices guide. Also learning about LogQL and what it can do. Less is more in the scraping stage.
Other then that the readme and documentation cover all the options. If you need help, you can:
- Comment here
- Open an issue in the repository
- Ask for help in the #add-ons channel of the HA discord (I’m CentralCommand#0913 there).
Also big thanks to @massive for letting me know the right way to do this with the journal. And for providing the journal scraping configuration, that’s what this add-on is uses as its default scraping configuration now.
HA relevant scrape config examples
Before journald
support was released for addons I had to test purely using additional scrape configs looking at log files other addons pumped out. I don’t use any of these anymore now that journald
support exists but I figured I’d share them. I thought people needing to configure additional scrape configs might find it useful to have a few other HA relevant examples at their disposal (in addition to what’s in the promtail docs).
Caddy 2 access logs
You can get Caddy 2 to log all access to the file of your choice and then scrape it with Promtail. Add something like this to your Caddyfile:
:443 {
log {
output file /ssl/caddy/logs/caddy.log {
roll_size 20MiB
roll_keep_for 168h
}
}
}
Then you can add a scrape config like this:
- job_name: caddy
pipeline_stages:
- json:
expressions:
stream: level
status_code: status
host: request.host
time: ts
- labels:
stream:
status_code:
host:
- timestamp:
source: time
format: Unix
static_configs:
- targets:
- localhost
labels:
job: caddy
__path__: /ssl/caddy/logs/caddy.log
This one was structured so it was nice and easy, the others were tougher.
Zigbee2MQTT logs
The Zigbee2MQTT add-on dumps out all of its logs to a folder called log
in its folder (/share/zigbee2mqtt/log
by default). This includes every MQTT message it publishes. It’s not structured, but its scrapable, here’s the config I used when testing:
- job_name: zigbee2mqtt
pipeline_stages:
- regex:
expression: '^(?P<stream>\S+)\s+(?P<time>\d{4}(?:-\d\d){2} \d\d(?::\d\d){2}):\s+(?P<content>.*)$'
- regex:
expression: '^(?P<mqtt_event>MQTT publish):\s+.*$'
source: content
- labels:
stream:
mqtt_event:
static_configs:
- targets:
- localhost
labels:
job: zigbee2mqtt
__path__: /share/zigbee2mqtt/log/*/log*.txt
Home Assistant log file
This actually won’t work anymore since /config
isn’t being mapped by the addon. But I thought it was a good reference since it was a bit tricky to figure out. The multiline
bit at the top causes it to suck up stack traces into the log line they are generated from. Otherwise it would’ve been really tough to read these logs with each line of the stack trace as a separate log entry.
- job_name: homeassistant
pipeline_stages:
- multiline:
firstline: '^\d{4}(?:-\d\d){2} \d\d(?::\d\d){2} '
- regex:
expression: '^(?s)(?P<time>\d{4}(?:-\d\d){2} \d\d(?::\d\d){2})\s+(?P<stream>\S+)\s+\((?P<thread>[^)]+)\)\s+\[(?P<component>[^\]]+)\].*)$'
- labels:
stream:
time:
component:
static_configs:
- targets:
- localhost
labels:
job: homeassistant
__path__: /config/home-assistant.log