this is most likely because there is too much info to process and your PI becomes under specced for that type of job.
Dear,
Thatâs why I pointed,
I have deleted and created new data yesterday.
Even after that my pi Hangs when wanted to grab data for 30+days.
Is there any way where i can move data to PC or on my opi and than grab data from there.
of course you can move grafana and influxDB out of the Raspberry Pi.
The easiest is probably using docker containers
Then in HA, where you define your InfluxDB, you can set the hostâs IP address for the host that actually has the InfluxDB data on.
I want to keep influxdb on my pi,
I just want something like,
Backing data on regular base to another device from where I can grab data in grafana.
Is there any way to do it
Any reason why youâd want to keep it on the pi?
of course you could also have grafana running on a different machine and pulling data from influxdb on the pi, but if youâre going to have grafana on a separate machine, you may as well move influxdb there too.
Thatâs what Iâd personally do (and currently doing) to leave the pi do what it needs: HA
What takes long is most likely the data retrieval from the influxdb, so having grafana elsewhere pulling form the pi might actually be worse then all in one as the data will now have to get out of the pi, onto the network, and to the remote machineâŚ
Not forgetting that with time youâll end up killing your SD card and lose all the data on itâŚ
Dear, I have 2 pi s and I donât want to use my mac or pc for influxdb or grafana.
And thatâs the reason why I wanted to use pi for influxdb.
What I was thinking.
Saving data on infludb on my pi.
Syncing data on PC.
And would like to use garafan on PC to grab data from influxdb which is on pc
I understand. On the basis that the DB will remain on one of the Pi, you will probably still experience the same issue with speed, albeit it might be a bit quicker if the other Pi that hosts influxdb does nothing but influxdbâŚ
Hi all, Iâm using Grafana and Influx db as addon on hassio running directly on rpi b 3+, created graphs for rpi cpu temp and 3 graphs for temp, humidity and pressure of a zigbee aqara sensor connected with cc2531 dongle, is it right that in Lovelace iframe card I donât see updated graphs of the sensor aqara? I used embed link but it doesnât refresh the graphs, only cpu temp is updated. I donât know how to retrieve data in order to display updated graphs, how can I do it?
Is there any reason, why you prefere a Picture over the Panel iframe?
How did you managed to see the temperatures with a proper name? Mine are shown as: °C {entity_id: mj_unsure_temp}
And that is the best I can get
Hello everyone, HA newbie here.
I tried to follow these steps to install grafana and influxdb. after deploying grafana i cant access its interface via http://ip_address:3000
it seems that port 3000 is not open & that is a problem, when i ran nmap i get
nmap localhost
Starting Nmap 7.60 ( https://nmap.org ) at 2019-10-19 16:14 UTC
Nmap scan report for localhost (127.0.0.1)
Host is up (0.000080s latency).
Not shown: 995 closed ports
PORT STATE SERVICE
22/tcp open ssh
139/tcp open netbios-ssn
445/tcp open microsoft-ds
8083/tcp open us-srv
8086/tcp open d-s-n
Nmap done: 1 IP address (1 host up) scanned in 0.04 seconds
shouldnt port 3000 be displayed as well? any ideaâs how to fix this.
i also tried this command do deploy grafana: docker run -d --name=âgrafanaâ --restart always -p 3000:3000 -v /volume1/docker/grafana:/var/lib/grafana grafana/grafana
i get under status that grafana keeps restarting
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
8c09454abb95 grafana/grafana â/run.shâ 16 minutes ago Restarting (1) 6 seconds ago grafana
after reading grafana docs, the issue might be with my storage path. i have to dig into it bit further.
i run, per grafana instructions:
docker volume create grafana-storage
docker run -d -p 3000:3000 --name=âgrafanaâ -v grafana-storage:/var/lib/grafana grafana/grafana
now i am able to access grafana via http://ip_address:3000
Has anyone here using Grafana found a way to change the size of the text on the Graphs is makes?
Hi,
Iâm having a problem that my camera card wonât show up. My camera entity remains in status idle. When I open the grafana rendered image link in my browsers, the image is shown fine.
What can be the cause of my problem?
camera_image: camera.grafana_graph
type: picture-glance
entities: []
camera:
- platform: generic
name: grafana_graph
still_image_url: 'http://192.168.1.100/api/hassio_ingress/4fpCsWMcL4pSth3Vb_qxznesKZBLi6jG0Wow1-d4nvc/render/d-solo/wctCn1aZz/home-assistant?from=1575198812333&to=1575803612333&orgId=1&panelId=2&width=1000&height=500&tz=Europe%2FBrussels'
username: !secret grafana_user
password: !secret grafana_pwd
I canât get this to work despite that I have followed the guide step by step
This is my current camera setup
camera:
- platform: generic
name: CPU Temp
still_image_url: 'https://192.168.0.101:8123/api/hassio_ingress/UmQCp6CniC9izgNcJ1g3FnP5dK9lOIEqcSjAiItj25k/render/d-solo/XA4Is2-Zk/cpu-temp?from=1575721670400&to=1575743270400&orgId=1&panelId=2&width=1000&height=500&tz=Europe%2FStockholm'
verify_ssl: false
username: !secret grafana_user
password: !secret grafana_password
But I have tried different still_image_urlâs. If I click on share/direct link rendered image I get the following start url: https://xxxxxx.duckdns.org:8123/xxxxxx but if I read the guide it should be http:// and local port 3000. If I use https://xxxxxx.duckdns.org:8123/xxxxxx in a browser the graph shows but not in the camera image. Also I would prefer to use local IP
I use docker and Hass.IO from where I installed the Grafana as an addon. In the docker setup it seems like 3001 is the local port. But if I try :3001 there is still no change
Can someone guide on what IP:port I should use to get this going?
@arsaboo, Nice article, helped me a lot!
I noticed that the camera images would go blank during the retrieval of some slower graphs. I managed to hack my way around this by piping curl to a tmp file, and than moving it locally.
command: 'curl -s -H "Authorization: Bearer API_KEY" "http://grafana/url" > /config/downloads/temperature.tmp; mv /config/downloads/temperature.tmp /config/downloads/temperature.png'
What is solution for Raspberry users on HassOS when is PhantomJS not supported?
I would recommend using Raspbian Stretch as there is a pre-compiled PhantomJS binary, or if you prefer the latest Raspbian Buster you may be able to compile PhantomJS from source.
For anyone stumbling onto this thread when googling who might run into the grafana errors in docker:
âââ
mkdir: canât create directory â/var/lib/grafana/pluginsâ: Permission denied,
GF_PATHS_DATA=â/var/lib/grafanaâ is not writable.,
You may have issues with file permissions, more information here: http://docs.grafana.org/installation/docker/#migration-from-a-previous-version-of-the-docker-container-to-5-1-or-later,
âââ
This was solved by going into the docker volume I attempted to mount and running âchmod 777 grafanaâ to open up the grafana folder permissions.
also,
- The newest default grafana password is not root:root but admin:admin
- The grafana URL needs to be accessible from your web browser, not just the grafana server. That means localhost:8086 wonât work. Use the host IP or name.
Excellent, it worked just fine to integrate a complete dashboard with Grafana.
Only additional improvement that I would like to mention: in step 5, I find it easier not to mention two entities in âincludeâ in order to have the full range of captors included by default. When you include these two, it excludes all others.