I didn’t realize that portainer hides addon containers by default so once I found that out I was able to find the correct container and it worked!
YOU ARE A LIFE SAVER!
Anyone have any clue why it is not possible to renew or create new letsencrypt certificate?
Me and two other users already reported it, but so far no response.
Existing cert renewal is timing out and new one failing with “Internal Error”
Don’t know if I had the same issues (if you’re not using DNS challenge) however new certificates/renewals failed for me when I screwed up the 80 port forwarding rule (multiple NAT environment).
Greetings good people, can somebody explain and help me?
I read comments but still not getting it.
I have webserver on lan which i want to see from remote location on my phone on HASS
I have nabucasa and its dns name to connect to.
And i want to be able to make a reverse proxy on the other webserver on lan.
First i’m trying non ssl to check if its working and its not.
Pls help
404: Not Found
For Source field you need an actual domain name (not an IP but something like mydomain.duckdns.org or mysubdomain.mydomain.duckdns.org if you’re also using DuckDNS so that you don’t have to deal with ISP renewing your IP).
Thank you for explanation as i am using nabucasa i didnt figure it out. Never the less i made a tutorial how to mount media folder for HASSOS. So now i dont need nginx addon
If someone neede here :
I was recently doing main router FW upgrade, so I rather went there and recreated my forwarding rules, but it didn’t help.
I described what other steps I tried in the github issue, but so far it looks like there is some issue with certbot not configuring nginx properly to serve the “acme-challenge”, so it returns 404 to Letsencrypt servers which are trying to verify the renew request.
This add on don’t start. I have install Maria DB and run it. After i try to run Nginc Proxy Manager but nothing happens . Nothing writing in the log. Have you an idea of the pb ? (i run Influx Db as database normaly, juste install MariaDB for this add on)
before i was using NGINX Home Assistant SSL proxy add on who works well, but when i connect and whant to change Node Red i have an error because size of node red is too big, and in this add on i d’ont know how i can change the max size
Hi All,
I have a question. Im using NPM and my Nextlcoud is begind this.
When i connect to my domain for nextcloud all data goes via NPM… is there a option to send information directly to the server of Nectlcoud without NPM instead when its forwarded the incoming domain…
I don’t understand the issue. Are you referring to NPM or to NGINX Home Assistant SSL proxy?
Could you explain in detail (including your config)?
Can you describe what you want to achieve? Have you made any changes to config.php of Nextcloud?
installed Nginx Proxy Manager a few week ago and didn’t change the Nginx config yet.
I wanted to add CloudFlare Origin pulls
but how can I access “/etc/nginx/certs/” from HAOS ?
Setting up NGINX to use TLS Authenticated Origin Pulls
For authenticated origin pulls to work, use Full SSL in the Cloudflare SSL/TLS app, and update the origin web server SSL configuration. Download origin-pull-ca.pem origin-pull-ca.pem and place the certificate in a file on your origin web server, for example in /etc/nginx/certs/cloudflare.crt
Then add these lines to the SSL configuration for your origin web server:
ssl_client_certificate /etc/nginx/certs/cloudflare.crt; ssl_verify_client on;
also I love to see some configuration template of Nginx, especial as a revers server.
Maybe i dont know how to explain correct.
nc.domain.com goes throught NPM and send the data to the server with port 4443 for example
When i use Nextcloud on my mobile and backup photo’s or video’s all the data goes throught NPM Server (RPi4) to other RPi4 where Nectcloud is running.
But if there is a option so incoming data from router goes after the redirect from NPM directly to second RPi … Then first RPI is not used anymore…
Sorry, I’m not sure I understood: do you want to be able to access folders from Nextcloud on LAN without going through NPM (while access from outside the network to Nextcloud is still going through NPM)?
When I access Nextcloud on LAN i can connect direct. Same network.
When I access Nextcloud from outside I use NPM.
But my Webserver is backup websites also using Webdav option. But 2.5GB every night will be pushed over NPM to Nextcloud server.
I must connect via NPM to access over HTTPS without open my Nextcloud port to outside.
Only the whole 2,5GB goes over NPM and forwarded to Nextcloud.
I’m sorry, but still don’t get it.
The mentioned 2.5 GB of daily data is the size of a full backup (do you have near 2.5 GB of data that might increase/decrease within a said margin, but still not larger than let’s say 3 GB, on a regular basis and you need to backup that each day) or incremental backup (thus you are adding/modifying 2.5 GB of data every each day and you need to keep all of that, so that the total size of the data needing backup is much larger - ie. in the TBs region for longer periods such as 2.5 GB * 365 days = 912 GB of different data every year)?
I’ll describe my setup, please mention if parts of it matches yours (device number can be increased/decreased):
- phones 1, 2 and 3 (1 Android and 2 iOS) and tablet 1 (iOS) are running Nextcloud mobile clients that back up data to folder A, B, C and folder D respectively (all folders are with pictures), on the server;
- desktop 1, laptop 1 and laptop 2 (dual boot Windows and Linux) running Nextcloud desktop clients that back up folders E, F and G to the server (folders E to G include some data that are specific to each device, mostly drivers stuff);
- Nextcloud server (folder H is common to all instances and includes mostly configs for devices such as the wireless router config file or bookmarks and folder “I” has files that I need to share with persons outside the house).
Folders A to G are syncronized from client to server (original data is on the phone or laptop/desktop and pushed to the server) and folder H is syncronized between all devices whenever there is a change in the client data (dependes on the situation, however with working from home there is very little new data).
ALL folders A to H are backed up (with encryption) to an external site using rclone on a weekly basis with copy function (so that only new files are pushed to remote backup). Folder “I” is not backed up to external site.
You have to exec inside the docker container of this add-on and use curl/wget to download that ClaudFlare cert.
Then open your host definition in UI and in advanced section you should be able to add those two config lines.
Thank you.
Unfortunately, as I’m running Home Assistant OS I don’t think I can, I can try to use the SSL add-on but it’s so limited I don’t think I’ll be able too.