It is perfectly possible of using a real ssl certificate locally only.
I used to:
add my registered domain name to my routers dns service (domain.my).
use a dhcp reservation for ha f.e. 192.168.0.1
run apache on a w10 machine
run certbot (let’s encrypt) on that w10 machine regularly to generate the SSL certificate.
copy the certificate to HA.
In this way ha.domain.my does resolve correctly to 192.168.0.1
No hassle, no nginx, no nabu casa, no ports open to ha (only port 80 for apache, as it is required for certbot to work)
You could stop apache/close port 80 after that, but i ran certbot once a week automated, as the let’s encrypt certificate is only valid for 90 days; wanted it to renew automatically
I had not read this post for a while for there was not a working solution proposed here till I hope now. But now that I am here I like to react on Nid01, for a reason that I feel a lot like Nid01.
I use HA a long time, more then you see in my profile, used it probably since version 14. something right after NoDo.
What started out as helping people in this forum now often goes to derogative remarks, pointing out that it is in the forum, or look better in google, do your search.
The fun of this forum used to be we are all explorers, one more gifted then the other, now you have to be a crack often to not get looked down upon. This forum has become soooo huge that finding stuff, and here it comes as N1do1 pointed out so well, if you do not know the search term of phrase, specially as none native English speaker, you are lost.
And asking here often and more often then it used to be, gets met with do your searching do not trouble us gods who know all. Which I think is not what should be, Help or do not react.
This is possible and I use https on local network for a few years. I bought domain because I could have only four subdomains on a free domain.
I set up nginx reverse proxy and got my lets encrypt certs for a domain and all subdomains. My domain is my ha instance and subdoamins are other docker containers ie. adguard.mydomain.com
I use adguard to do dns rewrite for mydomain.com and mysub.domain.com to resolve fqdn to my server ip.
And this is working with no problem, certs are automatically renewed.
There is just one problem I didn’t manage to solve. I can’t open my.subdoamin.com using nabu casa cloud. As other people wrote to use iframe from outside of your network your subdomain should be accessible over net and my aren’t.
I have docker installation, but this should work with any other type of ha installation. I can provide more info when I came home from work.
you might be right. But it is also right, that A LOT of people who “EXPECT” help from volunteers are not slighly providing information what is the issue, where the issue is, if its reproducable, how is it reproduceable and such necessary information. The volunteers help in their free time and litereally not getting paid for that job. You can check any tech forum and you find this behavior.
Also literally many people do not understand how a forum works. It is also not about language barriers, as I saw that since the beginning of my times in BBS or internet. Yes, also in my own language. And that’s since the 14.400 baud times. So for quite some time.
So after all the expectation from a high percentage of people is to get pre processed data to fit exactly their problem and to solve it.
Coming back to the issue when you do not understand the language or you do not know what you have to search for. I just gave directly one example even with the precise search term. There are thousands of tutorials just for these specfic terms. You can even choose your own language. Or extend it with the things you need. In this regard “Home assistant +nginx +reverse proxy”
Also noone of the so called gods and cracks is looking down on anyone. I would challenge myself instead if I asked the right question to my problem, instead of searching the issue at someone else.
But I also think due to the internet, with its AI’s, googles and other services people getting lazy and forgot how to describe in a good way their issue and what they want to achieve. Especially with the attitude to get everything for free and precisely fitting to their problem without describing the problem. Nowdays it’s even not necessary anymore to save data in your brain, coze if you don’t know you google it.
“Give a Man a Fish, and You Feed Him for a Day. Teach a Man To Fish, and You Feed Him for a Lifetime”
And about hiding my message… Well… “Computer says no”
I too am looking for a solution that is as simple as possible, but so far I have not been successful. In my case, DNS is resolved in the local network via a pi-hole, here I can also assign local addresses (homeassistant.local > 192.x.x.x). Would it be possible to use a functioning OpenSSL certificate that contains, for example, homeassistant.local as the address? I did not succeed with the local IP address, which seems logical to me after reading the articles.
Since I don’t use many clients, I wouldn’t have any problems manually installing certificates there if necessary.
Surely the issue is that even those of us who have a basic knowledge of networking, browsers and Linux find the whole concept of security certificates rather mind-boggling. Many questions arise such as (a) Why do we need a security certificate to send a command from one device to another on our local network, which is meant to be the whole point of Home Assistant? (b) What is that certificate actually doing? (c) Who or what checks and validates the certificate? (d) What are the ways in which the certificate can fail, and does it (as some posts have implied) need to be renewed or updated regularly?
Home Assistant is potentially a great product but it can also create a lot of stress for something which, after all, is meant to make our lives easier.
Edit: I too (having got frustrated with Google Home’s deteriorating speed and reliability) am slowly trying to make Home Assistant useful, and am just reaching the hurdle of voice control via web browsers).
You don’t need ssl cert for local network. When you look on it , ssl certs are not meant to be used on localhosts. But you can do it. The problem I’m facing is using ssl client certs for accessing different containers aka addons. This will be a great security feature IMHO. For example you can garant access to ie. Zigbee2mqtt web ui based on device that is accessing it based on client ssl cert.
Like many, I struggled with this. My solution which doesn’t require copying or samba sharing of the Let’s Encrypt SSL cert file, but does require a router running Tomato:
Run nginx-proxy-manager in docker to obtain the SSL certificate for your [sub]domain and proxy HA. Config HA to trust the reverse proxy.
To allow for internal network HTTPS access, add a rule to the router’s DNSMasq config, directing the external domain name to the docker server where nginx-proxy-manager runs. e.g.:
So, both internal and external HTTPS requests to the domain use the reverse proxy.
Regular HTTP, non-proxied access to HA also remains available locally.
Optional: For extra security, to avoid exposing 443 externally, forward another port to 443 for your server running nginx-proxy-manager. If doing this, also forward the port within your docker server, eg.
sudo iptables -t nat -A PREROUTING -p tcp --dport xxxx -j REDIRECT --to-port 443
so that you can use the same URL both internally and externally.
If we use a registered (purchased) domain name, does that need to be purely for the use of the certificate? I’ve had a domain for years that’s currently used just to host a blog which I very rarely update. Would I need to change the domain records to point it to my local machine? (I don’t want external access for my Home Assistant server - or anything on my home network with the exception of a couple of services - and am puzzled by the idea of using an Internet domain name for a local certificate.)
This is what I did. I purchased a domain because you can have up to four subdomains on free domain. As I use docker installation I wanted a subdomain for all my docker containers. That’s why I purchased a domain. It’s 20 € yearly so it’s not that expensive. I spend way more on beer weekly.
Now the only thing you have to do is to get ssl cert for your domain and subdomains. It’s easy to do it using nginx. I use subdomains for my containers ie. adgurad.mydomain.com
To resolve my domain and subdomains to my ha local ip I use adguard for dns rewrite.
There is a one lack of my setup and that is that I can’t access remotely my docker containers ie z2m. I’m using nabu casa for remote access.
As people said on the forum for remote access you will need that your subdomain is accessible over net and my isn’t and I don’t want to make it accessible. But maybe there is another way but none figured it out.
As I read this some months later, I now see exactly what is attempting to be accomplished.
The solution is to follow the steps in this video - just ignore the part about making a firewall rule to let traffic in from the internet.
This video will register a domain name for you (not really that important since you don’t want external access, but it WILL get you a valid FQDN for the cert the be assigned to), then it will show you how to configure the certificate part, and will auto-renew that cert for you, and then it will show you how to configure internal dns for proper name resolution so you don’t get certificate mismatch errors.
Same problem (which I expect should have a common solution) - "How to run Assist (which requires HTTPS) with NGINX setup (which doesn’t use HTTPS for local access).
This is the closest I’ve seen to a solution but am still stumped as to what steps to take.
I also have a Synology but also run a PiHole DNS server. It seems you’re “fooling” HA into thinking it’s accessing a secure HTTPS url but “re-writing” locally is that right? Can this be done on a PiHole?