Connecting to HA locally using HTTPS

It is perfectly possible of using a real ssl certificate locally only.
I used to:

  • add my registered domain name to my routers dns service (
  • use a dhcp reservation for ha f.e.
  • run apache on a w10 machine
  • run certbot (let’s encrypt) on that w10 machine regularly to generate the SSL certificate.
  • copy the certificate to HA.

In this way does resolve correctly to
No hassle, no nginx, no nabu casa, no ports open to ha (only port 80 for apache, as it is required for certbot to work)
You could stop apache/close port 80 after that, but i ran certbot once a week automated, as the let’s encrypt certificate is only valid for 90 days; wanted it to renew automatically :wink:

Sounds awesome, finally a do-able work around, finally working voice, to me the way it went was like a train running and running and no one truly was interested in offline integration. Thank you.

a few questions:
registered domain name, is that local domain name?
where to copy the certificate into HA??

I had not read this post for a while for there was not a working solution proposed here till I hope now. But now that I am here I like to react on Nid01, for a reason that I feel a lot like Nid01.
I use HA a long time, more then you see in my profile, used it probably since version 14. something right after NoDo.
What started out as helping people in this forum now often goes to derogative remarks, pointing out that it is in the forum, or look better in google, do your search.
The fun of this forum used to be we are all explorers, one more gifted then the other, now you have to be a crack often to not get looked down upon. This forum has become soooo huge that finding stuff, and here it comes as N1do1 pointed out so well, if you do not know the search term of phrase, specially as none native English speaker, you are lost.
And asking here often and more often then it used to be, gets met with do your searching do not trouble us gods who know all. Which I think is not what should be, Help or do not react.


There are several ways to do this. I mean self-sign certificate is one way to do it, however, adding each cert to all the devices can be painful.

Another option is to buy a domain name + Cloudflare account (better to get the domain from them too) with this you have endless options. Cloudflare Zerotiers is the easiest way.

Next is to throw in the reserve proxy/NPM caveat that you will need to own an IP public and open a port in your home or you need to use Cloudflare API to update your IP Public need to find the script

DuckDNS is something for you to try ( I believe there are some guide articles around here somewhere)

This is possible and I use https on local network for a few years. I bought domain because I could have only four subdomains on a free domain.
I set up nginx reverse proxy and got my lets encrypt certs for a domain and all subdomains. My domain is my ha instance and subdoamins are other docker containers ie.
I use adguard to do dns rewrite for and to resolve fqdn to my server ip.
And this is working with no problem, certs are automatically renewed.

There is just one problem I didn’t manage to solve. I can’t open using nabu casa cloud. As other people wrote to use iframe from outside of your network your subdomain should be accessible over net and my aren’t.
I have docker installation, but this should work with any other type of ha installation. I can provide more info when I came home from work.

Hello jayjay,

you might be right. But it is also right, that A LOT of people who “EXPECT” help from volunteers are not slighly providing information what is the issue, where the issue is, if its reproducable, how is it reproduceable and such necessary information. The volunteers help in their free time and litereally not getting paid for that job. You can check any tech forum and you find this behavior.
Also literally many people do not understand how a forum works. It is also not about language barriers, as I saw that since the beginning of my times in BBS or internet. Yes, also in my own language. And that’s since the 14.400 baud times. So for quite some time.
So after all the expectation from a high percentage of people is to get pre processed data to fit exactly their problem and to solve it.

Coming back to the issue when you do not understand the language or you do not know what you have to search for. I just gave directly one example even with the precise search term. There are thousands of tutorials just for these specfic terms. You can even choose your own language. Or extend it with the things you need. In this regard “Home assistant +nginx +reverse proxy”

Also noone of the so called gods and cracks is looking down on anyone. I would challenge myself instead if I asked the right question to my problem, instead of searching the issue at someone else.
But I also think due to the internet, with its AI’s, googles and other services people getting lazy and forgot how to describe in a good way their issue and what they want to achieve. Especially with the attitude to get everything for free and precisely fitting to their problem without describing the problem. Nowdays it’s even not necessary anymore to save data in your brain, coze if you don’t know you google it.

Give a Man a Fish, and You Feed Him for a Day. Teach a Man To Fish, and You Feed Him for a Lifetime

And about hiding my message… Well… “Computer says no” :poop:

1 Like

No it isn’t, but i do use it local, as it happens to be my local domain too (according my routers dns)

i used to use samba share (\\config\ssl if i am not mistaken)

i recently moved to another domainname provider (TransIP), as the old one didn’t support dns challenge, only http challenge
Turned out it was a little cheaper too :grin:

With that i was able to move from certbot(running on W10) to the let’s encrypt addon, so no need to copy it anymore.

And i now realize it doesn’t need apache either (however i do have it running anyway for other purposes)

So basically, you just need to get a domain name, and assign that to your routers dns (and add a dhcp reservation)

I too am looking for a solution that is as simple as possible, but so far I have not been successful. In my case, DNS is resolved in the local network via a pi-hole, here I can also assign local addresses (homeassistant.local > 192.x.x.x). Would it be possible to use a functioning OpenSSL certificate that contains, for example, homeassistant.local as the address? I did not succeed with the local IP address, which seems logical to me after reading the articles.
Since I don’t use many clients, I wouldn’t have any problems manually installing certificates there if necessary.

Surely the issue is that even those of us who have a basic knowledge of networking, browsers and Linux find the whole concept of security certificates rather mind-boggling. Many questions arise such as (a) Why do we need a security certificate to send a command from one device to another on our local network, which is meant to be the whole point of Home Assistant? (b) What is that certificate actually doing? (c) Who or what checks and validates the certificate? (d) What are the ways in which the certificate can fail, and does it (as some posts have implied) need to be renewed or updated regularly?

Home Assistant is potentially a great product but it can also create a lot of stress for something which, after all, is meant to make our lives easier.

Edit: I too (having got frustrated with Google Home’s deteriorating speed and reliability) am slowly trying to make Home Assistant useful, and am just reaching the hurdle of voice control via web browsers).


You don’t need ssl cert for local network. When you look on it , ssl certs are not meant to be used on localhosts. But you can do it. The problem I’m facing is using ssl client certs for accessing different containers aka addons. This will be a great security feature IMHO. For example you can garant access to ie. Zigbee2mqtt web ui based on device that is accessing it based on client ssl cert.

Like many, I struggled with this. My solution which doesn’t require copying or samba sharing of the Let’s Encrypt SSL cert file, but does require a router running Tomato:

  • Run nginx-proxy-manager in docker to obtain the SSL certificate for your [sub]domain and proxy HA. Config HA to trust the reverse proxy.

  • To allow for internal network HTTPS access, add a rule to the router’s DNSMasq config, directing the external domain name to the docker server where nginx-proxy-manager runs. e.g.:

  • So, both internal and external HTTPS requests to the domain use the reverse proxy.

  • Regular HTTP, non-proxied access to HA also remains available locally.

  • Optional: For extra security, to avoid exposing 443 externally, forward another port to 443 for your server running nginx-proxy-manager. If doing this, also forward the port within your docker server, eg.
    sudo iptables -t nat -A PREROUTING -p tcp --dport xxxx -j REDIRECT --to-port 443
    so that you can use the same URL both internally and externally.

I have basically similar setup as you do, but I use adguard to rewrite dns request for a domain and subdomains and my (sub)domain is not accessible over the net. I use it on local network only.

If we use a registered (purchased) domain name, does that need to be purely for the use of the certificate? I’ve had a domain for years that’s currently used just to host a blog which I very rarely update. Would I need to change the domain records to point it to my local machine? (I don’t want external access for my Home Assistant server - or anything on my home network with the exception of a couple of services - and am puzzled by the idea of using an Internet domain name for a local certificate.)


1 Like

This is what I did. I purchased a domain because you can have up to four subdomains on free domain. As I use docker installation I wanted a subdomain for all my docker containers. That’s why I purchased a domain. It’s 20 € yearly so it’s not that expensive. I spend way more on beer weekly.
Now the only thing you have to do is to get ssl cert for your domain and subdomains. It’s easy to do it using nginx. I use subdomains for my containers ie.
To resolve my domain and subdomains to my ha local ip I use adguard for dns rewrite.
There is a one lack of my setup and that is that I can’t access remotely my docker containers ie z2m. I’m using nabu casa for remote access.
As people said on the forum for remote access you will need that your subdomain is accessible over net and my isn’t and I don’t want to make it accessible. But maybe there is another way but none figured it out.

Sharing my solution for this, if anyone else can benefit from it.

I was running DuckDNS for external access before Nabu Casa came along, but I moved to Nabu Casa because I wanted to support the project and close a port on my firewall.

When attempting to put a button to call assist on my dashboard, I quickly realized that my local connection wasn’t going to work using my IP on mobile, and discovered this thread.

Here’s my solution:

  1. I restored my DuckDNS configuration and it’s maintaining a cert for my home assistant server. I don’t have any ports open on my firewall for Home Assistant.
  2. I happen to be running a NextDNS server on my Synology NAS and it supports rewrites, so I used that to rewrite my URL to my local IP.
  3. On mobile, I configured my Internal URL to, which NextDNS rewrites to my local IP.

With this, the app is happy and WebSocket and Local Push are connected and available, and Assist works great.

Hope this helps someone with a similar config.


An u write a manual for how to setup exactly

I run homeassistant on a intel nuc
I have a synology.
I need https local because now i can’t talk to my doorbell. Only with https the microphone works

1 Like

Most routers can do this, as most of them come with their own dns server :wink:

I happen to use and like NextDNS for its other features as well, so this was just a bonus.

1 Like

As I read this some months later, I now see exactly what is attempting to be accomplished.

The solution is to follow the steps in this video - just ignore the part about making a firewall rule to let traffic in from the internet.

This video will register a domain name for you (not really that important since you don’t want external access, but it WILL get you a valid FQDN for the cert the be assigned to), then it will show you how to configure the certificate part, and will auto-renew that cert for you, and then it will show you how to configure internal dns for proper name resolution so you don’t get certificate mismatch errors.

Happy TTSing!


Same problem (which I expect should have a common solution) - "How to run Assist (which requires HTTPS) with NGINX setup (which doesn’t use HTTPS for local access).
This is the closest I’ve seen to a solution but am still stumped as to what steps to take.
I also have a Synology but also run a PiHole DNS server. It seems you’re “fooling” HA into thinking it’s accessing a secure HTTPS url but “re-writing” locally is that right? Can this be done on a PiHole?