Using Cloudflare Access and Cloudflare Argo - Thoughts?

Curious what others think about this setup in terms of security and usability. I have been testing it out and it works great so far. I’m not necessarily committed to it long term since there is a cost, so really just experimenting.

To start, my setup is on a NUC. I run Proxmox and then have an Ubuntu Server 16.04 VM that I run Docker in for Home Automation stuff. Then I have the typical containers: Home Assistant, Mosquitto, Node Red, etc. etc. Connection was previously Cloudflare -> Nginx (with SSL certs and Oath2Proxy with Google account verification) -> Home Assistant (un-encrypted, but on my local VLAN’ed network so I don’t care about that - NGINX did the encryption for me).

New setup I am trying is going all in with Cloudflare.

On the Cloudflare side, I setup Cloudflare Access which can authenticate a user through Google or via email BEFORE they can ever attempt to connect to my Home Assistant setup. This is free for up to 5 users per month. Very simple and easy to use. I also created a bypass for API calls (such as GPS Logger). It only bypasses for very specific URL patterns. This is necessary for these types of calls.

Then I setup an Argo tunnel on my Ubuntu Server VM. Basically a local service that creates a tunnel to Cloudflare. You point it to a specific address (Home Assistant’s local address) and it creates a CNAME entry on Cloudflare for that subdomain. So hass.mydomain.com routes through the tunnel to the local Home Assistant instance. Note that this lets me close ALL NGINX ports to the outside world since the connection is initiated from my network. Also, it only allows traffic to that specific internal address, Hass in this case.

Then Home Assistant has separate username/passwords for each user. TOTP is not enabled.

As an extra step, I enabled the Web Firewall on Cloudflare to block all non-US traffic.

So the whole setup is something like this:

Cloudflare Access (offsite, authenticated through Google, and connection must be US based) —>
Cloudflare Argo Tunnel —>
Home Assistant instance with username/password

Again, I have NO ports open for Hass anymore (I still have one open for OpenVPN). The connection is a lot quicker when out and about as well. I have also tried some various scenarios to break Argo and it always comes back fairly quickly.

Biggest downsides is the cost. Its about $5/month for Argo. It’s also entirely dependent on Cloudflare. On the other hand, I can create up to 1000 tunnels if I really wanted to for various services and never open a single port for any of them. And again, users must be authenticated through Google before a request ever hits my local server. If there is a problem with Cloudflare, I can always fall back to a VPN connection if I need to though.

I really like the setup so far and everything works including Google Assistant, outside API calls for things like GPS Logger, etc. I’m curious how Home Assistant Cloud access will be compared to this.

So, what I am overlooking? Yes I can do this all in house for free. Yes I am relying on Cloudflare as the middle man. But in terms of security, I think its really good. Setup is pretty easy as well.

2 Likes

This looks really interesting. Going to have a look at that.

I have tried setting up Nginx and Caddy for access to HA and my Synology NAS, but for some reason that doesn’t work for me. I have spent many hours on that, tried a lot of configs and read many instructions, but can’t succeed in getting access (HA keeps spinning, sometimes I can login but then HA page never load, also access to Synology is not working)… But direct forwarding the 443 port to my HA instance is also not working, so apparently my router (DDWRT) is blocking something.

Let’s see if Argo will work for me.

What are your experiences on the traffic? Does it keep within the 1st free Gb per month?

I don’t see how I could ever use that much for just Home Assistant. If you go over it’s only$0.10/GB though. I can’t be sure what I have used for Home Assistant in the 2 weeks I’ve been used Argo because I was testing a lot of other services, including Plex. Don’t know that I will keep Plex on there but just tested for the heck of it. It works really well actually but probably cost prohibitive if you stream a lot remotely.

in your port forwarding on your ddwrt router, make sure your source net is blank, not 0.0.0.0. Drove me nutz for 2 days.

Thanks. My source net is blank however :frowning_face:

Strange thing is that I am using Geofency and doing an API call via NGINX is working fine. However when I try to open HA or my Synology page this is somehow not working.
I am guessing it’s JS related as I sometimes get related errors. Will dive into it once more in the coming weeks. Maybe try a newer version of the DDWRT software.

if it’s routing you geofency then it’s probably not the router unless you don’t have the right ports but it sounds like you have the right ports opened.

If you like I’ll try to help you. When you are ready to dive into it again, open up a different topic rather than hijack this one mention me and I’ll see if we can work something out.

1 Like

Do you have an exemple how you setup the exception for CloudFlare access by any chance ?

I setup Access maybe two months ago, but since I can’t sent snapshots to iOS notification or trigger NFC tags …

I now that Access have a service token but I don’t think its implemented with HAOS

– I was looking to use Client Certificates too but never figured out how

These are the routes which I set to bypass the Cloudflare Access auth check: /api, /frontend_es5, /frontend_latest, /local.

I haven’t yet figured out how to do it more-cleanly :man_shrugging:, so each route is a separate “app” in CF Access:

The first entry in the CF Teams dashboard screenshot above applies auth. The rest are the bypass exceptions. I haven’t touched this in a long time, things seem to be working great.