You need to login at least once and check “Keep me logged in” otherwise there will be no user profile set. If you are being prompted to login every time, the cookie saving the user info is likely being blocked. You can test this with Chrome in incognito mode and see that you are prompted to login every time.
You need to be aware that what you want to do is insecure, so I expect that you totally trust anyone connected to your network.
If you set up the trusted_networks auth provider. If the instance only has one user, you can use the allow_bypass_login option to login that user by default if that user access Home Assistant from a specific IP/network. If you have multiple users, you need to check the trusted_users option in the documentation to designate which users will automatically login depending on the IP/network. For example, if your Home Assistant instance has only one user, the next example will login that user automatically from any IP (0.0.0.0/0 represents all possible IP addresses in the IPv4 address space).
It’s normal that the first time you open the browser and connect to your HA you need to provide credentials.
If you don’t close the browser, your session stays active/you stay logged in to HA so no need anymore to fill in the user/pw.
Everybody here is missing the point. Sorry, maybe I was not clear.
What I want:
A dedicated 7-inch touch screen interface to the lights in the room
No need to use a keyboard/mouse because these a very hard to use on such a screen
The ability to set it up with a script.
Finding “long lived tokens” I thought I had what I need. I could have a script on the client that starts the browser in kiosk mode and passes the token to bypass the login screen, which as I say is useless in this context.
This means the client can be shut down and restarted, re made if corrupted, any number of times.
Most of this can be achieved using a keyboard temporarily to enter credentials and then “remember me” works. It is my current solution
But:
It is tacky. The login screen has no use in this scenario so even using it once offends me
To rebuild the system this fiddling needs to be repeated (it requires I use a magnifying glass)
Security? Sorry, it is no less secure than putting the credentials in the browser. That seems obvious to me
AFAIK it was never possible before to log in via long lived access tokens. The o ly way to bypass the login is by trusted networks as mentioned before.
I however would suggest to at least set the ip network in this setting to your subnet (e. g. 192.168.0.0/24) or better the IP of the kiosk with /32 instead of /24. That way you’re still forced to login with other devices or when in another network, which is especially helpful if you access from the internet as well.
On the other hand, I can also recommend a multimedia keyboard for when you rebuild such systems here and there. There is one from Logi that I tegrates a touchpad for mouse. This is way more secure than using trusted networks.
Long Lived Tokens are used as authentication to call Home Assistant APIs, not to login in Home Assistant. What is the problem of using trusted networks? If you feel safe to expose an auth token in a script in that device, there should not be an issue authorizing the IP of the device to bypass the login screen and login automatically (assuming that you manually assign that IP to the device in the DHCP resolution of your router).
It is not security that is tacky, it is using the login screen to enter data known at installation time, only on first run time, because I do not know how to enter it at installation that is tacky
Security wise IMO ir is no more secure to have the browser hold the credentials than a script.
Why not? If the client is compromised (it is pinned to the wall) access to the HA, using the profile for the client to use, is achieved.
That is precisely a security measure, so yes, you find that security measure tacky.
That also has a level of insecurity, but bypassing the login is more insecure because that user will be able to login even if you delete all the current tokens or even if you change their password.
Because you are opening a security hole in the Home Assistant instance, this is not about the device client but about your Home Assistant instance.
The token is (should be) tied to the same identity in HA as the password, and both are stored on the client . How is having a token stored in a client less secure than a login/password pair?
You keep saying "insecure " like a magic incantation. That does not make it so.
It looks like I could writte a bespoke script to use the API directly (I have moved on from this now, so I will not) how is that less secure than a bespoke script that sets off a web client?
Having credentials stored in a client, in whatever form, has the same issues no matter how those credentials are stored. That is the issue on the face of it, and there is nothing said here that changes that.
When you let the session open, you are not “storing” the password on the client (you are just letting the session open, no more)
When you let the session open, you cannot read the password locally in the device
When you let the session open, nobody can read the password remotely or steal it
Sessions can be killed remotely by the admin without accessing the device
Passwords can be changed remotely by the admin without accessing the device
None of the previous can be achieved with a token exposed in a local script.
I am telling you what is insecure because it is one of the fields in which I have certain expertise. It is not insecure because of magic, it is insecure because it is. But it is up to you to listen or not.
Both are insecure if they are stored locally. Secrets should be used in a hidden way, for example, on CI pipelines stored as env variables, not stored locally accesible by anyone with access to the device without the proper rights.
That is true, you should never store secrets in the clients, I don’t recommend that (and nobody has recommend that in this thread). It seems that you are confusing letting a session open (which is also insecure but much less insecure) than writing secrets in a file in a device (something that can be copied, stolen, distributed, and used later from another device even without you noticing it).