Time to upgrade the installation instructions for Core on Raspberry Pi?

I set up a new installation from scratch, which I do maybe twice a year, since I figure so much has happened on the setup in that time that it’s good to do something clean. Only this time it wasn’t that easy… I took the latest Bullseye image, downloaded it and made it ready for headless installation (I’m old fashioned, so I don’t do the Pi flasher, I use Balena Etcher and then on the boot partition I put an empty file named SSH and one named userconf.txt with the user name and the hash of the password inside, then I can SSH in, set up VNC and do the rest from VNC, which is what I always work with). I believe that the results should be as standard as an installation can be, I only activated VNC, did a full upgrade and changed the stuff I want in the user interface (removing the background image and making windows white with black text and Monospace Regular 13).

Then I went to the installation instructions, I usually forget the squence from time to time. I guess it’s dementia in my advanced age of 55… But this time it stopped on building cryptography. A bit of searching the forum, and some trial and error (there are several things billed as “solutions” in here) got me to part of the actual solution: First, outside the virtual environment, install the package rustc:

sudo apt install rustc

It still didn’t work, but in another thread I found this, done as the Hass user, inside the virtual environment:

pip install sqlalchemy

pip install fnvhash

And that got it up and running. Maybe those steps should be added to the dependencies on this page:

And another one. The same installation gave me an error from dependencies that didn’t work with NumPy. This seems to fix that:

sudo apt-get install libatlas-base-dev

Yeah that page is a little put of date. They pulled in some new dependencies. Especially Rust and Cargo are now required to compile some packages. But there are edit and feedback buttons on the bottom of that page, so you could add what you think is missing :slightly_smiling_face:

As a sidenote, installing the latest HA core on a pi zero was a major pain in the butt. Orjson depends on maturin, which did not have wheels for that platform. Compiling it from scratch took almost 15 hours, without any meaningful progress report while doing so (did it crash or hang ? Oh, no, still compiling…)

I wouldn’t have had the patience to work with a Pi Zero! :rofl: But I’m guessing you have a reason for not using more power than necessary.

It’s my development setup. My production HA is not on a Pi Zero :wink: But I never update it anymore, so for testing my custom cards on the latest and greatest HA, I use the Zero. The orjson compiling was insane though. Never had anything like this in the past with HA.

Aha, I see. :+1:

This is just getting worse. Even after all that I can’t upgrade to the later versions. Because it can’t take Norwegian characters æøå and gives me a decode error:

UnicodeDecodeError: 'utf-8' codec can't decode byte 0xe5 in position 2271

Different characters as wel, even the built-in degrees character. I have tried updating to Python 3.10 and just about everything else I can imagine, but nothing helps. I opened an issue three days ago on the core GitHub, but it seems like I’m the only one writing in the issue… Weird, because this is something that should affect everybody who has a non-standard code page, I think.

As for Debian try the following from the cli:
sudo dpkg-reconfigure locales

Scroll down and activate:
[*] nb_NO.UTF-8 UTF-8
[*] nb_NO ISO-8859-1


Reboot the host.

@Tamsy Thanks for the attempt, but that didn’t change anything. The second was already activated. I tried to deactivate that and activate only the UTF one, but now all my CLI windows says:

bash: warning: setlocale: LC_ALL: cannot change locale (nb_NO)

So I guess that’s not a winner either. C.UTF-8 seems to be activated as well, but there is no place to remove that. I see something else funky here. When I check the Python version with python -V, I get “Python 3.10.8”. But in the log file for Hass I see:

Could not parse JSON content: /home/homeassistant/.homeassistant/.storage/core.entity_registry
Traceback (most recent call last):
  File "/srv/homeassistant/lib/python3.9/site-packages/homeassistant/util/json.py", line 39, in load_json
    return orjson.loads(fdesc.read())  # type: ignore[no-any-return]
  File "/usr/lib/python3.9/codecs.py", line 322, in decode
    (result, consumed) = self._buffer_decode(data, self.errors, final)
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xb0 in position 1428: invalid start byte

So even with Python 3.10 installed and set up as standard, it seems to use 3.9. I think that is probably the thing here, because there was a change in August moving to Python 3.10, but it seems something went wrong with Core and code pages.

Did you recreate the venv using Python 3.10 ? The venv will keep using the Python version it was created with, even if you update it externally.

When does that happen ? During the update process ? Where exactly ?

I didn’t recreate it, I started again from an image taken before I installed. I did use alternative installation, can that be the problem? Here are the commands in sequence:

sudo apt-get update
sudo apt-get full-upgrade -y

PYTHON 3.10:

wget https://www.python.org/ftp/python/3.10.8/Python-3.10.8.tgz
sudo apt install build-essential zlib1g-dev libncurses5-dev libgdbm-dev libnss3-dev
sudo apt install git
tar -xzvf Python-3.10.8.tgz 
cd Python-3.10.8/
./configure --enable-optimizations
sudo make altinstall
/usr/local/bin/python3.10 -V
/usr/bin/python3 -V
sudo rm /usr/bin/python
sudo ln -s /usr/local/bin/python3.10 /usr/bin/python
python -V


sudo apt install mosquitto mosquitto-clients -y
sudo mousepad /etc/mosquitto/mosquitto.conf

listener 1883

allow_anonymous true


sudo systemctl status mosquitto

sudo mount -a
sudo dd if=/dev/mmcblk0 of=/home/pi/Sikkerhetskopier/Hytte-Pi-oppdatert-python-installert-mosquitto.img bs=512 count=31109120 status=progress


sudo apt-get install -y python3 python3-dev python3-venv python3-pip bluez libffi-dev libssl-dev libjpeg-dev zlib1g-dev autoconf build-essential libopenjp2-7 libtiff5 libturbojpeg0-dev tzdata rustc libatlas-base-dev
sudo useradd -rm homeassistant -G dialout,gpio,i2c
sudo mkdir /srv/homeassistant
sudo chown homeassistant:homeassistant /srv/homeassistant
sudo -u homeassistant -H -s


cd /srv/homeassistant
python3 -m venv .
source bin/activate
python3 -m pip install wheel
pip3 install homeassistant

As for where that happens, it’s on the second start of Hass after adding any one or more integration, built in or custom.

Yes. When you create the venv, you need to make sure to specify the version of Python you want to use:

python3.10 -m venv .

Once you are in the venv, python3 should then alias to 3.10. Check it with python3 -V to be sure, after enabling the venv.

source bin/activate
python3 -V

Should say 3.10.x

That said, it doesn’t explain your Unicode problem, even if the venv is still on Python 3.9. HA Core 2022.11.3 is running fine here for me in a Python 3.9.2 venv. I’m using all kind of locales from within HA for i18n testing, but I do use US English locale on Raspberry Pi OS for maximum compatibility. Maybe that’s the reason.

You’re right, of course. How weird. When I’m there as the user homeassistant I see version 3.10.8, but when I create the venv it is created with 3.9.5. With your command it’s created with 3.10.8, but now there’s something else: I can’t install wheel. This is the error that I get:

Could not fetch URL https://pypi.org/simple/pip/: There was a problem confirming the ssl certificate: HTTPSConnectionPool(host='pypi.org', port=443): Max retries exceeded with url: /simple/pip/ (Caused by SSLError("Can't connect to HTTPS URL because the SSL module is not available.")) - skipping
Could not fetch URL https://www.piwheels.org/simple/pip/: There was a problem confirming the ssl certificate: HTTPSConnectionPool(host='www.piwheels.org', port=443): Max retries exceeded with url: /simple/pip/ (Caused by SSLError("Can't connect to HTTPS URL because the SSL module is not available.")) - skipping
WARNING: There was an error checking the latest version of pip.

I never got that with 3.9.x. I tried with trusted:

ip install --trusted-host pypi.org --trusted-host files.pythonhosted.org wheel

But that didn’t change anything.

I’ve tried lots of Stackoverflow and other solutions/suggestions, but I’m still stuck. :anguished: And I don’t find that problem anywhere here on the forum. Does that mean that nobody’s ever installed Core with Python 3.10? That sounds weird. I think I’ll try do to a full install, not altinstall, and se what happens then.

As a 200+ pounds heavy viking with a beard I’m of course partial to blunt force, which is what make install really is, I guess. And it seems like that worked for me again, at least partly. :rofl: No error messages this time, and the installation of Hass went as it should. But the setup after install stopped with the error message (not just for recorder, but for several of the integrations, like zeroconf):

Setup failed for recorder: Unable to import component: No module named '_sqlite3'

I think this is a change that’s made to force us old Core stalwarts over to Docker…

Edit: Yeah, looks more and more like it… Because even with Python 3.10 I still get the

homeassistant.exceptions.HomeAssistantError: 'utf-8' codec can't decode byte 0xe5 in position 2271: invalid continuation byte

So it seems now only Docker or Hassio (which is totally useless for me, I need full control over everything else on the Pi running Hass) will work with later versions.

Well, what do you think runs inside that Docker container, both on the Container or HAOS installs ? HA Core. It’s just pre-sugarcoated with it’s perfect running environment for you (and additional layers of virtualization making it less efficient). There’s no reason you couldn’t run Core on its own, provided you have the correct execution environment. And that’s where the problems begin. Because the devs change that all the time (which isn’t too surprising, it’s pretty common to add dependencies in the development process), but they also don’t update the install instructions. So you’re often left guessing and researching yourself what kind of new dependencies you need to install.

The sqlite error is probably just because libsqlite3 or libsqlite3-dev is missing. Installing these packages with apt would quickly solve that. The UTF8 error is weird and looks like a bug in HA. It’s not uncommon for open source packages to have issues with Unicode and locales sadly. Personally I always run the base OS on US locale, which is the most compatible one, and then the frontend on my own locale. That avoids these kind of problems.

Yeah, I know it’s the environment, and it’s a PITA to get it working this time. Earlier changes hasn’t been anywhere near as destructive as this. First the Rust requirement, and then the UTF8 bug. Which may be considered a feature since I don’t even get an answer on GitHub about it and only one other user seems to have the problem… If I run the OS on US, then I have a big problem with my automations, which very often contains Norwegian special characters for clarity.

But for now I’ll see where Docker takes me. Hassio (or is it called HAOS now?) is totally useless for me because it doesn’t have a GUI, and I do all my work in the GUI, with VNC. And I have standalone versions of ZWaveJSUI and Node-RED on the same Pi, with a bunch of custom Python scripts running in the back as well.

Agreed. This is mostly a documentation problem though. But that’s nothing new in HA, documentation has always been a major weak point for this project. The sqlite thing isn’t really new though, I had to manually install that package on earlier versions too. It depends what distro you’re running this on and whether the packages are already preinstalled or not.

Really ? I mean, running on a locale x will not make characters used by locale y non-functional. Linux uses almost exclusively Unicode in the backend now (at least it should). Technically you could use Chinese idiograms in your files while on a South American locale - it’s all just UTF8 encoding. The problems mainly come from incorrect usage of (or missing) locale conversions in the client code (like HA) when doing symbol comparisons and similar. As long as your entity names don’t contain non-ANSI characters, you should be fine. Norwegians comments, quoted strings, etc, should be OK.

Unless they really messed up UTF8 encoding completely. Which is certainly possible, but I haven’t noticed that myself here.

Edit: looking at your error message closer, that doesn’t look like they don’t take UTF8 into account. In fact it looks like one of your files (automation YAML maybe ?) contains an invalid UTF8 encoding sequence.

Well, I finally got a reaction on Git Hub: “Closed as not planned”. So they’re not going to do anything about the JSON thing. Oh, the SQL problem wasn’t there on the exact same setup, with the same dependencies, on Python 3.9.x. So I guess the new instructions for Core on Raspberry Pi should be: “Don’t. Just don’t. At least not if you’re from anywhere else than English speaking countries” :rofl: The irony is that the guy closing it has an ø in his name…

Seems like I need to bite the Docker bullet. As long as I don’t have to go Hassio, I guess I can still keep my Pi’s mostly the way I want them.

Nonsense. I’m from a non-english speaking country and HA Core works just fine, on multiple instances. You just have to do a little more work to get it running. It’s labelled as an advanced install method after all. Installing a few dependencies and building some packages from source is a bit annoying, agreed, especially if it’s not clearly documented. But it’s hardly rocket science.