hello, i installed a local llama 3.1-8.0B.
I cant go any further since im running a quadro rtx4000.
Today i tried to connect HA to the machine and it was asking me which language model i want to use.
I am confused - i have the language model already installed and i can use it with open webui.
Anyway, im open to use another one but there are no Parameters visable so its hard for me to choose the best LLM.
There is LLAMA3 and LLAMA3.2 but with how many parameters?
If you check: library
the LLAMA3 has the option for 8b and 70b and the 3.2 has 1b and 3.
On my VM i have LLAMA3.2-8B installed and it runs way better then what homeassistant installed (i think) so yeah, can someone explain?
Thanks