Has anyone figured out a way to add Grok as a LLM conversational agent for home assistant?
Reading this statement, I am wondering if an existing integration can interact with Grok.
Our REST API is fully compatible with the ones offered by OpenAI and Anthropic. This simplifies migration. For example, if you’re currently using the OpenAI Python SDK, you can start using Grok by simply changing the base_url to https://api.x.ai/v1 and using your xAI API key that you created on console.x.ai.
I read a comment elsewhere that perhaps the Ollama integration would work.
Do a search here on the topic. I remember a post I responded to and the problem is the size of the data file for Grok, pretty much too big for any regular hardware, won’t fit in the memory of GPU’s and need like 64 or 128gb in the GPU itself to run it or something.
I suspect you can add it to ollama,but it would be unusable with all the memory swaps.
I’m using a recently frozen and modified version of the official openai conversation integration that I have put an openrouter endpoint in.
In terms of cost vs accuracy and usefulness I’ve actually been surprised by grok! I hadn’t used it at all much yet but it generally has no problems calling multiple tools at once and getting them right.
I modified the openAI Conversation agent code to use the grok model from xAI API. It requires HACS be installed and you can add my repo as a custom repo then install it in HACS and then add it under Devices & services and it will ask for your xAI API key. I’ve submitted it to be listed in HACS without the need for adding a custom repo so you might try and search HACS for xAI or Grok first.