Connecting ai assistant to the external world

What is currently considered the “right” way to enable assistant/LLMs to do web searches? In the past I had a searxng MCP client and it was used by Ollama. Now I moved from Ollama to vllm which meant I had to switch from Ollama integration to “Extended OpenAI conversation” which doesn’t seem to support MCP. Brief research suggests that this integration relies on functions and skills to enable similar functionality.

More background:

About half a year ago I set up the voice assistant which was hooked up to Ollama as the backend. In parallel I set up some MCP clients that can do weather forecast, searxng for web search, calendar access, etc. The nice thing about these is that they can also be re-used by other AI tools, e.g. OpenWebUI (although there I needed to make them available via openapi but that’s a different story).

Now I decided to move from Ollama to vllm to improve the performance. Perhaps there are other options to expose it to the world but natively it provides OpenAI compatible api. Brief search led me to “Extended OpenAI conversation” integration which supports setting your own url for api endpoint. Seems to work but the integration doesn’t seem to support mcp. It seems that there is another integration that supports mcp, it’s called “OpenAI conversation plus” which is a fork of “Extended OpenAI conversation” but it hasn’t been updated for 3 months and looks a bit abandoned so I’m reluctant to use it.

I’m under the impression is that I need to look into skills and how to expose web search functionality and other things as skills. Before going that route I wanted to check in with the community to understand if that’s the “right” way or if there are some other alternatives that I’m not aware of.

The questions I’d appreciate to have some feedback about are:

  • What are the current options to provide LLMs access to tools in HA?
  • What are the options to connect a Voice Assistant in HA to a vllm instance?