It would be very useful if documentation contained a list of supported intents (for every language). There is a technical intents dashboard and it is possible to check out intents in GitHub repo, but it is not optimal, especially for non-technical users.
I think it should be possible to automatically generate such documentation from the intents repository.
List, no… But we do need an easy way to tell. Because respectfully, it’s not acting like it has N-bajillion sentences and leaves me a little bewildered. Often not finding matches or trying to match things I didn’t ask for.
While I cut my teeth in the intent script sentences… Wt this point without an LLM doing that side of the equation for me I’d have given up a long time ago
Now that we have the ability to run intent first then LLM I need to see what my options are because honestly, the LLM is too good at recognizing voice and picking the right intent now everything else feels like a step backwards. If I’m ever going to make it workable I need. To know what it can’t do beyond just it didn’t match something.
(read:when I turn off the LLM. Assist is instantly 2000x dumber. And it leaves me wondering what it can and can’t do… Therefore, must see sentences. Right now it’s literally easier for me to build local ollama with open-webui server to drive the speech pipeline than try to figure out what the sentences match or not…)
Well, you are correct, I wasn’t aware there are so many possible sentences. But most of them are basically just duplicates/variants of some basic ones. I think it still should be possible to prepare some view that will make it easier to learn capabilities of Assist (hoover to see alternatives for a specific phrase?).