WTH there is no documentation containing supported intents

It would be very useful if documentation contained a list of supported intents (for every language). There is a technical intents dashboard and it is possible to check out intents in GitHub repo, but it is not optimal, especially for non-technical users.

I think it should be possible to automatically generate such documentation from the intents repository.

There is one that states it is automatically generated, but it hasn’t been updated in a year… its in the dev docs:

Yeah, but this one displays only intents’ identifiers, what I meant was a list of full sentences.

Voted. The voice docs are… Less than stellar. And something tells me after the 19th Dec. This is going to matter a lot more than it does now.

2 Likes

These are the number of possible sentences per language.

af: 0
ar: 2120348492953
bg: 37820406
bn: 4
ca: 51143749
cs: 35221956375
da: 89308
de: 220167399
el: 121201419
en: 5364612
es: 29351146100
et: 2
eu: 1203
fa: 4
fi: 59118157
fr: 347303088
gl: 6790160755
gu: 0
he: 11677
hi: 0
hr: 689561105
hu: 14899568523
id: 294
is: 114676899
it: 6783716545813
ka: 180
kn: 0
ko: 3512
lb: 127120
lt: 90316
lv: 11884
ml: 560850
mn: 0
ms: 780
nb: 42059304
nl: 15037377274
pl: 73602030713
pt: 322567951
ro: 155755315
ru: 13225284
sk: 7933
sl: 1236668597
sr: 74592
sv: 36848930
sw: 0
te: 4
th: 114
tr: 10432981
uk: 53625
ur: 861656
vi: 46977583

There are quite some languages with billions of possible sentences, I don’t think you want to list all of those right?

1 Like

List, no… But we do need an easy way to tell. Because respectfully, it’s not acting like it has N-bajillion sentences and leaves me a little bewildered. Often not finding matches or trying to match things I didn’t ask for.

While I cut my teeth in the intent script sentences… Wt this point without an LLM doing that side of the equation for me I’d have given up a long time ago

Now that we have the ability to run intent first then LLM I need to see what my options are because honestly, the LLM is too good at recognizing voice and picking the right intent now everything else feels like a step backwards. If I’m ever going to make it workable I need. To know what it can’t do beyond just it didn’t match something.

(read:when I turn off the LLM. Assist is instantly 2000x dumber. And it leaves me wondering what it can and can’t do… Therefore, must see sentences. Right now it’s literally easier for me to build local ollama with open-webui server to drive the speech pipeline than try to figure out what the sentences match or not…)

1 Like

Well, you are correct, I wasn’t aware there are so many possible sentences. But most of them are basically just duplicates/variants of some basic ones. I think it still should be possible to prepare some view that will make it easier to learn capabilities of Assist (hoover to see alternatives for a specific phrase?).