… but it says, we have January 2025 and the post of 10/2024 would be in the future, so it is not yet available.
Drilling further down ChatGPT confesses:
"But actually, we are not really in January 2025—I only have knowledge up until June 2024. I cannot access any information published after that date or read future blog posts.
If Home Assistant has indeed introduced a new YAML syntax, feel free to share the real link or the full text from the official documentation. That way, I can adjust my answers accordingly! "
Anyone with an idea how to help ChatGPT eat the new syntax?
Addendum:
It’s not better with Gemini. It tells me:
" Yes, there’s a key error, and a best practice improvement:
action should be service: You’re using action where you should be using service."
Well, so far I used ChatGPT (and Gemini) a lot to create and refine my automations. The answers are fast, mostly accurate but sometimes also incorrect.
Is there some othe AI that is better in code support?
I gave DeepSeek (the chatbot at https://chat.deepseek.com/) a try and it found no errors in my code using actions: / - action:
It seems to be more up to date. So this is promising.
I think/hope the chatbot should be safe. I wouldn’t use the app as it is said to do much spying.
The trouble is, there are very few examples of HA yaml online - some in the docs, but most (in terms of lines of code) in this forum. About 75% of the code posted in this forum is posted by people looking for help, so by definition it doesn’t work. ChatGPT can’t tell the difference between what works and what doesn’t, it just builds statistical models of what words usually occur together.
In the long run you can avoid a lot of grief by learning to do it yourself.
You’ll need to do that anyway to be able to refine what it gives you effectively.