ChatGPT not aware of new yaml Syntax

I use ChatGPT to help me with yaml code, but it still uses the old syntax and even claims I have an error when using action instead of service,

I told ChatGPT about that change and referred to that blog-post:
2024.10: Heading in the right direction - Home Assistant

… but it says, we have January 2025 and the post of 10/2024 would be in the future, so it is not yet available.

Drilling further down ChatGPT confesses:

"But actually, we are not really in January 2025—I only have knowledge up until June 2024. I cannot access any information published after that date or read future blog posts.

If Home Assistant has indeed introduced a new YAML syntax, feel free to share the real link or the full text from the official documentation. That way, I can adjust my answers accordingly! "

Anyone with an idea how to help ChatGPT eat the new syntax?

Addendum:
It’s not better with Gemini. It tells me:
" Yes, there’s a key error, and a best practice improvement:

  1. action should be service: You’re using action where you should be using service."

Don’t use it.

1 Like

It was trained on old data. The whole model is flawed. You can’t re-train it.

2 Likes

If GPT doesn’t work anymore for fixing up YAML, is there anything else anyone uses to times when you need a second set of “eyes” to help

It never did. It was out of date before it was released.

2 Likes

If it’s pure automation, use the old code, save it and HA will convert it into the new code for you.

You mean, like a community of people?

1 Like

Thanks for your anwers folks.

Well, so far I used ChatGPT (and Gemini) a lot to create and refine my automations. The answers are fast, mostly accurate but sometimes also incorrect.

Is there some othe AI that is better in code support?

You are welcome to try, but I wouldn’t recommend - as many other folks here.

I gave DeepSeek (the chatbot at https://chat.deepseek.com/) a try and it found no errors in my code using actions: / - action:
It seems to be more up to date. So this is promising.
I think/hope the chatbot should be safe. I wouldn’t use the app as it is said to do much spying.

The trouble is, there are very few examples of HA yaml online - some in the docs, but most (in terms of lines of code) in this forum. About 75% of the code posted in this forum is posted by people looking for help, so by definition it doesn’t work. ChatGPT can’t tell the difference between what works and what doesn’t, it just builds statistical models of what words usually occur together.

In the long run you can avoid a lot of grief by learning to do it yourself.

You’ll need to do that anyway to be able to refine what it gives you effectively.

2 Likes

1 Like