YAML validation

I mean, what when you run home assistant instance, it have very powerful yaml validations, but when you post yaml in forum, where is no such validations. Maybe it will be possible to integrate some checks in website engine. Yes, it will take some calculation, but maybe in this case use power of running home assistant instances. To post yaml you should check it on it’s own local instance for example, and make this process integrated. So in feature, use code complete in this case, make home assistant local IDE.

And to make it even worse that statement doesn’t seem to be true anymore.

Both I and at least one other user have experienced failures of the config checker to return a notification for an error in yaml syntax in an automation. It just failed silently and simply didn’t load the automation and then you needed to look in the main homeassistant.log file to see the error or that there was even an error in the first place.

So a user who uses the config checker and it says everything is valid would even be more confused when it didn’t work.

Yep, happened to me today (not the first time). Very annoying. I wouldn’t have noticed at all if I didn’t have an ‘unknown entities’ sensor.

Keep in mind the UI checker isn’t as complete as the command line. The command line check should catch all YAML errors.

1 Like

Thanks for the info.
But, is there any (good) reason why?
It seems to be counterintuitive given the move to the UI.

The answer is complicated, but from memory it boils down to the fact that a complete check requires that the whole config is parsed and loaded. The UI can’t do that since it’d replace the running config, but the command line can since it runs separately from the main HA process.

3 Likes

But it used to be more complete than it is now. for some reason it seems to have changed and become way less useful than it ever was before.

I’ve exclusively only used the built-in config checker (never the command line checker) and aside from template errors it has always been pretty reliable. Now not so much.

I’m using an appdaemon app I got off the forum years ago that is a click in the GUI to run and it picks up everything

If you’re talking about the one by Apop then I’m using the same one and that one is the one I’m referring to not working as well.

But I’m almost positive it just uses the built-in config checker on the backend except it creates a script to run it and then creates a sensor to report the results (valid or the errors generated).

# get HASS URL
		self.apiurl = "{}/api/config/core/check_config".format(self.config["plugins"]["HASS"]["ha_url"])

So there shouldn’t be any differences at between the built-in UI checker and the AD script.

and I’ve also checked using the AD script and the built-in UI checker and they both return the same “valid” response when the yaml wasn’t valid and the automation failed silently.

yes the APOP one however it’s never failed to pick up a yaml error for me

You’re right it never had for me either until a couple of weeks ago. I’m not sure what changed to make it fail then tho.

And I’m not saying that it doesn’t generally still work but I was completely surprised that it didn’t catch the two errors I had in two different automations.

Were they yaml errors though or did the automation just fail for other reasons?

They were yaml errors as far as I know. At least I gathered that from the log book entry that I found when I couldn’t find the automations I created in the dev-states.

it was an error in a condition (the ‘not’ syntax is wrong):

    condition:
      - condition: or
        conditions:
          - condition: state
            entity_id: input_boolean.as_bedroom_motion
            state: 'on'
          - condition: and
            conditions:
              - condition: not
                entity_id: person.me
                state: 'home'
              - condition: not
                entity_id: person.her
                state: 'home'

here is the log from that automation:

2023-02-07 17:27:02.664 ERROR (MainThread) [homeassistant.components.automation] Automation with alias 'AS Someone is in the Spare Bedroom' could not be validated and has been disabled: extra keys not allowed @ data['condition'][0]['conditions'][1]['conditions'][0]['entity_id']. Got 'person.me'
extra keys not allowed @ data['condition'][0]['conditions'][1]['conditions'][0]['state']. Got 'home'
required key not provided @ data['condition'][0]['conditions'][1]['conditions'][0]['conditions']. Got None

there was another one that complained about the state in the trigger needing to be a string (forgot to put quotes around a number in a state trigger) that did the same thing in another automation too.

that one also gave a similar entry in the logbook and it disabled the automation so it didn’t show up in the dev states either. I didn’t save the entry from that one but I can easily reproduce it if it helps.

Maybe I’ve just been lucky before and the config checker has always worked like this and it might not have ever caught these errors but the other user in the other thread had a similar experience. They didn’t post the failing automation but they seemed sure that it was a yaml syntax error that caused the failure. So I can’t confirm nor deny their experience only mine.

But this is getting way OT…

1 Like

At this point it might be good to split the thread because you are talking about an issue with yalm validation on a blog post

In any case, it is now becoming clear that the value of the yaml checker has increased, because for local use there are more sources that can generate yaml configs. It may also be worth making the option to run HA in different processes so that the checks are the same on the command line and on the UI. And if we talk about the future of neural networks, then HA should conduct a dialogue with the neural network, when the neural network suggested an option, HA would run the configchecker and Unit Tests, tell the neural network to correct this line. Also HA should like or dislike answer of AI. And so, after several iterations, get a ready answer. And in the end, make a button so that the user checks the automation. But this is natural, the distant future of neural networks. Also, when the chat with the neural network does its job, it should be disconnected from HA so that the feeling that HA is working locally is not lost. The neural network should remain only a free integrator of the smart home.

Here’s a concrete example that appears to be valid and HA happily starts.
It just secretly doesn’t initialise the automation.

    trigger:
      - platform: state
        entity_id: binary_sensor.some_sensor
        state: 'on'
        for:
          minutes: 3

A silly error: state instead of to but easily made if you are copying from a condition block to a trigger block.

But the only indication that your mission critical/life dependant ( :sweat_smile:) automation isn’t actually working is if you decide to look in the logs.

Automation with alias 'My Automation' failed to setup triggers and has been disabled: extra keys not allowed @ data['state']. Got None

Surely that can’t be right?

I agree.

AFAIK, it was never like that until the last couple of releases.

the config checker for automations has seemed to transition/evolve.

At one time errors in automations would show up just like any other error in the config validation window with the error message as in your example above spelled out right there.

then at some point it stopped doing that and popped up with a persistent notification that you had to first notice (I didn’t at first) and then when you open that notification it would tell you that “some integrations couldn’t be set up…” and it would tell you to check the logs for errors.

Now it just fails silently with nothing to give an indication there’s any problem.