Has anyone built a tool to export HA context for AI assistants?

I’ve been using Claude/ChatGPT to help me build integrations and write automations, but I kept running into the same problem that the AI needs context about my setup.

I built a simple Python script that connects to my HA instance and dumps everything to organized JSON/YAML files.

Now I can just run ./gather.sh, grab the relevant files, and paste them into my AI conversation. Makes it way easier to ask things like “help me write an automation using these entities” or “reorganize my dashboard”.

Has anyone else built something similar?

I’m curious if this is a common need or if others have solved it differently. Did you:

  • Build export tools like this?
  • Use MCP servers for real-time context?
  • Just manually copy-paste everything?
  • Find existing tools I missed?

I’ve also built an MCP server version for live context, and I use both depending on what I’m doing. I tend to prefer the file export approach though. Having the AI edit YAML files directly makes it much easier to review changes before applying them to HA.

Just wondering if I’m reinventing the wheel or if this is actually a gap in the ecosystem! It’s a bit of a messy setup atm as it’s specific to my needs. Never really intended to release it, but perhaps if there’s enough interest and assuming someone hasn’t already built something far better.

HA is evolving too fast for AU assistants, so you often get deprecated outputs and AI assistants often mix other code into the results, like Ansible, so you often get code that is faulty.

NathanCu is the AI oracle on the forum and he will also tell you this.

1 Like

My initial approach is to map drives with Samba and have the Ai make local copies of everything it needs to understand the full context. Migrations the other way are done manually using a compare tool.

The files and configuration changes to quickly for manual exports. This way, I can ask the Ai to refresh context and it will replace the local files with the active content.