I’ve been using Claude/ChatGPT to help me build integrations and write automations, but I kept running into the same problem that the AI needs context about my setup.
I built a simple Python script that connects to my HA instance and dumps everything to organized JSON/YAML files.
Now I can just run ./gather.sh, grab the relevant files, and paste them into my AI conversation. Makes it way easier to ask things like “help me write an automation using these entities” or “reorganize my dashboard”.
Has anyone else built something similar?
I’m curious if this is a common need or if others have solved it differently. Did you:
- Build export tools like this?
- Use MCP servers for real-time context?
- Just manually copy-paste everything?
- Find existing tools I missed?
I’ve also built an MCP server version for live context, and I use both depending on what I’m doing. I tend to prefer the file export approach though. Having the AI edit YAML files directly makes it much easier to review changes before applying them to HA.
Just wondering if I’m reinventing the wheel or if this is actually a gap in the ecosystem! It’s a bit of a messy setup atm as it’s specific to my needs. Never really intended to release it, but perhaps if there’s enough interest and assuming someone hasn’t already built something far better.