Voice Assistant Long-term Memory

Voice Assistant Long-term Memory

Imagine asking Home Assistant:

“Where’s my car parked?”

“What’s my Wi-Fi password?”

“Remind me where I put the spare keys.”

With Memory Tool, your Voice Assist can now remember and recall information long-term - just like a personal assistant that never forgets.

Two editions are available depending on your needs:

Local Only - Private, offline, lightning fast.

LLM Integrated - Natural, conversational, powered by GPT/Gemini.

My GitHub repository for all dependency files

Edition 1: Local Only

This version runs entirely inside Home Assistant, no LLM or internet connection required.

Open your Home Assistant instance and show the blueprint import dialog with a specific blueprint pre-filled.

Features (Local Only)

  • Works offline - fast, private, and secure.

  • SQLite with FTS5 full-text search for quick lookups.

  • Supports set, get, search, and forget, plus TTL, tags, and scopes.

  • Handles duplicates by updating existing memories or creating new ones.

Installation (Local Only)

  • Install and configure the Pyscript integration through HACS.

  • Copy scripts/memory.py and scripts/common_utilities.py into the config/pyscript folder.

  • Import the memory_tool_local.yaml blueprint and create an automation from it.

  • Dependency files are included in the repository.

  • Restart Home Assistant - the tool will automatically create its SQLite database on first run.

Edition 2: LLM Integrated

This version connects with an LLM (e.g., GPT, Gemini, …) to manage memory during conversations automatically.

Open your Home Assistant instance and show the blueprint import dialog with a specific blueprint pre-filled.

Features (LLM Integrated)

  • Interact naturally and conversationally, in any language you choose.

  • The LLM decides when to run set, get, search, or forget.

  • Can refine queries, update existing keys, or create new entries dynamically.

  • Uses the same SQLite + FTS5 backend, with extra flexibility thanks to LLM reasoning.

Installation (LLM Integrated)

  • Install and configure the Pyscript integration through HACS.

  • Copy scripts/memory.py into the config/pyscript folder.

  • Import the memory_tool_full_llm.yaml blueprint and create a script (do not change the default script name).

  • Requires specific usage rules and system policies - see home_assistant_voice_instructions.md for details.

  • Dependency files are included in the repository.

  • Restart Home Assistant - the tool will automatically create its SQLite database on first run.

Example: Local vs LLM

Scenario: Saving and retrieving your parking spot

Local Only Edition (exact commands, keyword-based)

You: “Remember office parking spot is in basement 3 near the elevator entrance”

You: “Find office parking spot”

Assist: “in basement 3 near the elevator entrance”

You: “Find parking spot”

Assist: “I found a few candidates: 1. office_parking_spot: in basement… Say a number to open it, or say cancel?”

You: “One”

Assist: “in basement 3 near the elevator entrance”

LLM Integrated Edition (natural language, context-aware)

You: “Remember office parking spot is in basement 3 near the elevator entrance”

You: “Hey, I left my car somewhere… where did I park again?”

Assist: “You told me your car is parked in basement 3 near the elevator entrance”

You: “Actually I moved it to basement 2, column 6”

Assist: “Got it, I’ve updated your parking spot to basement 2, column 6”

Local = precise, offline, reliable.

LLM = flexible, conversational, smarter.

Real-World Use Cases

  • Store and recall Wi-Fi passwords, parking locations, reminders, phone numbers, or any details you want.

  • Save temporary or long-term notes, with TTL configurable up to 10 years - or forever.

Which Edition Should You Choose?

  • Local Only - Best for privacy, speed, and offline reliability.

  • LLM Integrated - Best for natural conversations and smarter handling of memory.

Both editions will continue to be developed in parallel. I’d love feedback from anyone who tries them - especially ideas for new use cases or improvements!

2 Likes

After I set it up according to your method (local version), I still cannot read the memory when talking to the language assistant.
I don’t know where to find more logs. Can you elaborate on your tutorial?
························
Me:Remember my parking spot is B2 R1
Assist:Done
Me:what is my parking spot
Assist:I don’t have any information about your parking spot in the available smart home devices.
························
In addition, “Dependency files are included in the repository.”this sentence means that the two files scripts/memory.py and scripts/common_utilities.py are already placed in your repository, right?

Thanks~

I just updated the post to include my GitHub repository (I hit the new user link limit!). Just follow the installation steps there to get access to the feature. Please use it and send me your feedback so I can improve it. Thank you!

Hey everyone,

Quick heads-up: I just rolled out a minor update to fix bugs in the duplicate keyword score calculation and the incorrect delete confirmation logic.

If you’ve been using this feature, please drop a comment below and let me know if it’s working for you!

On the flip side, if you spot any new bugs or have an idea for a cool new enhancement, please share it! Your feedback is super helpful to helping me decide what to upgrade next.

Thanks for all your support!

Can you consider making a detailed video or graphic tutorial? I really don’t know how to use it after installation.Thanks~

Hello! I assume you’re using the Local Version, correct?
If so, you need to open the automation that was created from the blueprint. Inside the Trigger settings, you will see the commands used to store and recall information. You are able to customize these commands there.
The Local Version strictly requires you to speak using the exact syntax for it to work. If you’d like to use more natural language, you will need to switch to the LLM version.

I’m setting this up right now,
Already translated the google generative ai initial instructions .
Commands in the blueprint needs to be translated too, yes?

Hello!

You’re using the LLM version, right? Regarding the instructions for the LLM, you can either copy the content exactly as is or customize it. You should keep the instructions in English and there’s no need to translate them.

I just realized I forgot one crucial step in the current LLM version installation: you need to expose the script to the Assistant after the script has been saved.