HA Text AI: Transforming Home Automation through Multi-LLM Integration

HA Text AI: Transforming Home Automation through Multi-LLM Integration

Hey everyone,

I’m thrilled to introduce HA Text AI, an advanced AI integration designed to elevate your home automation experience! What started as a personal project has now evolved into a powerful tool that I’m excited to share with all of you. Screenshots.

logo@2x

:star2: Core Concept

HA Text AI leverages multiple AI providers, including OpenAI and Anthropic, to generate intelligent event descriptions, create smart notifications, and retrieve structured information. This integration transforms your smart home into a more responsive and intuitive environment.

:globe_with_meridians: Multi-Provider Support

In addition to standard providers like OpenAI and Anthropic, HA Text AI supports custom OpenAI-compatible REST API endpoints, allowing for greater flexibility and integration with various AI models.

Compatibility Requirements:

  • OpenAI-like REST API structure
  • JSON request/response format
  • Standard authentication methods
  • Similar model parameter handling

:gear: Advanced Control Mechanisms

This integration offers a range of features to optimize your AI interactions:

  • Context Depth Management: Customize conversation history (1-20 previous messages).
  • Token Usage Control: Manage memory and optimize API interactions.
  • Configurable Request Intervals: Implement rate limiting and monitor token usage.
  • Granular Response Configuration: Adjust max tokens per response and temperature (creativity) settings.

:bulb: Use Case Scenarios

  • Intelligent sensor event descriptions
  • Context-aware notifications
  • Transformation of raw data into human-readable text
  • JSON response generation for further automation

:arrows_counterclockwise: Flexible Integration

HA Text AI is lightweight and extensible, supporting:

  • Multiple AI providers
  • Customizable response formats
  • Easy integration with existing automations

:handshake: Community Invitation

I’m eager to hear your thoughts and ideas! Let’s collaborate and explore how AI can revolutionize home automation together. Your feedback and creative suggestions are invaluable!

:books: Full Documentation

For more details, check out the GitHub repository for comprehensive documentation.

Looking forward to your insights and potential use cases! :rocket:

5 Likes

Hi! Great job!
Can you give us some examples of use?

Looks cool, will see if I can get it working with Ollama

1 Like

Hi! Great job!
Can you give us some examples of use?

Yes, definitely, as soon as I have a bit of free time, I’ll prepare several Blueprints from my automations. For now, I’ll describe them in text, and maybe you can quickly create something yourselves.

  1. I was tired of trying to configure various automation scenarios and account for everything, so the idea was born from my desire to control home climate for comfort. I send a request to AI with a list of humidity and temperature sensors, open/closed window and door statuses, and a history of people’s presence by rooms for a week (which is also automatically generated weekly by a query to the integration, with the response coming in JSON - you can specify the response format to LLM in text, of course there can be comments, but JSON is always correct and easily extractable). Essentially, the trigger is people’s presence or approaching time of presence. AI generates a response in JSON according to a template, indicating how and for how long to turn on the air conditioner, humidifiers, and if windows are open, it sends a push notification that they should be closed. Accordingly, in rooms where people are present, the air conditioner’s power is reduced, and in rooms where people are not yet but will be, it works more intensively.

The main point is that AI perfectly understands what is comfortable, taking into account indoor and outdoor humidity, which mode is better to enable, and so on. Requests, of course, cost money, but it comes out to no more than a couple of dollars per month, and it’s always comfortable.

  1. In the children’s room and bedroom, I have CO2 sensors, and if there’s an excess, a request goes to AI with the growth dynamics, and I receive a push notification that ventilation is needed for 30 minutes or more, indicating at the current dynamics when harmful concentration will occur in N minutes or hours.

  2. Similar to the first automation, during cold times I control thermostats. There’s an additional benefit in that I send AI a weather forecast for the day and current outdoor sensor readings. Since my automation maintains message context, it allows for smooth management.

  3. Using the integration - GitHub - valentinfrlch/ha-llmvision: Let Home Assistant see!, I form a typical schedule of people’s arrivals/departures, and if something deviates from it, I receive notifications. I use people descriptions and calendar from LLM Vision to find something atypical. Free models on OpenRouter are sufficient for many tasks. From my observations, Sonnet Haiku works excellently for more complex tasks.

  4. In the mornings, based on the calendar and weather forecast, I create a daily timeline for myself and my spouse. I tried using schedule data, but ultimately models calculate average times quite well - when and where to travel, etc.

  5. Based on the weather forecast and other data, I create a card and message for the nanny about how to dress the child. If the UVI is high, AI adds a reminder about SPF and which one to use.

  6. Again, based on people’s presence schedule, AI forms a task for the robot vacuum by cleaning zones (its single charge has long been insufficient for everything, so charging and idle time are taken into account).

  7. Based on the weather forecast, current readings, history of turn-ons/turn-offs, and lawn description, automatic watering occurs.

  8. Depending on the time of day, weather, season, and current home environment, it smoothly controls the color temperature of lighting in the work area.

  9. Again, based on LLM Vision and Frigate, when a child cries at night, for example, the volume is reduced if we’re watching a series or movie, the light turns on, and this happens only if there are no adults in the room according to LLM Vision’s description. This can be configured through automations locally, but I find it convenient that AI can send everything in JSON, based on which management occurs, clearly understanding what should and should not be done.

  10. Depending on which room people are in, a command is sent to the speaker for TTS about someone arriving, standing at the door, i.e., if I’m far from the door and might not hear the doorbell, or it’s turned off, the speaker reports a description of what’s happening.

  11. To avoid information overload, notifications about adverse weather phenomena from government services are loaded into Home Assistant, and if such a notification exists and is truly important, family members receive it. But if it’s just a warning about fog at night, then no.

  12. Through the nanny’s phone tracking, I monitor how many hours she was at home, when she left and arrived. Obviously, you can count everything and write it down, but AI prepares us a weekly summary so we don’t forget to pay for any overtime.

  13. Also, for the sake of economy, AI provides an account of where and when lights are most often on, other integrated equipment works when presence sensors show no one is around. This happens once a week, just so we don’t forget about nature and energy saving, and comes to the messenger.

  14. In general, I used to have many different notifications in the “if, then…” format, but now they’ve become much fewer and more useful. For example, if it’s evening, AQI is high, many pollutants, and all windows are closed, then there’s no need to write about it. AI aggregates all basic indicators together on a timer, and if something is really wrong, it writes about it. Again, I request a response in JSON with a parameter of importance and notification necessity. I’m an allergy sufferer, and I deleted all pollen level apps, canceled subscriptions, and now simply retrieve open data and form notifications based on allergens if something will create discomfort for me.

  15. Based on Frigate and LLM Vision, I’ve set up filtering of false fire alarm notifications. For instance, Frigate often confuses things with a vacuum cleaner and music sounds. In case of a trigger, LLM Vision makes snapshots from all camera streams, forms a description of what’s happening, then I upload this to AI, and if there’s no panic, people are home, etc., I avoid receiving critical notifications in vain.

I’ve highlighted only the main points, hoping this will inspire ideas that might be useful to others. In Russian originally.

P.S.: There’s also a button in the child’s room that, when pressed, makes the speaker tell a kind and funny poem about him or an interesting fact about something, understandable for his age and interesting. The nanny isn’t always happy, and neither are we, when he decides to play with the button for 20 minutes straight :))

2 Likes

Looks cool, will see if I can get it working with Ollama

Yep, it’s in the “Potentially Compatible” list. Theoretically should work if you configure Ollama to expose an OpenAI-compatible API endpoint. No guarantees, but worth experimenting.

Pro tips for smooth Ollama integration:

  • Use Ollama’s OpenAI compatibility mode
  • Ensure JSON request/response format
  • Enable OpenAI-compatible API in Ollama settings
  • Run Ollama with OLLAMA_HOST=0.0.0.0:11434 ollama serve
  • Configure custom endpoint in HA integration: http://your_ollama_ip:11434/v1

Potential challenges:

  • Ensure exact JSON structure matches OpenAI
  • Some advanced parameters might not translate perfectly

Let me know how it goes!

Just realized that I need to be on 2024.11, so will work on upgrading to that first

Just realized that I need to be on 2024.11, so will work on upgrading to that first

It should work on earlier versions too. I’ve primarily tested on recent versions, so can’t give 100% guarantee, but compatibility looks solid from around Home Assistant 2023.10 onwards.

How do the sensors work? I don’t see where you set them in the yaml or call them via automation (at least via the GitHub). The only thing I see about sensors is the naming convention, but nothing else (unless I missed it somewhere)

How do the sensors work? I don’t see where you set them in the yaml or call them via automation (at least via the GitHub). The only thing I see about sensors is the naming convention, but nothing else (unless I missed it somewhere)

The sensor attributes can be found directly in the sensor’s state. I recommend checking the available attributes here:

Here’s a basic automation example to demonstrate sensor usage:

AI_Weather_Recommendation.yaml
alias: AI Weather Recommendation
description: Get AI-powered clothing recommendation based on weather conditions
mode: single
max_exceeded: silent
triggers:
  - at: "07:30:00"
    trigger: time
actions:
  - data:
      instance: sensor.ha_text_ai_openrouter
      context_messages: 5
      question: |
        Based on current weather conditions, provide a detailed clothing recommendation: 
          Temperature: {{ state_attr('weather.local', 'temperature') }}°C   
          Feels Like: {{ state_attr('weather.local', 'apparent_temperature') }}°C   
          Cloud Coverage: {{ state_attr('weather.local', 'cloud_coverage') }}%   
          Humidity: {{ state_attr('weather.local', 'humidity') }}%   
          Wind: {{ state_attr('weather.local', 'wind_speed') }} km/h, 
          Direction {{ state_attr('weather.local', 'wind_bearing') }}°   
          Pressure: {{ state_attr('weather.local', 'pressure') }} mmHg  

        Please suggest:   
          1. Optimal clothing layers   
          2. Recommended accessories   
          3. Tips for staying comfortable outdoors  
    action: ha_text_ai.ask_question
  - wait_for_trigger:
      - entity_id: sensor.ha_text_ai_openrouter
        attribute: response
        trigger: state
  - choose:
      - conditions:
          - condition: template
            value_template: >-
              {{ state_attr('sensor.ha_text_ai_openrouter', 'response') is
              defined }}
        sequence:
          - data:
              title: 🧥 Today's Clothing Recommendation
              message: "{{ state_attr('sensor.ha_text_ai_openrouter', 'response') }}"
            action: notify.mobile_app_someone
      - conditions:
          - condition: template
            value_template: "{{ true }}"
        sequence:
          - data:
              title: ❌ AI Recommendation Error
              message: Unable to generate weather-based recommendation
            action: notify.mobile_app_someone

UPD: I realized that the configuration block in YAML was not very clearly described on GitHub, so I added a detailed description and a parameter table to the documentation.

Configuration via yaml

GitHub - smkrv/ha-text-ai: Advanced AI Integration for Home Assistant with multi-provider support

Via YAML

Platform Configuration (Global Settings)

ha_text_ai:
  api_provider: openai  # Required
  api_key: !secret ai_api_key  # Required
  model: gpt-4o-mini  # Optional
  temperature: 0.7  # Optional
  max_tokens: 1000  # Optional
  request_interval: 1.0  # Optional
  api_endpoint: https://api.openai.com/v1  # Optional
  system_prompt: |  # Optional
    You are a home automation expert assistant.
    Focus on practical and efficient solutions.

Sensor Configuration

sensor:
  - platform: ha_text_ai
    name: "My AI Assistant"  # Required, unique identifier
    api_provider: openai  # Optional (inherits from platform)
    model: "gpt-4o-mini"  # Optional
    temperature: 0.7  # Optional
    max_tokens: 1000  # Optional

:clipboard: Configuration Parameters

Platform Configuration

Parameter Type Required Default Description
api_provider String :white_check_mark: - AI service provider (openai, anthropic)
api_key String :white_check_mark: - Authentication key for AI service
model String :warning: Provider default Recommended: Specific AI model to use. If not specified, the provider’s default model will be used
temperature Float :x: 0.7 Response creativity level (0.0-2.0)
max_tokens Integer :x: 1000 Maximum response length
request_interval Float :x: 1.0 Delay between API requests
api_endpoint URL :warning: Provider default Custom API endpoint
system_prompt String :x: - Default context for AI interactions

Sensor Configuration

Parameter Type Required Default Description
platform String :white_check_mark: - Must be ha_text_ai
name String :white_check_mark: - Unique sensor identifier
api_provider String :x: Platform setting Override global provider
model String :warning: Platform setting Recommended: Override global model. If not specified, uses platform or provider default
temperature Float :x: Platform setting Override global temperature
max_tokens Integer :x: Platform setting Override global max tokens

Thank you for pointing out this inaccuracy!

This is great! Do it work with Google Gemini?

Yes, of course, as soon as I’m confident that the current version is stable and reliable. At the moment, Gemini can be used via OpenRouter, for example — this combination works great.

Hi everyone!

Thanks to the community feedback, I’ve implemented a significant number of improvements:

I’ll be glad to hear any comments and feedback!

2 Likes

Hi everyone, a small update: not a Blueprint yet, but here’s one of the examples:

  • I have a markdown card on the panel that displays the AI response from my integration
  • A separate instance was created for this automation
  • The automation runs once an hour with a prompt:
Write in a friendly tone about the current weather and recommendations:
  - Temperature, humidity, wind.
  - Clothing for adults and children.
  - Possible hazardous weather phenomena and air pollution.
  - Safety recommendations and actions.
  - No greeting.

  Date and Time: {{ now().strftime('%Y-%m-%d %H:%M:%S') }}
  Location: xxx

  Current Weather:
    - Temperature: {{ state_attr('weather.weather_home', 'temperature') }} °C
    - Feels like: {{ state_attr('weather.weather_home', 'feels_like') }} °C
    - Humidity: {{ state_attr('weather.weather_home', 'humidity') }}%
    - Pressure: {{ state_attr('weather.weather_home', 'pressure') }} hPa
    - Wind: {{ state_attr('weather.weather_home', 'wind_speed') }} km/h
    - Conditions: {{ states('weather.weather_home') }}

  Air Quality:
    - Quality Index: {{ states('sensor.street_air_quality_index') }} AQI
    - PM10: {{ states('sensor.street_pm10') }} µg/m³
    - PM2.5: {{ states('sensor.street_pm2_5') }} µg/m³
    - Dominant Pollutant: {{ states('sensor.street_dominant_pollutant') }}

  Precipitation Forecast:
    - Snow: {{ states('sensor.weather_home_snow_amount') }} mm
    - Rain Forecast: {{ states('sensor.weather_home_rain_amount_forecast_0d') }} mm
    - Total Precipitation: {{ states('sensor.weather_home_precipitation') }} mm

  Storm Activity:
    - Current Storm:
    {%- if states('sensor.weather_home_storm') == 'True' %} - Storm
    {%- else %} - No Storm
    {%- endif %}

    - Storm Forecast:
    {%- if states('sensor.weather_home_storm_forecast_0d') == 'True' %} - Storm Possible
    {%- else %} - No Storm Expected
    {%- endif %}

  Additional Sensors:   
    - Feels like: {{ states('sensor.street_fl') }} °C   
    - Max UV: {{ states('sensor.openuv_max_uv_index') }}   
    - Current UV {{ states('sensor.openuv_current_uv_index') }}  

  Forecast: {% for forecast in state_attr('weather.weather_home', 'forecast') %}   
    - Date: {{ forecast.datetime }}   
    - Temperature: {{ forecast.temperature }} °C   
    - Conditions: {{ forecast.condition }} {% endfor %}
  • The response is displayed on the card and also shown on the screen (on the wall):

:crescent_moon: Night Weather Picture in xxx:

:thermometer: Temperature:

  • Current: -3°C to -3.6°C
  • Feels like: -9°C
  • Humidity: 80-82%
  • Wind: light, 7-15 km/h
  • Cloudy, no precipitation

:dress: Clothing Recommendations:

For Adults:

  • Warm thermal underwear
  • Wool sweater
  • Down jacket or winter coat
  • Insulated pants
  • Winter boots
  • Hat, scarf, gloves

For a Child:

  • Warm pajamas
  • Wool socks
  • Blanket for extra warmth
  • If they wake up - a warm robe

:warning: Safety:

  • Night, severe frost
  • Check the temperature in the child’s room
  • Insulate the radiators
  • Close the windows
  • Humidifier is on
  • Warm blanket for the child

:wind_face: Environmental Situation:
Air Quality: Excellent
AQI: 24 (low pollution level)
Precipitation Forecast: 0 mm

Have a peaceful winter night! :crescent_moon::snowflake:

Cute and convenient.

When I have time, I’ll definitely upload a Blueprint based on my example of monitoring dangerous situations and weather phenomena — from wind to storms and air pollution, with phone notifications and other features.

Thanks for your attention :wink:

2 Likes

getting this error
API request failed: Max tokens must be between 1 and 4096
any tips

alias: AI Clothes
description: AI Based clothes
triggers:
  - trigger: time
    at: "07:00:00"
conditions: []
actions:
  - data:
      instance: sensor.ha_text_ai_my_assistant
      context_messages: 5
      question: >
        Based on current weather conditions, provide a detailed clothing
        recommendation: 
          Temperature: {{ state_attr('weather.redcliffe', 'temperature') }}°C    
          Humidity: {{ state_attr('weather.redcliffe', 'humidity') }}%   
          Wind: {{ state_attr('weather.redcliffe', 'wind_speed') }} km/h, 
          Direction {{ state_attr('weather.redcliffe', 'wind_bearing') }}°   
          Pressure: {{ state_attr('weather.redcliffe', 'pressure_unit') }} mmHg  

        Please suggest:   
          1. Optimal clothing layers   
          2. Recommended accessories   
          3. Tips for staying comfortable outdoors  
      max_tokens: 1000
    action: ha_text_ai.ask_question
  - wait_for_trigger:
      - entity_id: sensor.ha_text_ai_my_assistant
        attribute: response
        trigger: state
  - choose:
      - conditions:
          - condition: template
            value_template: >-
              {{ state_attr('sensor.ha_text_ai_my_assistant', 'response') is
              defined }}
        sequence:
          - data:
              title: 🧥 Today's Clothing Recommendation
              message: "{{ state_attr('sensor.ha_text_ai_my_assistant', 'response') }}"
            action: notify.mobile_app_doms_iphone
      - conditions:
          - condition: template
            value_template: "{{ true }}"
        sequence:
          - data:
              title: ❌ AI Recommendation Error
              message: Unable to generate weather-based recommendation
            action: notify.mobile_app_doms_iphone
mode: single

I’m setting mine up right now and will let you know if I run into it.
Do we need to specify a model based on our service?

model: "gpt-3.5-turbo"

Thank you for using the integration, and I’m really sorry that you’ve encountered an error.

I suggest that the problem arises because, considering the context of 5 messages, the total processing cost exceeds 1000 tokens. I’ll think about how to make the error more understandable. You can verify this by enabling debug mode in the integration and checking the logs - 99% the problem is in the token limit.

Regarding the model, if the integration is with OpenRouter, for example, you can specify the model directly in the automation. If not specified, it will use the default model from the integration settings that you set when creating and configuring the instance. This also applies to any other service that allows working with various models using a single API key.

If nothing helps, you can send me the logs in private messages without sensitive information, and I’ll try to find the error and its cause.

1 Like

that fixed it