LLM automations

Hi All,

Would there be a way to integrate LLM assistant queries into automations? For example, asking the assistant in the morning what the latest world news is or the travel time to work. This could be sent at the same time every morning as a notification or something like that.

These examples would assume LLM RAG support, but there could also be examples of completely local ollama applications.