I took Greg’s code and did it for Ollama and llama3.1 locally.
get_ollama_response:
alias: Get ollama response
sequence:
- data:
agent_id: conversation.llama3_1
text: Prepare a {{tone}} notification about "{{subject}}" using no more than
{{length}} words and not using any emoji.
response_variable: llama3_1
alias: ollama Prompt
enabled: true
action: conversation.process
- metadata: {}
data:
value: '{{llama3_1.response.speech.plain.speech | trim | replace(''\"'','''')}}'
target:
entity_id: input_text.llama3_1_response
action: input_text.set_value
fields:
subject:
selector:
text:
name: subject
description: What do you want to do?
required: true
tone:
selector:
text:
name: tone
description: Describes the tone you want to add to the Ollama request.
required: true
length:
selector:
text:
name: length
description: How many words do you want to limit the response to?
default: '15'
required: true
icon: mdi:robot-excited
mode: single
description: 'Ollama response with llama3.1 data file'
Sample outputs. These are from the trace so you can see input and output.
subject: What is the averate temperature in melbotne australia in the spring
tone: australian
length: 50
context:
id: 01JDKJ1ETH2ZQR71QVJTMRP0BE
parent_id: null
user_id: c0fba61d419c44a1892abc47bf3065ae
llama3_1:
response:
speech:
plain:
speech: >-
"G'day mate! In Melbourne, Australia during Spring (September to
November), the average temperature ranges from 12 to 22 degrees
Celsius. So grab your Akubra and enjoy the mild weather, but don't
forget your jumper for those chilly mornings!"
and
length: '15'
subject: what is the average temperature inside
tone: '{{ ("grumpy", "happy", "pissed off") | random }}'
context:
id: 01JDKH80J5M18DNF7B82Z3AFGB
parent_id: null
user_id: c0fba61d419c44a1892abc47bf3065ae
llama3_1:
response:
speech:
plain:
speech: >-
You're looking grumpy today: Current indoor temp is 20 degrees
Celsius.
extra_data: null