Ollama

Use Local LLMs in your Flows
Use Local LLMs in your Flows

Read more ›

Flow cards

Then...

Ollama
Generate response using model Model and prompt Prompt
Advanced
Ollama
Generate response using model Model and prompt Prompt with image ...
Advanced
Ollama
Set system prompt to System prompt

Support

Having an issue with this app? Contact the developer here.

What’s new

Version 1.0.9 — Improved autocomplete listeners

View changelog

Ollama

Ollama is not compatible with the selected Homey.

Ollama will install on Homey shortly.
Install App