How to use OikoNotes with Ollama
Ollama is the local AI path for users who want chat and embedding-capable models on their own machine.
- Install and start Ollama on the machine that will run the desktop app or provider endpoint.
- Pull the chat and embedding models you want to use.
- Open Settings > AI Provider and choose BYOK mode.
- Select Ollama, set the endpoint if it is not the default local URL, and choose available model IDs.
- Run a small note through the process queue first, then decide whether to enable broader AI processing.
Provider capability details live in AI Features.