User Guide

Reference documentation for every feature

How to use OikoNotes with Ollama

Ollama is the local AI path for users who want chat and embedding-capable models on their own machine.

  1. Install and start Ollama on the machine that will run the desktop app or provider endpoint.
  2. Pull the chat and embedding models you want to use.
  3. Open Settings > AI Provider and choose BYOK mode.
  4. Select Ollama, set the endpoint if it is not the default local URL, and choose available model IDs.
  5. Run a small note through the process queue first, then decide whether to enable broader AI processing.

Provider capability details live in AI Features.