Docs
Back Home
Back Home

🤖 AI Assistant Integration

Process your recognition results using models deployed in Ollama. Customize templates to handle various tasks.

Last Updated: 2026-04-21 · Language: English

1. Configure AI Service

Local Ollama (Recommended, Fully Offline)

  1. Visit ollama.com/download to download and install Ollama.
  2. Verify the service address in Owl Meeting's AI settings (default is http://localhost:11434/api).
  3. Click "Test" to confirm the connection is normal.
  4. Search for and pull the required model (e.g., qwen3:4b) in the Ollama model library.
AI service configuration interface AI service configuration interface

2. Task Types

3. Input Modes

4. Real-time AI vs Offline AI

5. Custom Model Parameters

Built-in configuration for common LLM parameters (e.g., temperature, top_p). Advanced users can enable the model parameters panel for fine-tuning or pass custom parameters in JSON format.

6. FAQ

7. Recommendations