Model Configuration

Select and configure AI models for Jarvis

Jarvis supports both local models (via Ollama) and cloud providers. You can configure your default model and add fallbacks.

Local Models (Ollama)

Ollama is the recommended way to run AI models locally. It's free, private, and works offline.

Installing Ollama

curl -fsSL https://ollama.com/install.sh | sh

Available Models

ModelSizeBest ForCommand
llama3.24GBGeneral purposeollama pull llama3.2
qwen2.57GBCoding tasksollama pull qwen2.5
mistral4GBBalancedollama pull mistral
codellama7GBCode generationollama pull codellama

Setting Default Model

jarvis config set agents.defaults.model "ollama/llama3.2"

Cloud Models

Add cloud providers for more powerful models when needed.

OpenAI Models

  • gpt-4 - Most capable
  • gpt-3.5-turbo - Fast and cost-effective

Anthropic Models

  • claude-3-opus - Most capable
  • claude-3-sonnet - Balanced

Model Configuration

Configure model behavior in your config file:

{
  "agents": {
    "defaults": {
      "model": "ollama/llama3.2",
      "workspace": "~/.jarvis/workspace"
    }
  }
}
Jarvis

© 2026 Nilkanth Desai. MIT licensed.