Model Configuration
Select and configure AI models for Jarvis
Jarvis supports both local models (via Ollama) and cloud providers. You can configure your default model and add fallbacks.
Local Models (Ollama)
Ollama is the recommended way to run AI models locally. It's free, private, and works offline.
Installing Ollama
curl -fsSL https://ollama.com/install.sh | shAvailable Models
| Model | Size | Best For | Command |
|---|---|---|---|
| llama3.2 | 4GB | General purpose | ollama pull llama3.2 |
| qwen2.5 | 7GB | Coding tasks | ollama pull qwen2.5 |
| mistral | 4GB | Balanced | ollama pull mistral |
| codellama | 7GB | Code generation | ollama pull codellama |
Setting Default Model
jarvis config set agents.defaults.model "ollama/llama3.2"Cloud Models
Add cloud providers for more powerful models when needed.
OpenAI Models
gpt-4- Most capablegpt-3.5-turbo- Fast and cost-effective
Anthropic Models
claude-3-opus- Most capableclaude-3-sonnet- Balanced
Model Configuration
Configure model behavior in your config file:
{
"agents": {
"defaults": {
"model": "ollama/llama3.2",
"workspace": "~/.jarvis/workspace"
}
}
}