Skip to main content

Open Models Configuration

RA.Aid supports a variety of open source and compatible model providers. This guide covers configuration options and best practices for using different models with RA.Aid.

Overview

RA.Aid supports these model providers:

ProviderDescriptionKey Features
DeepSeekChinese hedge fund who creates sophisticated LLMsStrong, open models like R1
OpenRouterMulti-model gateway serviceAccess to 100+ models, unified API interface, pay-per-token
OpenAI-compatibleSelf-hosted model endpointsCompatible with Llama, Mistral and other open models
AnthropicClaude model series200k token context, strong tool use, JSON/XML parsing
GeminiGoogle's multimodal modelsCode generation in 20+ languages, parallel request support

Provider Configuration

DeepSeek Models

DeepSeek offers powerful reasoning models optimized for complex tasks.

# Environment setup
export DEEPSEEK_API_KEY=your_api_key_here

# Basic usage
ra-aid -m "Your task" --provider deepseek --model deepseek-reasoner

# With temperature control
ra-aid -m "Your task" --provider deepseek --model deepseek-reasoner --temperature 0.7

Available Models:

  • deepseek-reasoner: Optimized for reasoning tasks
  • Access via OpenRouter: deepseek/deepseek-r1

Advanced Configuration

Expert Tool Configuration

Configure the expert model for specialized tasks; this usually benefits from a more powerful, slower, reasoning model:

# DeepSeek expert
export EXPERT_DEEPSEEK_API_KEY=your_key
ra-aid -m "Your task" --expert-provider deepseek --expert-model deepseek-reasoner

# OpenRouter expert
export EXPERT_OPENROUTER_API_KEY=your_key
ra-aid -m "Your task" --expert-provider openrouter --expert-model mistralai/mistral-large-2411

# Gemini expert
export EXPERT_GEMINI_API_KEY=your_key
ra-aid -m "Your task" --expert-provider gemini --expert-model gemini-2.0-flash-thinking-exp-1219

Best Practices

  • Set environment variables in your shell configuration file
  • Use lower temperatures (0.1-0.3) for coding tasks
  • Test different models to find the best fit for your use case
  • Consider using expert mode for complex programming tasks

Environment Variables

Complete list of supported environment variables:

VariableProviderPurpose
OPENROUTER_API_KEYOpenRouterMain API access
DEEPSEEK_API_KEYDeepSeekMain API access
OPENAI_API_KEYOpenAI-compatibleAPI access
OPENAI_API_BASEOpenAI-compatibleCustom endpoint
ANTHROPIC_API_KEYAnthropicAPI access
GEMINI_API_KEYGeminiAPI access
EXPERT_OPENROUTER_API_KEYOpenRouterExpert tool
EXPERT_DEEPSEEK_API_KEYDeepSeekExpert tool
EXPERT_GEMINI_API_KEYGeminiExpert tool

Troubleshooting

  • Verify API keys are set correctly
  • Check endpoint URLs for OpenAI-compatible setups
  • Monitor API rate limits and quotas