YuvaDev Docs

Providers

Configure model providers

YuvaDev supports local and cloud providers so you can balance privacy, speed, and capability.

Ollama (Local)

Best for offline work and sensitive repositories. Runs fully on your machine.

OpenAI

Great for complex reasoning tasks and broad coding support.

Anthropic

Strong long-context planning for larger, multi-file change sets.

DeepSeek

Fast coding output with cost-efficient inference for iterative workflows.

Local Ollama setup
# Start Ollama locallyollama serve# Pull recommended coding modelollama pull qwen2.5-coder:14b# In YuvaDev settingsProvider: OllamaEndpoint: http://localhost:11434
Cloud provider environment variables
OPENAI_API_KEY=your_openai_keyANTHROPIC_API_KEY=your_anthropic_keyDEEPSEEK_API_KEY=your_deepseek_key