Skip to main content Link Menu Expand (external link) Document Search Copy Copied
aider
Aider is AI pair programming in your terminal. Aider is on GitHub and Discord.

January 28, 2025

Alternative DeepSeek V3 providers

DeepSeek’s API has been experiencing significant reliability issues for the past 24-48+ hours, with many users reporting downtime and overload problems. Their status page notes an ongoing incident.

If you’re affected by these issues, several alternative providers offer access to DeepSeek V3. This article compares their performance on aider’s polyglot benchmark to help you choose a reliable alternative.

Providers

OpenRouter

OpenRouter offers many DeepSeek providers through their unified API. You can use aider with OpenRouter like this:

# Set your API key using environment variables
export OPENROUTER_API_KEY=<your-key>
aider --model openrouter/deepseek/deepseek-chat
# Or use the --api-key command line option
aider --model openrouter/deepseek/deepseek-chat --api-key openrouter=<your-key>
# Or add it to .aider.conf.yml in your home directory or project root:
api-key:
 - openrouter=<your-key>

OpenRouter automatically monitors their providers and routes requests to stable APIs and away from those experiencing unreliable performance.

But not all providers serve the same version of open source models, and not all have the same privacy guarantees. You can control which OpenRouter providers are used to serve the model via aider’s model settings. Create a .aider.model.settings.yml file in your home directory or git project root with settings like this:

- name: openrouter/deepseek/deepseek-chat
 extra_params:
 extra_body:
 provider:
 # Only use these providers, in this order
 order: ["Novita"]
 # Don't fall back to other providers
 allow_fallbacks: false

See OpenRouter’s provider routing docs for more details.

Fireworks

# Set your API key using environment variables
export FIREWORKS_API_KEY=<your-key>
aider --model fireworks_ai/accounts/fireworks/models/deepseek-chat
# Or use the --api-key command line option
aider --model fireworks_ai/accounts/fireworks/models/deepseek-chat --api-key fireworks=<your-key>
# Or add it to .aider.conf.yml in your home directory or project root:
api-key:
 - fireworks=<your-key>

Create a .aider.model.settings.yml file in your home directory or git project root with settings like this:

- name: fireworks_ai/accounts/fireworks/models/deepseek-chat
 edit_format: diff
 weak_model_name: null
 use_repo_map: true
 send_undo_reply: false
 lazy: false
 reminder: sys
 examples_as_sys_msg: true
 extra_params:
 max_tokens: 8192
 cache_control: false
 caches_by_default: true
 use_system_prompt: true
 use_temperature: true
 streaming: true

Hyperbolic

You can use Hyperbolic’s API as an OpenAI-compatible provider:

# Set your API key using environment variables
export OPENAI_API_BASE=https://api.hyperbolic.xyz/v1/
export OPENAI_API_KEY=<your-key>
aider --model openai/deepseek-ai/DeepSeek-V3
# Or use the --api-key command line option
aider --model openai/deepseek-ai/DeepSeek-V3 --api-key openai=<your-key>
# Or add it to .aider.conf.yml in your home directory or project root:
api-key:
 - openai=<your-key>

Create a .aider.model.settings.yml file in your home directory or git project root with settings like this:

- name: openai/deepseek-ai/DeepSeek-V3
 edit_format: diff
 weak_model_name: null
 use_repo_map: true
 send_undo_reply: false
 lazy: false
 reminder: sys
 examples_as_sys_msg: true
 cache_control: false
 caches_by_default: true
 use_system_prompt: true
 use_temperature: true
 streaming: true
 editor_model_name: null
 editor_edit_format: null
 extra_params:
 max_tokens: 65536

Ollama

You can run DeepSeek V3 via Ollama.

# Pull the model
ollama pull deepseek-v3
# Start your ollama server
ollama serve
# In another terminal window...
export OLLAMA_API_BASE=http://127.0.0.1:11434 # Mac/Linux
setx OLLAMA_API_BASE http://127.0.0.1:11434 # Windows, restart shell after setx
aider --model ollama/deepseek-v3

It’s important to provide model settings, especially the num_ctx parameter to set the context window. Ollama uses a 2k context window by default, which is very small for working with aider. Larger context windows will allow you to work with larger amounts of code, but will use memory and increase latency.

Unlike most other LLM servers, Ollama does not throw an error if you submit a request that exceeds the context window. Instead, it just silently truncates the request by discarding the "oldest" messages in the chat to make it fit within the context window.

So if your context window is too small, you won’t get an explicit error. The biggest symptom will be that aider says it can’t see (some of) the files you added to the chat. That’s because ollama is silently discarding them because they exceed the context window.

Create a .aider.model.settings.yml file in your home directory or git project root with settings like this:

- name: ollama/deepseek-v3
 edit_format: diff
 weak_model_name: null
 use_repo_map: true
 send_undo_reply: false
 lazy: false
 reminder: sys
 examples_as_sys_msg: true
 cache_control: false
 caches_by_default: true
 use_system_prompt: true
 use_temperature: true
 streaming: true
 extra_params:
 num_ctx: 8192 # How large a context window?

Other providers

You will need to properly configure aider to work with DeepSeek V3 when served via other providers:

  • Determine the --model name to use.
  • Provide your API key to aider.
  • Add model settings to .aider.model.settings.yml.

Adapt the .aider.model.settings.yml shown above for Fireworks. You will need to change the name field to match you chosen provider’s model naming scheme.

See Advanced model settings for details about all aider model settings

Results

Model Percent completed correctly Percent using correct edit format Command Edit format
Hyperbolic 48.4% 97.3% OPENAI_API_BASE=https://api.hyperbolic.xyz/v1/ aider --model openai/deepseek-ai/DeepSeek-V3 diff
Fireworks 48.4% 96.9% aider --model fireworks_ai/accounts/fireworks/models/deepseek-v3 diff
DeepSeek 48.4% 98.7% aider --model deepseek/deepseek-chat diff
OpenRouter: DeepInfra 48.0% 99.5% aider --model openrouter/deepseek/deepseek-chat diff
OpenRouter: Novita 42.7% 84.0% aider --model openrouter/deepseek/deepseek-chat diff

AltStyle によって変換されたページ (->オリジナル) /