DeepSeek V3 Logo

DeepSeek V3

Provider: Deepseek

A highly efficient MoE model with 671B total parameters (37B activated), leveraging MLA for fast inference and a 128K token context window. Advanced chain-of-thought support and API compatibility with V2.

Key Information:

  • Identifier: deepseek-v3-2024-12-26
  • Fine-tunable: Yes
  • Standard Model: No