DeepSeek V2 Logo

DeepSeek V2

Provider: Deepseek

A strong, economical Mixture‑of‑Experts (MoE) language model with 236B total parameters (21B activated). Designed for efficient inference, long-context reasoning, and chain-of-thought tasks.

Key Information:

  • Identifier: deepseek-v2-2024-05-07
  • Fine-tunable: Yes
  • Standard Model: No