Powerful AI language model (MoE) optimized for efficiency/performance
Open-source, high-performance AI model with advanced reasoning
Contexts Optical Compression
Mixture-of-Experts Vision-Language Models for Advanced Multimodal
Towards Real-World Vision-Language Understanding
Strong, Economical, and Efficient Mixture-of-Experts Language Model
DeepSeek Coder: Let the Code Write Itself
Pushing the Limits of Mathematical Reasoning in Open Language Models
DeepSeek-Coder-V2: Breaking the Barrier of Closed-Source Models
An experimental version of DeepSeek model
Visual Causal Flow
Advancing Formal Mathematical Reasoning via Reinforcement Learning
Access and use all DeepSeek AI models in one program.
Run models like Kimi-K2.5, GLM-5, DeepSeek, gpt-oss, Gemma, Qwen etc.
From Vibe Coding to Agentic Engineering
DeepSeek LLM: Let there be answers
Prompt, run, edit, & deploy full-stack web applications using any LLM
AI-powered research assistant that performs iterative, deep research
Analyze computation-communication overlap in V3/R1
A high-performance distributed file system
Unleash Next-Level AI
A bidirectional pipeline parallelism algorithm
Alibaba's high-performance LLM inference engine for diverse apps
Towards Ultimate Expert Specialization in Mixture-of-Experts Language
Featuring powerful AI capabilities and supporting e-book formats