DeepSeek-V3

Category:Open Source LLMsPricing: Free

Description

A strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.