DeepSeek-V3

分类:Open Source LLMs价格: Free

描述

A strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.

© 2026 AILS Tools Hub