Mixtral

分类:Open Source LLMs价格: Free

描述

Mixtral 8x7B, a high-quality sparse mixture of experts model (SMoE) with open weights. Mixtral outperforms Llama 2 70B on most benchmarks with 6x faster inference. It matches or outperforms GPT3.5 on most standard benchmarks. <br>paper:https://arxiv.org/pdf/2401.04088.pdf <br>news:https://mistral.ai/news/mixtral-of-experts/

© 2026 AILS Tools Hub