Mistral @MistralAI AI Releases Mistral Small 4: 119B Parameter MoE Model


Mistral @MistralAI AI has launched its new Mistral Small 4 model. The model adopts a Mixture of Experts architecture with a parameter count of 119 billion, aiming to deliver powerful performance with greater efficiency.
View Original
post-image
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin