CoinWorld News, Meituan's new generation foundational large model longcat-2.0-preview has been opened for testing. The total number of parameters in this model exceeds one trillion, ranking among the world's top large models. An insider revealed that the new generation v4 large model released by deepseek on the same day has a total number of parameters and active parameters roughly consistent with Meituan's longcat-2.0-preview. The greater breakthrough of Meituan's new generation foundational large model lies in the fact that its training and inference are entirely reliant on domestic computing power clusters. It is reported that the number of computing power cards used during Meituan's training phase is between 50k and 60k, which is the largest-scale large model training task completed so far using domestic computing power.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin