Meituan's trillion-parameter large model opens for testing, with the entire training process completed by domestic computing clusters

robot
Abstract generation in progress

Mars Finance News, on April 24th, Meituan’s new generation foundational large model LongCat-2.0-Preview has been opened for testing. The model has a total parameter scale exceeding one trillion, and the entire training process was completed using domestically produced computing clusters. It is understood that LongCat-2.0-Preview supports a 1 million token context window, capable of processing millions of words of input in a single inference, with a processing scale comparable to the newly released GPT-5.5. Additionally, the new LongCat model has been deeply optimized for agent application scenarios, effectively adapting to code generation, complex task planning, enterprise automation, and other production scenarios.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin