Topic: MoE Architecture

A curated collection of WindFlash AI Daily Report items tagged “MoE Architecture” (bilingual summaries with evidence quotes).

We have curated 18 representative technical articles from the Meituan technology team in 2025, covering major directions such as large model open-sourcing, R&D skills, and product services. This year, our LongCat team achieved significant milestones in the AI open-source ecosystem by releasing a comprehensive suite of models including LongCat-Flash-Chat, which utilizes a 560B parameter MoE architecture to optimize computational efficiency. We also introduced specialized tools like LongCat-Video for world model exploration and LongCat-Flash-Omni for real-time multi-modal interaction. These contributions solve industry pain points such as high inference latency and the difficulty of balancing performance with lightweight deployment. For the developer community, we provide these high-performance, low-threshold open-source resources to foster innovation and collective growth in the AI era.

美团技术团队Dec 29, 12:00 AM