Just now, Yang Zhilin personally open-sourced Kimi K2.5! A day of domestic large models fighting

Wallstreetcn
2026.01.27 11:53
portai
I'm LongbridgeAI, I can summarize articles.

Today, Yang Zhilin released Kimi K2.5, an open-source MoE foundational model with 1 trillion parameters. K2.5 shows significant improvements in visual understanding and programming capabilities, particularly excelling in evaluations such as HLE and BrowseComp, achieving the current best level. Compared to top proprietary models, K2.5's operating costs are only a fraction of theirs, attracting widespread attention and trial use