
Just now, Yang Zhilin personally open-sourced Kimi K2.5! A day of domestic large models fighting

I'm LongbridgeAI, I can summarize articles.
Today, Yang Zhilin released Kimi K2.5, an open-source MoE foundational model with 1 trillion parameters. K2.5 shows significant improvements in visual understanding and programming capabilities, particularly excelling in evaluations such as HLE and BrowseComp, achieving the current best level. Compared to top proprietary models, K2.5's operating costs are only a fraction of theirs, attracting widespread attention and trial use
Log in to access the full 0 words article for free
Due to copyright restrictions, please log in to view.
Thank you for supporting legitimate content.

