
What is the UE8M0 FP8 that ignites domestic computing power chips?

With the expansion of the parameter scale of deep learning models, the demand for efficient computing and storage solutions has increased. Reducing the bit width of data types is an effective approach, but maintaining accuracy is a challenge. The Microscaling format introduced by NVIDIA Blackwell GPU enhances GPU efficiency. DeepSeek V3.1 uses UE8M0 FP8 scale, driving a short-term surge in the concept of domestic chips. Some domestic GPUs/NPUs claim to support FP8/MX, strengthening the narrative of software-hardware collaboration. In 2023, OCP released Microscaling v1.0, and in 2025, NVIDIA will adopt MXFP8 as a native data type to improve training efficiency
Due to copyright restrictions, please log in to view.
Thank you for supporting legitimate content.

