What does NVIDIA's inference context memory storage mean for NAND?

Wallstreetcn
2026.01.14 16:27
portai
I'm PortAI, I can summarize articles.

Citigroup's report points out that NVIDIA's newly launched AI inference context memory storage (ICMS) architecture is expected to significantly exacerbate the global NAND flash supply shortage. Each server requires an additional 1152TB SSD, which is expected to bring an additional demand equivalent to 2.8% and 9.3% of the global NAND total demand in 2026 and 2027, respectively. This move will not only drive up NAND prices but also provide clear structural growth opportunities for leading memory chip manufacturers such as Samsung, SK Hynix, and Micron