HDDs are roughly 100x slower than SSDs and are more susceptible to vibration, so why are they still the go-to for cold data in AI data centers? One reason: capacity per dollar, as the cost advantage is overwhelming.
What sits in AI cold storage? Massive raw video, images, and text for model training, plus logs and user interactions from inference — large datasets with low access frequency and little need for speed, where the priority is cheap and dense.
HDDs fit that brief: unit storage cost is only 1/4–1/5 of SSDs, and capex to deliver 1EB is about 1/50 of SSDs, yielding huge savings at scale.
SSDs offer higher throughput and lower latency, but the price tag makes bulk cold storage uneconomic. Using SSDs for vast cold datasets would be prohibitively expensive. HDDs do have longer lead times (around 1 year), but cold data are stored for the long term, and forward planning mitigates delivery risks.
So despite the speed gap, as long as the cost edge holds, AI data centers won’t drop HDDs. In the near term, HDD remains the default choice for cold storage.