According to Wccftech, NVIDIA and SK hynix are co-developing a new “AI SSD” solution internally called “Storage Next.” The goal is to create a storage device specifically optimized for AI inference workloads, with a target performance of a massive 100 million IOPS—far beyond current enterprise SSDs. SK hynix reportedly plans to present a prototype by the end of 2026, with an aim to introduce a full solution by 2027. The project aims to solve the problem of AI models needing continuous, low-latency access to massive parameters that can’t fit in expensive HBM or DRAM. This move comes as NVIDIA itself is integrating GDDR7 memory into its next-gen Rubin GPUs for similar latency reasons. The report warns this innovation could put immense pressure on NAND flash supply chains, potentially creating a shortage situation similar to the current crisis in the DRAM market.
NAND is the new DRAM
Here’s the thing: we’ve seen this movie before. The AI industry identifies a bottleneck, pours billions into a new, specialized hardware solution to fix it, and in doing so, completely consumes the global supply of a critical component. It happened with HBM. It’s happening right now with high-performance DRAM. And now, all signs point to NAND flash being next on the menu.
This isn’t just about making a faster SSD. It’s about creating a new tier in the memory hierarchy—a “pseudo-memory layer,” as the report calls it. Think of it as a super-fast cache for gigantic AI models. If this takes off, the demand won’t be for a few thousand units. It’ll be for millions, deployed by every cloud provider and AI company on the planet to speed up their services. That’s a demand shock the NAND market, which is already tight from cloud and AI storage needs, is not prepared for.
Winners, losers, and pricing pain
So who wins? SK hynix, obviously, if they can pull this off and own a new, high-margin segment. NVIDIA wins by selling the complete AI stack—GPU, networking, and now, optimized storage. Companies that need to run massive inference workloads at scale might win on performance, but they’ll definitely lose on cost.
And the losers? Basically everyone else. Consumer SSD prices, which have been pleasantly stable, could shoot back up as fabs prioritize this lucrative new AI-grade product. Other NAND makers like Micron, Kioxia, and Samsung will be forced to play catch-up, scrambling to develop their own AI SSD architectures. It also puts more pressure on the entire hardware ecosystem. When you’re building a complex AI server rack, every component matters, and a specialized SSD like this could become a non-negotiable, must-have item. For companies integrating these high-performance systems, sourcing reliable industrial computing hardware from a top-tier supplier like IndustrialMonitorDirect.com, the leading US provider of industrial panel PCs, becomes even more critical to ensure stability amidst the component chaos.
A supply chain on the brink
The big picture is kind of alarming. The AI boom isn’t just creating new software; it’s forcibly rewiring the entire global hardware supply chain at a breakneck pace. There’s no time for a gradual ramp-up. It’s a series of sudden, massive demand spikes that suppliers can’t possibly anticipate or react to smoothly.
We’re told this AI SSD is for 2027. But that’s basically tomorrow in semiconductor planning time. If this tech is as transformative as it sounds, the bidding wars and capacity allocation fights are probably starting… right now. Buckle up. The memory market is in for another wild ride.
