SK Hynix has announced an enhanced version of its HBM3 memory that increases data transfer rate by 25% and therefore provides a sizeable performance boost for applications that uses this premium type of DRAM. Companies developing solutions for artificial intelligence (AI) and high-performance computing (HPC) will welcome HBM3E when it becomes available in 2024.
SK Hynix’s HBM3E memory will increase the data transfer rate from today’s 6.40 GT/s to 8.0 GT/s, which will boost per-stack bandwidth from 819.2 GB/s to 1 TB/s. Compatibility of HBM3E with existing HBM3 controllers and interfaces remains unclear, as SK Hynix has yet to reveal details about this aspect of the technology.
SK Hynix plans to begin sampling its HBM3E memory in the second half of 2023 and start mass production in 2024.
The company will use its 1b nanometer fabrication technology (the company’s 5th Generation 10nm-class node for DRAMs) to produce its HBM3E memory. This production node is currently used to make DDR5-6400 DRAMs that are validated for Intel’s next-generation Xeon Scalable platform. Eventually it will also be used to produce LPDDR5T memory chips for performance-demanding low-power applications.
“Amid growing expectations that the memory market will start to recover from the second half, we believe our industry-leading DRAM technology, proven again through mass production of the 1bnm process this time, will help us improve earnings from the second half,” said Jonghwan Kim, head of DRAM development at SK Hynix.
Assuming that SK Hynix’s HBM3E memory development and mass production progresses as expected, SK Hynix will have a list of buyers eager to procure fast HBM3E memory for their next-generation solutions for AI and HPC. Meanwhile, SK Hynix, which is already the biggest HBM supplier according to TrendForce, will likely strengthen its positions if it becomes the first supplier of premium HBM3E memory.