SK Hynix Unveils First-Ever 12-Layer HBM3 Chips To Feed Memory-Hungry GPUs
This new 12-layer design represents one of the highest capacities that is expected out of the HBM3 memory standard. When HBM3 arrived a year ago, JEDEC reported that DRAM stacking could get as high as 16 layers in the future with capacities of up to 32GB.
The new 12-layer design was accomplished by improving the process efficiency and performance stability of the HBM3 with Advanced Mass Reflow Molded technology, while simultaneously reducing the thickness of the chip’s TSV’s by 40% to achieve the same stack height level as the previous 16GB products. In essence, the HBM modules retain a similar form factor as their predecessors but feature double the memory capacity.
This new tech will be greatly beneficial in the datacenter world, where enterprise GPUs require loads and heaps of GPU memory to run extremely demanding computational workloads, such as simulation or machine learning. SK hynix states it is seeing a growing demand for its memory products thanks to new demand in the AI-powered chatbot industry.
SK hynix says its new HBM3 modules should come to market starting in H2 2022.