Samsung Starts Mass Producing World's Thinnest 12nm LPDDR5X To Fuel Mobile AI Surge

Samsung LPDDR5X DRAM on a fingernail.
As time goes on, running artificial intelligence (AI) workloads directly on devices (as opposed to offloading to the cloud) will become more commonplace. To help push the industry towards that goal. Samsung announced it has begun mass producing what it claims is the industry's thinnest 12-nanometer class 12 gigabyte (GB) and 16GB LPDDR5X DRAM packages. The announcement comes just three months after Samsung laid claim to the world's fastest LPDDR5X.

"Samsung’s LPDDR5X DRAM sets a new standard for high-performance on-device AI solutions, offering not only superior LPDDR performance but also advanced thermal management in an ultra-compact package," said YongCheol Bae, Executive Vice President of Memory Product Planning at Samsung Electronics. "We are committed to continuous innovation through close collaboration with our customers, delivering solutions that meet the future needs of the low-power DRAM market."

Samsung LPDDR5X DRAM airflow infographic.

According to Samsung, it was able to tap into its "extensive expertise in chip packaging"  and "optimized back-lapping process" to reduce the thickness of its LPDDR5X DRAM packages by around 9% using a 4-stack structure. Samsung also says that its ultra-slim DRAM, which is as thin as a fingernail at just 0.65 millimeters, offers a sizable 21% percent improvement in heat resistance compared to the previous generation.

Samsung LPDDR5X DRAM package laying flat.

Razer-thin DRAM makes for some cool imagery for sure, which Samsung is proud as a peacock to flaunt, but it's not just about looks and/or bragging rights. According to Samsung, the size reduction frees up space inside mobile devices to facilitate more airflow, which in turn makes it easier for device makers to manage thermals.

Side view of a Samsung LPDDR5X DRAM package on a ruler.

Part of the reason this matters is because achieving and maintaining higher levels of performance will play a big role in enabling power-hungry applications and features, including those served by on-device AI. We're already seeing a push in that direction with the current crop of Copilot+ laptops and more advanced smartphone devices.

Samsung says that as demand for smaller mobile DRAM packages grows, it plans to develop both 6-layer 24GB and 8-layer 32GB modules to cram into increasingly thin devices.
Tags:  Samsung, memory, DRAM, AI, lpddr5x