Samsung’s Breakthrough In Human Brain-Like MRAM Tech Claims To Excel At AI Workloads

hero samsung mram
In the computing field, it can be extremely difficult to differentiate buzzword bunkum from meaningful messages. That's especially true when we're talking about emerging technologies and the products based on them. It seems like every other week there's a big announcement about an upcoming bit of tech that will "revolutionize [x] industry" or "change our lives." It's easy to discard statements like that because of the rarity with which the promised benefits manifest into reality, but sometimes the topics of those articles have real promise.

Along similar lines, Samsung's latest report on new developments for its MRAM technology could be a big deal at some point in the future. The gist of the announcement is that Samsung has found a way to make in-memory computing practical using MRAM. In-memory computing is a bit of a grand idea, but the core of it is simply doing processing within memory, rather than having the memory and processor be discrete elements. There are a lot of ways you can go about doing this, but historically MRAM hasn't been suited for it due to the very low resistance of MRAM compared to competing memory technologies, including DRAM, RRAM, and PRAM.

What is MRAM? Well, it isn't new. Magneto-resistive Random Access Memory (MRAM) has been around for more than a decade, and Samsung's been manufacturing it in volume for nearly three years now. Without getting into the weedy details, MRAM offers the performance of DRAM with the non-volatility of flash memory. It also uses much less power than DRAM. (If that sounds familiar, you probably read this article yesterday.)

intel loihi board
A board bearing numerous Intel Loihi chips. Loihi is a neuromorphic computer.

The biggest downside of MRAM is that it isn't very dense, which means that having a lot of capacity requires a lot of space, relatively speaking. That means it isn't appropriate for most consumer devices, which need memory on the order of multiple gigabytes to be useful. Instead, Samsung's mostly been exploring uses for MRAM in the IoT and AI markets. The latter of those two is the main target of today's announcement, because in-memory computing happens to be the way the human brain works, and emulating that function—with so-called "neuromorphic" computers, like Intel's Loihi—has led to some pretty staggering breakthroughs in AI research.

Neuromorphic computers don't need large amounts of memory because of the way they work. Instead, each synapse in the network has a couple-hundred kilobytes of RAM at most. These synapses also aren't powered while they aren't functioning. That isn't a problem for non-volatile RAM like MRAM, which would make it ideally-suited for this kind of computing if not for the aforementioned resistance issue. That's why Samsung is crowing about having resolved it.

inline samsungguys
Dr. Donhee Ham, Dr. Seungchul Jung, and Dr. Sang Joon Kim, the fellows responsible for the breakthrough.

Rather than revising the hardware to improve the resistance of MRAM, Samsung's clever researchers did an end-around and revised the typical way of doing in-memory computing. Instead of working from "current-sum" values, Samsung tried using "resistance sum." In doing so, Samsung was able to demonstrate effective and efficient in-memory computing using MRAM. The company says that it did so by running an image-classification AI on MRAM circuits and that it achieved an accuracy of 98% for hand-written digits and 93% for faces.

Not that the accuracy numbers are really meaningful, mind you; the news here is simply that it worked at all. Samsung didn't make any statements about new products in the announcement, but says that it expects its MRAM to find its way into "next-generation low-power AI chip technologies," and specifically mentioned neuromorphic computing as a possible use case.