AMD Unveils EPYC 7000 Series Processors And Platform To Take On Intel In the Data Center

Introducing The AMD EPYC 7000 Series

AMD EPYC Server Chip Logo

Today marks the launch of AMD's EPYC family of processors for data center servers. Based on the company's Zen microarchitecture, it has become abundantly clear that AMD was targeting the lucrative data center market first and foremost with its new CPU architecture and the highly scalable Naples platform that leverages it. Of course Zen scales well for client/consumer desktop applications, as we've seen with AMD's successful Ryzen processor launch. However, the data center is near and dear to the AMD's heart, due to significantly higher chip pricing and better profit margins; not to mention the explosion of the cloud, from software as a service-built platforms like Amazon AWS, to AI, and big data analytics.

Today, AMD is giving us a detailed picture of how its Zen-based EPYC processor lineup will flesh-out. Along with some of its key architectural advantages, it's easy to see now that Zen was built from the ground-up with data center-class scalability in mind, across its entire architecture. We've got plenty of information to disclose on the pages ahead, but also have some video of some EPYC-based servers in action that demonstrate the performance of the platform.

Power Optimized And Secure

AMD has three tenants in mind with respect to EPYC-based server platforms, "Power, Optimize and Secure." In other words, balanced top-end performance with abundant core resources, memory bandwidth and I/O connectivity, flexible configurability of the platform for targeted workloads and securing the platform at the silicon-level to minimize threat vectors wherever possible. Those are AMD's "principles" for EPYC, and it does look like EPYC has the building blocks to execute toward those goals as well.

AMD EPYC 7000 Server
AMD EPYC 7000 Series Server With All DIMM Sockets Populated

Above is a fully configured 2P AMD EPYC server, with all of its 8 memory DIMM sockets and channels filled - 64 physical cores, 128 threads, up to 2 terabytes of memory and 128 PCI Express lanes of direct CPU root access. It's a beast to be sure, but before we get to far head of ourselves, let's take a look at the entire current AMD EPYC processor family... 


AMD's EPYC 7000 series of server processors have a common set of features and attributes across the line-up, namely 8-channel DDR4 memory per CPU socket and up to 2TB of memory total, 128 available PCI Express lanes for specialized co-processor expansion (GPUs, storage HBAs etc.), specialized, integrated secondary processor cores for security functions and further socket expansion beyond just this family of EPYC CPUs. 

Speaking of which, here's what the current family looks like... 


AMD EPYC Single Socket

32, 24, 16 and 8-core EPYC CPUs will comprise AMD's new server CPU stack for now, but all sport 128 lanes of PCIe connectivity and that same 8-channel DDR4 controller with official supported memory speeds of up to 2666MHz. Base frequencies of the chips clock in around the 2GHz mark with boost frequency topping out at 3.2GHz. The flagship is the AMD EPYC 7601 with 32 cores and 64 threads and the lineup goes down to the 8-core EPYC 7251 which also has all the same memory bandwidth and PCIe connectivity of the 32-core beasts.

The key common attributes of 8-channel DDR4 and 128 lanes of available PCIe expansion alone speak to the platform advantage AMD is pitching versus Intel's current Xeon platforms and even its latest Xeon Scalable processor family. Though we haven't seen official, confirmed details of Intel's new Xeon Scalable offering, recent leaks indicate that AMD could have a core count advantage. In addition, if PCIe connectivity and memory channel support is similar to Intel's high-end Skylake-X platform, EPYC could have a significant advantage in both PCIe expansion and memory bandwidth as well. 

These advantages could play well for AMD in single-socket server configurations especially, a segment of the market that AMD is quick to point out represents a full 25% of the world's deployed data center server population.

Let's drop down beyond just the speeds and feeds with EPYC, for a quick architecture refresh, along with some new key platform feature details... 

Related content