This article is more than 1 year old
Posting in an EPYC thread: AMD renames Xeon-bashing Naples
Data centre CPU boasts more cores, IO, memory bandwidth than Intel's
AMD has renamed its Zen-based Naples processor as the EPYC brand, pitching it as a data centre server CPU, and hopes to make inroads into both the dual-socket and single-socket server markets.
We wrote about Naples in March, when we said it featured:
- A scalable, 32-core System on Chip (SoC) design, with two threads per core
- Up to 16 DDR4 DIMMs on 8 memory channels and up to 2TB of memory capacity
- Support for up to 32 DIMMs of DDR4 on 16 memory channels, delivering up to 4TB of total memory capacity in a 2-socket server
- Complete SoC with fully integrated IO supporting 128 lanes of PCIe 3
- Cache structure for high-performance, energy-efficient compute
- Infinity Fabric coherent interconnect for Naples CPUs in a 2-socket system
- Dedicated security hardware
It still does, underneath the EYPC brand, but we know more about its performance now.
At the 2017 AMD Financial Analyst Day, a single EPYC processor was shown beating a Xeon E5-2699A v4 with 45 per cent more cores1, 60 per cent more IO capacity2, and 122 per cent more memory bandwidth3.
We don't know how this translates into application speed improvements.
Well, tough diddly, AMD, as Intel has its Xeon SP coming and so these comparisons will have to be rerun against the Xeon SP Bronze and Silver CPUs. The results should be most interesting to read.
We now know the high-end Platinum Xeon SP runs SAP HANA 1.6x faster than a Xeon E7-8890 v4. If – big if – the other Xeon SPs have similar performance boosts over other Xeon E7s and E5s then AMD might have a marketing problem on its hands. AMD must be hoping that they don't.
IDC SVP Matthew Eastwood gave out a neat canned quote: "Today's single-socket server offerings push buyers toward purchasing a more expensive two-socket server just to get the memory bandwidth and IO they need to support the compute performance of the cores... EPYC [offers] a single-processor solution that delivers the right-sized number of high-performance cores, memory, and IO for today's workloads."
The first EPYC-based servers will launch in June, with AMD promising widespread support from OEMs and channel partners. Dropbox is evaluating using EYPC CPUs. ®
1 EPYC processor offers up to 64 PCI Express high speed IO lanes per socket, versus the Xeon E5-2699A v4 processor at 40 lanes per socket.
2 EPYC processor includes up to 32 CPU cores versus the Xeon E5-2699A v4 processor with 22 CPU cores.
3 EPYC processor supports up to 21.3 GB/s per channel with DDR4-2667 x 8 channels (total 170.7GBps), versus the Xeon E5-2699A v4 processor at 19.2GBps with max DDR4-2400 x 4 channels (total 76.8GBps).