Intel's DDR4-friendly Xeon workhorses bolt for workstations, servers

New E5 v3 chips available in 32 different flavors for Dell, IBM, Cray et al


Intel's latest-generation Xeon E5 v3 processors first showed up in systems from the likes of Dell last month, but Chipzilla made them generally available on Monday – with all of 32 different parts heading for OEMs and the channel.

The new Xeon E5-2600 v3 and E5-1600 v3 chips are all based on Intel's x86-64 Haswell microarchitecture, and fabbed using a 22nm process.

"The processors will be used in servers, workstations, storage and networking infrastructure to power a broad set of workloads such as data analytics, high-performance computing, telecommunications and cloud-based services, as well as back-end processing for the Internet of Things," Intel said in a canned statement, leaving no stone unturned.

The Xeon E5-1600 v3 series comes in six different configurations and mainly targets the workstation market, while the E5-2600 v3 series takes aim at a wider range of applications and will ship in a total of 26 different versions.

Let and let die ... Xeon E5 2600 silicon (Click to enlarge)

Where the previous-generation Xeon E5 v2 series maxxed out at 12 cores per chip, the Xeon E5-2600 v3 series ups the limit to 18, with as much as 45MB of cache per socket.

The new Xeons are also the first to support DDR4 RAM, which offers memory bus speeds of up to 2,133MHz and which Intel says can increase performance for memory bandwidth–constrained workloads by up to 1.4 times versus the previous generation of CPUs.

In addition, Intel's Advanced Vector Extensions 2 (AVX2) improve integer and floating-point computation performance by as much as 1.9 times by doubling vector instruction sizes to 256 bits per clock cycle, while the Intel Advanced Encryption Standard New Instructions (AES-NI) increase performance of data encryption and decryption by as much as 2 times.

A wide range of vendors have committed to building systems around the new Xeons beginning on Monday, including Bull, Cray, Cisco, Dell, Fujitsu, Hitachi, HP, Huawei, IBM, Inspur, Lenovo, NEC, Oracle, Quanta, Radisys, SGI, Sugon, and Supermicro, among others.

Just what you'll have to pay for boxes built with the new chips, however, will vary widely. Intel says the Xeon E5-1600 v3 series will range in price from $295 to $1,723, while the Xeon E5-2600 v3 series will start at $213 and go all the way up to $2,702 per chip. ®


Other stories you might like

  • AMD bests Intel in cloud CPU performance study
    Overall price-performance in Big 3 hyperscalers a dead heat, says CockroachDB

    AMD's processors have come out on top in terms of cloud CPU performance across AWS, Microsoft Azure, and Google Cloud Platform, according to a recently published study.

    The multi-core x86-64 microprocessors Milan and Rome and beat Intel Cascade Lake and Ice Lake instances in tests of performance in the three most popular cloud providers, research from database company CockroachDB found.

    Using the CoreMark version 1.0 benchmark – which can be limited to run on a single vCPU or execute workloads on multiple vCPUs – the researchers showed AMD's Milan processors outperformed those of Intel in many cases, and at worst statistically tied with Intel's latest-gen Ice Lake processors across both the OLTP and CPU benchmarks.

    Continue reading
  • Ampere: Cloud biz buy-ins prove our Arm server CPUs are the real deal
    Startup teases 128+ core chip, disses Xeon and Epyc, unsurprisingly

    Interview After two years of claiming that its Arm-powered server processors provide better performance and efficiency for cloud applications than Intel or AMD's, Ampere Computing said real deployments by cloud providers and businesses are proving its chips are the real deal.

    The Silicon Valley startup held its Annual Strategy and Product Roadmap Update last week to ostensibly give a product roadmap update. But the only update was the news that Ampere's 5nm processor due later this year is called Ampere One, it's sampling that with customers, and it will support PCIe Gen 5 connectivity and DDR5 memory.

    Continue reading
  • Intel’s Falcon Shores XPU to mix ‘n’ match CPUs, GPUs within processor package
    x86 giant now has an HPC roadmap, which includes successor to Ponte Vecchio

    After a few years of teasing Ponte Vecchio – the powerful GPU that will go into what will become one of the fastest supercomputers in the world – Intel is sharing more details of the high-performance computing chips that will follow, and one of them will combine CPUs and GPUs in one package.

    The semiconductor giant shared the details Tuesday in a roadmap update for its HPC-focused products at the International Supercomputing Conference in Hamburg, Germany.

    Intel has only recently carved out a separate group of products for HPC applications because it is now developing versions of Xeon Scalable CPUs, starting with a high-bandwidth-memory (HBM) variant of the forthcoming Sapphire Rapids chips, for high-performance kit. This chip will sport up to 64GB of HBM2e memory, which will give it quick access to very large datasets.

    Continue reading
  • AMD touts big datacenter, AI ambitions in CPU-GPU roadmap
    Epyc future ahead, along with Instinct, Ryzen, Radeon and custom chip push

    After taking serious CPU market share from Intel over the last few years, AMD has revealed larger ambitions in AI, datacenters and other areas with an expanded roadmap of CPUs, GPUs and other kinds of chips for the near future.

    These ambitions were laid out at AMD's Financial Analyst Day 2022 event on Thursday, where it signaled intentions to become a tougher competitor for Intel, Nvidia and other chip companies with a renewed focus on building better and faster chips for servers and other devices, becoming a bigger player in AI, enabling applications with improved software, and making more custom silicon.  

    "These are where we think we can win in terms of differentiation," AMD CEO Lisa Su said in opening remarks at the event. "It's about compute technology leadership. It's about expanding datacenter leadership. It's about expanding our AI footprint. It's expanding our software capability. And then it's really bringing together a broader custom solutions effort because we think this is a growth area going forward."

    Continue reading

Biting the hand that feeds IT © 1998–2022