Microsoft 'Catapults' geriatric Moore's Law from CERTAIN DEATH

FPGAs DOUBLE data center throughput despite puny power pump-up, we're told


Microsoft has found a way to massively increase the compute capabilities of its data centers, despite the fact that Moore's Law is wheezing towards its inevitable demise.

In a paper to be presented this week at the International Symposium on Computer Architecture (ISCA), titled A Reconfigurable Fabric for Accelerating Large-Scale Datacenter Services, a troupe of top Microsoft Research boffins explain how the company has dealt with the slowdown in single-core clock-rate improvements that has occurred over the past decade.

To get around this debilitating problem – more on this later – Microsoft has built a system it calls Catapult, which automatically offloads some of the advanced tech that powers its Bing search engine onto clusters of highly efficient, low-power FPGA chips attached to typical Intel Xeon server processors.

Think of FPGAs – field-programmable gate arrays – as chips whose circuits can be customised and tweaked as required, allowing crucial tasks to be transferred away from the Xeons and instead accelerated in FPGA hardware.

This approach may save Microsoft from a rarely acknowledged problem that lurks in the technology industry: processors are not getting much faster.

Wait. What?

For those not familiar with the chip industry, a primer. For the past 50 years, almost every aspect of our global economy has been affected by Moore's Law, which states that the number of transistors on a chip of the same size will double every 18 months – or so – resulting in faster performance and better power efficiency

One slight problem: Moore's Law is not, in fact, a law. Instead, it was an assertion by Intel founder Gordon Moore in a 1965 article that the semiconductor industry got rather carried away with. In the past ten years, the salubrious effects of Moore's Law have started to wane, because although companies are packing more and more transistors onto their chips, the performance gains that those transistors bring with them are not as great as they were during the law's halcyon days.

Intel has yoked its entire business to the successful fulfillment of Moore's Law, and proudly announces each new boost in transistor counts. And, yes, those "new" transistors can help to increase a compute core's all-important instruction per cycle (IPC) metric – improved branch prediction, larger caches, more-efficient scheduling, beefier buffers, whatever – but the simple fact is that although chips have gone multi-core and are getting better at multi-tasking, those individual cores are not getting much faster due to any significant new discovery.

As AMD CTO Joe Macri recently told us, "There's not a whole lot of revolution left in CPUs." He did, however, note that "there's a lot of evolution left."

Microsoft's Catapult is a bit of both.

Programmable software, meet programmable hardware

Under new chief executive Satya Nadella, Microsoft is throwing billions of dollars at massive data centers in its attempt to become a cloud-first company. Part of that effort – and that investment – is to figure out a way to jump-start consistent data-center compute-performance boosts.

The solution that Microsoft Research has come up with is to pair field-programmable gate arrays with typical x86 processors, then let some data-center services such as the Bing search engine offload certain well-understood operations to the arrays.

To say that the performance improvements in this approach have been noticeable would be a gross understatement. Microsoft tells us that a test deployment on 1,632 servers was able to increase query throughput by 95 per cent, while only increasing power consumption by 10 per cent.

Though FPGA technology is well understood and used widely in the embedded technology industry, it's rare to hear of it being paired with standard off-the-shelf CPUs for accelerating web-facing software – until now, that is.

"We're moving into an era of programmable hardware supporting programmable software," Microsoft Research's Doug Burger told The Register. "We're just starting down that road now."

If Microsoft has indeed figured out how to almost double the performance of its computers while only paying a tenth more in electricity for large-scale data center tasks – and we see no reason to doubt them – that's not only a huge saving, but also one that saves the company from the slowdown in run-of-the-mill CPUs chips.

"Based on the results, Bing will roll out FPGA-enhanced servers in one data center to process customer searches starting in early 2015," Derek Chiou, a principal architect of Bing, said in a statement emailed to El Reg.

"We were looking to make a big jump forward in data center capabilities. It's an important area," Microsoft Research's Doug Burger explained to us.

"We wanted to do something that we thought could put us on a path that makes some really big leaps. Rather than banking on scaling to many, many more cores, let's take a different path – what can we do in hardware? We think specialization is going to be the next big thing."

Microsoft isn't doing this on a hunch. Burger wrote a paper [PDF] in 2011, Dark Silicon and the end of Multicore Scaling, which predicted that "left to the multicore path, we may hit a 'transistor utility economics' wall in as few as three to five years, at which point Moore's Law may end, creating massive disruptions in our industry."

So far, there are few signs to the contrary. [And if you want a real horror story, take a gander at the slow development of EUV lithography — Ed.]


Other stories you might like

  • Robotics and 5G to spur growth of SoC industry – report
    Big OEMs hogging production and COVID causing supply issues

    The system-on-chip (SoC) side of the semiconductor industry is poised for growth between now and 2026, when it's predicted to be worth $6.85 billion, according to an analyst's report. 

    Chances are good that there's an SoC-powered device within arm's reach of you: the tiny integrated circuits contain everything needed for a basic computer, leading to their proliferation in mobile, IoT and smart devices. 

    The report predicting the growth comes from advisory biz Technavio, which looked at a long list of companies in the SoC market. Vendors it analyzed include Apple, Broadcom, Intel, Nvidia, TSMC, Toshiba, and more. The company predicts that much of the growth between now and 2026 will stem primarily from robotics and 5G. 

    Continue reading
  • Deepfake attacks can easily trick live facial recognition systems online
    Plus: Next PyTorch release will support Apple GPUs so devs can train neural networks on their own laptops

    In brief Miscreants can easily steal someone else's identity by tricking live facial recognition software using deepfakes, according to a new report.

    Sensity AI, a startup focused on tackling identity fraud, carried out a series of pretend attacks. Engineers scanned the image of someone from an ID card, and mapped their likeness onto another person's face. Sensity then tested whether they could breach live facial recognition systems by tricking them into believing the pretend attacker is a real user.

    So-called "liveness tests" try to authenticate identities in real-time, relying on images or video streams from cameras like face recognition used to unlock mobile phones, for example. Nine out of ten vendors failed Sensity's live deepfake attacks.

    Continue reading
  • Lonestar plans to put datacenters in the Moon's lava tubes
    How? Founder tells The Register 'Robots… lots of robots'

    Imagine a future where racks of computer servers hum quietly in darkness below the surface of the Moon.

    Here is where some of the most important data is stored, to be left untouched for as long as can be. The idea sounds like something from science-fiction, but one startup that recently emerged from stealth is trying to turn it into a reality. Lonestar Data Holdings has a unique mission unlike any other cloud provider: to build datacenters on the Moon backing up the world's data.

    "It's inconceivable to me that we are keeping our most precious assets, our knowledge and our data, on Earth, where we're setting off bombs and burning things," Christopher Stott, founder and CEO of Lonestar, told The Register. "We need to put our assets in place off our planet, where we can keep it safe."

    Continue reading

Biting the hand that feeds IT © 1998–2022