Nemesis looms over TPC-C benchmark

Expensive and oudated


TPC-C - that venerable server-come-database benchmark from the Transaction Performance Processing Council - looks set to be pensioned off at last with the recent announcement of a new OLTP (OnLine Transaction Processing) benchmark specification, TPC-E.

The new benchmark is aimed at moving the touchstone of database and server performance into the world of today rather than yesterday. This is being achieved by shifting the fundamental premise on which the specification is built to a model more fitting to current business needs.

TPC-C was built around a model of a typical database system needed to run a parts warehouse. With TPC-E the model has been shifted to a model of a typical brokerage operation. This should give a better simulation of the modern real-world transactions that take place over the Internet – including the time delays and fractures that can occur between distributed contributors to the smooth running of a transaction. In operation, the benchmark goes through the process of interacting with the financial markets to execute customer orders and update the relevant account information files.

The Council has also aimed at getting more reality into the benchmark by creating `real’ names, addresses and business details for the test database from data culled from US census and New York Stock Exchange information.

TPC-E is also scalable, so that the number of customers can be varied. This should expand the benchmark’s usefulness in allowing database and server vendors to pitch at small and mid-range market sectors specifically, and at a price that smaller vendors can afford. The price of conducting benchmark tests has been one of the recent criticisms of TPC-C, so much so that only the major vendors now attempt it.

One reason for this is the suggestion that results can be affected by factors such as the number of disk drives used, rather than the total disk capacity. So, more small capacity disks can mean a better test result. But it also means an impractical installation configured specifically for the test, sometimes with several thousand disk drives and a price tag running well into the millions. This is decreasingly relevant to the average business requirement, though the benchmark results are still seen as good guidance.

TPC-E is still currently in draft as Version 1.0.0 and more information, together with the detailed specification, can be found here. ®


Other stories you might like

  • Robotics and 5G to spur growth of SoC industry – report
    Big OEMs hogging production and COVID causing supply issues

    The system-on-chip (SoC) side of the semiconductor industry is poised for growth between now and 2026, when it's predicted to be worth $6.85 billion, according to an analyst's report. 

    Chances are good that there's an SoC-powered device within arm's reach of you: the tiny integrated circuits contain everything needed for a basic computer, leading to their proliferation in mobile, IoT and smart devices. 

    The report predicting the growth comes from advisory biz Technavio, which looked at a long list of companies in the SoC market. Vendors it analyzed include Apple, Broadcom, Intel, Nvidia, TSMC, Toshiba, and more. The company predicts that much of the growth between now and 2026 will stem primarily from robotics and 5G. 

    Continue reading
  • Deepfake attacks can easily trick live facial recognition systems online
    Plus: Next PyTorch release will support Apple GPUs so devs can train neural networks on their own laptops

    In brief Miscreants can easily steal someone else's identity by tricking live facial recognition software using deepfakes, according to a new report.

    Sensity AI, a startup focused on tackling identity fraud, carried out a series of pretend attacks. Engineers scanned the image of someone from an ID card, and mapped their likeness onto another person's face. Sensity then tested whether they could breach live facial recognition systems by tricking them into believing the pretend attacker is a real user.

    So-called "liveness tests" try to authenticate identities in real-time, relying on images or video streams from cameras like face recognition used to unlock mobile phones, for example. Nine out of ten vendors failed Sensity's live deepfake attacks.

    Continue reading
  • Lonestar plans to put datacenters in the Moon's lava tubes
    How? Founder tells The Register 'Robots… lots of robots'

    Imagine a future where racks of computer servers hum quietly in darkness below the surface of the Moon.

    Here is where some of the most important data is stored, to be left untouched for as long as can be. The idea sounds like something from science-fiction, but one startup that recently emerged from stealth is trying to turn it into a reality. Lonestar Data Holdings has a unique mission unlike any other cloud provider: to build datacenters on the Moon backing up the world's data.

    "It's inconceivable to me that we are keeping our most precious assets, our knowledge and our data, on Earth, where we're setting off bombs and burning things," Christopher Stott, founder and CEO of Lonestar, told The Register. "We need to put our assets in place off our planet, where we can keep it safe."

    Continue reading

Biting the hand that feeds IT © 1998–2022