When the chips are down, thank goodness for software engineers: AI algorithms 'outpace Moore's law'

ML eggheads, devs get more bang for their buck, say OpenAI duo


Machine-learning algorithms are improving in performance at a rate faster than that of the underlying computer chips, we're told.

AI software techniques have become so efficient, in fact, engineers can now train a neural network on ImageNet – a top dataset for image recognition systems – to about 79.1 per cent accuracy with 44 times less compute power compared to back in 2012. That's according to a study [PDF], emitted this week by OpenAI, that estimated, at this rate of improvement, algorithmic efficiency doubled every 16 months over seven years.

"Notably, this outpaces the original Moore’s law rate of improvement in hardware efficiency (11x over this period)," the paper, by by Danny Hernandez and Tom Brown, stated.

That law is an observation by Intel co-founder Gordon Moore in the 1960s that the number of transistors on a chip doubles roughly every two years, leading people to expect processor performance to double over the same period. It's also a law that's been dying since at least 1999, and considered dead since 2018.

(We like to joke that Moore's second law is that no journalist can write about Intel without mentioning the first law.)

An accuracy level of 79.1 per cent may seem low at first, yet it was chosen because that was the level of performance for AlexNet when it won the ImageNet challenge in 2012. AlexNet is celebrated as the first model that rekindled computer and data scientists' obsession with neural networks.

ml

Moore's Law isn't dead, chip boffin declares – we need it to keep chugging along for the sake of AI

READ MORE

The improvement isn’t just for computer-vision models: it can also be seen in other types of neural network architectures for language translation and reinforcement learning too, OpenAI said.

“Increases in algorithmic efficiency allow researchers to do more experiments of interest in a given amount of time and money,” the OpenAI duo wrote. "In addition to being a measure of overall progress, algorithmic efficiency gains speed up future AI research in a way that’s somewhat analogous to having more compute."

Although advances in algorithm performance are good news for the machine-learning community, it's worth pointing out models are getting larger and more complex, and require significant resources and money to train. A recent paper by AI21, a research hub focused on natural language based in Israel, revealed that it costs anywhere from $2,500 to $50,000 to train a language model with 110 million parameters. When that number increases to 1.5 billion parameters – a number equivalent to OpenAI’s GPT-2 – the costs jump up to anywhere between $80,000 to a whopping $1.6m. ®

Broader topics


Other stories you might like

  • Microsoft Azure to spin up AMD MI200 GPU clusters for 'large scale' AI training
    Windows giant carries a PyTorch for chip designer and its rival Nvidia

    Microsoft Build Microsoft Azure on Thursday revealed it will use AMD's top-tier MI200 Instinct GPUs to perform “large-scale” AI training in the cloud.

    “Azure will be the first public cloud to deploy clusters of AMD's flagship MI200 GPUs for large-scale AI training,” Microsoft CTO Kevin Scott said during the company’s Build conference this week. “We've already started testing these clusters using some of our own AI workloads with great performance.”

    AMD launched its MI200-series GPUs at its Accelerated Datacenter event last fall. The GPUs are based on AMD’s CDNA2 architecture and pack 58 billion transistors and up to 128GB of high-bandwidth memory into a dual-die package.

    Continue reading
  • New York City rips out last city-owned public payphones
    Y'know, those large cellphones fixed in place that you share with everyone and have to put coins in. Y'know, those metal disks representing...

    New York City this week ripped out its last municipally-owned payphones from Times Square to make room for Wi-Fi kiosks from city infrastructure project LinkNYC.

    "NYC's last free-standing payphones were removed today; they'll be replaced with a Link, boosting accessibility and connectivity across the city," LinkNYC said via Twitter.

    Manhattan Borough President Mark Levine said, "Truly the end of an era but also, hopefully, the start of a new one with more equity in technology access!"

    Continue reading
  • Cheers ransomware hits VMware ESXi systems
    Now we can say extortionware has jumped the shark

    Another ransomware strain is targeting VMware ESXi servers, which have been the focus of extortionists and other miscreants in recent months.

    ESXi, a bare-metal hypervisor used by a broad range of organizations throughout the world, has become the target of such ransomware families as LockBit, Hive, and RansomEXX. The ubiquitous use of the technology, and the size of some companies that use it has made it an efficient way for crooks to infect large numbers of virtualized systems and connected devices and equipment, according to researchers with Trend Micro.

    "ESXi is widely used in enterprise settings for server virtualization," Trend Micro noted in a write-up this week. "It is therefore a popular target for ransomware attacks … Compromising ESXi servers has been a scheme used by some notorious cybercriminal groups because it is a means to swiftly spread the ransomware to many devices."

    Continue reading
  • Twitter founder Dorsey beats hasty retweet from the board
    As shareholders sue the social network amid Elon Musk's takeover scramble

    Twitter has officially entered the post-Dorsey age: its founder and two-time CEO's board term expired Wednesday, marking the first time the social media company hasn't had him around in some capacity.

    Jack Dorsey announced his resignation as Twitter chief exec in November 2021, and passed the baton to Parag Agrawal while remaining on the board. Now that board term has ended, and Dorsey has stepped down as expected. Agrawal has taken Dorsey's board seat; Salesforce co-CEO Bret Taylor has assumed the role of Twitter's board chair. 

    In his resignation announcement, Dorsey – who co-founded and is CEO of Block (formerly Square) – said having founders leading the companies they created can be severely limiting for an organization and can serve as a single point of failure. "I believe it's critical a company can stand on its own, free of its founder's influence or direction," Dorsey said. He didn't respond to a request for further comment today. 

    Continue reading
  • Snowflake stock drops as some top customers cut usage
    You might say its valuation is melting away

    IPO darling Snowflake's share price took a beating in an already bearish market for tech stocks after filing weaker than expected financial guidance amid a slowdown in orders from some of its largest customers.

    For its first quarter of fiscal 2023, ended April 30, Snowflake's revenue grew 85 percent year-on-year to $422.4 million. The company made an operating loss of $188.8 million, albeit down from $205.6 million a year ago.

    Although surpassing revenue expectations, the cloud-based data warehousing business saw its valuation tumble 16 percent in extended trading on Wednesday. Its stock price dived from $133 apiece to $117 in after-hours trading, and today is cruising back at $127. That stumble arrived amid a general tech stock sell-off some observers said was overdue.

    Continue reading

Biting the hand that feeds IT © 1998–2022