Google's big Spanner in the works for price war against AWS

Shard this for a game for forks! Mega Mountain View SQL decision 'coming'


The cloud wars won't be won on price – but customers are waking up to the costs, according to Google.

"It's not a zero-sum game," Brian Stevens, Google's vice president for Cloud Platform and former Red Hat chief technology officer, told The Reg in a recent interview.

"We don't think you just win on cost alone, but we do think many companies we talk to are stunned about what cloud delivers but surprised by the bill.

"It's the first time you see the cost all in one. It's the first time they see the cost – they had been servers and a data center cost."

That said, Google did at least concede on price – cutting the price of Google Cloud Platform (GCP) API operations in storage following a rejiggering of its storage buckets.

That followed an initial holding of the line in January 2016 in the face of price cuts by AWS, saying GCP remained less expensive than the rival's cloud service.

AWS and Microsoft have become both mind and market share winners when it comes to public infrastructure cloud – numbers one and two respectively.

They have scrambled their way there, helped in part by a cutting and counter-cutting of prices along the way.

Deutsche Bank reckoned in November that the era was over, adding that the price for "basic" AWS services had fallen between 10 and 20 per cent annually since 2014.

But just as Deutsche Bank believed we were post-price cut peak, AWS proceeded to roll out its 56th cut since 2006, coming into effect on December 1.

Then again, AWS and Azure do not remain cheap once you're on board. Pennies per user for compute or for storage soon become pounds, and then multiple thousands of pounds as you make greater use of their services.

Or when developers clock off – and leave VM instances running on the meter.

AWS last year announced a budget calculator to help customers control spend, setting up alerts as they approach their budget's limit.

Microsoft had given AWS a spirited run for its money, but has never been able to successfully undercut Amazon.

In the UK at least, prices are even going up – by 22 per cent, a move it's blamed on the pound's tumble since Britain's June 2016 vote to leave the European Union.

Stevens – hired by Google in November 2014 – reckons the essential GCP plumbing is now in place.

"Google is in all the major deals against the other two," he said. "The technology platform has matured. What was missing was the enterprise integration capabilities – the security, encryption, key management, VPN, virtual private cloud, private IP cloud. Now we've addressed that over the past two years. There's no gap from a capability perspective. We win the majority of proofs of concept against the others so it shows we are there."

He promised that machine learning and telemetry would infuse all GCP services.

"The next phase of this is to show enterprise capabilities they never knew existed, to show them a level of telemetry in their infrastructure, show them the results of using machine learning in their infrastructure," Stevens told The Reg.

But there's at least one big feature left to drop and has people wondering – a commercial edition of Google's massive SQL, post-NoSQL database Spanner.

Spanner was used in Google's F1 ads platform and was planned to be rolled out to Gmail. In 2012, Google released a whitepaper to the world detailing Spanner.

Spanner is a massively distributed SQL architecture capable of storing data in different centres and providing the classic attributes of RDBMS jettisoned by NoSQL, the generation of data architectures spawned to achieve web-scale storage. Spanner, according to Google's paper here [PDF], is capable of scaling to millions of machines across hundreds of data centres and trillions of database rows.

Google followed in 2014 by releasing Spanner to alpha while the technology has also been picked up by others – with the Spanner inspired CockroachDB from a team of ex-Googlers.

Google and its MO have form: Google released a white paper for MapReduce in 2004 for large-scale data processing that was picked up to become Hadoop.

Since then on Spanner? Nothing, but recent reports talk of rumblings inside Mountain View as Google's brains assess how to productise this planet-sized relational weapon.

Speaking to The Reg, Stevens promised "big new services" in SQL databases at scale from Google, but sidestepped the details.

"The world has never seen a distributed version of SQL – never seen a single instance of SQL that scales 100 per cent. That's why you had NoSQL," he said.

"We have been putting it in the hands of early access – putting technology in the hands of select customers before productising.

"After that we will make a production decision. We are getting encouraging signals back on that – it's one of the more traditional IT things we are working on." ®

Similar topics

Broader topics


Other stories you might like

  • Talos names eight deadly sins in widely used industrial software
    Entire swaths of gear relies on vulnerability-laden Open Automation Software (OAS)

    A researcher at Cisco's Talos threat intelligence team found eight vulnerabilities in the Open Automation Software (OAS) platform that, if exploited, could enable a bad actor to access a device and run code on a targeted system.

    The OAS platform is widely used by a range of industrial enterprises, essentially facilitating the transfer of data within an IT environment between hardware and software and playing a central role in organizations' industrial Internet of Things (IIoT) efforts. It touches a range of devices, including PLCs and OPCs and IoT devices, as well as custom applications and APIs, databases and edge systems.

    Companies like Volvo, General Dynamics, JBT Aerotech and wind-turbine maker AES are among the users of the OAS platform.

    Continue reading
  • Despite global uncertainty, $500m hit doesn't rattle Nvidia execs
    CEO acknowledges impact of war, pandemic but says fundamentals ‘are really good’

    Nvidia is expecting a $500 million hit to its global datacenter and consumer business in the second quarter due to COVID lockdowns in China and Russia's invasion of Ukraine. Despite those and other macroeconomic concerns, executives are still optimistic about future prospects.

    "The full impact and duration of the war in Ukraine and COVID lockdowns in China is difficult to predict. However, the impact of our technology and our market opportunities remain unchanged," said Jensen Huang, Nvidia's CEO and co-founder, during the company's first-quarter earnings call.

    Those two statements might sound a little contradictory, including to some investors, particularly following the stock selloff yesterday after concerns over Russia and China prompted Nvidia to issue lower-than-expected guidance for second-quarter revenue.

    Continue reading
  • Another AI supercomputer from HPE: Champollion lands in France
    That's the second in a week following similar system in Munich also aimed at researchers

    HPE is lifting the lid on a new AI supercomputer – the second this week – aimed at building and training larger machine learning models to underpin research.

    Based at HPE's Center of Excellence in Grenoble, France, the new supercomputer is to be named Champollion after the French scholar who made advances in deciphering Egyptian hieroglyphs in the 19th century. It was built in partnership with Nvidia using AMD-based Apollo computer nodes fitted with Nvidia's A100 GPUs.

    Champollion brings together HPC and purpose-built AI technologies to train machine learning models at scale and unlock results faster, HPE said. HPE already provides HPC and AI resources from its Grenoble facilities for customers, and the broader research community to access, and said it plans to provide access to Champollion for scientists and engineers globally to accelerate testing of their AI models and research.

    Continue reading

Biting the hand that feeds IT © 1998–2022