Do you have a few hundred million rows of data that need sorting?
If so, Google wants you to send them in the direction of its new BigQuery cloud analytics service, which has left beta status behind it and is now ready to crunch real, live data in exchange for your hard-won cash.
Big G launched the service in November 2011. As blog post announcing the pre-launch mentioned only the REST interface for users. The post green-lighting the product as officially in production says there's now a web application front-end too.
New users can upload and delve into 100GB of data a month at no charge. Beyond that, get ready to shell out US$0.12 per gigbayte for storage and $0.035 per gigabyte processed. You're only allowed 2TB of data, and there's a per-day limit of 1000 queries and 20TB of data processing. If you want more, Google asks you to step away from the self-provisioning tools and talk direct to its sales folk.
The service is said to support “hundreds and hundreds of terabytes”. The BigQuery site is silent on methods oher than uploads for moving such volumes of data into the service, as is the Google Storage site. We've asked Google to explain just how it's possible to make that happen, given that uploading even 100 GB is not something many would wish to attempt and also in light of the fact that Amazon offers the chance to ship drives for on-site injection to the cloud. ®