A crack in the madness of clouds

Sanity check 09?


Besides providing some of the biggest technical innovation of 2008, the cloud also wins the award for most amorphous product definition. Few people define "the cloud" or "cloud computing" the same way, leading to market noise and a wealth of misinformation.

"The cloud" as a term really started as a metaphor for the "internet" and has since been bastardized to mean pretty much anything that isn't on-premise computing.

What will see in 2009? Sadly, not a miraculous understanding, but instead a glimmer of hope that the cloud can live up to the hype. In 2008 we were mostly able to break cloud computing (or the cloud - whatever you like) into the following segments:

  • Infrastructure-as-a-Service - operating system hosting with dynamic provisioning and theoretically unlimited resource scaling. A leading example is Amazon's EC2
  • Platform-as-a-Service - no infrastructure required hosting to develop and deploy applications. Examples here include Force.com, Heroku and Google App Engine
  • Software/Application-as-a-Service - where applications are delivered via a web browser. These examples include Salesforce.com, NetSuite and Gmail

Storage-as-a-Service hovered on the fringe in 2008 and I suspect that it too will become part of the cloud taxonomy in 2009.

Outside of cloud-y infrastructure we also have accessories such as tooling, including monitoring, configuration management and automation, as well as a never-ending supply of APIs to interact with every cloud service. These will become more relevant as cloud consumption increases. With this in mind, these are the masses of precipitation I think we should expect 2009.

Data-as-a-Service

Breaking away from database tables and silos, data-as-a-service will provide a way to consume disparate structured and un-structured cloud-based data across various networks.

For example, let's say I want to pull inventory data from Amazon S3 and user data stored in Google's Bigtable and join it in Salesforce.com through some kind of query. In the relational database world you would use SQL, but each cloud provider has implemented a slightly different database structure and API. Somewhere down the line this will have to get easier.

The ability to apply cohesion to disparate data regardless of what structure it's in or how it's stored could be the cloud's Holy Grail. Former Credit Suisse software analyst Jason Maynard has called this "data-as-an-answer" for its ability to provide insight beyond one silo.

I also think this could introduce a concept of cloud droplets, which will form micro-clouds or transaction points that form a larger computing environment.

In physics, a cloud droplet is very small compared to a raindrop. It takes about a million cloud droplets to make a single rain drop. In the same vein, I believe that we'll eventually get to a place where endpoints - or end-users - act as cloud droplets potentially in the form of peer-to-peer cohesion or possibly as a function of data federation.

Internal clouds

A "compute cloud" is a different animal according to the developers of Eucalyptus, an open-source, EC2-compatible infrastructure-as-a-service. Typically based on virtual machines, "cloud computing allows users to dynamically provision processing time and storage space from a ubiquitous 'cloud' of computational resources."

Next page: Portability

Other stories you might like

  • It's a crime to use Google Analytics, watchdog tells Italian website
    Because data flows into the United States, not because of that user interface

    Updated Another kicking has been leveled at American tech giants by EU regulators as Italy's data protection authority ruled against transfers of data to the US using Google Analytics.

    The ruling by the Garante was made yesterday as regulators took a close look at a website operator who was using Google Analytics. The regulators found that the site collected all manner of information.

    So far, so normal. Google Analytics is commonly used by websites to analyze traffic. Others exist, but Google's is very much the big beast. It also performs its analysis in the USA, which is what EU regulators have taken exception to. The place is, after all, "a country without an adequate level of data protection," according to the regulator.

    Continue reading
  • Google recasts Anthos with hitch to AWS Outposts
    If at first you don't succeed, change names and try again

    Google Cloud's Anthos on-prem platform is getting a new home under the search giant’s recently announced Google Distributed Cloud (GDC) portfolio, where it will live on as a software-based competitor to AWS Outposts and Microsoft Azure Stack.

    Introduced last fall, GDC enables customers to deploy managed servers and software in private datacenters and at communication service provider or on the edge.

    Its latest update sees Google reposition Anthos on-prem, introduced back in 2020, as the bring-your-own-server edition of GDC. Using the service, customers can extend Google Cloud-style management and services to applications running on-prem.

    Continue reading
  • Amazon shows off robot warehouse workers that won't complain, quit, unionize...
    Mega-corp insists it's all about 'people and technology working safely and harmoniously together'

    Amazon unveiled its first "fully autonomous mobile robot" and other machines designed to operate alongside human workers at its warehouses.

    In 2012 the e-commerce giant acquired Kiva Systems, a robotics startup, for $775 million. Now, following on from that, Amazon has revealed multiple prototypes powered by AI and computer-vision algorithms, ranging from robotic grippers to moving storage systems, that it has developed over the past decade. The mega-corporation hopes to put them to use in warehouses one day, ostensibly to help staff lift, carry, and scan items more efficiently. 

    Its "autonomous mobile robot" is a disk-shaped device on wheels, and resembles a Roomba. Instead of hoovering crumbs, the machine, named Proteus, carefully slots itself underneath a cart full of packages and pushes it along the factory floor. Amazon said Proteus was designed to work directly with and alongside humans and doesn't have to be constrained to specific locations caged off for safety reasons. 

    Continue reading
  • Microsoft promises to tighten access to AI it now deems too risky for some devs
    Deep-fake voices, face recognition, emotion, age and gender prediction ... A toolbox of theoretical tech tyranny

    Microsoft has pledged to clamp down on access to AI tools designed to predict emotions, gender, and age from images, and will restrict the usage of its facial recognition and generative audio models in Azure.

    The Windows giant made the promise on Tuesday while also sharing its so-called Responsible AI Standard, a document [PDF] in which the US corporation vowed to minimize any harm inflicted by its machine-learning software. This pledge included assurances that the biz will assess the impact of its technologies, document models' data and capabilities, and enforce stricter use guidelines.

    This is needed because – and let's just check the notes here – there are apparently not enough laws yet regulating machine-learning technology use. Thus, in the absence of this legislation, Microsoft will just have to force itself to do the right thing.

    Continue reading

Biting the hand that feeds IT © 1998–2022