AWS Summit Amazon piled on the new cloud features at its AWS Summit event in San Francisco, including new services for file storage and machine learning and updates to features that debuted last year.
Speaking at Thursday morning's keynote, AWS boss Andy Jassy described cloud computing as "the new normal" for businesses of all sizes and said Amazon offered more and broader functionality than any other cloud IT vendor.
The cloud mega-giant rolled out 516 major new features and services in 2014, Jassy said, and he had several new ones to announce at this week's conference that should please developers and data center admins.
Objects, blocks, and now files
The first was Amazon Elastic File System (EFS), a new storage service that allows customers to store their files directly in the AWS cloud.
Previously, Amazon has only offered object-based storage via its Simple Storage Service (S3), SAN-style block-based storage via the Elastic Block Store (EBS), or archival storage via the Amazon Glacier service. This new offering mimics a shared file system in the cloud, accessible via the NFSv4 protocol and scalable up to petabytes in size.
"Whether you want to use this for an application that's small, development test, or if you want to use it for something that's very large with high demand and scalability; because it grows to petabyte-scale or more, it handles all of those use cases," Jassy said.
File systems can be created and managed using GUI, command-line tools, and APIs. Each file system is backed by multiple Elastic Compute Cloud (EC2) instances, with SSD-based storage for maximum performance. To ensure high availability, all files, directories, and links are replicated across multiple Availability Zones.
Fees for the service are straightforward at $0.30 per gigabyte of storage used, billed monthly based on the average usage throughout the month. You'll have to wait to get your hands on it, though; Amazon says EFS will become available in preview "in the near future," but it's accepting applications to try it out now.
Machine learning for the masses
Big data is another topic that has been on Amazon's mind, and on Thursday it announced a new service designed to enable developers to add machine learning to their applications, even if they have no direct experience in the field.
Jassy explained that Amazon has been experimenting with machine learning since its very early days as an online bookseller, for things like recommendation engines and fraud detection. Over the years it has developed in-house tools to make creating new machine learning models easier, and those tools have now become the basis of its new public service, Amazon Machine Learning.
Amazon data scientist Matt Wood took the stage on Thursday to explain how the new service can automatically pull data from S3, Amazon Redshift, or MySQL databases hosted on the AWS Relational Database Service, run that data through its built-in machine learning algorithms, and use it to generate predictive models.
The service is designed to be usable by developers with no experience in statistics, data analysis, or any of the other mathematical drudgery that's ordinarily necessary to implement machine learning. All that's needed to create models is the AWS Management Console or the service's APIs. Once the models are built, predictions can be generated in real time or in batches, as needed.
"All of this is using the battle-hardened technology that's been tried and tested inside Amazon.com," Wood explained.
Amazon Machine Learning is available now and there are no fixed up-front costs. Data analysis and model building are billed at a rate of $0.42 per hour. After that, predictions are billed at $0.10 per batch of 1,000, or $0.0001 per real-time prediction.