Computer Business Review

Amazon Cloud Computing goes beta

CBR Staff Writer

18:30, August 24 2006 Inc has launched into the emergent utility computing space, until now the experimental playground of tech heavyweights such as Sun Microsystems Inc and Hewlett-Packard Co.

The retailer's web services division yesterday opened Elastic Cloud Computing for limited beta testing, promising prices starting at $0.10 per virtual server per hour.

EC2, as Amazon abbreviates it, is one of the first services to address anticipated demand for on-demand computing capacity at the lower end of the market.

It's targeted at smaller web developers that don't want to be caught unawares if their services become unexpectedly popular. The likes of Sun and HP are targeting larger organizations that have complex and resource-intensive processing tasks. Big number-crunching jobs.

Other options out there are geared toward batch-oriented computing where developers set-up their application, upload their work, use their dashboard to watch their job, and pick up their file/results when it is done, said an Amazon spokesperson.

With the current dot-com-style explosion in web startups, and the myriad new ways in which a site can be discovered by large numbers of people at once, and hit by what was once mainly known as the Slashdot effect, there could be demand for services like EC2.

Amazon is basically renting out virtual servers, or instances, at its data centers. Rather than pay a fixed fee for your expected capacity plus redundancy, the typical hosting model, with EC2 you only pay for what you use. Users pay by the server-hour and by the number of gigabytes of data that internet users chug from them.

Customers pay $0.10 for each hour or part-hour an instance is running, and $0.20 per gigabyte of data transfer. An additional $0.15 per gigabyte per month fee for storage is billed separately for using Amazon Simple Storage Service, known as S3.

The company said each instance is roughly equivalent to a server with a 1.7Ghz Xeon processor, 1.75GB of RAM, 160GB of local disk, and a 250Mbps internet connection, bursting to 1Gbps. In future, other tiers of virtual machine will be introduced, the firm said.

To use the service, webmasters create a server image to their own spec, or selecting one of Amazon's preconfigured images. This Amazon Machine Image is then uploaded to S3, where it sits until it is invoked via a published API.

Each user can have multiple AMIs -- they could have one for a web server and one for a database, for example. In the beta, up to 20 instances can be run at any time by a single user.

With EC2, developers can install their own OS, applications, and configuration, log-in, use or secure their network as they see fit, and exercise root control of their instances, the Amazon spokesperson added.

While utility computing has been touted for some time - Sun started testing its Sun Grid last year, and opened the service to the public in March - the jury is out on how much demand there will be.

Sun's plan is to create a utility that can be not only be used for big heavy batch processes, but which can also be tapped by the likes of Amazon and other software and service providers that in turn offer transactional utilities to their own customers.

As we reported in Computer Business Revew's Infrastructure MarketWatch in June, Sun was said to have a backlog of about 2,000 software vendors waiting to test their applications on the service.

Amazon, by contrast, is taking its service directly to web developers, and has a specific use case in mind.

The company is not currently disclosing how long the beta test will run for, or how many testers it is accepting. Would-be users are advised to sign up promptly to avoid disappointment.


Post a comment

Comments may be moderated for spam, obscenities or defamation.