Understanding Cloud Computing Benchmarks and its Mileage

cloud-benchmarkingBenchmarking is prevalent in the IT world. To prove that cloud providers’ services are better than others, they steer their benchmarking tests performance metrics. However, the corresponding values obtained from cloud benchmarks are usually vague. Organizations doing the tests recognize that these benchmarks can be a great guide in some situations. Although cloud benchmarks give an objective comparison of providers, users should rely on these benchmarks at their own risks.

CloudHarmony, one of the first cloud-benchmarking services in the market, is likely to be an important resource for evaluating cloud provider performance. It runs a variety of benchmarks developed by Phoronix.

Jason Read, CloudHarmony founder explained that CloudHarmony is “trying to provide an apples-to-apples comparison of different providers,” but it does not provide an assurance how the application will run on any given provider’s cloud infrastructure. Read added that third-party benchmarks can give direction to what cloud is effective in each unique circumstance, but users must be more attentive in running their own tests.

Doug Willoughby, cloud strategy director for Compuware, phrases it this way, “Your mileage may vary.” Benchmarks places its cloud through its paces using standard applications and configurations, but it faces an array of other variables.

One of the most apparent is user’s application and desired configuration. Creating the same environment is not high because of the distinct user’s application and cloud providers provides wide range of operating systems, storage options instance sizes, and database.

Network is also another important variable. Like in the case of Compuware’s CloudSleuth service, that monitors the latency of a generic retail application hosted on a number of leading clouds. To measure network performance, CloudSleuth performs around 200 different tests, hitting each provider with 3,000 individual tests daily.

Willoughby said that several factors, such as network congestion or packet loss, can cause network traffic below the average baseline. He added what he called a Visualization Tool which gauges providers’ latency to see what’s going on instead of a benchmark service.

Another issue is choosing the uniform instance size and configuration that benchmark each cloud. Getting identical CPUs, storage specs and memory from each cloud provider; and there are clouds that provide higher-performing instances than those being tested.

Look into the case of Amazon Web Services and its Cluster Compute Instances. In CloudHarmony tests, AWS seldom achieve the highest performance and none of those tests make use of Cluster Compute Instances. At $1.60 per hour, Cluster Compute Instances has not been benchmarked with the rest of the pack. AWS Cluster Compute Instance performed at par with or better than the field in CloudHarmony’s previous tests.

Price performance is another concern of cloud users. This is one main reason why AWS does not mind when competitors boost their platforms’ outperformance of EC2. As customers recognize EC’s low price and AWS’s rich feature, AWS remains confident that customer’s decision won’t rely on performance alone.

Cloud Benchmarking Guidance

CloudHarmony tested different instance sizes from each provider across several benchmarks in these five categories -CPU performance, memory IO, disk IO, encoding and encryption and Java/Ruby/PHP/Python performance. The significant results are as follows:

  • Storm On Demand is the top performer across the boards, exceeding other clouds by a reasonable margin.
  • Bluelock and GoGrid also ranked consistently among the top three providers in all categories.
  • Rackspace stays near the middle of the pack, while OpSource was always near the bottom.
  • AWS fared well especially in the larger instance sizes in Java/Ruby/PHP/Python test.

Performance will depend on the actual application being run and the overall architecture. AWS is a feature-rich platform that has performed a great deal while supporting more users; Bluelock is a VMware only cloud; Rackspace will change once it completes its OpenStack makeover; OpSource is created for security flexibility and maximum networking; while Storm On Demand is unproven..

Network performance is also an important factor to be considered. During the 30-day period of test application, CloudSleuth’s data shows that Microsoft Windows Azure (U.S. Central) delivers the fastest response time (2.82 seconds). Following in sequence are Rackspace (U.S. East, 3.12 seconds), GoGrid (U.S. East, 3.28 seconds), OpSource (U.S. East, 3.45 seconds) and Google App Engine (3.54 seconds); while AWS ranked 10th with 3.85 seconds. There might not be many differences, but each second is equivalent to money in the Internet.

Compuware’s Willoughby mentioned a study showing that for every two seconds of waiting time results to 8 percent abandonment rate.

Cedexis, through its Radar Service also monitors network performance using different measurements and source locations. It measures response times from more than 1.7 million end-user nodes around the world. IN the period of April 27 to May 25, the result shows AWS’ three Eastern regions leading the pack, followed by Windows Azure and GoGrid in the fourth to sixth spot respectively. Google App Engine lands in the tenth spot.

What Can Be Expected?

The current cloud computing benchmarks and performance measurements can be used as guides; making way for legitimate measures on what users can expect next. More enterprise applications, including some apps from SAP and Oracle, are moving into the cloud. This gives more real-world application benchmarking. Also, a large number of products and services in the market can monitor user’s cloud server performance, ranging from startup cloud services, like ServerDensity, to big-time vendors such as VMware (Hyperic) and CA (Nimsoft).

If companies would release anonymous information or run and share their own analyses, a clearer view of which application types, configurations and clouds delivering the best performances are likely to be seen.

Cloud users do not need to worry when figuring out which cloud is right cloud for their applications. Cloud resources are becoming less expensive and benchmarks and performance measurements can be good starting points. Running test trials on several clouds to determining which cloud is the best does not need much time, money or commitment.



3 comments

  1. Thank you for bringing more clarity to cloud bench marketing. It confuses many individuals.   I was curious though why you said “while Storm On Demand is unproven..”?   I would be happy to provide you a free trial so that you may test our platform yourself. I would also like to point out that Storm On Demand currently powers several thousand websites and has been on the  market since  February of 2010. Here is short list of customers https://www.stormondemand.com/servers/clients.html    which include General Motors, Klout, Purdue University, Suave, Best Buy and many more.    I just wanted to take a moment to clarify that point.  Thank you very much for including Storm On Demand in your article.   Travis Stoliker

Leave a Reply

Your email address will not be published. Required fields are marked *