The McKinsey report "Clearing the air on cloud computing" is getting some attention. It has some good stuff in it, including the warning that cloud computing is approaching the top of the Gartner hype cycle. However, its claim that cloud computing (in the guise of EC2) ends up being more expensive per server month for large enterprises than doing it in-house seems fatally flawed. In particular, it doesn't seem to be accounting for the costs correctly and it completely overlooks the benefits of automation in the cloud which ultimately lead to a revolution in the way compute resources are consumed.
The cost equation in the report starts on slide 22 and it's really sketchy. They mix EC2 compute units and cores together (compare 22 and 23). They talk about "$14K/Server (2 CPU, 4 core)" which on my calculator comes out to $97/core/month over three years, but they have a cost of $45/mo/CPU on the same slide (and $97 doesn't even account for the facility or power or cooling).
On slide 24 they suddenly compare an in-house datacenter server with "75% of EC2 Large Standard Windows configuration on Amazon EC2" and nowhere do they mention that the latter cost includes the Windows license. Ouch!
Unless they actually document more details of their cost accounting I can only say that it's flawed. Many business line owners in large corporations come to us and tell us they can't believe how cheap EC2 is because their internal chargebacks by IT are $400+ per server.
The other big mystery is how McKinsey arrives at just a 10% labor reduction when moving to a "third-party cloud provider," and they quote $96/mo of labor for the cloud servers. For what? For the guy who clicks the "launch" and "terminate" buttons on the management dashboard?
Again, the report is so thin on details that it's impossible to figure out what they're really thinking. Clearly a lot of staff is required to run a whole data center, as well as a lot of service providers, from the architects and engineers to build the facility, to the HVAC guys cleaning filters, the folks maintaining the UPS batteries, the genset, and the security crew. 10%, yeah, right.
What the report seems to completely overlook is the possible reduction in sysadmin costs. One of the huge benefits of the cloud is that the entire computing infrastructure can be automated, top to bottom. That saves a lot of sysadmin labor and in the end it means that requisitioning more compute capacity can be done by the end user somewhere in a business unit instead of being an IT chore.
The report also doesn't take into account the cost of the red tape that surrounds corporate IT - things the business can't do because IT can't support them, wasted time spent doing workarounds instead of just launching a few more servers, having to guess six months ahead of time how many servers will be needed at launch. Opportunity cost because projects don't happen for lack of IT resources.
It would have been great to read a report that lays out all the costs and assumptions clearly so one can retrace what is included and what is not. I would have loved to have learned more about corporate IT costs. Alas, the report fails to do that. and it also fails to recognize that cloud computing revolutionizes the way compute resources are consumed, which ultimately is where the bigger benefits will come from.