The Cloudy Enterprise: Hours More Important Than Dollars

My mother’s latest project is projected to be over-budget. Thanks to a change in the way projects are allocated she now has X dollars instead of Y hours. Her project needed 50,000 “IT” hours (yes, she actually did the quote thing with her fingers when she said that), but now it can only have 45,000 “IT” hours because the “cost” (yes, she actually did the quote thing with her fingers when she said that, too, because enterprise dollars are more like Monopoly money than real money) of IT has increased by a few dollars per hour and she was forced to completely change the way the project was accounted for – now using ‘dollars’ instead of ‘hours’. Yes, I was flabbergasted, too, that in the middle of a project the valuations changed and an uncompromising stance of “figure it out” was imposed on all projects, but no one said working in IT was going to be logical, fair, or easy.

This is problematic primarily because the bulk of the costs of projects within the enterprise are staff related. Those “IT” hours might be budgeted separately from other project costs because they are “IT money” and not $vendor money that might be used to acquire software, hardware, or other tangible IT assets required for the project.

Staff related costs often were 50% to 70% of the total cost over a period of three years. Cost of communications, power, cooling and facilities could add up to another 30% to 40% of the total. Hardware and software, when combined, usually represented somewhere between 20% and 25% of the costs.

Surprised to find that people are the most costly component of an IT solution?

-- Cloud computing could cost more than using your own systems, April 2010, ZDnet (emphasis added)

As she was explaining how the change mid-project essentially completely changed the budget, I realized public cloud computing couldn’t change that. There’s nothing in this math that would change simply because the organization was using a public cloud computing based model instead of a traditional client-server mixed with web-based model coupled with legacy mainframe model. The core cost of IT remains the same because it’s based on the cost of IT overall, on a complex calculation involving salaries and benefits and compensation models. That figure, that $X per “IT” hour, doesn’t change because the deployment architecture changed.

If an organization uses different costs per “IT” hour based on the function provided, that might change the overall costs to a project. If security is $70 per hour while IBM WAS configuration is $50 per hour, that might make an impact on a project’s budget. Of course differentiating IT costs based on function might lead to an ugly place: we don’t need all those security hours, we can’t afford them! We don’t need all those testing hours, we can’t afford them. It also impacts architecture. If network security is more “expensive” than putting security in the application from an internal budget standpoint it stands to reason security is going to end up in the application. But when all “IT” hours look the same on the bottom line, there’s really not much cloud can do to change that. By normalizing the cost of IT prioritization of function becomes focused on, well, function rather than cost. And that’s probably a good thing, because we wouldn’t want to ignore security tasks because they cost a bit more in the organizational balance sheet.

So core IT costs aren’t likely to be changed by public cloud computing and, here’s the kicker, public cloud computing might cost the organization more.


Published Apr 19, 2010
Version 1.0

Was this article helpful?