It is true right now that for the most part, virtualization changes deployment of applications but not their development. Thus far this remains true, primarily because those with an interest imagein organizations  moving to public cloud computing have reason to make it “easy” and painless, which means no changes to applications.

But eventually there will be changes that are required, if not from cloud providers then from the organization that pays the bills.

One of the most often cited truism of development is actually more of a lament on the part of systems’ administrators. The basic premise is that while Moore’s Law holds true, it really doesn’t matter because developers’ software will simply use all available CPU cycles and every bit and byte of memory. Basically, the belief is that developers don’t care about writing efficient code because they don’t have to – they have all the memory and CPU in the world to execute their applications. Virtualization hasn’t changed that at all, as instances are simply sized for what the application needs (which is a lot, generally). It doesn’t work the other way around. Yet.

But it will, eventually, as customers demand – and receive - a true pay-per-use cloud computing model.

The premise of pay-for-what-you-use is a sound one, and it is indeed a compelling reason to move to public cloud computing. Remember that according to IDC analysts at Directions 2010, the primary driver for adopting cloud computing is all about “pay per use” with “monthly payments” also in the top four reasons to adopt cloud. Luckily for developers cloud computing providers for the most part do not bill “per use”, they bill “per virtual machine instance.”