Framework to calculate TCO for Cloud Computing and Virtualization
A lot has been said about how elastic computing or on-demand computing or cloud computing can be a great cost containment vehicle for companies. The key element there is the pay per use costing model of each of these concepts. If an enterprise is able to predict it's work load and it's usage models then these computing concepts make a lot of sense. The question not withstanding issues such as security, the need to control the deployment environment etc. is in regards to whether these elastic computing models truly deliver on the promise of cost savings and/ or cost avoidance.
Has there been a proven way of calculating the TCO for traditional investment in IT assets that could now be used to compare and contrast the advantages or to identify the hidden costs of these pay per use computing models? I am wondering if there have been any frameworks that exist for doing a cost benefit analysis for even concepts such as Virtualization and Grid Computing. What are some of the elements that need to be plugged into this TCO model to truly get to an accurate figure? Do you include costs associated with "loss of control" and/or "lost business opportunity costs" due to the inability of the cloud provider to scale up an environment during your peak? Is there a truly scientific and proven mechanism, methodology or process that could be used to compute TCO for these newer computing models?
Thanks in advance for your help.