Framework to calculate TCO for Cloud Computing and Virtualization
To my fellow architects - A lot has been said about how elastic computing or on-demand computing or cloud computing can be a great cost containment vehicle for companies. The key element there is the pay per use costing model of each of these concepts. If an enterprise is able to predict it's work load and it's usage models then these computing concepts make a lot of sense. The question not withstanding issues such as security, the need to control the deployment environment etc. is in regards to whether these elastic computing models truly deliver on the promise of cost savings and/ or cost avoidance. Has there been a proven way of calculating the TCO for traditional investment in IT assets that could now be used to compare and contrast the advantages or to identify the hidden costs of these pay per use computing models? I am wondering if there have been any frameworks that exist for doing a cost benefit analysis for even concepts such as Virtualization and Grid Computing....