Tuesday, November 10, 2009

Framework to calculate TCO for Cloud Computing and Virtualization

To my fellow architects -

A lot has been said about how elastic computing or on-demand computing or cloud computing can be a great cost containment vehicle for companies. The key element there is the pay per use costing model of each of these concepts. If an enterprise is able to predict it's work load and it's usage models then these computing concepts make a lot of sense. The question not withstanding issues such as security, the need to control the deployment environment etc. is in regards to whether these elastic computing models truly deliver on the promise of cost savings and/ or cost avoidance.


Has there been a proven way of calculating the TCO for traditional investment in IT assets that could now be used to compare and contrast the advantages or to identify the hidden costs of these pay per use computing models? I am wondering if there have been any frameworks that exist for doing a cost benefit analysis for even concepts such as Virtualization and Grid Computing.  What are some of the elements that need to be plugged into this TCO model to truly get to an accurate figure? Do you include costs associated with "loss of control" and/or "lost business opportunity costs" due to the inability of the cloud provider to scale up an environment during your peak? Is there a truly scientific and proven mechanism, methodology or process that could be used to compute TCO for these newer computing models?

Thanks in advance for your help.
surekha -

Thursday, September 3, 2009

Has Anyone realized ROI with Cloud Computing ROI?

Question to the viewers of this blog is how many have used concepts of cloud computing (and I do not mean private clouds, here)? Are you looking at Cloud computing for reduction in infrastructure costs by moving to a pay as you use concept or to augment your peak capacity or else are you using cloud sofware as a service?

What is your experience and are you realizing the ROI as in a lowering of your TCO, or increase in operations efficiency and availability?

Thanks for your input.
surekha -

Sunday, August 9, 2009

Issues with Server Virtualization

Virtualization has been considered a great way to maximize utilization of server resources by increasing server density. In addition, this concept is considered to be very key to green IT initiatvies as it reduces the number of compute resources needing to be supported in a data center thus reducing the cooling and power costs.

Having said that, there seems to be a performance downside to virtualization. Given that a virtual machine (VM) is yet another layer of indirection the server HW this layer negatively impacts the performance any time there is a need for the application to access IO Channels, SAN disc and other network resources. This is especially so when the application code is either running inside of a JVM or an application server container such as a JEE container, .NET CLR or even a web server servlet container. In all cases, performance profiles of an application seem to vary whether the application is running inside of a virtual layer vs. directly on the physical server.

Vendors are now focusing on this drawback by introducing the concetp of Just Enough OS (JEOS). It remains to be seen how effective these optimizations prove to be in the realm of critical path enterprise systems.

Thank you.

surekha -