From the dWb website

Feature Article

Virtualisation is the solution -

What is the problem again?

You have too many servers, you need to consolidate - help is at hand all you need is this magic wand and you can virtualise them - job done!

You have too much storage and its growing at an uncontrollable pace - take a couple of spoonfuls of this virtualisation elixir and all your problems will evaporate.

You have gone global and need support 7 x 24 - no problem. Give the existing team this virtualisation tablet, wait for a couple of hours and your virtual team will be up and running.

What is this virtualisation stuff you may ask, can it be used to solve all my resource problems, should it come with a health warning?

This sounds like the golden bullet that we have been waiting for - I am virtually convinced?

When is it right to use virtualisation solutions?

When is it appropriate have a non-virtual solution?

 Feature Article

Is a Virtual World the right answer?

Many years ago, before PCs had been invented, computing was only done on Mainframes. These were very complex and expensive. To reduce the costs smaller computers were developed some of which were called minicomputers - they were still complex. One of the characteristics of these computing platforms was their ability to run several programs at the same time. The difficulties arose when introducing change both hard and soft.

To address the complexity and cost the minicomputers evolved into low-cost servers running various variants of Unix. These servers were linked together through local area networks and a distributed computing environment was created. The performance was good and the investment costs were low - hidden however was the consequences of the multiplier.

The distributed environment was good for the companies, good for the IT industry and for many years stimulated innovation. The platforms were reliable, the number of servers enabled higher availability without the need to duplicate the entire environment. The multiplier was still under control.

After the advent of PCs the need for servers grew. They were used to provide file and print services, directory services, databases, messaging, etc., etc. As their number grew the management requirement also grew, the complexity grew - any change needed to be implemented on all of the servers. The management overhead increased to include protection from malware attacks, patch management, security policy compliance, operating system consistency, etc., etc. Financially user-based application acquisition, licence fees, support costs, etc. exposed the multiplier ogre.

This needed to be stopped.

Analysis showed that many of these servers were underutilised for the job they were doing; analysis showed that the licenses were not under control; analysis showed that managing the environment was costing a fortune.

What can be done?

With a whoosh, a bang and the smell of burning rubber - consolidation was the new buzzword. Reduce the number of servers - at once. It looks obvious, it sounds easy and it seems to make sense. Unfortunately the server hardware was not up to the job (I/O and network connections); the software was not really built to handle the requirements; the systems management tools had not been designed for this job and the applications were positively hostile to the idea.

It became clear that the level of complexity had actually increased not decreased. Yes, you can combine a number of servers onto a single server but apart from the acquisition / finance costs the management of separate environments is still needed. The maintenance effort grew.

Virtualisation is the way that complexity is hidden through a layer of abstraction. Done properly this can be successful in reducing the management effort. Like many things however it is not the answer to all of the problems.

The number of servers can be reduced and replaced by more sophisticated hardware with higher resilience, capacity (I/O and network connections) and performance. The storage can be virtualised, tiered and commoditised but the root of the problem is the amount of data. Active de-duplication and archiving will help but removal of obsolete data is the real cure.

Virtual teams are a good thing. Virtual teams need tools that support virtual teams - they need remote presence capabilities. They need to have a stronger team ethos than a physical team. The creation of virtual teams need careful behavioural coaching, common objectives and compatible work cultures. Waving an organisational wand does not do the job.

The easy answer to what should or should not be virtualised is to select those areas that are compatible to the virtualisation model - clustered and load-balanced solutions; applications with few dependencies on physical hardware or specific software capabilities; applications that are easy to maintain e.g. do not need to be rebooted; applications that are easy to re-platform (already have a high level of abstraction); non-business critical environments.

Virtual Worlds are the best place for those things that can be virtualised (and cost savings can be made) - they are dangerous for those things that are difficult to virtualise (where the costs are correspondingly high e.g. no savings).

This document maintained by dwb@dwb.co.uk. -------- Material Copyright © 1999-2012 dWb

o