
Consolidation is the most common driver for adopting server virtualisation, says Cris Colfescu, consultant at The Project Network, a firm specialising in delivering virtualisation and project management advice.
“Some people think they will cut costs in terms of energy consumption and rack space leasing, some are thinking about reduced hardware maintenance costs and others are interested in reducing their carbon footprint,” says Colfescu.
Earlier this year UK charity Comic Relief implemented a virtual server and storage environment designed to do just that. It cut the number of physical servers it was running by 20 per cent, helping the charity to meet its environmental commitments by reducing the power consumption of both physical machines and the air conditioning systems needed to cool them.
“We used to have 20 servers, we now have 16 running virtual machines,” says John Thompson, Comic Relief’s head of IT.
Simon Evans is director of information services at Southgate College in north London, and is in the process of consolidating 50 physical servers into 35 using VMware ESX as well as replacing 600 legacy desktop PCs with VMware’s virtual desktop infrastructure (VDI) software and thin clients. The college spent £250,000 on VMware products, servers, storage area network (SAN) equipment and professional services, which Evans says represents a saving of £450,000 on the £700,000 that would be needed to buy like-for-like replacements for the servers and desktops the college already owned. He also expects to shave £28,000 a year off the college’s electricity bill and £1,200 from annual computer maintenance.
“The key is to take a phased approach, especially when migrating servers,” says Evans.
Like Southgate College, Comic Relief took the opportunity to simultaneously virtualise its storage capacity while introducing VMware. It installed a Compellent SAN appliance loaded with thin provisioning software to make the most of unused data capacity across its network. This allowed it to increase use and create virtual storage areas that can be quickly expanded to accommodate rapidly growing datasets.
“Our data grows exponentially with every one of our campaigns and we need to add flexible capacity as our needs grow. Our total capacity is now 15TB, with about 4TB currently available,” says Thompson.
Online gambling firm Betfair is six months into a sweeping three-year virtualisation programme that will eventually see between 8,000 and 10,000 virtual servers filling its own racks hosted in a Guernsey datacentre. The main driver behind the project is the need to conserve space.
“If we continue to grow at our current rate, we will need a datacentre the size of a football pitch to accommodate all the servers,” says information systems manager Tony Rigby. “Which from a financial perspective is not the thing to do.”The programme will eventually see all Betfair’s mission-critical software development and quality assurance environments running on virtual servers.
Christopher Venning is head of IT at the Royal College of Physicians (RCP). He recently oversaw the implementation of 45 virtual servers hosted across seven HP blade servers and is in the process of virtualising 160 desktops using VMware’s VDI software.
The implementation focused on improving business continuity, reducing management overheads, and PCI security compliance. The college also has a cap on its power use because it is based in a residential area and needed to consolidate.
“We are based in Regent’s Park, and there is a limited amount of power we can draw. So to run 160 VDI desktop images, we had to use administration servers for additional power,” says Venning.
To understand the best route to successfully virtualising servers, IT chiefs must understand the requirements of the applications that will run on the virtual machines (VMs).
For organisations such as Comic Relief, planning was essential. It migrated most of its critical
applications to virtual servers, including its contact management, campaigns fulfilment, grants administration and finance systems based on Microsoft’s Dynamics GP software and an SQL-based data warehouse application.
Stefan Van Overtveldt is vice president of emerging technology and innovation at BT, which is moving toward a cloud computing environment that relies heavily on server virtualisation in the company’s datacentres.
Van Overtveldt agrees that effective planning and close examination of application requirements are key. BT has forged its own approach to this, dubbed the continuous migration process.
“For some applications there is no change at all, but for others you have to take a closer look at the application to figure out how it will work in a virtualised environment,” he says. “You have to look at the databases connected to them, the infrastructure they use, how they interact with the network and storage resources, then create a structured approach to migrating them.”
RCP’s Venning believes it is important to understand what services are running where and who uses them. “Some of the applications that you think will be tricky are actually a doddle. It is the general administration services that cause problems,” he says. “File and print services are hard to virtualise without disruptions to users and locally attached storage is hard to accommodate.”
One aspect of virtualisation that often catches people out is the strain it can put on the network. Putting 10 VMs on a single server greatly increases the data traffic to and from that server and any back-end solution has to take care of the increased I/O, caching and disk sharing, otherwise performance can be affected, says Colfescu.
“Everyone has invisible problems in the network but when you virtualise, those problems become visible because you are concentrating everything on one piece of hardware and bottlenecks occur,” he says.Buying as much server memory and processor power as you can afford is a good way to avoid performance issues in the first place, says the RCP’s Venning. “The network traffic between the server and the client is often tiny – it is the traffic between the servers that is high,” he says.
Colfescu says firms do not necessarily have to install new servers to handle virtualisation, depending on what they have already and how it fits alongside the virtualisation software licensing requirements.
“Everything can be re-used – the network, servers, network interface cards – but a lot comes down to the product you select because some are licensed by socket, some by core, and some per server,” he says.
Colfescu says companies interested in virtualising their infrastructure for the first time should make optimum use of capacity planning software, either those supplied by the virtualisation vendors themselves or third-party products.
“The figures come out slightly different with each one and might tell you that you need ‘x’ amount of servers for a particular task, but it is still worth using them,” he says.
Server virtualisation options
VMware
VMware has quickly grown to dominate the server virtualisation market. It now claims to have more than 120,000 paying enterprise customers around the world. VMware offers a range of server and desktop virtualisation tools, along with management tools.
Microsoft
Microsoft only came to market with a viable server virtualisation solution, Hyper-V, as part of its Windows Server 2008 release earlier this year. The management aspect of Hyper-V is handled by the latest version of Microsoft’s Systems Center systems management
software.
Xen
Xen is an open-source software virtualisation hypervisor first released under general public licence in 2003. It has been integrated into various open-source products including the Novell distribution of Suse Linux Enterprise Server and Red Hat’s Enterprise Linux 5. The firm XenSource sells management products around the Xen hypervisor, and was acquired by thin client veteran Citrix. Citrix’s XenServer 5 product is now based on what was previously XenSource’s XenEnterprise product.