DCIM: Two worlds collide

By on
DCIM: Two worlds collide

Dreaming of one dashboard for facilities and IT.

Page 1 of 2  |  Single page

As data centres become the engine rooms of the modern enterprise, efforts are underway to blend the plant and IT infrastructure assets into a single, manageable framework.

In the past, said David Yip, data centre executive with IBM, there were “two divergent schools of thought” when it came to managing IT assets – those that had responsibility for facilities (the data centre’s physical assets) and those responsible for IT (the systems).

Developing a single management view of both of these assets is “where all the action is now,” Yip said.

Today, everything from power-rails to racks and the server hardware and comms equipment is IP-enabled.

That has made all of the status and performance of these physical assets available for monitoring and management tools that fall under the broad banner of “data centre infrastructure management” or DCIM.

DCIM dashboard

Rodney Gedda, an analyst with Telsyte, said most of his clients desire this holistic view of IT and plant.

“A recent trend is for the power hardware companies to develop software to hook into the IT – into things like server and networking equipment,” he said.  

“Another trend is the deployment of integrated cloud computing “stacks” or data centres-in-a-box that can feature [autonomous] monitoring of components, reporting heat and power consumption.”

Mike Andrea, director of the Strategic Directions Group, notes that some co-location providers have seen opportunity to profit from this convergence.

Strategic Directions is among those service providers taking on licenses for DCIM software and bundling this monitoring capability in with the rental of racks in their facilities.

"The cost of the software licenses can otherwise be quite prohibitive for smaller and medium sized organisations," he said.

Shifting the power dynamics

One of the key costs of operating IT equipment in the data centres is power — both for operating the IT hardware and operating the cooling infrastructure required to ensure the hardware runs at optimum performance.

Paul Tyrer, vice president of Schneider Electric IT, said the pressure to reduce power consumption is coming from the upper levels of management.

Savvy business managers are demanding better oversight of the energy and power management within the data centre and are “demanding visibility from the data centre management team,” he added.

Power is such a significant cost, it has driven two major trends in data centre design: the move towards cooler external environments with abundant hydropower, and the shift towards hardware capable of tolerating higher temperatures within the data centre facility.

The first option, open to the Googles, Apples and Amazons of this world, is to build massive facilities in places like Oregon, USA, that boast cheap power and low ambient temperatures.

Most of Australia, by contrast, is a hostile environment for IT hardware, necessitating cooling.

One fallback option for Australian IT managers is to seek out new cooling and power management features in the latest generation of server hardware.

Servers from all the major brands have become gradually more capable of dealing with higher data centre temperatures.

ASHRAE, which publishes routinely updated guidelines on acceptable temperature envelopes, has steadily revised its recommendations for data centres upwards every year for the past few years.

Yip said the average temperature of a new data centre has moved from being refrigerated at 18 degrees celsius to a positively balmy 25-27 degrees.

IT managers need to “manage the power and thermal profiles of the hardware, all the way down to the chip level,” said Andrew Cameron, data centre product lead for HP South Pacific.

Managing thermals means understanding aspects such as virtual machine performance on a server, the operation of the server itself, plus the attributes of the rack and row the server is housed in.

For many organisations it may be necessary to run a ‘thermal analysis’ of their infrastructure, moving hotter-running hardware to cooler parts of the centre.

Read on to find out how this power is being put into the hands of facilities managers...

Next Page 1 2 Single page
Copyright © iTnews.com.au . All rights reserved.
Tags:

Most Read Articles

Log In

Username:
Password:
|  Forgot your password?