DIY data centres

By on
DIY data centres

Putting together a computer room on a budget has never been easier.

Page 1 of 2  |  Single page

Using bed sheets and desk fans to alter in-room airflows and Perspex sheets for containment is part of data centre industry folklore.

Often they are cited as examples of bad practice or as proof that a generation of centres designed and configured around mainframe-era thinking face refresh or redundancy as newer, more cost-efficient designs emerge.

But discounting the DIY elements of data centre construction and configuration also serves a handy purpose of driving sales of more expensive custom kit from vendors.

For example, a containment system of Perspex' roofs and sliding doors from Rittal covering two rows of 10 racks is around $7,500, says the company's IT business development manager Mark Roberts.

But Perspex sheets themselves range up to a couple of hundred dollars at a hardware store or a plastics manufacturer. Add a PVC strip or swing door you might usually find in a cold room or butcher's shop, seal any holes or gaps nicely and the cost you pay for containment could be a fraction of the commercial version.

Australia is a nation consumed by renovation and DIY. Data centre consultants readily acknowledge that it is part of the data centre and computer room culture for many small to medium businesses.

But there are consequences to getting it wrong - big ones - and some in the industry are just waiting to reap the consultancy fees when DIY jobs go pear-shaped.

"Customers often try to do something and don't quite get it right," says APC's data centre solutions advisor Adam Wilkinson."We get called in to help them do it properly."

"Taking a short-term focus on investment like [DIY Perspex containment] with the knowledge that is out there is risky. You've got to think big - [besides] $7500 aggregated across that many racks really isn't that bad," adds the Frame Group's data centre practice manager Greg Goode.

CRN's investigation of construction and configuration techniques drew a wide variety of responses, ranging in price from sub-$100 to well into the five figures.

As a result, we present our guide to putting together the most cost-effective SMB or SME data centre today.

The floor

Floor design is linked to a number of factors, including power, equipment weight and cooling.

But the traditional raised floor in data centres is in many ways a design leftover from the mainframe era.

Most small businesses are unlikely to have mainframes. As a result, it could be worth dropping racks straight down onto the concrete slab and containing them rather than building a raised floor, particularly if the average rack density is under five kilowatts.

"One of the minimal advantages of having a raised floor - though unadvised - is that you can run a small amount of cabling under it," says Shaun Vosper, the director of Brisbane-based consultancy Data Centre Technologies.

Goode concurs: "Raised floors were initially built to throw cabling under and pump air to the racks," he says. "In the mainframe era, air management was a load of rubbish. Most of the sub-floor plenum was totally occupied by cabling and a mish-mash of other pipes, so the cooling never worked that effectively.

"Slowly people worked out that it was better using the sub-floor plenum just for the distribution of air, and they moved cables into managed or overhead systems. Other people say that with containment you can do away with the raised floor altogether."

Goode says the choice between raised floor and the bare slab is ideological. "The jury's 50/50 at the moment," he says.

Vosper is convinced the choice is more practical.

"If you're doing a really basic setup I'd suggest not to worry about a raised floor because there's a lot of air conditioning systems on the market that don't require floor-based distribution," Vosper says.

"For the SME market, it's a cost they just don't have to incur."

The raised floor can be costly not just in installation but in floor space, according to Wilkinson.

"The smaller the room, the greater percentage of space is lost by putting in a raised floor," Wilkinson says.

"By the time you address pedestrian access and occupational health and safety issues around the step [up to the floor level], you could lose five out of 25 square metres in the room."

Adds Gordon Makryllos, APC's vice president for Pacific, "The default should be that you don't need a raised floor."


If the raised floor is canned, a different approach to cooling than the standard computer room air conditioning (CRAC) unit is required.

Opponents of the slab say that in-row cooling - a method of sticking air conditioners between the racks themselves - combined with hot or cold aisle containment is the main alternative to under-floor air distribution.

"If you have a slab then you have to cool the equipment in-row," says Peter Spiteri, director of marketing at Emerson Network Power.

"Although there's some efficiency benefits in not having a raised floor, there could be issues with having plant and IT equipment side-by-side because you've just put in infrastructure that may or may not carry chilled water right beside racks of your server equipment.

"That plumbing requires service on a monthly basis so you could have maintenance cleaning trays right next to someone setting up blade servers."

Most industry players CRN spoke to were broadly dismissive of this risk. Such maintenance visits are generally supervised, they say.

Others, such as Vosper, believe in-row cooling could be unnecessary at the extreme low-end of the market.

"If your business has 2kW of IT load per rack do you need an in-row cooler? No, you don't," he says. "Most SMEs usually just take a basic air conditioner for home use and buy the more industrial model.

"If you've got three racks averaging three or four kilowatts each, a couple of top quality base air conditioners would pose few dramas.

"If you were going to push above five kilowatts per rack as part of your IT strategy only then would you really need to look at other air conditioning solutions."

Vosper, however, urges small businesses considering a base air conditioner to think about redundancy.

"If the unit fails what are you going to do?" he says. "What people have to look at is the cost of downtime to their business because it will really drive their choice."


Containment has become a de facto design methodology in the data centre. It's about controlling air flows - getting the most efficient use of available cold air while expelling hot air and ensuring the two flows don't mix.

There are a number of ways to achieve this.

Many larger data centres arrange racks into "hot" and "cold" aisles. In the cold aisle, cool air is blown into racks on both sides. The alternating hot aisle exhausts the air and expels it, at least in theory.

To augment this arrangement, either the hot or cold aisle is often "contained" - that is, a roof is added on top of the racks with sliding doors at each end.

"You need to start compartmentalising the data centre and lock up areas where air is bypassing the normal cooling cycle," Goode says.

DIY types use Perspex or other acrylic or polycarbonate sheets for the roof and PVC doors to save money. But other DIY forms of containment are also emerging, according to Vosper.

"I've seen a basic data centre built with the type of refrigerant cooling panels you find as walls in a cool room," Vosper says.

"The panels have significant heat and cooling properties so you don't lose a lot of the cooling you're providing.

"I've also seen centres built out of gyprock with house insulation in the walls. Some of these DIY jobs are in the premises of extremely large companies."

Regardless of construction, experts agree there are some cheap ways to optimise the effectiveness of containment architectures.

These include putting blanking plates over the front of empty spaces in the racks to prevent cool air from passing through, and ensuring the containment area itself is properly sealed.

"You need to make sure there's no leakage of air into service corridors or outside of the contained area," Spiteri says.

"For example, where a cable comes in or out of a room you've got to seal around the hole."

The future of containment isn't limited to walling in a number of racks; several vendors including APC are pushing containment within a single rack, opening the door for much smaller computing room deployments.

"We have a single rack configuration that is totally contained and capable of capacities up to 30kW," Makryllos says.

"It's like a data centre in a rack."

Chris Molloy, chief executive of TEX Solutions, says that US rack manufacturer Chatsworth is also pursuing a rack containment model.

"I'm definitely seeing a move towards rack containment," he says.

Goode concurs. "It's becoming the next wave of thinking in data centres."

Read on to page two for DIY tips on thermal dynamics.

Next Page 1 2 Single page

Most Read Articles

Log In

|  Forgot your password?