Facebook reveals radical Prineville data centre

By

A sneak peek at new, energy-efficient facility.

Facebook has taken the covers off a new data centre located in Prineville, Oregon, which uses custom server hardware and a radical approach to cooling.

Facebook reveals radical Prineville data centre

The facility, now in an operational state but still undergoing testing, is the result of two years of engineering work, and was launched to select members of the press and guests in April.

Custom hardware

Facebook's Prineville data centre features custom-designed servers, power supplies, server racks, and battery backup systems in an effort to consume less power.

The custom-designed server racks, each with 90 servers per rack, are larger than standard 1U rack units (about 1.5U), completely serviceable from the front, and run on Intel and AMD motherboards that strip out anything that didn't contribute to efficient use of power.

The company hopes to set an example to its industry peers by publishing these designs in what it calls the "Open Compute Project".

Facebook has also designed custom power supplies that act as an AC/DC power converter to eliminate the need for a central uninterruptible power supplies (UPS) system.

A unique building

The Prineville building uses what Facebook describes as a "penthouse" cooling system – with the server room located on one floor and various cooling apparatus on the floor above – with no ductwork present in the facility.

When temperatures outside are cool, the upper level takes in outside air through a series of filters and into mixing chambers, before being passed through a bank of filters, a humidification chamber and pushed by large fans down into the server room.

When temperatures outside are too cold and dry (which can create static electricity inside the facility), the air pulled in from outside is mixed with hot air that has been cycled through the servers and fed up into the mixing room.

After passing through the filters, moisture is added to the mix before the air passes through a mist eliminator wall to ensure no liquid droplets can fall onto the server room floor below.

Excess hot air is pushed out of the top floor and off the roof of the building.

Facebook designed its servers to run at 27 degrees celsius, but have since found the machines are more resilient than first thought. The company now believes the threshold could safely climb above 29 degrees Celsius.

Facebook expects the designs to save it 24 percent on data centre costs, and use 38 percent less power.

iTnews has published a walk-through gallery of these features – with permission from Facebook. Or you can take a tour courtesy of Rackspace below:

Got a news tip for our journalists? Share it with us anonymously here.
Copyright © iTnews.com.au . All rights reserved.
Tags:

Most Read Articles

Health signs $33m networks deal with Optus

Health signs $33m networks deal with Optus

Optus quietly delays mobile-to-satellite service launch

Optus quietly delays mobile-to-satellite service launch

Defence trials AI radiocomms deception technology

Defence trials AI radiocomms deception technology

Cricket Australia to deploy SD-WAN, Teams Calling

Cricket Australia to deploy SD-WAN, Teams Calling

Log In

  |  Forgot your password?