Running Facebook's Forest City data centre

By

Onsite manager talks Open Compute, outdoor-air cooling and the need for cheap storage.

This week, we caught up with Keven McCammon, onsite manager at Facebook's flagship Forest City data centre in North Carolina.

Running Facebook's Forest City data centre

Billed as one of the most energy efficient data centres in the world, and the first to deploy v2 OpenCompute Project web servers at scale, Forest City is the template that all future Facebook data centres are set to follow.

We asked McCammon about the new technologies being deployed and the day-to-day runnings of the facility.

How has Facebook’s recent Graph Search announcement affected operations at the Forest City Data Centre (if at all)?

We have added specific hardware resources to ensure that Graph Search functions as quickly and as seamlessly as Facebook's other services. We'll share more soon about the infrastructure we've built to support Graph Search.

Facebook recently announced it would be moving to an all-Flash Data Centre environment in conjunction with Fusion-io. Can you talk us through the implementation process? How far along is the migration and is it likely to affect the facilities’ size/employee headcount? 

We already deploy Flash in our data centres, in many cases to serve the same purposes that spinning disks do, so it's not likely to have a significant impact on the size of the teams at our facilities. What Flash has the potential to do is to be more energy efficient and more flexible than spinning disks.  

Of course, we're not there yet. We need the industry to start producing a greater variety of Flash types, instead of just the premium products being sold now. We need, for example, cheaper types of Flash, with low write-endurance, to enable more cost-effective solutions to challenges like cold storage.

Only when we have a wide variety of Flash types, to match all the different use cases in the data center, will we really start to see the possibility of an all-Flash data centre become real.

What do the typical day-to-day operations at Forest City data centre involve?

The typical day involves installing and configuring new racks of servers and repairing any systems that fail. We've built significant automation into our data centre to alert us to any issues that might crop up and to suggest preventative maintenance work.

We've invested pretty heavily in these systems and in the overall operations of the facilities, and we currently maintain a ratio of one service technician for every 20,000 servers at Forest City. 

Has the facility introduced any custom server/storage/network/racking configurations that hadn't been trialled previously?

We've deployed lots of new gear in Forest City, including Open Rack and the latest Open Compute Project web server and storage designs.

In some cases, being the second of Facebook's data centres has meant that we've been able to get the new gear more quickly. It'll be the same story in our newest data centre, in Lulea, Sweden. Lulea will come online later this year, and it will be our first data centre with 100 percent [original design manufacturer] ODM-built servers.

Was Facebook able to pull racks from existing co-lo arrangements as it builds more of its own facilities?

We are in the process of moving capacity from our previous leased facilities to our wholly owned and operated facilities. We've done a lot of work in our leased facilities to make them more efficient, but you can do so much more when you control every aspect of the data centre.

Are there any other technology initiatives that make the Forest City data centre unique?

All our data centres are based on the designs that we've open sourced as part of our participation in the Open Compute Project, but we're always working on improving every aspect of our facilities.

Some of the cool things about our Forest City data centre are that we've deployed the newest versions of the Open Compute web server and storage designs, and we've upgraded all the networking in the facility to 10Gbps.

We've also started using a water-infused membrane instead of misters in our outdoor-air cooling system, and that's helped us become even more efficient in our use of water.

But the coolest thing about our Forest City data centre is that it's proven that we can make our outdoor-air cooling system work in a relatively hot and humid climate. We just went through the second hottest summer on record in North Carolina, and we didn't have to turn on our backup chillers once.

Our PuE at the facility over the summer was 1.07. We are really proud of that.

Talk us through the data centre's cooling  solution.

All our data centres use 100 percent outdoor-air cooling systems, to save the cost and environmental impact associated with the chillers you find in many other facilities. A lot of work went into designing those systems to be as efficient as possible, and we're always tweaking them to make them better.

A good example in our Forest City data centre is the use of a water-infused membrane instead of misters to add water to the incoming air, which we need to do to regulate the temperature and humidity of that air before we send it into the server hall. What we've found is that using the membrane instead of the misters allows us to use even less water in this process.

Can you give us some idea of how redundant the facility is?

We manage redundancy at several levels within the facility. We also have varying degrees of redundancy in our infrastructure as a whole.

What are the main challenges that have cropped up since the facility began operating?

We're pretty open about this stuff, and have shared a lot of our findings via the Open Compute Project. We really believe that the more the industry can collaborate on the challenges we all face, the more efficient and sustainable the entire industry will be.

Got a news tip for our journalists? Share it with us anonymously here.
Copyright © iTnews.com.au . All rights reserved.
Tags:

Most Read Articles

Defence trials AI radiocomms deception technology

Defence trials AI radiocomms deception technology

Cricket Australia to deploy SD-WAN, Teams Calling

Cricket Australia to deploy SD-WAN, Teams Calling

Optus quietly delays mobile-to-satellite service launch

Optus quietly delays mobile-to-satellite service launch

CSIRO seeks fibre optic provider for WA telescope

CSIRO seeks fibre optic provider for WA telescope

Log In

  |  Forgot your password?