Inside the Facebook Green Data Centre: Pictures

Facebook’s data centre in Oregon is an energy efficiency leader. We took a look inside

In the midst of wide open spaces of farm and grazing land, the central Oregon town of Prineville (population  9,253) has  a new type of farm:  farms producing Web services that interconnect people. Like horses, cows and sheep data centres need water, fresh air and plenty of space, and the Facebook green data centre is the most well-known.

Unlike a farm, a data centre is mechanical and use electricity – lots of electricity. With the Columbia River  about 80 miles away, providing hydroelectric power, Prineville has this in plenty, and this is why the world’s largest social network  decided about four years ago to build its first wholly owned data centre in Prineville.

On 16 August we joined a small group of journalists shown round the new facility by Facebook.

The Facebook green story

As this 334,000-square-foot data centre comes online and starts processing more and more of the workload from Facebook’s huge customer base, it becomes an integral part of Facebook’s central computing strategy: To run a single, humongous, infinitely pliable computer system that expands and contracts as necessary to handle all its traffic efficiently.

There are many reasons – more than 75 the company has listed publicly – as to why Prineville was chosen, but suffice to say that the most important reasons are the ones listed above: land, power, water and fresh air. Certainly those are the most important environmental factors in the location of any data centre.

Prineville, founded in the 1870s, may be a bit on the backward side (it got its first Starbucks in 2006), but it is now on the cutting edge of the data centre business. Facebook is not only building two huge data centre here – each one of which could hold three of Wal-mart’s super stores – but Apple is also planning on locating its own IT centre in the region.

Facebook, which is closing in on 1 billion registered users and is pounded by billions of transactions each minute, realised early on that it was going to need to own its data centres. The growth of the company could simply not continue by using rented server space.

The Prineville facility opened in April 2011 following a two-and-a-half-year construction period. It is custom-built for Facebook’s purposes and uses the company’s Open Compute Project architecture.

Open Compute cuts power use

Facebook launched the OCP on April 7, 2011. It is an unprecedented attempt to open-source the specifications it employs for its hardware and data centre. The OCP held its second summit event in May 2012 in San Antonio. More than 500 attendees came.

As part of the project, Facebook has published specs and mechanical designs used to construct the motherboards, power supplies, server chassis, and server and battery cabinets for its data centre. That’s unprecedented enough for a company of Facebook’s growing scale, but the social network is also open-sourcing specs for its data centre’s electrical and mechanical construction.

The move is surprising because Facebook closely secures the information inside its network walled garden. It has had to endure its share of flak from users about how it handles personal information, which the company relies upon to earn income.

Nonetheless, Facebook isn’t fearful of showing the world exactly how its system is set up. Could this lead to future security problems? There’s no way to tell at this point, but Facebook appears confident in its ability to separate IT structure from its data stores of personal and business information.

Who would have thought the company would open source the technological blueprint for how it delivers and supports all that data in the cloud? Indeed, Facebook’s take is completely divergent from strategies of other Internet companies. Google, Twitter and Amazon closely guard their data centre and hardware specifications to maintain a competitive edge in the cutthroat cloud-computing market. Why is Facebook giving away its specs to other companies?

“It’s amazing how much can happen in a year,” Frank Frankovsky, Facebook engineer and founding board member of OCP, told eWEEK. “In April 2011, when we open-sourced a set of server and data centre designs under the name ‘Open Compute Project,’ we weren’t sure what to expect. It was our hope that we could inspire the industry to be a little more open, a little more innovative and a little more focused on energy efficiency.

“It seems to have worked, although there’s still a lot more to do.”

Show and tell day

So, what did we see on our tour? Most data centres don’t give tours, much less post a real-time PUE meter right in the lobby, but Facebook does the latter.

Power usage effectiveness (PUE) is a standard measurement used to determine the energy efficiency of a data centre (although there are alternatives, notably FVER). This is determined by dividing the amount of power entering a data centre system by the power actually used by the infrastructure within it. The lower the number, the better; a PUE of 1.0 is the lowest possible.

The U.S. Environmental Protection Agency considers a PUE of 1.5 as “best practice”. Facebook’s data centre on this day was running 1.11 and averages between 1.05 and 1.18.

Unique air filtration

Facebook Prineville Data Centre, Air Filtration SystemData centres, especially super-high-transactional facilities such as a Facebook data centre that handle billions of Web and mobile transactions per minute, get very hot with all those servers working at the same time. Facebook took special care to both filter and hydrate the air coming into the facility and make sure the hot exhaust doesn’t find its way back inside.

The airflow control process also is up-to-the-minute. The company eschewed the use of air conditioning units that conventional data centres most often deploy, adopting a free-air cooling system using outside air, which has become standard practice on new “green” data centres, along with evaporative cooling.

Chief designer Jay Park and his group use a vacuum-type setup that brings in air from the outside along an entire 300-foot-long wall and then forces it laterally through a second wall containing purified water — a subprocess that amounts to a huge mister. The air that comes through that wall is as cool and comfortable to people as a high-end air conditioning system in a department store.

Finally, the cool, newly humidified air is then forced through a wall of hundreds of lightweight paper filters before being sucked down into the data centre — not through standard ducts, but through 13 6 foot by 6 foot, 14-foot-deep wells placed around the always-warm data halls containing some 15,000 servers that crank away 24/7.

This air moves down through the server rooms and maintains steady temperatures throughout. Intelligence is built into the system; as it gets hotter outside, the system works harder to keep everything cool inside. When it’s cooler outside, the system winds down to save power. This is what adds greatly to the data centre’s PUE.

The solar contribution

Facebook, very conscious of its carbon footprint, uses solar power to help augment its hydroelectric power. It’s only a small percentage at this point, but the company plans on increasing the sun’s contribution over time.

The Prineville environment was carefully chosen for this $210 million investment (although a lot more capital expense has gone into the project since that number was released last spring). In summer, the air is mostly arid and temperatures can get hot — it was 93 degrees on this day in August. Conversely, in winter it can get very cold, yet the data centre is agile enough to handle the huge variance in temperature easily with its automated controls.

Half the servers in the data centre are what Facebook calls “store-bought” servers, from Rackable, Dell, HP, IBM and several other vendors. These are slowly being phased out after a normal 3-to-5-year lifespan, mostly determined by spinning disks operating mostly at top speed, 24 hours a day.

The other half of the server farm consists of customed-designed MemCache servers that are expressly built for moving Web content. You won’t find video cards or anything extraneous inside these machines. They’re not for everybody, Facebook Director of Site Operations Tom Furlong said, but they work perfectly for Facebook’s purpose.

Custom-Built Servers

Facebook Prineville Data Centre Custom 1GBb ServerFacebook is using custom-built 1Gb and 10Gb servers that are stripped down to bare essentials for what Facebook needs to do: move tons of Web-based data fast and to the right place for millions of simultaneous users.

Officials did not hazard to guess how many of these are firing away in this one data centre. There are 14 server aisles; Facebook calls them data halls. Each half-aisle has approximately 20 to 25 racks of servers, with an average of 22 19-inch- wide servers per rack. If you do the math, this would come out to about 15,400 servers — 7,700 on each side of the main central aisle.

Anything to do with financial or personal information — credit card info, receipts, purchase info and so on – is maintained in servers behind a high-security cage. Not even the data centre manager can get entry to this area very easily. Once someone is inside, he or she is videoed and monitored at all times. “We take our users’ trust very seriously,” Data Centre Director Ken Patchett said.

Facebook technicians use custom-designed work carts to bring tools and new parts to service servers. Every server is connected to all others so that when a disk or any component burns out or stops for any reason, the I/O through that unit simply moves to another one at exactly the time of the outage.

At the moment, there are 64 full-time employees running the Facebook data centre in Prineville. About 250 construction jobs over a span of two-and-a-half years amounted to 1 million man-hours of jobs in doing the massive project.

Prineville is but the first major data centre project for FAcebook. Others currently are being built in Forest City, North Carolina and in Lulea, Sweden, using the same principles as Prineville.

More space at Prineville

Facebook is in the process of constructing a second data centre building next door to its original Prineville facility. The second location will focus on storage of data only – including all photos, videos, documents, cached Web pages, user information, logs – everything. That one’s probably a year away from completion.

Facebook Prineville Data Centre GnomesFurlong, one of the designers of the Prineville facility, told eWEEK that in addition to the centre helping to carry out the mission of Facebook, “we wanted to build the most efficient system possible. It’s the combination of the data centre, the server and the software that makes it happen. And now we know how to do this in various regions of the world so that they all work together.”

Patchett summed up Facebook’s – and its newest data centre’s – mission in this way: “Anywhere you are in the world, at any given time, no matter what day or what hour it is, there’s always somebody waking up or going to sleep. And when they get on and they go to Facebook.com and they want to interact with their friends and see what’s going on in the world, you’ve got to be available.”

Chris Preimesberger is Editor of Features and Analysis for eWEEK. Twitter: @editingwhiz