www.enr.com/articles/7287-editors-choice-green-project-facebook-data-center

EDITORS' CHOICE & GREEN PROJECT: Facebook Data Center

February 13, 2012
OFFICE: United States Institute of Peace

Facebook Data Center

Prineville, Oregon  Region ENR California

Project Team

Owner Facebook, Palo Alto, Calif.

Architect Sheehan Partners, Chicago

General Contractor DPR Construction, Redwood City, Calif.; Fortis Construction, Portland, Ore.

Engineers Alfa Tech, San Jose, Calif.; WHPacific, Bend, Ore.; Peoples Associates Structural Engineers, Milpitas, Calif.

Consultant Brightworks Sustainability Advisors, Portland, Ore.

Subcontractor: Rosendin Electric, San Jose, Calif.

The Facebook Data Center is a project of many firsts. It was Facebook's first real estate purchase. It was the first data center to be designed, owned and operated by the social networking software giant. It is Facebook's first LEED-Gold certified building and one of the most energy-efficient data centers ever built.

In another first, Facebook is giving away the project's design secrets. In a nod to its hacker roots and open-source software, Facebook last year launched Opencompute.org, a repository for technical specifications and CAD drawings for the sustainable data center's mechanical and electrical systems, battery cabinets, rack systems and servers.

To make the 320,000-sq-ft project more sustainable than Facebook's leased facilities, a key decision was to integrate a new server design with the center's overall design, says Tom Furlong, director of site operations. To reduce power consumption, servers were custom built to eliminate unneeded components usually found in off-the-shelf equipment, such as video cards and multiple network interfaces.

In another deviation from the norm, designers cut out a stage of power transformers and used a higher voltage in the facility. "If you want to focus on efficiency in the electrical system, you want to limit those power conversions or make them with the most efficient components you can," Furlong says. In typical data centers, up to 25% of power is lost in these conversions; the rate was reduced to just 7% at Prineville.

Much of the project's energy savings result from an innovative yet simple mechanical system that relies on evaporative cooling. Most data centers are noticeably chilly, some as cool as 55º F. Located in a desert in central Oregon, Facebook's cold aisles are designed to operate at much higher temperatures. "In 2009, when we were doing this data center's design, the ASHRAE standard was changed to 80.6º F inlet temperature, but no one wanted to step up and be the first to adopt that new code," says Jay Park, the project's design engineer. "We took a big step and got there, proving that the [new] standard works—and we went even further." Prineville's second phase, along with Facebook's recently completed North Carolina data center, will operate using 85º F inlet air.

To cool the air, the mechanical system simply injects water mist as needed after the outside air is drawn through a massive bank of dust filters. "Prineville's high-desert location means they are able to use outside air routinely most times of the year, with evaporative cooling used during the summer months," says Eric Lamb, executive vice president at Redwood City, Calif.-based DPR Construction, which built the project in a joint venture with Portland, Ore.-based Fortis Construction. Variable-frequency-drive-equipped fans work in concert with onboard server fans to draw air over the server components at the ideal rate.



Despite the water needed for evaporative cooling, the building consumes a third the amount of water used by a typical data center, Furlong says. A mist-eliminator screen captures and recycles any minute water droplets that haven't been fully absorbed into the air, saving about 18% of water usage. Rainwater is harvested for gray-water usage in the facility. Facebook also added a 100-Kw photovoltaic array to supply all the power for the 5,000 sq ft of office space.

Located on a mesa above Prineville, the rocky site required blasting and excavating 60,000 cu yd of material, which was reground and used for fill, taking some 1,500 truckloads off the road, says David Aaroe, executive vice president at Fortis Construction.

The project consumed 620 miles of electrical wire and 48 miles of conduit. The team used building information modeling to design foundations, the steel structure, mechanical and fire protection systems, and underslab conduit. Using BIM, design-assist subcontractors and careful project management, DPR/Fortis brought the project in ahead of schedule and under budget. In phase two, the team expects to cut an additional month out of the project schedule. The as-built BIM will connect with facility management software to assist Facebook with downstream operations and maintenance, Lamb says.

The project benefited one of Oregon's most economically depressed communities. A report prepared by Portland-based ECONorthwest at the behest of Facebook estimates the data center's capital costs at $210.4 million, with $142 million staying in state. Aaroe says that of the more than 2,000 workers that went through safety orientation, more than 70% were from central Oregon.

The decision to open up the data center's design specs to the industry aligns with Facebook's culture, which encourages participation in the open-source community. The company was built around open-source software, says Furlong. Opencompute.org has attracted industry giants such as Intel to join forces. "We feel our competitive advantage is in our software and the service we deliver," not in the data center, Furlong says. "We'll get a lot more out of it by sharing than we will by being insular about it."