How Facebook’s Homegrown Data Centers Serve Billions Of Users, Now And In The Future

Do you enjoy looking at your friends’ Instagram photos? Cherish the Facebook updates about your cousin’s new baby? Rely on Messenger to chat with your college friend living abroad? Or love the immersive nature of Oculus Rift experiences?

None of that would be possible without the people who’ve developed, built, and maintained Facebook’s global data center infrastructure.

In 2009, when Facebook had just a fraction of the 1.8 billion users it has now, not to mention none of the other apps and companies it owns—Instagram, Messenger, WhatsApp, and Oculus—the company operated a single data center, really just a set of servers, at a location close to its Silicon Valley headquarters.

Soon, though, as Facebook vice president of engineering Jay Parikh explained to me recently, the company realized it needed to expand to a second location, on the East Coast. In short order, more expansion followed.

Everything Facebook does is “very, very interconnected,” Parikh said. “It’s not something where you can say, Hey, let’s take our users in California, and put them on servers in California” and do the same for users in other geographic areas. All users are connected to everything the company does, and to all other users, and that presented the company with significant engineering challenges.

In those days, Facebook still relied entirely on third-party hardware and co-location facilities for its server infrastructure. Over time, though, it has abandoned that reliance on outside technology and facilities and starting in 2009, has built its own network of data centers, infrastructure that it believes is among the most energy-efficient in the industry and that’s essential to delivering the daily experience of its gigantic user base.

Jay Parikh

The idea? To make it possible for Facebook’s engineers, and those building its apps, to be able to develop new services and then quickly deploy them throughout the entire user base.

An example? Taking something like Facebook Live, which was originally developed as a hackathon project, and launching it to the full Facebook community within five months.

“We treasure that as a core part of our culture,” Parikh said, “to move fast and deliver experiences to a very big community.”

More Complex Apps Demand More Complex Infrastructure

As Facebook began building out its own data centers, it was tempting to simply “lather/rinse/repeat” the kinds of facilities like the one it had built in Prineville, Oregon in 2011. In fact, said Parikh, it became an in-house joke that that’s all the company needed to do.

But as the company began offering more immersive experiences, both through the Facebook service itself and then through its other apps, it realized that it needed to to ramp up the power and energy efficiency of its new data centers in order to maintain economic efficiency, and to future-proof against further demands on the systems that might come from increased reliance on artificial intelligence and machine learning.

After all, systems set up to work well in 2014, Parikh pointed out, aren’t likely to be ready for 2017.

“We don’t want to play it safe and get conservative,” he said, “and get complacent in how we’re thinking about the technology.”

Green Data Centers

While Facebook itself—the “big blue app”— has 1.8 billion users, the company’s other apps serve at least 2.5 billion more. All of that computing demand requires a global data center network of unparalleled strength and efficiency.

Now, Parikh said, Facebook is “thinking about how to build the platform, make it scalable and reliable for all apps and services, and make it ready for new immersive [services] like live video and 360-[degree] video.”

Today, Facebook has seven data centers around the world—five in the United States, spread across Oregon, Iowa, Texas, New Mexico, and North Carolina, as well one each in Ireland and Sweden. And that number is growing rapidly, though the company isn’t sharing any future numbers. Parikh did say that it typically takes Facebook 12 to 18 months from ground-breaking to open a new data center.

As that network has grown, so has bandwidth demands. Facebook is “pushing very aggressively” for 100 gigabit per second interconnects between the data centers, and the company is already trying to figure out how to stretch that to 400 Gbps.

Even as that bandwidth demand ramps up energy usage, Facebook has committed itself to a data center network that relies on green power. Last year, the company said it expects to use 50% clean and renewable energy in its data centers by 2018. The plan, in fact, is that its facilities in Iowa, Texas, Sweden, Ireland, and New Mexico will all be 100% powered by wind, solar, or hydro-electric power.

One way that’s possible is by limiting the amount of power used in the first place. To do that, Facebook said it has developed systems that waste, on average, just 6-8% of power, versus what it says is an industry standard of 50-60%. Google claims its number is about 12%.

The company is confident in its Power Utilization Effectiveness (PUE) numbers, to the point of showing them in real time for each of its data centers.

One major way it achieves that is with a facility design in which air is brought in from the outside to cool servers, rather than relying on expensive and power-intensive air conditioning systems. The heat from servers is then exhausted out of the buildings. In colder climates, or seasons, some of that warm air can be routed back into the buildings, reducing the reliance on heaters.

In the end, what this all means is that Facebook knows that its user base and the complexity of the apps and services it delivers will continue to grow over time, and that if it doesn’t prepare itself for that growth, it won’t be able to keep up.

The company’s users expect high performance, no matter which tool they’re using, and without an adaptable and extensible data center infrastructure, there’s no guarantee it can succeed at delivering on Facebook’s core mission: making it possible for people to share their lives with family and friends and for the entire world to be a more connected place.

 

Fast Company , Read Full Story

(33)