Do you ever wonder where a video you post to Facebook ends up being stored?

Or how a message you send via WhatsApp or Messenger ends up getting to the person you send it to?

When you like a photo on Instagram, how does that like get pinned to the picture concerned?

And if you are a user of the Oculus virtual reality system, how is it that you can connect up with other users?

Well drive out of Dublin along the M3 motorway and you will find the answer lurking behind bushes near Clonee.

Facebook data centre Clonee

Because that’s where Facebook, which also owns Instagram, Messenger, WhatsApp and Oculus, has built an extraordinarily large new data centre.

Now it is fair to say, data centres don’t exactly have a reputation for being particularly sexy. And that’s for two main reasons.

First, they aren’t terribly outstanding to look at in pictures, comprised usually of large industrial units housing hundreds or even thousands of racks of servers.

And second, the people who run these data centres, primarily big tech firms, tend to be very secretive about what goes on inside them, in order to keep what they store safe from hackers.

So it was quite unusual then that a group of media, including this correspondent, were invited by Facebook to venture inside its Clonee facility to learn more about what it does.

Facebook has rightly been in the spotlight for all the wrong reasons lately, with a sharp focus from users, tech experts and the media on how it handles users’ data and its policies around content moderation.

Its actions and inactions in this regard have correctly drawn much criticism and forced it to admit to and address its failings, something Mark Zuckerberg has acknowledged the company isn’t proud of.

Mark Zuckerberg

But it is quite clear when you visit the Clonee data centre that the social network’s employees are hugely proud of what they have built.

Announced in 2015, ground was first broken on the €200m facility in April 2016 and it quickly became the largest construction project in Ireland at the time.

After 7.2 million construction hours by 1,200 workers (at peak there were 1,550 on site), Clonee 1 and 2, as the first two buildings are called, are now up and running.

Clonee 3 is still being constructed and Facebook hopes to have it completed next year, while a planning application for Clonee 4 and 5 has been lodged.

As well as the 1,200 construction workers still on site building Clonee 3, there are now 300 permanent staff working at the centre in all sorts of capacities, including engineering and support services.

The employees come from all over the northeast region, making the plant a big employer.

This, Facebook claims, busts the myth that data centres aren’t good for the local or regional economy.

Niall McEntegart RTE

“In five plus years you are talking about 1,000-1,500 people in construction continuously in a sector that really needed it,” said Niall McEntegart (above), Facebook’s Datacenter Operations Director for EMEA and APAC.

“Plus when you look at the operations staff here over, there’s 300 people and that’s likely to grow by a couple of hundred people by the time we are finished on this site.”

The scale of the facility is gigantic with the entire campus occupying 250 acres, the same as 4.2 Dublin Zoos.

Each of the buildings is 377 metres long, the equivalent distance of walking from St Stephen’s Green to Trinity College Dublin.

The floor area of each unit is 25,500 square metres, the same as four Aviva stadium pitches, and eventually Facebook hopes there will be five such buildings.

So far, 11,800 tonnes of steel has been used in the construction of Clonee 1 and 2, enough for two Eiffel Towers, along with 58,000m3 of concrete.

Tech firms choose Ireland to build data centres for a number of reasons.

“We’re really well connected in terms of fibre here in Ireland and that’s improving all the time in terms of cross Atlantic connections in to Europe,” said Mr McEntegart.

But it also makes sense to build data centres here because of the temperate climate.

Servers generate considerable amounts of heat and the facilities have to be kept at a constant moderate temperature.

Traditionally data centres used direct cooling systems to keep the infrastructure from overheating.

But Facebook claims to have pioneered a new indirect cooling mechanism – a heat exchanging unit that ensures air in the data halls does not mix with air outside, risking contamination.

It also makes it 50% more efficient to run than a standard cooling system – equivalent to the power consumed by 7,000 homes a year.

The summer heat wave did increase the need to use water to keep the humidity levels at the optimum 20-80%, but otherwise didn’t cause any major issues, Mr McEntegart said.

Facebook data centre

To bring the necessary power into the plant, Facebook has built a 220kV substation, the first privately built such system in Ireland.

Because they consume vast amounts of power, data centres have been criticised by environmentalists for increasing reliance on fossil fuels and emissions.

Some data centre operators claim to run them on renewable energy.

In reality though, this often means they have just purchased power from a company that sources renewable power, but haven’t actually ensured new renewable capacity has been added.

Facebook claims it does things differently in this regard.

“These use a lot of electricity absolutely,” said Mr McEntegart. “But we’ve made a lot of effort in recent years to address that in to the future.”

Facebook has an agreement in place with Brookfield Renewable Partners, which not only ensures the site is 100% powered by wind energy, but also that it is actively investing in constructing renewable generation capacity.

So far, Facebook said it has added 2,500MW of renewable infrastructure including solar, wind and hydro globally in the last year alone and this is increasing all the time.

“We don’t deal in credits, we deal in real investment,” said Mr McEntegart.

“And we are partnering those investments on the grids where the power is actually being used.”

Structurally, each of the data centre buildings is incredibly complex, with 200,000km of fibre cable stretching as far as the eye can see. Enough indeed to go five times around the Earth.

But what is most surprising though is all the data comes into and out of the data centre via a couple of dozen fairly standard looking cables, situated in a number of different Main Point of Entry rooms.

Facebook data centre

“What happens in effect is that the data travels from your phone across the mobile network down fibre to a building just like this one,” Mr McEntegart said.

These central cores of fibre then fan out through a Main Distribution Frame room, where there are 500,000 individual connections, into the main data halls.

These house many hundreds of servers, each stacked one above the other in a rack, with colourful lights twinkling and fans humming.

Facebook said it has invested considerable resources in completely redesigning the servers, simplifying the equipment, increasing automation and making it more efficient – 38% more than traditional data centre setups.

The tech giant said it has been through this process in an open manner, sharing what it has learned though the Open Compute project so that others can benefit from the advances too.

Not surprisingly, all areas across the whole complex are extremely secure.

Biometric fingerprint readers control access and security guards roam the perimeter in vehicles and on electric scooters.

Facebook data centre

There is also resiliency and redundancy built into Facebook’s data centre network, so if one part or indeed an entire data centre shuts down, another section in Clonee or facility elsewhere in the world will pick up the slack within milliseconds.

Users’ data is stored in more than one location, to avoid any data loss. And if servers have to be removed, they get the full treatment.

“If you have a failure on this equipment or we decommission it  because it gets old, we do a seven-stage repeated wipe on this, followed by fully grinding the hard drive themselves down to powder,” said Mr McEntegart.

Outside, Facebook said it has tried to ensure the facility is sympathetic to its surroundings, drawing design cues from the nearby landscape of the Hill of Tara and Brú na Bóinne world heritage site.

It has built in lakes, actively planted native species and even set up ten bee hives, which are tended to by members of staff, including Mr McEntegart.

Facebook data centre Clonee

The social network said it is also conscious of being a good neighbour, and has just launched its Community Action Grant Program. It is inviting applications for funding for projects that address community needs, like STEM education or connecting people on or offline.

Data centres may not have the best of reputations, for many reasons, some justified.

But in a world where data is the lifeblood of the global economy and demand for it is rising at an exponential rate, they are and will increasingly be a necessity.

So the next-time you ‘Like’ a post, upload a video or send a message on a Facebook-owned platform, you may or may not have helped to “bring the world closer” as the company would wish.

But you can raise a little smile to the fact that you will probably have caused a tiny little LED light in a dark and noisy data hall in Clonee to flicker on and off, for the tiniest millisecond.

Comments welcome via Twitter to @willgoodbody





READ SOURCE

SHARE

LEAVE A REPLY

Please enter your comment!
Please enter your name here