Bored with Blighty? Relocation lessons for the data centre jetset

Green expertise, language barriers and constant cooling issues

Got Tips? 25 Reg comments

The market where time stood still

Power being bounteous and cheap in a country where environmentalism is painfully low on the agenda has hit the progression of data centre design hard. Where most European data centres have been embracing free air cooling, hot or cold aisle containment, and even heating office spaces through waste heat, some US data centre operators still rely on chilling the whole data floor, which is ludicrously wasteful. With the cost of oil from the Middle East dropping fast however, things are starting to change.

Ashburn, Virginia, in the US is fast emerging as the very best America has to offer in the data centre world and it has an awful lot more going for it than just being located in a geologically boring (read: stable) part of the country. Near the Washington Dulles airport, this area is affectionately known as “data centre alley” and it is the main connectivity backbone for the US.

What brings people here?

Depending on the source, you might hear this area carries between 50 and 70 per cent of the World’s – not just US – internet transit. Ashburn is rapidly becoming one of the premier global hubs, and there are some tasty tax incentives for technology businesses bringing employment to the area to further sweeten the pot.

Being within driving distance of America’s largest naval shipyard in Norfolk, Virginia, is one factor that has brought with it an unexpected benefit – an amazing talent pool. Where most data centres in the US – and certainly those outside the new builds coming from the likes of Facebook, Microsoft and Apple – have been built to the same tired spec for twenty years, Virginia data centres are acquiring staff from the US Navy who are used to constructing battleships and maintaining nuclear submarines. This has led to some truly amazing power and efficiency improvements in the area, with one provider in particular granted a patent for 2N+2 redundancy on every single data centre infrastructure component, a totally unprecedented level of power and cooling availability.

People power, and problems

The security situation in Ashburn is pretty hot, too – presumably another side effect of the prevalence of highly trained former Navy personnel. It’s simultaneously amazing and somewhat terrifying as a Brit to be led through data centres, those mundane, sterile bastions of technology, by security guards wearing body armour and carrying automatic weapons at their hip.

South America, on the face of it, shares a lot of attributes with the US. While cooling is onerous and the geography and climate of the continent can provide some interesting issues, land is cheap and running costs are low, with the added benefit that techies are even cheaper still, which makes it a very attractive proposition. In recent times, major players – including Equinix, Verizon and Microsoft’s cloudy services – have established significant footholds there

When you’re working in a foreign land that might be an eight-hour flight (or more) away, remote hands will quickly become your best friend. These facilities are generally staffed by experienced engineers who can stand up environments in no time at all, ready for the customer’s Sys Admins to perform the finer configuration remotely at their leisure.

It’s something of an easy gateway into a logistically difficult foreign market, and saves a lot of time and cost. It was when working with a third-party data centre in Brazil that I experienced one of the real peculiarities of South America – and working with data centres abroad in general – the language barrier.

Sponsored: Webcast: Ransomware has gone nuclear


Biting the hand that feeds IT © 1998–2020