This article is more than 1 year old

Keep on trucking: Dropbox's Magic Pocket and the curse of the loading bay

Want to build a storage cloud? How many trucks can you unload in a day?

It's not something the average sysadmin has to worry about, but when you're rolling out terabyte after terabyte into a data centre with a presto time signature, the loading bay becomes a bottleneck.

That's the kind of thing Dropbox faced in its rapid-fire move to build its own cloud outside the warm embrace of Amazon Web Services.

Anybody who's worked in stadium rock will sympathise with the problem: indoors, there's a bunch of empty space to be filled; outside, there's more than one semi-trailer; and between them, there's one bay you can reverse a semi-trailer into.

Physical logistics, Dropbox storage team lead James Cowling, were an unexpected challenge in the company's Magic Pocket project, which sucked 500 petabytes of 500 million users' worth of customer data into in-house data centres.

It's been billed as Dropbox “getting off the cloud”, but in conversation with Vulture South, Cowling agreed that from the user's point of view, Dropbox has become its own cloud.

Until the Magic Pocket project, Cowling said, Dropbox had operated a hybrid model: “business logic, databases and Web services ran on our own data centres,” he said, with AWS acting as the bulk data store.

The switch to running the storage infrastructure in-house was, Cowling reckons, the biggest-ever migration away from a cloud service.

And until that time, the Australian expat (he encountered Dropbox people while studying for his MIT PhD) was used to thinking of mass storage in strictly Comp. Sci. terms, he told Vulture South.

“When I was doing my PhD in storage and distributed systems, I never had to think about what fits in the loading bay,” he said.

Creating such a big cloud in a short time – Magic Pocket was a two-and-a-half year project, but there are wrinkles in that we'll get to in a minute – changed that.

With between 30 and 40 racks to plug in each day, there were “a lot of interesting conversations about how to pull them into the data centre”.

It's not just about looking down a string of empty bays in the data centre, and telling the storage equivalent of road crew where to put things.

“The racks have to be in different hardware failure domains, plugged into different power supplies, different circuit breakers, different network connections,” he said.

And the supply chain management had to cope with Dropbox's oft-recounted Bad Hair Day story, when two semi-trailers full of disks crashed in a single week.

Next page: Design and build

More about


Send us news

Other stories you might like