You need to shift millions of repos to AWS without any downtime. How? Bitbucket engineering chief tells all

And you need to do it during a pandemic while working from home


How does one rebuild an aeroplane, at 40,000 feet, while it is still full of passengers? That was the question posed by Daniel Tao, head of engineering at Bitbucket, while discussing the source shack's move to AWS data centres.

The outfit found itself having to migrate 50 million repositories out of parent company Atlassian's bit barns and shunt them across the billion or so daily interactions into Amazon's cloud without falling into a heap. Oh, and it had to sort it out during a global health pandemic.

Bitbucket Cloud has been around for over 10 years and, along with an on-premises version, is Atlassian's take on source wrangling. It was augmented back in April with Open DevOps, built around Jira, Confluence and Opsgenie as well as Bitbucket, but, unlike the rest of the Atlassian line, had remained firmly in its own data centres.

The service was bedevilled by outages in 2019 and storage woes in 2018. A change was due. And not just the yanking of support for Mercurial in 2020.

"Its architecture has always assumed it would be in a data centre," said Tao. "And so we had to really redo key aspects of Bitbucket's architecture, and rebuild it kind of in a new way in a cloud environment, while still operating our data centres."

Rival DevOps outfit GitLab memorably shunted its own data to Google's cloud in 2019, shortly after GitHub's acquisition by Microsoft.

As for Bitbucket's migration, it took 18 months from start to finish, with the final push happening over three hours at the end of August. The pandemic complicated things as the team faced up to "the same curveballs that were thrown at every tech company of just learning how to work remotely and coming up with new processes and rituals around that," according to Tao.

It also had to ensure sufficient capacity had been purchased and account for any last-minute issues in its plans.

From a technical standpoint, the greatest challenge was one of bandwidth as customer data was first replicated from Atlassian's servers to AWS's. "Every time a customer pushed to their Git repository, every time a customer left a comment on a pull request, and so forth, all that data was being replicated in real time with a lag of milliseconds into our new environment in AWS," said Tao.

The downtime at the end of the August was then only required to point Bitbucket's services at this new "source of truth."

"The vast majority of our customers wouldn't have noticed anything," he claimed.

Which is all well and good from a technical standpoint, but some customers might be less than pleased at having their data moved to AWS.

"In terms of data locality," said Robert Krohn, head of Agile and DevOps engineering at Atlassian. "Our data centres are in the same sort of region, so we didn't have to inform our customers that we were moving stuff around."

That might raise an eyebrow or two, although Krohn added: "Some of the big customers we did talk to… but in general, that wasn't a constraint."

Although a leap into the land of Bezos might trigger an ulcer or two in some developers, if one has already signed up for Atlassian's cloud wares, the odds are that a chunk of one's data is already in AWS's data centres. Atlassian's internal Platform-as-a-Service, Micros, runs atop AWS and hosts the majority of the company's cloud products. This includes (after a few tweaks) Bitbucket.

"It was the final boss," remarked Tao.

And despite the move to AWS, the throat to choke if things go wrong is still Atlassian's.

Unsurprisingly, Tao and Krohn were keen to highlight the improvements in performance and scalability after migration. "Behind the scenes," said Tao, "the amount of incidents has dropped to zero." A glance at the company's status page shows just the one problem, which cropped up around pipelines earlier this week, since August's final switchover.

However, there are still those pesky data centres, now redundant and stuffed full of customer data, to deal with. "We have to take the data that is on those hard drives very seriously," said Krohn, "and we're going through a process of securely destroying them in a certified way."

But what of the remaining hardware? "We've had parties where we've sent people and said, oh, go and cut the cables," said Krohn.

Because, no matter how carefully you plan your migration and what precautions you have to take, let's face it: there ain't no party like a DC-decommissioning party. ®

Similar topics

Narrower topics


Other stories you might like

  • Stolen university credentials up for sale by Russian crooks, FBI warns
    Forget dark-web souks, thousands of these are already being traded on public bazaars

    Russian crooks are selling network credentials and virtual private network access for a "multitude" of US universities and colleges on criminal marketplaces, according to the FBI.

    According to a warning issued on Thursday, these stolen credentials sell for thousands of dollars on both dark web and public internet forums, and could lead to subsequent cyberattacks against individual employees or the schools themselves.

    "The exposure of usernames and passwords can lead to brute force credential stuffing computer network attacks, whereby attackers attempt logins across various internet sites or exploit them for subsequent cyber attacks as criminal actors take advantage of users recycling the same credentials across multiple accounts, internet sites, and services," the Feds' alert [PDF] said.

    Continue reading
  • Big Tech loves talking up privacy – while trying to kill privacy legislation
    Study claims Amazon, Apple, Google, Meta, Microsoft work to derail data rules

    Amazon, Apple, Google, Meta, and Microsoft often support privacy in public statements, but behind the scenes they've been working through some common organizations to weaken or kill privacy legislation in US states.

    That's according to a report this week from news non-profit The Markup, which said the corporations hire lobbyists from the same few groups and law firms to defang or drown state privacy bills.

    The report examined 31 states when state legislatures were considering privacy legislation and identified 445 lobbyists and lobbying firms working on behalf of Amazon, Apple, Google, Meta, and Microsoft, along with industry groups like TechNet and the State Privacy and Security Coalition.

    Continue reading
  • SEC probes Musk for not properly disclosing Twitter stake
    Meanwhile, social network's board rejects resignation of one its directors

    America's financial watchdog is investigating whether Elon Musk adequately disclosed his purchase of Twitter shares last month, just as his bid to take over the social media company hangs in the balance. 

    A letter [PDF] from the SEC addressed to the tech billionaire said he "[did] not appear" to have filed the proper form detailing his 9.2 percent stake in Twitter "required 10 days from the date of acquisition," and asked him to provide more information. Musk's shares made him one of Twitter's largest shareholders. The letter is dated April 4, and was shared this week by the regulator.

    Musk quickly moved to try and buy the whole company outright in a deal initially worth over $44 billion. Musk sold a chunk of his shares in Tesla worth $8.4 billion and bagged another $7.14 billion from investors to help finance the $21 billion he promised to put forward for the deal. The remaining $25.5 billion bill was secured via debt financing by Morgan Stanley, Bank of America, Barclays, and others. But the takeover is not going smoothly.

    Continue reading

Biting the hand that feeds IT © 1998–2022