BBC makes switch to AWS, serverless for new website architecture, observers grumble about the HTML

News aggregator says it's 'way more complicated and much harder to parse'


Updated The BBC website, the sixth most popular in the UK, has mostly migrated from the broadcaster's bit barns to Amazon Web Services (AWS) with around half the site now rendered using AWS Lambda, a serverless platform.

"Until recently much of the BBC website was written in PHP and hosted on two data centres near London," Matthew Clark, head of architecture, said lately. "Almost every part has been rebuilt on the cloud."

PHP runs fine in the cloud, but this is not a matter of lift and shift. Instead, the BBC team devised a new architecture based on serverless computing. It also endeavoured to combine what used to be several sites – such as News, Sport, and so on – into one, though Clark said the World Service, iPlayer video, and the radio site BBC Sounds remain separate.

The rest have been combined into a new thing called WebCore. "By focussing on creating one site, rather than several, we're seeing significant improvements in performance, reliability, and SEO," said Clark.

Web traffic initially hits a Global Traffic Manager (GTM), an in-house solution based on the Nginx web server and running partly on-premises (showing that the BBC has not entirely ditched its data centres) and partly on AWS. GTM handles "tens of thousands of requests a second," said Clark. A second layer on AWS handles caching and routing, before hitting functions running on AWS Lambda, which perform server-side rendering (SSR) of dynamic content using React, a JavaScript framework.

Server-side rendering means the browser gets a page ready to view without having to do a lot of work, and thus it should appear instantly, though it increases the burden on the server – caching mitigates this, we note. Walmart engineer Alex Grigoryan, who also oversaw a migration to SSR, tested SSR vs client-side rendering (CSR) and said: "When we did A/B tests on SSR vs CSR... our numbers showed better engagement from the customer with rendering early," though he noted increased server load as a major disadvantage.

In the BBC's case, Lambda is used, which is able to auto-scale on demand. "About 2,000 lambdas run every second to create the BBC website; a number that we expect to grow," said Clark. He added that Lambda scales better than VMs on the AWS Elastic Compute Cloud (EC2), saying that "our traffic levels can rocket in an instant; Lambda can handle this in a way that EC2 auto-scaling cannot."

Another aspect of the BBC site is the logic that goes into requesting content, which Clark calls the "business layer". Content is provided to the web rendering layer via a REST API, and a solution called Fast Agnostic Business Layer "allows different teams to create their own business logic," he said, so that different requirements are met while still sharing the same system for things like access control and caching. Clark didn't say much about how the content itself is stored, though he promised to return to this topic in future posts.

The WebCore platform uses CI/CD to enable rapid iteration, and Clark showed an example monthly report showing 110 releases or around three per day. Builds take around 3.5 minutes, and the average time from a pull request (when new code is merged) to running it in production was one day and 23 minutes, in this particular month. On average 67 per cent of pull requests were actually merged into the code.

BBC website HTML

A small section of the HTML delivered for a news article today on the BBC site. A news aggregator says it is much harder to parse than before

Great work? Comments on Hacker News show that opinions vary. "Running a site the size of the BBC on Lambda is nothing short of an exuberant waste of a government-subsidized budget, it's absolutely crazy. Lambda VM time has a massive markup compared to regular compute... IMHO this is the epitome of serverless gone wrong," said one.

Another comment from John Leach, who runs a headline aggregation site called News Sniffer, said that the generated HTML is not easy to analyze. "I run the News Sniffer project which has to parse BBC News pages and I knew about this rollout a few weeks ago when the HTML all changed format completely and my parsers broke. As a side note, the new HTML is way more complicated and much harder to parse than before – I know the aim isn't to help parsing for content, but I was still saddened to see how it's ended up."

There is also curiosity about unanswered questions. What is the cost impact of moving from on-premises to AWS? What is the cost impact of Lambda versus using EC2? Why, if the caching and content delivery network is working as expected, are 2,000 lamdbas a second required?

We have asked the BBC for more details. ®

Updated at 16:02 UTC on 5 November 2020 to add

The BBC's Matthew Clark got in touch to say: "Although EC2 Lambda compute cost is higher, the amount you need is less, offsetting this." He added, somewhat mysteriously since EC2 can autoscale, that: "With EC2, we provision web servers with plenty of capacity to handle sudden traffic changes (e.g. due to breaking news). Whereas with Lambda, we only pay for what we actually use."

To the question of why the org didn't use the opportunity of server-side rendering to deliver more human-readable HTML that would be better for parsing and accessibility tools, he responded: "The web page HTML looks different as it's largely generated by the framework used (React). The BBC has a very high bar for accessibility and performance, and we continue to test the site to ensure that it works well across browsers and screen readers." Lastly, we asked why, if the caching and content delivery network was working as expected, 2,000 Lamdbas a second were required.

Clark claimed: "The Lambdas are essential at handling updates so that the site remains up to date. Each BBC page typically involves multiple simple Lambda executions - the majority of which complete in under 150ms."


Biting the hand that feeds IT © 1998–2020