Reg Lectures The Higgs Boson particle was first predicted by scientists in 1964 but it wasn't until 2012 that the existence of this fundamental of physics could finally be proven. That was thanks to CERN's Large Hadron Collider (LHC).
Scientific understanding has, for centuries, been constrained by the volume of data that could be captured and analysed. That's changing in an era of big experiments such as LHC - the world's largest scientific machine.
But, Houston, we have a problem. LHC generates petabytes of data each second, and that poses a challenge for the engineering and technical teams who keep this vast lab running.
CERN's head of compute and monitoring Tim Bell popped by during a scheduled LHC shutdown and upgrade to discuss the infrastructure challenges faced by CERN and shared by some of the world's other big-science experiments.
Tim discussed the past, present and future of LHC and plans to expand the collider's computer capacity of 300,000 cores - by 50 per cent - during the next two years to support new breakthroughs such as the exploration of dark matter.
You can catch Tim as he discusses the data challenges of big science while touring a physics experiment that knows only extremes. ®