This article is more than 1 year old

We've found it! A cloud-and-AI angle on the royal wedding

Harry and Meghan's guests to be ID'ed in real time by AWS vid-scrapers, pumped into apps

The wedding of Prince Harry and Meghan Markle will be brought to the world with the help of cloudy machine learning.

As AWS’ video processing limb Elemental recently revealed, “A video feed from an outside broadcast van located near St. George’s Chapel will capture faces of arriving guests and feed the signal to an AWS Elemental Live small form factor appliance located nearby for real-time ingest to an entirely cloud-based workflow.”

That video will be analysed by AWS’ cloudy-celebrity-spotting-as-a-service thing we noticed and ridiculed back in 2017.

The results will be there for all to see in a Sky News app that will “automatically highlight celebrities as they appear on screen and immediately present details about their connection to the royal couple with on-screen captions and graphics.”

The Register thinks this is a breakthrough moment for cloud as readers who struggle to explain what they do at work will at last have a relatable example of what it is to work in IT these days.

“The cloud and AI are for identifying faded pop stars made unrecognisable by plastic surgery” will surely satisfy many. “The cloud means I can rent all the computers needed to supply pointlessly curious people with meaningless information” could also be a useful way to explain the cloud.

Vulture South, the source of this story which resides in a former colony, has no explanation for the ongoing popularity of the monarchy. But that hasn’t stopped us from trying to cash in on it with this story. ®

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like