This article is more than 1 year old

Register Lecture: For AI's sake – taming the wild data frontier

Blighty's metrology experts define terms

Data is beginning to drive society and industry. Fleets of AI-driven vehicles are planned while major providers in health, banking and financial services seek to take vital decisions about you based simply on their clever analysis of streams and pools of data.

That's all fine until data is described incorrectly and errors get passed downstream. The capacity for mistakes and mishaps has never been greater, meaning there has never been a better time to agree a standard description of data, its quality and provenance.

Are we too late in introducing controls? And how do you get strong-willed and maverick actors like those in IT - famed for attempting to force their own, de facto standards - to fall into line?

National Physical Laboratory’s head of digital Neil Stansfield delivered our January 2019 lecture on the nature of trust in data and why his organisation is participating in an ambitious project for a universally agreed, standard definition of "what data is".

NPL has quantified many fundamentals during its lifetime – SI units for the second and the metre, for example.

Neil, meanwhile, runs NPL's work on digital technology, data, sensors, quantum technologies and graphene and has previously led government programmes to identity and harness emerging and disruptive technologies.

You can enjoy Neil’s lecture above. Neil discussed the challenges of standardising data, the technical and international work that's helped establish earlier fundamental standards such as length and time, and how this might provide a pathway to quantifying data. He talked combining data flows, the trustworthiness of real-time data in AI, algorithms - and much more. ®

More about

TIP US OFF

Send us news


Other stories you might like