Oh no, you're thinking, yet another cookie pop-up. Well, sorry, it's the law. We measure how many people read us, and ensure you see relevant ads, by storing cookies on your device. If you're cool with that, hit “Accept all Cookies”. For more info and to customize your settings, hit “Customize Settings”.

Review and manage your consent

Here's an overview of our use of cookies, similar technologies and how to manage them. You can also change your choices at any time, by hitting the “Your Consent Options” link on the site's footer.

Manage Cookie Preferences
  • These cookies are strictly necessary so that you can navigate the site as normal and use all features. Without these cookies we cannot provide you with the service that you expect.

  • These cookies are used to make advertising messages more relevant to you. They perform functions like preventing the same ad from continuously reappearing, ensuring that ads are properly displayed for advertisers, and in some cases selecting advertisements that are based on your interests.

  • These cookies collect information in aggregate form to help us understand how our websites are being used. They allow us to count visits and traffic sources so that we can measure and improve the performance of our sites. If people say no to these cookies, we do not know how many people have visited and we cannot monitor performance.

See also our Cookie policy and Privacy policy.

This article is more than 1 year old

Worried we'll make ourselves extinct? Let’s be scientific about it

Register Lecture to calculate Existential Risk of AI, bio-tech and more

If you’ve got a nagging feeling that the emergence of autonomous weapons, bio-tech, all knowing computers, untracked asteroids, and the breakdown of political norms is all a bit of worry, congratulations. You’re aware of some of the key existential risks facing us all.

But worrying might not be enough. To take action, we need to be able to quantify that risk if we want to avoid it, or at least manage it.

Luckily, we have Dr Adrian Currie of Cambridge’s Centre for the Study of Existential Risk is joining us for a Register Lecture on April 25 to discuss how we can develop a science of existential risk.

As Adrian puts it, existential risks are threats to the very existence of the human species. Some like meteor strikes, massive volcanic eruptions and climate change, leave traces for us to study. Others, such as those technological developments that enabled our species to have unprecedented effects on a global level, are trickier.

Either way, if we want to reap the benefits of AI, automation, synthetic biology and advanced gene-editing techniques, and so on, without, well, imperilling our very existence we need to find a way of understanding, communicating and minimizing those risks.

Adrian believes that a science of existential risk must be speculative and creative: “Which means we need to rethink what science looks like, and perhaps the role of scientists in society.”

This journey into the future begins at the Yorkshire Grey on Theobalds Road, London, on April 25. Doors will be open from 6.30pm, with the lecture proper starting at 7pm. As ever, refreshments of the liquid and solid variety, will be available.

We’ll break for a drink and a bite following Adrian’s presentation, after which the floor will be open to questions. It promises to be a fascinating evening, and we look forward to seeing you there.

You can get full details and buy tickets here. ®

Similar topics

Similar topics

Similar topics

TIP US OFF

Send us news