Cardiff researchers get £250k to monitor Brexit hate crime on Twitter

Pre-crime snoops study spread of cruel chatter

Cardiff University's Social Data Science Lab has been awarded a £250,000 grant to set up a centre to monitor “Brexit-related hate crime” on Twitter.

The lab – based in Wales, UK, and dubbed the Centre for Cyberhate Research and Policy – will develop “a monitoring tool that displays a live feed of the propagation of hate speech as it happens on Twitter.”

Cyberhate, a term coined by Cardiff's Dr Peter Burnap, co-director at the Social Data Science Lab at Cardiff University, refers to a form of antagonism without reference to the legality of the speech, he told The Register. He added that the ultimate aim of the research is to help the government identify areas that require policy attention and improve “interventions to stop hate crime from spreading".

The grant of £250,000 will help it do this, and comes thanks to the UK's Economic and Social Research Council, one of the nation's seven research councils which funnel taxpayers' cash to academics.

Professor Matthew Williams, the principal investigator on the project and co-director of the Social Data Science Lab at Cardiff University, said: “Hate crimes have been shown to cluster in time and tend to increase, sometimes significantly, in the aftermath of 'trigger' events. The referendum on the UK’s future in the European Union has galvanized certain prejudiced opinions held by a minority of people, resulting in a spate of hate crimes. Many of these crimes are taking place on social media.

“Over the coming period of uncertainty relating to the form of the UK’s exit, decision makers, particularly those responsible for minimising the risk of social disorder through community reassurance, local policing and online governance, will require near-real-time information on the likelihood of escalation of hateful content spread on social media. This new funding will provide the system and evidence needed to achieve this,” concluded Williams.

The team are collecting data over a 12-month period, from 23 June 2016, the data of the UK's referendum on whether to leave the European Union. It will be using "state-of-the-art machine learning technologies to classify, analyse, and evaluate tweets in real-time" with a particular focus on geolocated tweets to examine the spread of hateful chatter.

The focus on Twitter does narrow the view for the researchers, although the group has expertise in its analysis, having previously examined the platform as part of a $800k grant into pre-crime from the US Department of Justice. Dr Burnap admitted as much to The Register, saying "this is the issue," but explained that the use of geolocation data to track the popularity of malicious tweets would enable the group to "zoom out" on the phenomenon.

The tool that the team will be developing is intended to include a dashboard for policy makers and analysts that will provide details of "precursors to hate speech, such as type of social media user, characteristics of their network, the type of hate expressed, the content that is posted (such as URLs and hashtags) and external factors such as mass media reporting."

Dr Burnap, who is the computational lead on the project, said: “To date the information available to government on topics such as hate speech around Brexit has been post-hoc and descriptive. What is needed are open and transparent methods that are replicable, interpretable and applicable in real-time as events are unfolding. We will be enhancing our existing language models using cutting edge computational methods to mine massive amounts of public reaction and provide meaningful insights into hateful and antagonistic commentary within minutes of an event occurring.” ®

Broader topics

Other stories you might like

  • A peek into Gigabyte's GPU Arm for AI, HPC shops
    High-performance platform choices are going beyond the ubiquitous x86 standard

    Arm-based servers continue to gain momentum with Gigabyte Technology introducing a system based on Ampere's Altra processors paired with Nvidia A100 GPUs, aimed at demanding workloads such as AI training and high-performance compute (HPC) applications.

    The G492-PD0 runs either an Ampere Altra or Altra Max processor, the latter delivering 128 64-bit cores that are compatible with the Armv8.2 architecture.

    It supports 16 DDR4 DIMM slots, which would be enough space for up to 4TB of memory if all slots were filled with 256GB memory modules. The chassis also has space for no fewer than eight Nvidia A100 GPUs, which would make for a costly but very powerful system for those workloads that benefit from GPU acceleration.

    Continue reading
  • GitLab version 15 goes big on visibility and observability
    GitOps fans can take a spin on the free tier for pull-based deployment

    One-stop DevOps shop GitLab has announced version 15 of its platform, hot on the heels of pull-based GitOps turning up on the platform's free tier.

    Version 15.0 marks the arrival of GitLab's next major iteration and attention this time around has turned to visibility and observability – hardly surprising considering the acquisition of OpsTrace as 2021 drew to a close, as well as workflow automation, security and compliance.

    GitLab puts out monthly releases –  hitting 15.1 on June 22 –  and we spoke to the company's senior director of Product, Kenny Johnston, at the recent Kubecon EU event, about what will be added to version 15 as time goes by. During a chat with the company's senior director of Product, Kenny Johnston, at the recent Kubecon EU event, The Register was told that this was more where dollars were being invested into the product.

    Continue reading
  • To multicloud, or not: Former PayPal head engineer weighs in
    Not everyone needs it, but those who do need to consider 3 things, says Asim Razzaq

    The push is on to get every enterprise thinking they're missing out on the next big thing if they don't adopt a multicloud strategy.

    That shove in the multicloud direction appears to be working. More than 75 percent of businesses are now using multiple cloud providers, according to Gartner. That includes some big companies, like Boeing, which recently chose to spread its bets across AWS, Google Cloud and Azure as it continues to eliminate old legacy systems. 

    There are plenty of reasons to choose to go with multiple cloud providers, but Asim Razzaq, CEO and founder at cloud cost management company Yotascale, told The Register that choosing whether or not to invest in a multicloud architecture all comes down to three things: How many different compute needs a business has, budget, and the need for redundancy. 

    Continue reading

Biting the hand that feeds IT © 1998–2022