Helping prevent the spread of misinformation

When I worked at Full Fact I spearheaded our automated fact checking and AI work. This was an attempt at dramatically scaling the work of fact checkers around the world. How could we use technology that exists today to supercharge the journalists and researchers at the frontlines of the misinformation war?

Since 2015, at Full Fact our goal was to create a global collaborative effort to help media outlets, civil society, platforms and public policy makers better understand the landscape, and to bring the benefits of those tools to everyone by working in partnership.

We launched our roadmap The State of Automated Fact Checking in August 2016, where we set out a plan for making fact checking dramatically more effective using existing technology. In autumn of that year we were one of the first UK organisations to use the “Fact Check” label in Google news.

In November 2016, we announced support from Google’s Digital News Initiative for the first stages of our automated fact checking work. This funding helped build our first prototypes. In May 2019 we – along with Africa Check, Chequeado and the Open Data Institute – won the Google AI Impact Challenge. We are just one of 20 international winners, chosen from more than 2,600 entrants. Over the next three years, with Google’s support, we used machine learning to dramatically improve and scale fact checking, working with international experts to define how artificial intelligence could transform this work, to develop new tools and to deploy and evaluate them.

I left Full Fact in 2020, but this work is being continued by Andy Dudfield who now leads the ever smart and brilliant technology team there.

What did you build?

At Full Fact we have built the infrastructure to monitor claims that are made in the public sphere. Within that we apply a few algorithms that we have developed:

  1. Claim detection algorithm – this separates out factual statements from other sentences
  2. Claim matching algorithm – this matches new sentences to potential fact checks, so if this claim has already been checked before, you’ll know.
  3. Automated checking – certain sentences, in the UK, will be parsed out and automatically checked against official statistics.

On top of these services, we have built a suite of products to help fact checkers. These are now being used around the world.

What do fact checkers need?

Phoebe Arnold at Full Fact wrote a fantastic report about the challenges of online fact checking [PDF] and technology needs of fact checkers worldwide. She interviewed 19 fact checking organisations around the world, and found that each of them particularly needed help with.

Source: Challenges of online fact checking, Phoebe Arnold, Full Fact, 2020

Some of our tools meet these needs, but as you can see there is a lot more to be done, and some of the product changes are directly in the hands of technology companies, or governments who might want to make these changes a necessity.

In the news

In its early days the automated fact checking project collected a fair bit of press, here are some of my favourites.

  • My interview on BBC More or Less with Tim Harford.
  • BBC Click Full Fact talks automated fact checking on BBC Click
  • Wired Google is helping Full Fact create an automated, real-time fact-checker
  • The Guardian Journalists to use ‘immune system’ software against fake news
  • TechCrunch Full Fact aims to end fake news with automated fact checking tools
  • The Guardian Fake news clampdown: Google gives €150,000 to fact-checking projects
  • Engadget Full Fact wants to automate fact checking to fight fake news
  • Independent Google funds automated fact-checking software in bid to fight fake news
  • Poynter Full Fact has developed and is using an inward-facing automated fact checking platform
  • Nieman Lab Fact-checking and data-driven projects among winners of Google’s Digital News Initiative funding