Menu

Facebook issues $100K challenge to build an AI that can identify hateful memes

  • May 21,2022
  • Angela King

Memes are now an integral part of how people communicate on the internet. While a lot of these memes have an ability to cheer you up, a lot of them are hateful and discriminatory.

At the same time, AI models that are trained primarily with text to detect hate speech, struggle to identify hateful memes. So, Facebook is throwing a new $100,000 challenge to developers to create models that can recognize hateful images and memes.

As a part of the challenge, Facebook said it’ll provide developers with a dataset of 10,000 ‘hateful’ images licensed from Getty Images:

In a blog post , the company explained that creating an AI model to detect hateful memes is a multimodal problem. The model has to look at the text, look at the image, and then look at the context of how they’re used in conjunction. Facebook said that annotators have ensured that examples in the dataset create a multimodal problem for the AI to solve. So, some of the existing models for text or image detection might not work out of the box.

Facebook is careful enough to open up this dataset only to approved researchers. The company said the dataset contains meme of sensitive nature often reported on social media including the following categories:

Detecting hate speech is a difficult problem for Facebook and other social networks. Memes add an extra layer of complexity as moderators and AI have to understand context of the posted meme. Companies can’t apply a one-size-fits-all solution as cultural, racial, and language-based context of memes change very frequently.

While this challenge might not ship a readymade solution for the social network giant, it might give the company some ideas as to how to solve this problem.

You can learn more about the competition here and you can read the accompanying paper describing methods and benchmarks here . Selected researchers will present their paper at NeuralIPS 2020 in December.

Leave a Comment