Facebook makes public the algorithm to find violent videos

Facebook decides to make open source the algorithm that allows it to unearth violent or copyright infringing videos. The code is on GitHub

The algorithm developed by Facebook in collaboration with the University of Modena and Reggio Emilia to unearth violent videos becomes open source. The social giant and the Italian university have decided to spread the code in order to allow others to collaborate on the development of the algorithm to improve it.

This algorithm, born from the collaboration between Facebook’s Paris-based Fair (Facebook Artificial Intelligence Research) group and Italian researchers, uses artificial intelligence to find similar parts between different videos in real time. Born to find mainly copyright violations, the algorithm is also used by the blue social network to find videos that contain images of violence, especially on minors, and terrorist propaganda. Until now it has been protected by copyright, so it couldn’t be used by anyone other than the university and Facebook. But now it won’t be like that anymore.

Available on GitHub

The algorithm able to identify violent videos will be available to everyone on GitHub, so that big and small developers (but also anti-violence associations) can use it to find the “hash” (a sort of fingerprint) of dangerous videos. Anyone will soon be able to use the code developed by Facebook and Unimore and, being open source, even modify and improve it. For example, a nonprofit working to combat violence against women or children will be able to use the algorithm to scan social networks, forums and other platforms to find videos that should be removed from the Web.

Sharing Technology

Facebook, in a note, explains why it was decided to open source the algorithm: “If we identify terrorist propaganda on our platforms, we remove and index it using a variety of techniques, including the algorithms we’re sharing today. Now we can share the ‘hashes’ with industry partners, including smaller companies, so they can remove the same content if it appears on their services.”

Very often, in fact, a video removed by Facebook then continues to circulate on other channels that don’t have the technical tools necessary to classify it as dangerous automatically, without a review process performed by a human.