More than a year ago now, Facebook unveiled a new technology to detect abusive content.

While Microsoft has more recently unveiled its “Artemis” project to protect children from abuse on the internet, more and more videos and images are being reported and deleted by major social platforms for sexual abuse, especially of minors.

164 companies have reported sexual abuse

According to John Shehan, Vice President of the National Center for Missing and Exploited Children, “These figures show that any service provider who allows individuals to host images and videos is likely to publish child sexual exploitation material.

Overall, no fewer than 164 companies have reported content. Among these companies, Google reported more than 3.5 million images and videos, Yahoo more than 2 million, Imgur more than 260,000. Dropbox, Microsoft, Snap, and Twitter also reported more than 100,000 last year.

Apple, on the other hand, only reported just over 3,000 in total. These are only images; no videos were reported. This shows how difficult it is for the company to analyze the content sent from the messaging application.

In response to the various reports made by the companies, Alex Stamos, who was head of information security for both Facebook and Yahoo, said that “If all the companies involved were as aggressively looking at content like Facebook, the number of reports could be 50 million, or even 100 million”, indicating that there is still far too much content that is not reported by companies, which do not always have sufficiently powerful tools.

70 million images reported in 2019

As the New York Times reports, technology platforms remain infested with illegal content. In 2019, the number of reported content on the platforms increased by more than 50%, with a total of 70 million images reported to the National Center for Missing and Exploited Children. That is a clearinghouse designated by the US federal government that works with law enforcement.

Video is a format that platforms are more successful in detecting. However, it is also the most popular format with predators, so there were 41 million videos reported in 2019. That’s a lot more than five years ago when the number of reports was less than 350,000.

In 2019, Facebook found 60 million pieces of content, a mix of photos and videos, which represents 85% of all reports. This shows several elements. On the one hand that the social network is very popular and has a vast number of users, but also that it has an aggressive approach to limiting undesirable content, an approach which, however, does not exclude the activity of predators. For Antigone Davis, global head of security at Facebook, it is essential to continue to develop the best solutions to ensure the safety of more children.