Facebook And Instagram Removed More Than 12 Million Pieces Of Child Porn

5927

Facebook removed tens of millions of posts, photos and videos over the past six months for violating its terms of service that restrict the use of child pornography, drug and gun sales, and terrorism

Photo: Depositphotos.com/InkDropCreative

The company revealed its sweeping efforts to police its own network in its latest biannual transparency report, which includes data from Instagram for the first time. Facebook says it identifies most of the content automatically using artificial intelligence and advanced software before users can ever see it.

Perhaps the most shocking detail in the report is the dramatic increase in the removal of child pornography. Facebook says that it removed about 11.6 million pieces of content it deemed to be child nudity and sexual exploitation of children in the third quarter, roughly double the number of pieces of content removed for the same reason in the first quarter.

On Instagram, it removed 1.2 million photos and videos involving child nudity or exploitation over the second and third quarters.

“Just because we’re reporting big numbers doesn’t mean there’s so much more harmful content happening on our services than others. ... What it says is we’re working harder to identify this and take action on it." Mark Zuckerberg

Facebook CEO Mark Zuckerberg has faced increasing scrutiny over his failed efforts to curtail the spread of child exploitation on Facebook, which came to a crescendo at a congressional hearing last month. Representative Ann Wagner (R-MO) scolded Zuckerberg at the hearing, saying, “You are not working hard enough, and end-to-end encryption is not going to help the problem.”

“We work harder than any other company to identify this behavior,” Zuckerberg said in response.

FBI Director Christopher Wray has also been critical of Facebook’s plan to encrypt all of its users’ messages in the near future. Wray says that it would be a “dream come true” for child pornographers.

The New York Times in September claimed that Facebook Messenger was responsible for nearly 12 million of the 18.4 million worldwide reports of child sexual abuse material.

During a press call with reporters on Wednesday, Zuckerberg responded to concerns about the vast amount of child exploitation content found on the company’s networks, claiming that it was actually a positive thing for the industry and society.

“There’s this inverted incentive to look at the numbers that we’re putting out and come to the conclusion that just because we’re reporting big numbers, there’s so much more harmful content on our services than others,” Zuckerberg said. “But what it says is we’re working harder to identify and take action on this [type of content] more than others are.”

Zuckerberg also defended his decision to encrypt all user messages across Facebook services when he was asked about how it would affect the company’s ability to police content.

“At a high level you’re pointing to a real tension, which is that encryption makes it harder for us to see some of the content in the services,” he said. “But the encryption we’re proposing wouldn’t make it any harder for us to prevent any of the content [that violates our terms of service].”

He said content violations can often be detected through behavior patterns, such as who is messaging whom. “We can look at patterns of bad activity,” he said. “We look at the patterns of activity of the accounts.”

Michael Nuñez, Forbes Staff

   Если вы обнаружили ошибку или опечатку, выделите фрагмент текста с ошибкой и нажмите CTRL+Enter

Орфографическая ошибка в тексте:

Отмена Отправить