Gettyimage |
Facebook said the billions of account that it removed over the six-month period, were attributed to "unsolicited" bad actor that were caught creating spam profiles and were "never considered active". Facebook said that the figures didn't count towards the total number of monthly users.
The Transparency report also detailed the prevalence of hate speech, graphic photos, videos and other abuse on its platform.
According to the social media giant, between the months of October and March, it said that it removed or labelled about 11.1 million pieces of terrorist content, 52.3 million instances of violent or graphic content and 7.3 million posts, photos or other uploads containing hate speech.
The report also revealed, for the first time, that its efforts in fighting prohibited posts about guns, ammunition and drugs. The social media giant said that it removed about 1.4 million contents that violated its rules against selling guns and ammunition, while about 1.5 million contents about drugs including marijuana were banned.
"We're able to do things that are not possible for other companies to do," Facebooks CEO Mark Zuckerberg told reporters, pointing at the company's heightened investments in safety and security. "When you look at it, we really need to decide what issues we think are the most important to address. In some ways, some of the remedies cut against each other."
Following the attacks on two mosques in Christchurch, New Zealand which were broadcast live on Facebook, the social media platform has been under heavy criticism for not being able to police its platform against online extremism.
Facebook says it will start to include data from its photo-sharing service, Instagram in its Transparency report every quarter.