Meta
, the parent company of
and
, has released its monthly
compliance report
for India for the month of November. The company revealed that it took down over 23 million pieces of content across both platforms in the month.
Notably, the company removed 37 million pieces of
bad content
in the preceding month, representing a significant decrease on a monthly basis.
The report also details the content moderation efforts across 13 policies for Facebook and 12 for Instagram.
It also highlights Meta's response to user reports received through the Indian grievance mechanism.
Content removal on Facebook
According to the company, over 18.3 million pieces of content were removed from Facebook. The platform received 21,149 reports, with tools provided to resolve 10,710 user issues.
Specialised Review: Of the remaining 10,739 reports, 4,538 required specialised review and resulted in content removal.
Content removal on Instagram
Just like on Facebook, over 4.7 million pieces of content were removed from Instagram and the photo and video-sharing platform received 11,138 reports, with tools provided to resolve 4,209 user issues.
Specialised Review: Of the remaining 6,929 reports, 4,107 required specialised review and resulted in content removal.
“Of the other 6,929 reports where specialised review was needed, we reviewed content as per our policies, and we took action on 4,107 reports in total. The remaining 2,822 reports were reviewed but may not have been actioned,” it said.
Meta emphasised that the number of pieces removed represents content violating its standards, including removal or warnings for disturbing content.
Every month, large social media companies, including Meta and X (formerly Twitter) must release a compliance report after the implementation of the IT Rules 2021 in India.