The Indian government has informed Meta, the parent of social media platforms including Facebook, Instagram and WhatsApp, to complete content takedown orders within one hour after receiving instructions, Moneycontrol reported.
The Ministry Of Electronics and Information Technology (MeitY), citing examples of Silicon Valley leader Google, told Meta that it wants faster and higher compliance while removing content. Additionally, Meta was asked to add more fact-chcekers keeping the user base in the country in mind.
Sources said that the discussion took place the last week when Sir Nick Clegg, Meta's president of global affairs met with Union IT Minister Ashwini Vaishnaw and other senior officials from MeitY to discuss India's internet policy.
People aware of the development told Moneycontrol, “Clegg and others were evidently taken by a bit of surprise when Vaishnaw said that Google was doing better than Meta in terms of compliance with takedown requests. They had the view that Meta was in full adherence to the takedown norms under the IT Rules amendments of last year.”
In 2021, the government issued a set of guidelines to remove content from social media. These amendments earlier suggested that social media platforms are expected to remove content within 36 hours after receiving instructions from any institution related to the government or court. However, it now wants action to be taken much faster than the previous one, the report highlighted.
“We deeply appreciate the opportunity to discuss how Meta can work together with the government to achieve India's Techade goals. Since these were closed-door meetings, we would not be able to share more information,” the source added.
This is the second time in as many weeks that Meta has been asked by a country's government to reasses its content takedown policies.
Two weeks ago, the United Kingdom's administration also told Meta to overhaul its process for content takedowns on the basis of removal requests from state entities like the police department. This development came after the company's Oversight Board ruled that it was wrong to ban a music video on its platform, which the UK law enforcement authorities argued could "contribute a risk to offline harm".
Meta's Oversight Board, which was created last year as an independent body to oversee appeals against content takedown decisions made by the company on Facebook and Instagram, is the brainchild of former UK deputy prime minister, Sir Nick Clegg, who is a senior Meta executive currently. It comprises journalists, academics and politicians who oversee high profile content moderation cases to ensure free speech, while curbing hate speech and damaging content.
In its latest transparency report released in August 2022, the Oversight Board claimed that the number of appeals challenging content moderation decisions by Facebook and Instagram grew by 66 per cent in the January to March quarter of 2022.
A total of 480,000 cases were filed with the Board during this period, which is up from 288,440 cases that were filed in the October to December quarter of 2021. The body added that in 14 out of 20 cases it reviewed, Meta acknowledged that its “original decision was incorrect.”