Meta human rights report glances over issues in India, puts blame on users
In its first human rights report published earlier today, Meta, the parent company for Facebook, Instagram and WhatsApp among others, sought to detail efforts that it has taken that could affect the impact that its social networks have on human rights around the world. As part of its global overview of efforts put by the company in this regard, while an external audit said Meta could have made a clear impact on numerous issues affecting critical public affairs, it leaves an ambiguous response in terms of what it could do about these issues, going forward.
India survey’s findings
As described by Meta, the impact assessment study in India conducted on its behalf by US-based law firm Foley Hoag LLP said that Facebook and associated social platforms had the “potential” to be connected to myriad human rights issues “caused by third parties”. The range of these issues, as described by what Foley Hoag found in India between March 2020 and June 2021, include “restrictions of freedom of expression and information, third party advocacy of hatred that incites hostility, discrimination or violence, rights to non-discrimination, and violations of rights to privacy and security of person.”
While the findings also noted hate speech and discriminatory content to be among the issues, Meta’s description puts the pin on the “end user” for creating such content – rather than issues with its own content moderation, and enforcement of community and content policies.
Meta further said that Foley Hoag’s assessment study in India found “difference” in what its own employees think of its content policy – and what those outside the company do. Issues with regards to content policy on Facebook, Instagram and WhatsApp have been a part of numerous controversies in the public domain, and in October 2021, the company’s India hierarchy received summons from the Ministry of Electronics and Information Technology (Meity) to explain allegations of uncontrolled hate speech, inflammatory and communally divisive content on the platform.
While the assessment also said that there were numerous reports of Facebook’s content moderation efforts containing bias, it added that the company faces various challenges in India – including “user education, difficulties of reporting and reviewing content, and challenges of enforcing content policies across different languages.”
‘Back-patting, inadequate’
The issues that Meta has had in India have been widely reported. In November last year, a report by The New York Times cited an internal Facebook study from February 2019, which said that the lack of adequate moderation in the country meant that news feeds of users in India were “a near constant barrage of polarizing nationalist content, misinformation, and violence and gore.”
India, incidentally, is Meta’s biggest market by the number of users. As per data from January 2022 from Statista, Facebook, Instagram and WhatsApp had 239 million, 231 million and 487 million users in India, respectively.
Anupam Shukla, partner at Mumbai-based law firm Pioneer Legal, said that Meta’s assessment of its issues in India are “extremely self-congratulatory, and back-patting in nature”. However, what Shukla said is the bigger issue is that the report does not take any onus or initiative on the risks that the report highlights.
“While Meta mentions the issues that its India survey found, it largely states that it is not to be blamed for most of it – and puts the blame on its users. While it did not have any legal obligation as such to disclose its human rights actions, it fails to meet expectations in terms of what its responsibility is – to the millions of people who use its platforms,” Shukla said.
Meta’s actions
In response, Meta said in its report that it has “taken steps” to build a larger human rights team in India, and that it “aims to ensure there is appropriate regional expertise” in such a team.
“In addition, in 2020-2022, Meta significantly increased its India-related content moderation workforce and language support. As of the time of writing, Meta had reviewers across 21 Indian languages, with both language and cultural expertise, as well as multiple classifiers. Translating widely used languages has substantial practical benefits for our users,” Meta said in the human rights report.
The company also cited its transparency reports in India and the offering of end-to-end encryption in its chat apps as actions that it has taken to “try to prevent or mitigate acts of abuse”. However, Miranda Sissons, director of human rights at Meta, told The Wall Street Journal that the company has no plan to release the full study of its state of affairs in India any time in the future – at least for now.