Facebook had always been cognizant of all the incendiary anti-Muslim content posted on its platforms in India by members of right-wing Hindu nationalists but turned a blind eye towards it, whistleblower and former Facebook employee, Frances Haugen, has revealed.
Citing internal company documents, she said that Facebook had categorized the content Rashtriya Swayamsevak Sangh (RSS) users, groups, and pages as “fear-mongering content.”
However, after taking into account “political consequences,” Facebook decided against designating the RSS users, groups, and pages and flagging the incendiary content shared against Muslims.
Had Facebook taken the difficult decision of categorizing RSS-related content without fearing political repercussions, it would have resulted in increased monitoring of RSS users, groups, and pages on all of its platforms.
Quoting one example from the internal company documents, Haugen said that Facebook had identified a number of anti-Muslim posts on its platforms that compared Muslims to ‘pigs’ and ‘dogs’ and disseminated misinformation claiming that Quran calls for ‘men to rape their female family members.’ However, Facebook decided not to take any action against such posts.
Complaints Filed Against Facebook
Frances Haugen has lodged 8 complaints against Facebook in the US Securities and Exchange Commission. All of her complaints are based on thousands of internal company documents that she secretly copied before leaving Facebook this year in May.
4 out of 8 complaints against Facebook are related to content linked with India. Facebook has only classified three countries as a “Tier-0 Country,” meaning that the company is required to closely monitor the content shared on its platforms in these countries during important election events. India is one of them while the US and Brazil are the other countries on this list.
However, 87% of the Facebook resources available for this purpose are allocated for the US while the remaining 13% is used for the rest of the world.
Despite allocating most of its resources to monitor content in the US, Facebook failed to curb misinformation and violent extremism linked with the 2020 US Presidential Elections and Capitol Riots in January this year.
In response to the cases filed by Frances Haugen, the Director of Policy Communications Facebook, Lena Pietsch, said that Facebook must balance protecting the right of billions of people to express themselves openly with the need of keeping its platforms safe.
She noted that Facebook is making improvements to tackle the spread of fake news and harmful content, adding that to suggest Facebook encourages hateful content and does nothing about is just not true.