A whistleblower, who leaked a selection of internal documents from Facebook, has claimed that "Facebook over and over again, has shown it chooses profit over safety". His revelations coincided with a global outage that affected Facebook, Instagram, and WhatsApp, which back online on Tuesday after almost six hours.

A cache of documents from within Facebook were leaked to the Wall Street Journal by a whistleblower, prompting an exhaustive probe into the social network.

facebook logo
facebook logo

Following the publishing of initial investigatory efforts into the documents, the whistleblower - Frances Haugen - has come forward in an interview, explaining more about the tech giant's workings, and why she chose to release the data, reports AppleInsider.

That data release led to a series of reports, including one that revealed Facebook allegedly knew that Instagram was bad for the well-being of teenagers. That report led to a hearing with the Senate Commerce Committee's consumer protection subcommittee on mental health in teenagers.

Shocking revelations by whistleblower

In an interview with 60 Minutes on Sunday, Haugen doubled down on the document trove seemingly proving Facebook cared more about its algorithms than dealing with hate speech and other toxic content, the report said.

Haugen was formerly a product manager at the company, working in its Civic Integrity Group, but departed after the group was dissolved.

Facebook logo

Before her departure, she copied tens of thousands of pages of internal research, which she claims demonstrates Facebook lies to the public about its progress against hate, violence, and misinformation.

"I knew what my future looked like if I continued to stay inside of Facebook, which is person after person after person has tackled this inside of Facebook and ground themselves into the ground," said Haugen over her decision to release the documents.

"At some point in 2021, I realised Ok, I'm gonna have to do this in a systemic way, and I have to get out enough that no-one can question that this is real," the data scientist said.

The documents included studies performed internally by Facebook about its services, with one determining that Facebook had failed to act on hateful content.

(With inputs from IANS)