A former Facebook Inc employee has said the social media giant’s products harm the mental health of some young users, stoke divisions and weaken democracy, urging United States lawmakers to regulate the company.
Whistleblower Frances Haugen told a US Senate subcommittee on Tuesday that Facebook has repeatedly misled the public about the damage it knows teenage girls suffer from its photo-sharing app Instagram, as well as how its products fuel division.
“I’m here today because I believe Facebook’s products harm children, stoke division and weaken our democracy,” Haugen said in a statement before her testimony on Capitol Hill.
“Congressional action is needed. They won’t solve this crisis without your help.”
Her testimony came a day after Facebook and two of its main services, Instagram and messaging app WhatsApp, suffered an hours-long global outage, and after weeks of mounting pressure on the social media company to explain its policies for young users.
Haugen went public in an interview with CBS on October 3 and revealed she was the one who provided documents used in a Wall Street Journal investigation and a Senate hearing on Instagram’s alleged harm.
The WSJ stories showed the company contributed to increased polarisation online when it made changes to its content algorithm; failed to take steps to reduce vaccine hesitancy, and was aware that Instagram harmed the mental health of teenage girls.
“As long as Facebook is operating in the shadows, hiding its research from public scrutiny, it is unaccountable,” Haugen told the panel on Tuesday.
“Until the incentives change, Facebook will not change. Left alone, Facebook will continue to make choices that go against the common good,” she said. “Facebook hides behind walls that keep researchers and regulators from understanding the true dynamics of their system.”
Facebook spokesman Kevin McAlister said in an email to the Reuters news agency that the company sees protecting its community as more important than maximising profits.
He also said it was not accurate to say that leaked internal research demonstrated that Instagram was “toxic” for teenage girls.
That echoed testimony Facebook’s head of global security, Antigone Davis, delivered before the same Senate committee last week. “We care deeply about the safety and security of the people on our platform,” Davis said at that time.
“We take the issue very seriously … We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17.”
But at Tuesday’s hearing, US senators accused Facebook CEO Mark Zuckerberg of pushing for higher profits while being cavalier about user safety. They also demanded that US regulators investigate Haugen’s accusations that the company’s products harm children and stoke divisions.
In an era of deep political divisions in Washington, DC, both Republican and Democratic lawmakers agreed on the need for big changes.
In an opening statement, Democratic Senator Richard Blumenthal, who chairs the subcommittee holding the hearing, said Facebook knew that its products were addictive, like cigarettes.
“Tech now faces that Big Tobacco jaw-dropping moment of truth,” Blumenthal said.
He called for Zuckerberg to come before the committee, and for the Securities and Exchange Commission and Federal Trade Commission to investigate the company.
“Our children are the ones who are victims. Teens today looking in the mirror feel doubt and insecurity. Mark Zuckerberg ought to be looking at himself in the mirror,” Blumenthal said.
Senator Marsha Blackburn, the top Republican on the subcommittee, said that Facebook turned a blind eye to children below age 13 on its sites. “It is clear that Facebook prioritises profit over the wellbeing of children and all users,” Blackburn said.
Al Jazeera’s Shihab Rattansi, reporting from Capitol Hill, said regulating content on Facebook and other social media platforms will be tricky for Congress, however, due to the US’s First Amendment protections of free speech.
“The question becomes, ‘Well what criteria will be used and who will have oversight of that’,” Rattansi said.
Still, Jason Kint, CEO of the Digital Content Next trade organisation, said Tuesday’s hearing was significant. “What’s different about this moment is we have evidence coming from inside the building,” he told Al Jazeera.
“What this hearing provides is that evidence that they knew and that there was actual empirical data supporting all of these downstream harms of the way the platform works.”