Facebook whistleblower “shocked” by company investment in metaverse


Facebook whistleblower Frances Haugen was “shocked” when she learned that the company was planning to hire 10,000 engineers in Europe to work on the metaverse, a version of the Internet based on virtual and augmented reality, while her money would be better spent on security, she said. British lawmakers Monday.

“I was shocked to learn that Facebook wants to overtake the Metaverse and that they are going to hire 10,000 engineers in Europe to work on the Metaverse,” she said. “Because I was like, ‘Wow, do you know what we could have done with security if we had 10,000 more engineers? It would have been incredible.

Haugen, 37, a former product manager of Facebook’s civic disinformation team, made the statement while testifying before a joint committee, which met to discuss an online safety bill, legislation to regulate social media. His testimony will inform the committee’s work on developing the bill, which is due to be submitted to Parliament for approval in 2022.

The bill would hold social media companies accountable for harmful content shared on their platforms, which would lead to significant fines in the UK if they do not remove content such as child sexual abuse and terrorist content.

“There is a vision within the company that security is a cost center, it is not a growth center, which I think is very short term because Facebook’s own research has Shown that when people have worse experiences with integrity on the site, they are less likely to retain, ”added Haugen.

Facebook spokesperson Drew Pusateri noted that the company has invested heavily in efforts to improve its platform.

“This criticism makes no sense at first glance given that almost all of the documents deal with the details of the work people are doing to better understand how to meet our challenges,” Pusateri said in an email.

Facebook CEO Mark Zuckerberg pushed back against ideas that the company has prioritized growth over security.

“We care deeply about issues such as safety, well-being and mental health,” he said in a Facebook post earlier this month after Haugen testified before US lawmakers. “If we wanted to ignore research, why would we create a cutting edge research program to understand these important issues in the first place? ”

Facebook told NBC News that it has spent $ 13 billion since 2016 to fight bad content and that it employs 40,000 people to work on safety and security.

Monday’s hearing took place the same day a consortium of 17 media outlets, including NBC News, released dozens of reports based on tens of thousands of pages of leaked internal documents obtained by Haugen, which included disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by its legal counsel.

Haugen attributed many of Facebook’s failures to effectively control its platform to the company’s “culture of positivity”.

“There is a philosophy that people focus on the good,” she said. “And that’s not always a bad thing, but the problem is, when it’s so intense that it discourages people from looking into tough questions, it becomes dangerous.”

She said Silicon Valley-based Facebook employees are likely to check out their own Facebook news feeds and see “a nice place where people are nice to each other,” and they don’t see the damage. caused to society elsewhere in the world, such as Ethiopia, where leaked documents show the platform is being used to spread hate speech and incite violence without adequate moderation of content in multiple local languages.

While researchers on Facebook’s integrity teams are keenly aware of the damage amplified by the platform, that information doesn’t necessarily reach company executives, Haugen said.

“The good news is coming but not the bad,” she said. “Executives see all the good they generate and they can deduct the bad as the costs of doing all the good.”

She added that employee voices highlighting the damage and calling for mitigation strategies “are not amplified internally because they slow down the growth of the business a bit and it’s a growth-favoring business.”

“Frances Haugen’s evidence so far has strengthened the case for an independent regulator with the power to audit and inspect large tech companies,” said Damian Collins, UK lawmaker and chairman of the joint committee, in a press release issued before the hearing. “There needs to be more transparency about the decisions companies like Facebook make when trading user safety for user engagement.”

During Haugen’s appearance on Capitol Hill on October 5, she called on lawmakers to demand more transparency from the social media giant.

“I believe Facebook’s products harm children, fuel division, weaken our democracy and more,” she told the Senate Trade Subcommittee on Consumer Protection.


Leave A Reply