Northwest Arkansas Democrat-Gazette

Facebook staffers fault own products

Some say false stories, hate amplified

COMPILED BY DEMOCRAT-GAZETTE STAFF FROM WIRE REPORTS

For years, Facebook has fought back against allegations that its platforms play an outsize role in the spread of false information and harmful content that has fueled conspiracies, political divisions and distrust in science, including covid-19 vaccines.

But research, analysis and commentary contained in a vast trove of internal documents indicate that the company’s own employees have

studied and debated the issue of misinformation and harmful content at length, and many of them have reached the same conclusion: Facebook’s own products and policies make the problem worse.

In 2019, for instance, Facebook created a fake account for a fictional, 41-yearold North Carolina mom named Carol, who followed then- President Donald Trump and Fox News, to study misinformation and polarization risks in its recommendation systems. Within a day, the woman’s account was directed to “polarizing” content, and within a week to conspiracies including QAnon.

“The content in this account ( followed primarily via various recommendation systems!) devolved to a quite troubling, polarizing state in an extremely short amount of time,” according to a Facebook memo analyzing the fictional U.S. woman’s account. When a similar experiment was conducted in India, a test account representing a 21-year- old woman was in short order directed to pictures of graphic violence and doctored images of Indian airstrikes in Pakistan.

Memos, reports, internal discussions and other examples contained in the documents suggest that some of Facebook’s core product features contribute to the spread of false and polarizing information globally and that suggestions to fix them can face significant internal challenges.

Facebook’s efforts to quell misinformation and harmful content, meanwhile, have sometimes been undercut by political considerations, the documents indicate.

“We have evidence from a variety of sources that hate speech, divisive political speech, and misinformation on Facebook and the family of apps are affecting societies around the world,” an employee noted in an internal discussion about a report titled “What is Collateral Damage?” “We also have compelling evidence that our core product mechanisms, such as virality, recommendations and optimizing for engagement, are a significant part of why these types of speech flourish on the platform.”

The documents were disclosed to the U.S. Securities and Exchange Commission and provided to Congress in redacted form by whistleblower Frances Haugen’s legal counsel. The redacted versions were obtained by a consortium of news organizations.

Some of the documents have been previously reported by the Wall Street Journal, BuzzFeed News and other media outlets.

Chief Executive Officer Mark Zuckerberg made only a brief mention Monday of what he called the “recent debate around our company.” Largely repeating statements he made after Haugen’s Oct. 5 testimony before a U.S. Senate subcommittee, he insisted that he welcomes “good-faith criticism” but considers the current storm a “coordinated effort” by news organizations to criticize the company based on leaked documents.

Haugen, meanwhile, told a British parliamentary committee Monday that the social media giant stokes online hate and extremism, fails to protect children from harmful content and lacks any incentive to fix the problems, providing momentum for efforts by European governments working on stricter regulation of tech companies.

‘MAKING HATE WORSE’

Haugen told the committee of United Kingdom lawmakers that Facebook Groups amplifies hate, saying algorithms that prioritize engagement take people with mainstream interests and push them to the extremes. The former Facebook data scientist said the company could add moderators to prevent groups over a certain size from being used to spread extremist views.

“Unquestionably, it’s making hate worse,” she said.

The increase in polarization predates social media, and despite serious academic research there isn’t much consensus, the company says.

The documents reflect a company culture that values open debate and disagreement and is driven by the relentless collection and analysis of data. But the resulting output, which often lays bare the company’s shortcomings in stark terms, could create a serious challenge ahead: A whistle-blower complaint filed to the SEC, which is included in the cache of documents, alleges, “Facebook knows that its products make hate speech and misinformation worse” and that it has misrepresented that fact repeatedly to investors and the public.

“We’ve known for over a year now that our recommendation systems can very quickly lead users down the path to conspiracy theories and groups,” a Facebook employee wrote on their final day in August 2020.

Facebook said in a statement that projects go through rigorous reviews and debates so that Facebook can be confident in any potential changes and its impact on people. In the end, the company ended up implementing many of the ideas raised in this story, according to the statement.

BIG JOB

Facebook has provided some details on ways it has succeeded at curbing misinformation.

For instance, it disabled more than 1.3 billion accounts between October and December 2020 amid the contentious U.S. presidential election. And over the past three years, the company removed more than 100 networks for coordinated inauthentic behavior, when groups of pages or people work together to mislead people, said Guy Rosen, vice president of integrity.

And yet, aside from the challenges of trying to monitor a colossal volume of data, the company’s system for screening and removing false and potentially harmful claims has significant flaws, according to the documents.

For instance, political concerns can shape how Facebook reacts to false postings. In one September 2019 incident, a decision to remove a video posted by the anti-abortion group Live Action was overturned “after several calls from Republican senators.” The video, which claimed incorrectly that “abortion was never medically necessary,” was reposted after Facebook declared it “not eligible for fact-checking,” according to one of the documents.

Facebook employees repeatedly cite policies and products at Facebook that they believe have contributed to misinformation and harmful conduct.

For instance, employees have cited the fact that misinformation contained in comments to other posts is scrutinized far less carefully than the posts themselves, even though comments have a powerful sway over users.

Many of the employees’ suggestions pertain to Facebook’s algorithms.

CHANGE BACKFIRED

In 2018, the company changed the ranking for its News Feed to prioritize meaningful social interactions and deprioritize things like viral videos, according to its statement. That change led to a decrease in time spent on Facebook, according to the statement, which noted it wasn’t the kind of thing a company would do if it was simply trying to drive people to use the service more.

In internal surveys, Facebook users report that their experience on the platform has worsened since the change, and they say it doesn’t give them the kind of content they would prefer to see. Several tests by the company indicate that it quickly led users to content supporting conspiracy theories or denigrating other groups.

“As long as we continue to optimize for overall engagement and not solely what we believe individual users will value, we have an obligation to consider what the effect of optimizing for business outcomes has on the societies we engage in,” one employee argued in a report called “We are Responsible for Viral Content,” in December 2019.

Similarly, after The New York Times published an oped in January 2021, shortly after the raid on the U.S. Capitol, explaining how Facebook’s algorithms entice users to share extreme views by rewarding them with likes and shares, an employee noted that the article mirrored other research and called it “a problematic side-effect of the architecture of Facebook as a whole.”

Front Page

en-us

2021-10-26T07:00:00.0000000Z

2021-10-26T07:00:00.0000000Z

https://edition.nwaonline.com/article/281500754460421

WEHCO Media