ADVERTISEMENT

Facebook reportedly had evidence its algorithms were dividing people, but top executives killed or weakened proposed solutions (FB)

Facebook had internal research showing it encouraged polarization, but Mark Zuckerberg and other top executives rejected ideas aimed at fixing the problem, the The Wall Street Journal reported.

facebook ceo mark zuckerberg
  • "Our algorithms exploit the human brain's attraction to divisiveness," one report concluded, according to The Wall Street Journal.
  • But Zuckerberg and policy chief Joel Kaplan repeatedly nixed proposed solutions either because they feared appearing biased against conservatives or simply lost interest in solving the problem, the The Wall Street Journal reported.
  • Facebook has come under increasing pressure to address toxic content and polarization on its platform during the coronavirus pandemic and with the looming 2020 presidential election .
  • Visit Business Insider's homepage for more stories .

Facebook had evidence suggesting that the company's algorithms encourage polarization and "exploit the human brain's attraction to divisiveness," but top executives including CEO Mark Zuckerberg killed or weakened proposed solutions, The Wall Street Journal reported Tuesday.

The effort to better understand Facebook's impact on user behavior started in response to the Cambridge Analytica scandal, and its internal researchers determined that, contrary to the company's mission of connecting the world, its products were having the opposite effect, according to the paper.

One 2016 report found "64% of all extremist group joins are due to our recommendation tools," with most people joining at the suggestion of Facebook's "Groups You Should Join" and "Discover" algorithms. Researchers noted that "our recommendation systems grow the problem," according to the paper.

ADVERTISEMENT

The Wall Street Journal reported that Facebook teams pitched multiple fixes, including: limiting the spread of information from groups' most hyperactive and hyperpartisan users; suggesting a wider variety of groups than users might normally encounter; and creating subgroups for heated debates to prevent them from derailing entire groups.

However, these proposals were often nixed entirely or significantly diluted by Zuckerberg and policy chief Joel Kaplan, according to the paper, which reported that Zuckerberg eventually lost interest in trying to address the polarization problem and was concerned about the potential for solutions to limit user growth.

In response to the pitch about limiting the virality of hyperpartisan users' posts, Zuckerberg reportedly asked the team to not bring something like that to him again.

The company's researchers also determined that, due to a larger presence of far-right accounts and pages publishing content on Facebook, any changes including apolitical tweaks, like reducing clickbait would have a larger net effect on conservatives.

That concern worried Kaplan in particular , who previously halted a project called "Common Ground" that aimed to encourage healthier political discourse on the platform.

ADVERTISEMENT

Ultimately, many of the team's efforts weren't incorporated into Facebook's products, with managers telling employees in September 2018 that the company was pivoting "away from societal good to individual value," according to the Wall Street Journal.

"We've learned a lot since 2016 and are not the same company today," a Facebook spokeswoman told the paper. "We've built a robust integrity team, strengthened our policies and practices to limit harmful content, and used research to understand our platform's impact on society so we continue to improve."

Facebook has come under fire repeatedly from critics who say the company hasn't done enough to limit the spread of harmful content on its platform. That topic has come into sharper focus due to coronavirus-related misinformation running rampant on social media platforms and the looming 2020 presidential election .

See Also:

ADVERTISEMENT

JOIN OUR PULSE COMMUNITY!

Unblock notifications in browser settings.
ADVERTISEMENT

Eyewitness? Submit your stories now via social or:

Email: eyewitness@pulse.com.gh

ADVERTISEMENT