Fullscreen
Loading...
 
Print

Facebook

"Among Millennials, Facebook is far and away the most common source for news about government and politics."

- Amy Mitchell, Jeffrey Gottfried, and Katerina Eva Matsa, Facebook Top Source for Political News Among Millennials, Pew Research Center, June 1, 2015



Facebook algorithms are partly programmed and partly based on machine learning, which explains why the company had to carry out research to learn that they are effectively promoting divisive content to users…

The primary job of the algorithms is to maximize user engagement, so they highlight content that achieves this and reduce the visibility of content that doesn’t.

An internal investigation found that one of the unintended consequences of this was that users were actively exposed to sensational and polarized content, as that drove them to respond. The Wall Street Journal reports that senior executives at the social network were asked to take action to limit visibility of divisive content, but chose not to do so.
[start of indented quote from WSJ article] A Facebook team had a blunt message for senior executives. The company’s algorithms weren’t bringing people together. They were driving people apart.

‘Our algorithms exploit the human brain’s attraction to divisiveness,’ read a slide from a 2018 presentation. ‘If left unchecked,’ it warned, Facebook would feed users ‘more and more divisive content in an effort to gain user attention & increase time on the platform’ […]

The high number of extremist groups was concerning, an earlier presentation says. Worse was Facebook’s realization that its algorithms were responsible for their growth, finding that ‘64% of all extremist group joins are due to our recommendation tools’ and that most of the activity came from the platform’s ‘Groups You Should Join’ and ‘Discover’ algorithms: ‘Our recommendation systems grow the problem’ […]

Facebook had kicked off an internal effort to understand how its platform shaped user behavior and how the company might address potential harms. Chief Executive Mark Zuckerberg had in public and private expressed concern about ‘sensationalism and polarization.’

But in the end, Facebook’s interest was fleeting. Mr. Zuckerberg and other senior executives largely shelved the basic research, according to previously unreported internal documents and people familiar with the effort, and weakened or blocked efforts to apply its conclusions to Facebook products. [end of indented quote from WSJ article]

The report says there were two reasons for this, in addition to the obvious one of not wanting to reduce eyeball time for ads.

First, the company took the view that it should not interfere with free speech, even if that were in the interests of users, as that would be “paternalistic.” CEO Mark Zuckerberg is said to be a particularly vigorous proponent of this argument.
- Ben Lovejoy, Facebook algorithms promote divisive content, but company decided not to act, 9To5 Mac, May 27th 2020, with quote from Jeff Horwitz and Deepa Seetharaman, Facebook Executives Shut Down Efforts to Make the Site Less Divisive, Wall Street Journal, May 26, 2020


Show php error messages