9.2 C
New York
Thursday, March 28, 2024

How to Fix Facebook-Before It Fixes Us

If you read nothing else this weekend, read this:

How to Fix Facebook—Before It Fixes Us

An early investor explains why the social media platform’s business model is such a threat—and what to do about it.

By Roger McNamee, Washington Monthly

Excerpt: 

My familiarity with building organic engagement put me in a position to notice that something strange was going on in February 2016. The Democratic primary was getting under way in New Hampshire, and I started to notice a flood of viciously misogynistic anti-Clinton memes originating from Facebook groups supporting Bernie Sanders. I knew how to build engagement organically on Facebook. This was not organic. It appeared to be well organized, with an advertising budget. But surely the Sanders campaign wasn’t stupid enough to be pushing the memes themselves. I didn’t know what was going on, but I worried that Facebook was being used in ways that the founders did not intend.

A month later I noticed an unrelated but equally disturbing news item. A consulting firm was revealed to be scraping data about people interested in the Black Lives Matter protest movement and selling it to police departments. Only after that news came out did Facebook announce that it would cut off the company’s access to the information. That got my attention. Here was a bad actor violating Facebook’s terms of service, doing a lot of harm, and then being slapped on the wrist. Facebook wasn’t paying attention until after the damage was done. I made a note to myself to learn more.

Meanwhile, the flood of anti-Clinton memes continued all spring. I still didn’t understand what was driving it, except that the memes were viral to a degree that didn’t seem to be organic. And, as it turned out, something equally strange was happening across the Atlantic.

[…]

Algorithms that maximize attention give an advantage to negative messages. People tend to react more to inputs that land low on the brainstem. Fear and anger produce a lot more engagement and sharing than joy. The result is that the algorithms favor sensational content over substance. Of course, this has always been true for media; hence the old news adage “If it bleeds, it leads.” But for mass media, this was constrained by one-size-fits-all content and by the limitations of delivery platforms. Not so for internet platforms on smartphones. They have created billions of individual channels, each of which can be pushed further into negativity and extremism without the risk of alienating other audience members. To the contrary: the platforms help people self-segregate into like-minded filter bubbles, reducing the risk of exposure to challenging ideas.

Read the whole article here. 

Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments

Stay Connected

157,451FansLike
396,312FollowersFollow
2,280SubscribersSubscribe

Latest Articles

0
Would love your thoughts, please comment.x
()
x