Home News How Facebook Controls Your News

How Facebook Controls Your News

by Jane Scearce

News articles have become a prominent part of the Facebook platform as the social media site pushes to keep its users on their site for needs beyond status updates. The “newsfeed” is now a hub for, well, actual news. An algorithm places news stories on your feed based on what Facebook thinks you like; the more you interact with these, the better Facebook will predict with headlines appeal to you most. The trending topics column now visible on the right side of your newsfeed gives you currently trending news topics, an obvious bid for Twitter fans. Yet rather than these additional features allowing for a rapid and extensive spread of online news, Facebook’s algorithm collection is doing quite the opposite.

The Trouble With Filtering

We’re beginning to understand that, as a species, we can’t always be trusted with the things we invent. This is particularly true of online news comment sections, where an anything goes “freedom of speech” bastardization means running the risk of harassment, hate speech, graphic images, or threats of violence. Some sites have begun to at least address and attempt to come up with solutions to the problem of commenters who create an unsafe environment for others. Similarly, Facebook’s filters try to take out any content that could be deemed offensive which, in theory, has good intentions behind it, but in practice can have a polarizing effect.

On top of the filter for potentially offensive content, Facebook presents you with content on your newsfeed based on the number of likes a piece receives as well as how people on your friends list are interacting with it. The idea is that you’ll see only the news that fits your exact interests and are generally appealing in nature, and that this is a good thing. Where this backfires is when a story breaks that contains uncomfortable, even disturbing information, and is often weeded out from the rest of the fare.

For example, the situation in Ferguson has been notably absent from many newsfeeds, presumably because Facebook’s algorithm’s deemed the topic “unsettling” and therefore a Very Bad Thing to include in most people’s newsfeeds. But the problem with this is that important news stories can often be unsettling, or uncomfortable, or controversial. And critics are saying that by “shielding” users from these kinds of stories, Facebook is creating alienating cocoons that keeps them uninformed and their views mostly unchallenged. Ideologically, it feels very 1984.

Ferguson has been largely left out of Facebook news algorithms because these algorithms function on stories with the most “appeal”, typically enforced by the “like” button. Of course, the stories that are most popular with the general public will earn lots of likes from people acknowledging it, but this has the opposite effect of prominent but “unsettling” stories suffering because no one wants to “like” something potentially uncomfortable or inhumanly tragic.

And, as is so often the case with situations like Ferguson, the general public would rather ignore something that makes them uncomfortable rather than try to understand it. So, police brutality gets buried, and Rick Perry’s mugshot climbs the charts ever higher.

You may also like