Fakebook: Why Social Media Companies Need to Curb the Spread of Misinformation

Rylee Tan | The Loyola PhoenixSocial media giants such as Twitter and Facebook knowingly push false news to users, causing increased political divisiveness in America.

The Loyola Phoenix is committed to publishing opinion pieces that represent many diverse perspectives and viewpoints. If you have an interest in submitting a piece or writing for us, email phoenixopinion@luc.edu.

In a year defined by a divisive presidential election, racism and a global pandemic, never has there been a time where credible news was needed more. Despite this, there seems to be a lack of consensus on what the facts are that grip our country today.

Simple questions such as “Where was Kamala Harris born?”  and “How many people have actually died from coronavirus?” are often debated. There’s an increasing sense of division and confusion in the U.S. concerning what the facts are. The source of this is a lack of consistent, reliable news from social media companies.

Approximately 43 percent of adults in the U.S. get their news primarily from social media, according to Pew Research Center. For the first time in U.S. history, more people rely on social media for news than any sort of print journalism. The emergence of social media as a primary news source has led to everything from knowing your uncle’s political beliefs to having access to a wide range of coronavirus theories. What it’s also done, however, is give social media companies, such as Facebook and Twitter, a strong business incentive to give individuals news they know people will interact with.

Nearly all social media companies collect large amounts of data on their users, according to Pew Research Center. This data is often used to predict peoples’ interests and generate ads based on their interests. With 66 percent of social media users engaging in some sort of political post, companies like Facebook and Twitter capitalize from giving users false news stories they are likely to engage with.

One Massachusetts Institute of Technology (MIT) Twitter study found false news stories are 70 percent more likely to be retweeted than true stories. By the same token, it takes true stories six times longer to reach individuals than false news. The term “fake news” can often be misconstrued as a political tactic used to discredit an individual or organization. In the MIT study, they use the term “false news” to describe news that contains verifiably incorrect information rather than using the term fake news which can have many broad meanings.

The study concluded one of the main reasons false news spreads faster than real news on social media is because of novelty. People like hearing new things and always being in the know. Many times, true news stories can be boring and unchanging, making it less likely for users to interact with such news, the study said.

This has created a world where some of America’s top news sources perpetuate stories that are sometimes false. Often, people are completely unaware the news they’re consuming is false, and people with different political beliefs likely have entirely different news feeds.  In the past two decades, the percentage of people who consistently hold conservative or liberal beliefs — rather than a mix of the two — rose by 100 percent, according to researchers at the University of Cambridge. This suggests the rise of political partisanship and news on social media are directly related. 

Similarly, politicians with more extremist ideologies generally attracted larger social media audiences than their more moderate counterparts, according to a Harvard study on political polarization on Twitter. This is a proof point of the so-called “echo chamber” of opinions and extreme ideologies promoted by social media to give people their desired news

President Donald Trump has publicly stated he uses Twitter as a way to “fight back” when he feels a story is inaccurate, according to a 60 minutes interview in 2016. Though potentially useful at times, this philosophy opens the doors for people and organizations to spread misinformation.

It doesn’t take an expert to figure out social media has increased divisiveness in the U.S. In fact, the majority of the country believes social media does more to divide than it doesn’t, according to a 2019 Wall Street Journal poll. What there isn’t among the public, however, is a desire for reckoning on behalf of social media companies and government institutions to do more to curb the spread of misinformation. 

Social media companies are well aware of their contribution to political divisiveness. Facebook’s recent decision to shut down all new political ads the week before the 2020 Presidential Election, according to The Washington Post, is an acknowledgment something needs to be done to restore truth in news. Rather than banning new ads the week before the election, Facebook could make it so all new news stories are screened for false information before they are spread around.

Likewise, government regulations can be put in place to assure all social media companies are all on a level playing field when it comes to controlling the spread of misinformation. Though shareholder pressure is never a good reason to compromise democracy, it realistically makes it difficult for social media companies to financially justify changing their algorithms. Thus, government regulation could be very useful here to ensure all companies are held to the same standard.

As this country heads into one of the most highly contested presidential elections in recent history, it’s important to remember that we’re not as divided as it may seem. If you notice that your friends and relatives are spreading around crazy conspiracy theories and hoaxes, it likely isn’t out of malice. Rather, they are probably consuming an entirely different news feed that they believe to be true. At the end of the day, it’s hard to achieve progress when we can’t agree on where we are at the moment — a problem that begins and ends with social media.

(Visited 786 times, 3 visits today)
Next Story