Some fake news creators do so seeking to make money from advertising. However, false information can also arise from misinformed social media posts by regular people, when they share an article or idea without first checking facts.
Join us for a closer look at who’s behind these stories, why some are pointing the finger at Facebook, and how engaging with them can have negative effects beyond shaping your immediate view.
We’ll also share what you can do to reduce the number of fake news posts on your Facebook feed. But first, a few examples.
What Does Fake News Look Like?
To discourage their spread, we’re purposefully limiting the links to examples of fake news articles and the websites that publish them. However, here are a few examples of fake news headlines that have recently been read by millions include:
1. “Thousands of Fraudulent Ballots for Clinton Uncovered.”
Just before the election, the Christian Times Newspaper ran a story which claimed that “tens of thousands” of fraudulent ballots had been found in a warehouse in Ohio.
According to the paper, the ballots had supposedly been cast for Hillary Clinton and were found inside sealed ballot boxes that would be counted alongside real election ballot boxes.
The story was disproved by Snopes—but not before it went viral, having been shared with over 6.1 million people.
2. “Megyn Kelly Was Fired After Endorsing Clinton.”
The website endingthefed.com published news that Fox anchor Megyn Kelly was fired last August after allegedly endorsing Clinton.
The story, which seems to have had no purpose other than to fuel divisiveness, was picked up by a “Colorado for Trump” Facebook page and shared with more than 724,000 people despite that Kelly was offered over $20 million a year to remain with the network.
3. “Elizabeth Warren Endorsed Bernie Sanders.”
In March, a completely fabricated New York Times site published an article claiming that Elizabeth Warren was endorsing Bernie Sanders for president. The story copied the fonts and design of the New York Times and even used the bylines of two of its political reporters.
Despite having a different URL, those cosmetic similarities were enough to fool readers.
The original fake article was shared to at least 700,000 people, as well as casting doubt on the official NY Times’ credibility—they were forced to address the fake article and clarify that the site on which it appeared was fraudulent and in no way connected with the real Times.
Fake news isn’t limited to an article format, either.
Fake news in the form of text or quotes over an image has been frequently used to spread the false impression that candidates have said something damning, like these supposed (but totally fake) quotes attributed to Trump and Hilary above.
Profit and Partisanship: The Two Driving Factors Behind Fake News
These morally-ambiguous entrepreneurs get advertising money for every person who clicks their articles—and even more, should a reader click one of the ads on the fake news site’s sidebar.
To maximize their profits, fake news writers craft political clickbait titles that are designed to evoke the knee-jerk response of sharing. The more their articles are spread, the more money these writers make.
One stateside writer, Paul Horner, has come forward to admit that he’s made his living off viral news hoaxes for several years. Paul is personally responsible for the creation of such fake news stories as the Amish committing their vote to Trump, gay wedding vans that also act as mobile abortion clinics, and that Obama banned the national anthem in public.
These claims might appear ludicrous, but together they’ve been read and shared over 100,000 times—earning Paul Horner an income of thousands of dollars per month and, even more frightening, possibly influencing voters.
Perhaps it’s the proliferation of fake news stories—or, at least, an increasing disregard for fact-checking—that influenced Eric Tucker, a 35-year-old co-founder of a marketing company in Austin, Texas, to share his unchecked hypothesis regarding local protests.
Eric had just about 40 Twitter followers when he decided to share photos of a line of buses in Austin, along with his opinion that the convoy had been used to transport in protesters to demonstrate against President-elect Trump.
He claims to have researched whether or not the buses were associated with any local conferences before sharing. However, added, “I’m also a very busy businessman, and I don’t have time to fact-check everything that I put out there, especially when I don’t think it’s going out there for wide consumption.”
Unfortunately, it turns out that Eric Tucker was wrong. The buses were hired by a company called Tableau Software, which was holding a conference that drew more than 13,000 people.
But, by the time that truth came to light, facts didn’t matter.
Mr. Tucker’s post was shared at least 16,000 times on Twitter and more than 350,000 times on Facebook, fueling a nationwide conspiracy theory and feeding the growing distrust those associated with either political party have for the other side.
Why Facebook Is Taking Heat for Fake News
Since Facebook isn’t responsible for the content users share, why are they getting caught in the crossfire? It has to do with how these, and similar, platforms decide what kind of posts and ads you see.
Facebook uses algorithms to decide what people see in their search results and news feeds. And, people are more likely to see websites and stories that are already getting attention.
During this election, people were apparently clicking a lot of attention-grabbing—but fake—stories. This has resulted in a lot of negative attention on both Google and Facebook for helping spread false news that potentially misled voters.
How Fake News Affects You
Platforms such as Facebook make money off of advertising to you, their users. They do this by collecting information about you through your actions, such as which articles you like, which ads you click on, and which types of information you mark as “not relevant.”
Here’s the problem: Facebook’s advertising model is a really effective way to promote businesses and new products because it tailors what you see to your interests. However, when it comes to news, those same algorithms are inadvertently furthering our divided ideologies.
When you like, click, and share articles hyper-partisan articles, Facebook takes note. The platform lumps you into a bucket (or interest group) to easier shape the promoted posts and ads you see in the future.
At the time of writing, over 56 million Facebook users have identified as either “very conservative” or “very liberal” buckets—classifications that are available to advertisers looking to target and promote posts.
This means that there are 56 million Facebook users—possibly yourself among them—who have already engaged with articles and posts that are polarizing. And, in doing so, are less likely to be exposed to fair and balanced news in months and years to come.
We’ll share actionable info on how you can reduce the presence of polarizing stories on your feed shortly. But first, what’s Facebook doing to help the problem of political clickbait?
What Facebook Is Doing to Battle Fake News
Earlier this week, the tech companies said they’re banning fake news sites from using their ad platform. This means websites that have been flagged as pushing out fake news no longer have access to the tools that help them make money by promoting their posts.
This move could hit fake sites where it hurts—their bank accounts. However, these actions won’t stop fake stories from showing up in search results and news feeds as they’re shared by social media users.
Additionally, political clickbait writers have gone on record to state that as soon as one website is blocked, they can easily take another one live, turning the task of taking down fake news into a game of whack-a-mole.
Further muddying the waters, Facebook founder Mark Zuckerberg is fervently denying that the platform has a fake news problem, stating that he believes 99% of the information found on Facebook is true. It later came out that his estimation had lumped the “truthiness” of personal posts in with news headlines to reach a higher average.
In his denial, Zuckerberg fired Facebook’s news-checking team—a move that has lead to the rumor of some employees meeting in secret to try and solve the problem. (Vice News Update 11/16/16)
Basically, not only is the problem of fake news far from solved, it’s likely going to get worse before it gets better.
“More Divided Than Ever” – How Fake News Is Shaping Our World Views
According to Pew Research, a whopping 62% of Americans still get their news from social media. The idea of such readily available information was once looked at as an opportunity, explains media personality Josh Zepps:
“People in the 1990s would say that once everyone is online, there’s going to be no need for difference anymore because everyone is going to be able to communicate everything, and everyone is going to able to be exposed to so many different ideas that you’re not going to be insular anymore. You’re not going to be parochial anymore, trapped in your little circle of beliefs, because with the internet, everything is at everyone’s fingertips—all the time.”
Of course, Mr. Zepps continues, what’s happened with fake news has had the opposite effect.
“The availability of communication on such a widespread scale has actually enabled people to silo themselves into little self-thinking communities of sameness so that we’re actually more divided than we’ve ever been.”
It turns out that, despite a wealth of information at our fingertips, most of us just want to engage with information that’s in line with how we already see the world—not contradicts it.
This is called confirmation bias, and you can read how it affects many aspects of your life, including your spending habits, in “Is There a Bullseye On Your Back? How Confirmation Bias Makes You a Target for Scams.”
How to Reduce Your Risk of Seeing (and Believing) Fake News on Facebook?
Fake news feeds into our polarized views, but it also depends on those biases to thrive. After all, we’re just so eager to believe the worst about the side that we don’t like!
You can lessen the likelihood of falling for fake news in three steps:
- Unfollowing polarizing, independent political Facebook pages.
- Managing your Facebook advertising preferences.
- And welcoming—or, at least exposing yourself to—opinions that aren’t in line with your views.
1. How to Unfollow Independent Political Facebook Pages
The first step in fighting fake news is reducing your exposure to polarizing news sources by “draining the swamp” of your Facebook news feed. What kind of pages should you be on the lookout for?
Examples of popular hyperpartisan, political social media groups that rarely welcome dissenting opinions include:
- Occupy Democrats
- Conservative Post
- Conservative Tribune
- Americans Against the Republican Party
- Being Liberal
- Right Wing News
- Young Conservatives
- Americans Against the Tea Party
- Freedom Daily
To start, check to see if you’ve “liked” or followed any hyper-partisan independent Facebook pages and unfollow them to reduce your chances of being exposed to fake news:
- Go to your Facebook home screen.
- Look on the left-side menu for the Pages category.
- Click Like Pages. This will take you to an overview page.
- Select the Liked Pages tab on top (mine is third from the left).
- Look for the Review Liked Pages option in the upper right corner (shown below).
- From there, you can select which pages to stop following.
2. How to Manage Your Facebook Ad Preferences
Next, remember what we explained about Facebook’s algorithms? If you’ve liked, clicked, or shared partisan articles or pages in the past, only seeking information that confirms your views, you might already be lumped into an ultra-conservative or ultra-liberal category.
This limits the perspectives that will appear on your news feed in the future and opens you up to being targeted by advertisers pushing political clickbait titles.
To reduce your exposure to political clickbait and flat-out fake news, you’ll need to manage your ad preferences. There are two ways to do so:
When you see an ad or promoted post in your page, click the or near the top-right corner. Then select “Why am I seeing this?” You’ll see an explanation of why you’re seeing the ad, and you can add or remove yourself from audiences who are shown that ad.
If you’d prefer not to be a part of that group, click Manage Your Ad Preferences to see more audiences you’re a part of that influence which ads you see on Facebook.
Alternately, if you’d prefer not to wait for an ad to come along, you can just view and adjust your ad preferences here.
3. Select Unbiased News Sources to Follow
Now that you’ve cleared out your page likes and ad preferences, you’ll see far fewer promoted political posts on your feed.
In their place, try to source non-partisan news such as BBC News, Reuters, The Independent, PBS, and CSPAN. While it can be argued that all news has some bias, these sources offer unprecedented objectivity when compared to others, such as Fox News, MSNBC, and CNN.
See Also: How to Spot & Avoid Media Bias
Bottom Line: Be Selective With Your News Sources
In 1994, Tom Maddox wrote of evolving consumerism in The Cultural Consequences of the Information Superhighway. His theory of the good and bad that could come from the vast sharing of information includes this surprisingly prophetic warning:
“We could be profiled in terrifying detail, almost casually, as a kind of side effect of network software. This is the worst scenario for the future because it implies an audience composed of inert consumers and passive paracitizens, easily manipulated by any technically adept spin doctors with access to their profiles.”
Maddox’s words might be dire, but they’re not definite. The reality is that news, like products, is something we consume. Unlike products, we seem less inclined to “shop around” for different sources of information.
Bottom line: Facebook claims to be making moves that would diminish the amount of fake news on your feed, but you can’t depend on a platform’s algorithms to determine the information that you’re exposed to.
Instead, embrace opposition and dissenting views to develop a balanced opinion. Check facts and seek out other sources. While those who rely upon hyperpartisan pages and follow political clickbait for information—whether on the right or left—risk developing worldviews based on false information.
Commentscomments powered by Disqus