Why You Should Think Twice About What You Read on Facebook

Update: Since the publishing of this story, Facebook has taken several steps to combat fake news on the site. The company, along with Google, will "no longer allow fake news sites to use their ad-selling services," reports CNN. Mark Zuckerberg also addressed the controversy again in a Facebook post on Nov. 18. In it, the CEO discusses how it usually leaves the community to decide what's fake on the site. However, if the community is spreading something false, the News Feed "penalizes this content."

Zuckerberg also outlined the seven steps the company is taking to stop the spread of fake news. It involves developing better tools to flag fake news, creating a simpler process for people to report fake news, working with more sources that fact-check news, "showing warnings" when someone decides to read fake news, showing higher-quality content when it comes to related articles, "disrupting" how fake news sites can make money on the site, and lastly, listening and learning from media companies that already do rigorous fact-checking.

The post has more than 131,000 reactions, 9,000 comments, and 10,000 shares. Read it for yourself below.

Original story: A study in May found that 44 percent of US adults get their news from Facebook, which means you — yes, you — quite possibly clicked on this story from your news feed. There's nothing wrong with that; I also get my news from Facebook and the links my friends share. However, what you and I see on our news feed may look extremely different. Though Facebook continues to insist that it is not a media company, the fact that it's how a significant number of Americans get their news suggests otherwise. In its media-company role, Facebook helped disseminate misinformation throughout the election, failing its own mission statement and letting down its users.

On Facebook's own profile page, the company includes its mission statement:

Founded in 2004, Facebook's mission is to give people the power to share and make the world more open and connected. People use Facebook to stay connected with friends and family, to discover what's going on in the world, and to share and express what matters to them.

Based on this statement, it's clear that the company has a duty to you — the user — to "discover what's going on in the world." But is the company failing us if it allows you to find fake news and share it with all your Facebook friends?

Facebook's algorithm surfaced fake news.

After the company fired the human curators who were running Facebook's "Trending Topics" section, the section has repeatedly surfaced fake news. Did you happen to read how Pope Francis endorsed Donald Trump? Or that Hillary Clinton was calling for civil war should Trump be elected? These fake stories spread because they were shared on Facebook.

An algorithm, which one researcher warns is inherently not neutral, cannot distinguish between what is fake and what is not. Facebook's vice president of product management, Adam Mosseri, has admitted to this problem in a statement to TechCrunch.

"We take misinformation on Facebook very seriously. We value authentic communication, and hear consistently from those who use Facebook that they prefer not to see misinformation. In Newsfeed we use various signals based on community feedback to determine which posts are likely to contain inaccurate information, and reduce their distribution. In Trending we look at a variety of signals to help make sure the topics being shown are reflective of real-world events, and take additional steps to prevent false or misleading content from appearing. Despite these efforts we understand there's so much more we need to do, and that is why it's important that we keep improving our ability to detect misinformation. We're committed to continuing to work on this issue and improve the experiences on our platform."

The New York Times reports that, internally, employees at Facebook are also concerned about the role the company played in the election and considering what its responsibility is. Employees are giving product managers suggestions on how to address the news feed problems.

Employees at Facebook are so dissatisfied with Zuckerberg's words and lack of action that dozens of them have formed a task force to investigate whether the company did enough to stop the spread of fake news, reports BuzzFeed. "If someone posts a fake news article, which claims that the Clintons are employing illegal immigrants, and that incites people to violence against illegal immigrants, isn't that dangerous, doesn't that also violate our community standards?" said one anonymous employee to BuzzFeed.

Where is this fake news coming from?

A BuzzFeed investigation found that many of the fake news stories were originating in Macedonia. Teens who wanted an easy way to make money created sites to write fake news that would cater to Trump supporters. One source told BuzzFeed that a friend who runs one of these sites can make "$3,000 a day" when a story takes off. In another investigation, The Washington Post found that sometimes Facebook surfaces stories from known satire sites, like SportsPickle. Fake news isn't the only issue on Facebook either: as an interactive story from The Wall Street Journal points out, your news feed will look vastly different if you're a liberal or a conservative.

Can we change what we see on Facebook?

To a certain extent, you can control what you see in the news feed. Facebook shows you posts it thinks you may like based on previous information, including pages you've "liked," where you're from, and other details in your profile. These same details are surfaced to advertisers, who will then show you things you might like to buy or see. At this point, however, you've probably clicked and "liked" so much stuff on Facebook that it's hard to escape the bubble you've created for yourself on the site. Besides, if what you're seeing is information and stories that you want to believe, why would you try to re-create a new world for yourself on Facebook?

Mark Zuckerberg doesn't believe the fake news on Facebook had a strong affect on the election

Two days after the election, on Nov. 10, Facebook CEO Mark Zuckerberg responded to all these arguments at a conference. "Personally I think the idea that fake news on Facebook, which is a very small amount of the content, influenced the election in any way — I think it is a pretty crazy idea. Voters make decisions based on their lived experience."

On Saturday, Nov. 12, Zuckerberg wrote a post that said "more than 99 percent of what people see [on Facebook] is authentic. . . Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other." Zuckerberg acknowledged that there is more work to be done in curbing fake news from the site.

Meanwhile, Gizmodo reports that Facebook may already have the features to get rid of fake news. Talking to two sources, Gizmodo found that a news feed update that "would have identified fake or hoax news stories" was in the works. But the source contends the company never moved forward with it because "there was a lot of fear about upsetting conservatives after Trending Topics." The update, in addition to flagging fake news, "disproportionately impacted right-wing news sites by downgrading or removing that content from people's feeds" — which may have been another reason the update was never released.

Facebook denied the report and all of its findings in an emailed comment to POPSUGAR. "The article's allegation is not true. We did not build and withhold any News Feed changes based on their potential impact on any one political party. We always work to make News Feed more meaningful and informative, and that includes examining the quality and accuracy of items shared, such as clickbait, spam and hoaxes. Mark himself said 'I want to do everything I can to make sure our teams uphold the integrity of our products.' This includes continuously review updates to make sure we are not exhibiting unconscious bias."

The company did not respond to our other questions for comment.

Where do we go from here?

Joshua Benton, director at the Nieman Lab, explores several different options. One of the most beneficial ones is to rehire human curators who can decide what goes in the "trending topics" section. See, there is plenty of great, well-reported journalism out there. But, if the algorithm on the site can't fix what you see on your feed, it can at least attempt to resolve this via the Trending Topics section. Benton also goes on to suggest that a team could sort through all the content on Facebook and flag fake news — and even potentially remove the publisher or page from the site if it continues to do this action.

Facebook can also do what reporters have been asking the company to do for months: declare itself a media company. When it controls this amount of information and dictates how and when you see that, it is making editorial decisions. When the company refuses to strike down fake news from the site, that is an editorial decision. Facebook is the company determining what information is vital to its readers — decisions which reporters and editors make in newsrooms every day.

We'll never know the full extent of how much Facebook did or did not influence the election. But with local and state elections now a year away, the company can and should do better. It owes that to its users and employees, who subscribe to a mission statement that wants to "make the world more open and connected."