X

Zuckerberg: Fake news on Facebook affected election? That's 'crazy'

Facebook's CEO talks about the News Feed's role in the election and the responsibility his company has to its almost 2 billion users -- most of whom are not actually dead.

Richard Nieva Former senior reporter
Richard Nieva was a senior reporter for CNET News, focusing on Google and Yahoo. He previously worked for PandoDaily and Fortune Magazine, and his writing has appeared in The New York Times, on CNNMoney.com and on CJR.org.
Richard Nieva
4 min read
Watch this: Facebook says don't blame it for Trump

After Donald Trump won the US presidential election on Tuesday, some commentators argued that fake news circulating on Facebook helped the real estate mogul turned reality TV personality win.

There was, for example, this story from the nonexistent Denver Guardian about an FBI agent associated with Hillary Clinton 's email leaks being found dead in a murder suicide. Or this one about the Pope endorsing Trump.

Even President Barack Obama called out Facebook by name the day before Tuesday's election. "As long as it's on Facebook, and people can see it, as long as it's on social media, people start believing it," Obama said at a Michigan rally. "And it creates this dust cloud of nonsense."

Mark Zuckerberg doesn't buy that fake stories played a role in the election outcome.

"Personally, I think the idea that fake news on Facebook -- it's a very small amount of the content -- to think it influenced the election in any way is a pretty crazy idea," Facebook's CEO said Thursday at the Techonomy conference in Half Moon Bay, California.

Instead, he thinks some people are shocked and still trying to understand the results of the election. "It takes a profound lack of empathy to think that someone voted some way because of a fake news story," Zuckerberg said.

The discussion comes days after the US presidential election. Trump won the office in an upset victory, which blindsided many people -- including pollsters and pundits -- who believed Clinton, the Democratic nominee, would become the next president.

To add to the disorientation, Facebook suffered what looks to be an oddly timed glitch on Friday, in which the site thought many of its users were dead. Lots of those very-much-alive users posted screenshots of memorial banners over their Facebook pages.

"For a brief period today, a message meant for memorialized profiles was mistakenly posted to other accounts. This was a terrible error that we have now fixed," said Facebook in a statement. "We are very sorry that this happened and we worked as quickly as possible to fix it."

Facebook, the news source

Facebook, with its 1.79 billion users, is playing a major role in society as more people look to the social network to get their news. Over 40 percent of American adults get their news from Facebook, according to the Pew Research Center and Knight Foundation.

Earlier on Thursday, Adam Mosseri, vice president of product management at Facebook, said in a statement that "there's so much more we need to do," to fight the spread of misinformation on the social network.

In the aftermath of the election, critics of the service have also blamed Facebook for the unexpected election result, arguing that the social network promotes tunnel vision because people are supposedly only exposed to viewpoints aligned with their own. Your Facebook feed is made up of posts from only the people you choose to populate it. So, the argument goes, there's a Facebook that liberals see and one that conservatives see, depending on the political views of your friends on the site.

Plus, Facebook relies on an algorithm that decides exactly what you see on your News Feed. Generally, it learns from what you've clicked on or Liked in the past and shows you more of what fits your interests. That is, it shows you what it thinks you want to see.

Even though the algorithm learns from your cues, Facebook still has an awesome amount of control over potentially shaping someone's worldview. Zuckerberg denied that Facebook is an echo chamber, arguing that Facebook actually exposes you to more viewpoints because everyone has at least a small number of friends who hold opposing opinions.

Zuckerberg emphasized that Facebook does show people stories they may not agree with, but that sometimes people just tune them out. "It's not that the diverse information isn't there," he said. "We haven't gotten people to engage with it in higher proportions."

This isn't the first time Facebook has been scrutinized for what it does or doesn't show us. It drew ire earlier this year after reports claimed Facebook encouraged its editorial contractors to suppress conservative news in its "trending stories" feature. Soon after that, the feature was redesigned to be more robotic, without human-written descriptions or curation.

Zuckerberg was also asked about his thoughts on the election results in general. In the past, he's been critical of Trump. In April, he took a thinly-veiled shot at then-candidate Trump onstage at F8, Facebook's most important conference of the year. Without referring to Trump by name, he talked about the dangers of "building walls," a nod to Trump's promise to build a wall along the Mexican-American border.

Trump has previously attacked Zuckerberg, too, calling the tech CEO's push for more immigration through his public interest group Fwd.us a bad move for American workers.

On Thursday, Zuckerberg was more diplomatic. "Well we have a lot of work to do," he said. "But that would have been true either way."

This story was first published Thursday, November 10 at 7:25 p.m. PST.

Updated, Friday November 11, 2:36 p.m. PST: Adds comments from Facebook about glitch that presumed users were dead.

Podcast