Mark Zuckerberg Is in Denial (Cut and Paste)
Posted: November 16th, 2016, 4:25 am
http://www.nytimes.com/2016/11/15/opini ... pe=article
Donald J. Trump’s supporters were probably heartened in September, when, according to an article shared nearly a million times on Facebook, the candidate received an endorsement from Pope Francis. Their opinions on Hillary Clinton may have soured even further after reading a Denver Guardian article that also spread widely on Facebook, which reported days before the election that an F.B.I. agent suspected of involvement in leaking Mrs. Clinton’s emails was found dead in an apparent murder-suicide.
There is just one problem with these articles: They were completely fake.
Mark Zuckerberg, Facebook’s chief, believes that it is “a pretty crazy idea” that “fake news on Facebook, which is a very small amount of content, influenced the election in any way.” In holding fast to the claim that his company has little effect on how people make up their minds, Mr. Zuckerberg is doing real damage to American democracy — and to the world.
He is also contradicting Facebook’s own research.
In 2010, researchers working with Facebook conducted an experiment on 61 million users in the United States right before the midterm elections. One group was shown a “go vote” message as a plain box, while another group saw the same message with a tiny addition: thumbnail pictures of their Facebook friends who had clicked on “I voted.” Using public voter rolls to compare the groups after the election, the researchers concluded that the second post had turned out hundreds of thousands of voters.
In 2012, Facebook researchers again secretly tweaked the newsfeed for an experiment: Some people were shown slightly more positive posts, while others were shown slightly more negative posts. Those shown more upbeat posts in turn posted significantly more of their own upbeat posts; those shown more downbeat posts responded in kind. Decades of other research concurs that people are influenced by their peers and social networks.
All of this renders preposterous Mr. Zuckerberg’s claim that Facebook, a major conduit for information in our society, has “no influence.”
The problem with Facebook’s influence on political discourse is not limited to the dissemination of fake news. It’s also about echo chambers. The company’s algorithm chooses which updates appear higher up in users’ newsfeeds and which are buried. Humans already tend to cluster among like-minded people and seek news that confirms their biases. Facebook’s research shows that the company’s algorithm encourages this by somewhat prioritizing updates that users find comforting.
I’ve seen this firsthand. While many of my Facebook friends in the United States lean Democratic, I do have friends who voted for Mr. Trump. But I had to go hunting for their posts because Facebook’s algorithm almost never showed them to me; for whatever reason the algorithm wrongly assumed that I wasn’t interested in their views.
Donald J. Trump’s supporters were probably heartened in September, when, according to an article shared nearly a million times on Facebook, the candidate received an endorsement from Pope Francis. Their opinions on Hillary Clinton may have soured even further after reading a Denver Guardian article that also spread widely on Facebook, which reported days before the election that an F.B.I. agent suspected of involvement in leaking Mrs. Clinton’s emails was found dead in an apparent murder-suicide.
There is just one problem with these articles: They were completely fake.
Mark Zuckerberg, Facebook’s chief, believes that it is “a pretty crazy idea” that “fake news on Facebook, which is a very small amount of content, influenced the election in any way.” In holding fast to the claim that his company has little effect on how people make up their minds, Mr. Zuckerberg is doing real damage to American democracy — and to the world.
He is also contradicting Facebook’s own research.
In 2010, researchers working with Facebook conducted an experiment on 61 million users in the United States right before the midterm elections. One group was shown a “go vote” message as a plain box, while another group saw the same message with a tiny addition: thumbnail pictures of their Facebook friends who had clicked on “I voted.” Using public voter rolls to compare the groups after the election, the researchers concluded that the second post had turned out hundreds of thousands of voters.
In 2012, Facebook researchers again secretly tweaked the newsfeed for an experiment: Some people were shown slightly more positive posts, while others were shown slightly more negative posts. Those shown more upbeat posts in turn posted significantly more of their own upbeat posts; those shown more downbeat posts responded in kind. Decades of other research concurs that people are influenced by their peers and social networks.
All of this renders preposterous Mr. Zuckerberg’s claim that Facebook, a major conduit for information in our society, has “no influence.”
The problem with Facebook’s influence on political discourse is not limited to the dissemination of fake news. It’s also about echo chambers. The company’s algorithm chooses which updates appear higher up in users’ newsfeeds and which are buried. Humans already tend to cluster among like-minded people and seek news that confirms their biases. Facebook’s research shows that the company’s algorithm encourages this by somewhat prioritizing updates that users find comforting.
I’ve seen this firsthand. While many of my Facebook friends in the United States lean Democratic, I do have friends who voted for Mr. Trump. But I had to go hunting for their posts because Facebook’s algorithm almost never showed them to me; for whatever reason the algorithm wrongly assumed that I wasn’t interested in their views.