
This report concentrates on these follow-up responses. (See “ About this canvassing of experts” for details about this sample.) Participants were next asked to explain their answers. Some 1,116 responded to this nonscientific canvassing: 51% chose the option that the information environment will not improve, and 49% said the information environment will improve. The information environment will NOT improve – In the next 10 years, on balance, the information environment will NOT BE improved by changes designed to reduce the spread of lies and other misinformation online. The information environment will improve – In the next 10 years, on balance, the information environment will be IMPROVED by changes that reduce the spread of lies and other misinformation online. Respondents were then asked to choose one of the following answer options: The question: In the next 10 years, will trusted methods emerge to block false narratives and allow the most accurate information to prevail in the overall information ecosystem? Or will the quality and veracity of information online deteriorate due to the spread of unreliable, sometimes even dangerous, socially destabilizing ideas? Those trying to stop the spread of false information are working to design technical and human systems that can weed it out and minimize the ways in which bots and other schemes spread lies and misinformation. The rise of “fake news” and the proliferation of doctored narratives that are spread by humans and bots online are challenging publishers and platforms. The question arises, then: What will happen to the online information environment in the coming decade? In summer 2017, Pew Research Center and Elon University’s Imagining the Internet Center conducted a large canvassing of technologists, scholars, practitioners, strategic thinkers and others, asking them to react to this framing of the issue: For every fact there is a counterfact and all these counterfacts and facts look identical online, which is confusing to most people.”Īmericans worry about that: A Pew Research Center study conducted just after the 2016 election found 64% of adults believe fake news stories cause a great deal of confusion and 23% said they had shared fabricated political stories themselves – sometimes by mistake and sometimes intentionally. “Truth is no longer dictated by authorities, but is networked by peers. “The major new challenge in reporting news is the new shape of truth,” said Kevin Kelly, co-founder of Wired magazine. When BBC Future Now interviewed a panel of 50 experts in early 2017 about the “ grand challenges we face in the 21 st century” many named the breakdown of trusted information sources. It is a social condition, like crime, that you must constantly monitor and adjust to.

Misinformation is not like a plumbing problem you fix. For instance, after fake news stories in June 2017 reported Ethereum’s founder Vitalik Buterin had died in a car crash its market value was reported to have dropped by $4 billion.


This makes many vulnerable to accepting and acting on misinformation. New information platforms feed the ancient instinct people have to find information that syncs with their perspectives: A 2016 study that analyzed 376 million Facebook users’ interactions with over 900 news outlets found that people tend to seek information that aligns with their views. presidential election highlighted how the digital age has affected news and cultural narratives.

The 2016 Brexit vote in the United Kingdom and the tumultuous U.S. In late 2016, Oxford Dictionaries selected “post-truth” as the word of the year, defining it as “relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.”
