Suppressing Conspiracy Theories on the Internet

Recently by Michael Tennant: Walnuts Are Drugs, Says FDA

Do you think anthropogenic global warming is a hoax? Are you unconvinced that your ancestors had more in common with Cheetah than with Tarzan? Have you any doubts about the official version of how 9/11 went down? Then you, according to Evgeny Morozov, are part of a “kooky” “fringe movement” whose growth must be checked by forcing you to read “authoritative” content whenever you go looking for information on such topics on the Internet.

Morozov is a visiting scholar at Stanford University, a contributing editor to Foreign Policy magazine, and a former fellow at George Soros’ Open Society Institute – in other words, a reliable bellwether of globalist establishment thinking. His musings in Slate – in which he argues that while outright censorship of the web may not be possible, getting browsers and search engines to direct people to establishment-approved opinions would be an excellent idea – offer “proof of how worried the bad guys are about popular disbelief in State pieties, and about sites … that stoke it,” Lew Rockwell averred, citing his own website as an example. The New American undoubtedly would fall under that rubric as well.

The problem, as Morozov sees it, is that people who “deny” global warming or think vaccines may cause autism – opinions that conflict with those proffered by governments, the United Nations, and other globalist organizations – can post anything they want on the Internet with “little or no quality control” over it. As a result, he says, there are “thousands of sites that undermine scientific consensus, overturn well-established facts, and promote conspiracy theories.”

In addition, Morozov worries that those searching for information on a disputed topic will, because of the way search engines are structured, tend to find sites giving the politically incorrect version of events first and may never get around to reading the “authoritative” sources on the subject. “Meanwhile,” he argues, “the move toward social search may further insulate regular visitors to such sites; discovering even more links found by their equally paranoid friends will hardly enlighten them.”

Then comes the big question with the foreordained answer: “Is it time for some kind of a quality control system?” Morozov, not surprisingly, replies strongly in the affirmative. Since dissuading those already committed to these outré views may be impossible, he thinks “resources should go into thwarting their growth by targeting their potential – rather than existent – members.” “Given that censorship of search engines is not an appealing or even particularly viable option” – note that he doesn’t say he opposes censorship per se – Morozov argues for changes to browsers and search engines that would notify users that they are about to see something that the self-appointed arbiters of acceptable opinion have deemed unfit for human consumption and, if possible, direct them elsewhere.

He suggests two approaches to ensuring that web searchers are not exposed to unapproved thoughts:

One is to train our browsers to flag information that may be suspicious or disputed. Thus, every time a claim like “vaccination leads to autism” appears in our browser, that sentence would be marked in red – perhaps, also accompanied by a pop-up window advising us to check a more authoritative source. The trick here is to come up with a database of disputed claims that itself would correspond to the latest consensus in modern science – a challenging goal that projects like “Dispute Finder” are tackling head on.

The second – and not necessarily mutually exclusive – option is to nudge search engines to take more responsibility for their index and exercise a heavier curatorial control in presenting search results for issues like “global warming” or “vaccination.” Google already has a list of search queries that send most traffic to sites that trade in pseudoscience and conspiracy theories; why not treat them differently than normal queries? Thus, whenever users are presented with search results that are likely to send them to sites run by pseudoscientists or conspiracy theorists, Google may simply display a huge red banner asking users to exercise caution and check a previously generated list of authoritative resources before making up their minds.

Read the rest of the article

The Best of Michael Tennant