Bullspotting: Finding Facts in the Age of Misinformation

Bullspotting: Finding Facts in the Age of Misinformation

by Loren Collins
Bullspotting: Finding Facts in the Age of Misinformation

Bullspotting: Finding Facts in the Age of Misinformation

by Loren Collins

eBook

$13.99  $18.00 Save 22% Current price is $13.99, Original price is $18. You Save 22%.

Available on Compatible NOOK Devices and the free NOOK Apps.
WANT A NOOK?  Explore Now

Related collections and offers

LEND ME® See Details

Overview

This entertaining and educational book applies the tools of critical thinking to identify the common features and trends among misinformation campaigns. With illustrations drawn from conspiracy theorists and deniers of every stripe, the author teaches readers how rumors are started, and the rhetorical techniques and logical fallacies often found in misleading or outright false claims. What distinguishes real conspiracies from conspiracy theories, real science from pseudoscience, and actual history from bogus accounts purporting to be history? How does one evaluate the credibility of rumors and quotes or judge the soundness of legal arguments advanced by tax deniers? Readers will learn how to make these critical distinctions and also how to spot "evidence" that has been manufactured or manipulated in some way to create a false impression.At a time when average citizens are bombarded with false information every day, this entertaining book will prove to be not only a great read but also an indispensable resource.

Product Details

ISBN-13: 9781616146351
Publisher: Prometheus Books
Publication date: 10/30/2012
Sold by: Barnes & Noble
Format: eBook
Pages: 267
File size: 1 MB

About the Author

Loren Collins (Atlanta, GA) has been published in the Atlanta Journal-Constitution on the topics of misinformation and critical thinking. An attorney and firm associate with the Law Office of W. Bryant Green, III, P. C., he is the creator of www.BirthofaNotion.com, a website that debunks the fallacies propounded by "birthers" regarding the legitimacy of President Barack Obama's US citizenship.

Read an Excerpt

Bullspotting

FINDING FACTS in the Age of MISINFORMATION
By LOREN COLLINS

Prometheus Books

Copyright © 2012 Loren Collins
All right reserved.

ISBN: 978-1-61614-634-4


Chapter One

BALONEY DETECTION

Every man should have a built-in automatic crap detector operating inside him. —Robert Manning, "Hemingway in Cuba," 1965

Ernest Hemingway said this in 1965. He was talking specifically about the art of writing at the time, but his advice is worthwhile on a far larger scale. Crap detection is not only a valuable tool; it's an essential one. We're constantly bombarded with information, much of it unreliable, from every source imaginable. Television, radio, print, news, word of mouth, and especially the Internet, are continuous founts of misinformation. Of crap. Operating in the world efficiently depends on one's ability to identify that crap, to spot bull, to know what to trust and what to be suspicious of. But how does one go about making those distinctions as automatically as Hemingway would suggest we should?

When I told a friend that I was writing a book on recognizing misinformation, his response was, "You only need one thing: common sense." This was not an undereducated friend; he is a lawyer with a master's degree in business. But he's dead wrong. It's immediately tempting to agree that common sense is all that's needed to tell fact from fiction, but it's common sense that's driving that sense of agreement.

Common sense is, unfortunately, often unreliable. Indeed, it's so unreliable that humanity has developed a mechanism to try to overcome the common errors of common sense—a means by which we can cut through the morass of prejudices and blind spots and unconscious assumptions, and discover the true reality underneath. And we've given that antidote for common sense a name: science. Or, to put it more broadly, the scientific method. A question is posed. Research is conducted. A hypothesis is constructed. The hypothesis is tested. Analysis is done, and a conclusion is drawn. And that conclusion can then be validated or invalidated through further iterations of the scientific method.

It's not perfectly suited to all aspects of life, but the success of the scientific method of thinking is evidenced by its own successful results. Within less than sixty years, humanity went from putting the first man in flight to putting the first man on the moon. In fewer than fifty years, we went from discovering the makeup of DNA to successfully cloning a large mammal. Common sense, by contrast, told us for most of human history that the world was flat. That the stars moved, but not the earth. That the world was composed of but four elements: earth, fire, wind, and water. That life had existed for only a few thousand years. That illness was caused by supernatural forces. That certain races of people were inferior. That women were inferior to men. That magic exists.

For millennia, these were our accepted truths. Humanity's knowledge was largely governed by common sense, and progress was slow. It was science—and its tools for rationally examining our universe and uncovering its undiscovered truths—that propelled the rapid change of the last few hundred years.

That's not to say that common sense is useless or that it leads only to false answers. Common sense is frequently helpful and does sometimes aid in making legitimate observations about the world. For instance, common sense might recognize that the consumption of certain natural plants tends to be followed by helpful (or harmful) aftereffects and might thus conclude that the plant is medicinal or poisonous. Identifying that pattern can help lead to the discovery of an underlying truth. Still, it's no guarantee. Hypotheses, after all, are essentially the operation of common sense. We look at existing data and patterns, and we draw a tentative conclusion. Some hypotheses turn out to be correct; many don't. Common sense is the same way.

Perhaps the biggest change that science has provided in the past century has been the introduction of the Internet. It's often said that such developments make the world smaller, but on the individual level, this has made the average person's world immensely larger. Historically, people tended to have local friends, read local newspapers, follow local events. Our lives are no longer so insular.

Our access to information has grown exponentially, and so, simultaneously, has our access to misinformation. Entire libraries of scholarship are available online, but so are innumerable amateur blogs. Legitimate and respectable news sources are ever-more convenient, but ideological and agenda-driven websites offer up factually questionable propaganda posing as news. Video archives preserve the past, but inexpensive cameras and editing software now permit even the nuttiest conspiracy theorist to produce a polished video presentation that can be visually compelling.

Unfortunately, most of us are not conditioned to distinguish between trustworthy and untrustworthy information, particularly given the avalanche of new facts we're presented with each day. So we turn to logical shortcuts. We favor information from people or sources we like. We distrust information that's inconsistent with our personal biases and beliefs. We accept information when it supports a conclusion we like, and we deny it when it supports the opposite. We fall back on common sense.

Using such shortcuts is not necessarily wrong. Indeed, it's often necessary, given the sheer volume of information we're confronted with each day. One can hardly be expected to research and validate every new piece of information encountered; life would be a perpetual series of mundane research projects on insignificant subjects.

Still, there are ways that misinformation can be spotted and singled out for further review. A properly trained skeptical eye is always on the lookout for suspect information and is possessed of the tools to evaluate it. Then, even if a firm answer cannot be easily found, the skeptic knows to keep an open mind as to the validity of the new fact and can avoid treating it as a confirmed truth or a proven falsehood. He or she can avoid being a duped participant in the further spread of misinformation.

Indeed, while the Internet has facilitated the spread of misinformation, there are simultaneously ever more resources that examine, challenge, and debunk false claims. Barbara and David Mikkelson founded the website Snopes® in 1995, with the original mission of addressing urban legends that circulated the web, particularly through e-mails. Today, with a scope that is far broader than just urban legends, Snopes remains the web's go-to source for the lowdown on the latest popular rumors. And despite the site being an independent operation, the Mikkelsons are regularly attacked by cranks for their debunking. Similar websites, such as TruthOrFiction. com and About.com's Urban Legends page, are also good resources that have been challenging false claims since the late 1990s.

It wasn't until the 2000s that new websites were devoted to a field where misinformation is not only endemic but also has the potential to be far more influential than a simple chain e-mail: politics. Elected officials, political commentators, interest groups ... they all regularly spin the truth to suit their agenda, and during election years, that spin gets broadcast directly into the public's homes as part of every campaign. FactCheck.org was created to evaluate the claims made by candidates and officials, and in 2004, it garnered national attention by being cited in a televised vice-presidential debate. In 2007, it was joined by PolitiFact.com, which graded political claims on its "Truth-O-Meter" scale, ranging from "True" to "Pants on Fire."

Debunking even made its way onto television with the popular Discovery Channel show Mythbusters. Premiering in 2003, each episode of the show takes on a handful of rumors or myths, and the hosts conduct experiments to see if the claims stand up to scrutiny. Depending on the result, the myth is declared to be either Busted, Plausible, or Confirmed. And while the show's experiments are not scientific in the strictest sense, cohost Adam Savage has noted that the show roughly follows the scientific method: they take a claim, develop a hypothesis, make a prediction, conduct some tests, and evaluate their findings. It's condensed and formatted for broadcast purposes, but the end result is that Mythbusters can make skepticism and critical thinking into enjoyable entertainment. (Having regular explosions doesn't hurt, either.)

The show's methodology also illustrates a critical point: "debunking" may be a result, but it shouldn't be a goal. That's why the word debunking is sometimes frowned upon by the skeptical community, since it implies that one begins with the intention of proving a claim to be false. Such firm commitment to a particular result is not scientific; starting with a preconceived outcome is, rather, the pseudoscientist's approach to evidence.

The scientific approach is to begin with a hypothesis. For most purposes, it's sufficient to use what is called a "null hypothesis," a default proposal that two phenomena are not related, or that a proposed process will not work as promised, or that a speculated fact is not accurate. A null hypothesis could be that crop circles were not created by aliens, that TWA Flight 800 was not felled by a missile, or that the president was not born in Kenya. From that starting point, one evaluates the evidence to see if the null hypothesis can be rejected—if there's not enough evidence to reject it, then the question remains open for further investigation.

Unfortunately, the human mind doesn't operate this way automatically, and even when it tries, it is prone to making errors. Not only that, but the human mind is conditioned to make very particular kinds of errors when evaluating questions, and those errors lead to incorrect beliefs.

Prehistoric humans had no concept of science or any sort of organized system of reasoning. But they were observers of the world around them, and they were capable of recognizing patterns in their environments. The sun would rise and set. The seasons changed in a predictable fashion. Clouds would appear in the sky before it rained. Certain animals might migrate and return at the same times during the seasonal cycles. Poisonous plants and animals could be identified by detecting the morbid pattern of when someone fell ill or died after eating them. Food could be located by identifying the patterns of where certain plants were likely to grow, or where certain animals were prone to live. Making these simple and basic associations helped ancient humans live and better understand their world.

But while we were evolving to recognize valid patterns, we weren't evolving the equivalent skill of rejecting invalid ones. If a man heard a suspicious sound and believed it was a tiger instead of a rodent, he did not particularly suffer if he responded as if it was a tiger. On the other hand, if he failed to sense a pattern between "sound" and "tiger" and he was wrong, then he was cat food. The man who saw a false pattern lived to spread his suspicious genes; the man who failed to see a real pattern won the Paleolithic equivalent of a Darwin Award. People thus evolved to spot patterns, even when they weren't there, and so began our love affair with misinformation. Out of our predilection for making associations, even false ones, came superstitions and some of the earliest forms of pseudoscience, like astrology.

Michael Shermer, founder of Skeptic magazine and author of The Believing Brain, termed this phenomenon patternicity, which he defined as "the tendency to find meaningful patterns in both meaningful and meaningless noise." Shermer further named another psychological phenomenon inherited through the practices of our ancient ancestors. Agenticity, as he called it, is "the tendency to infuse patterns with meaning, intention, and agency." He illustrates this by using the same "cat in the grass" illustration: when the ancient wanderer heard the rustle, it was advantageous to his survival when he assumed that there was a conscious agent behind the sound (a predatory cat) rather than an inanimate force (the wind).

We thus developed the tendency to draw associations not only between physical events but also between physical events and perceived motives behind those events. Superstitions were imbued with supernatural elements, invoking the existence of intelligent forces outside our sight that could be credited as the causes of the effects we observed. There came to be supernatural powers that caused the seasons to change, or caused crops to die, or brought famine. Special rituals could please the gods and earn their good favor, like dances that could bring rain or shamanic practices that could heal the sick. Animals and trees, the sun and the moon, even the earth itself were imbued with spiritual powers.

From these supernatural beliefs, then, grew bodies of myths. Humans are narrative creatures by nature; professor Walter Fisher coined the term Homo Narrans, the storytelling man. As generations of superstitious practice and observations accumulated, they were assembled into narratives. Impersonal natural spirits became pantheons of gods with names and personalities and individual histories unto themselves. Over time, humanity's tendency toward agenticity did not merely ascribe anonymous agency to worldly events; it also personalized those events. To the ancient Greeks, lightning and thunder were the works of Zeus; the seas were controlled by Poseidon; and so on.

These same tendencies toward pattern seeking and myth making underlie much of modern misinformation as well. Conspiracy theories are deeply rooted in the desire to find agenticity in accidental events, or to reassign agenticity to a more comfortable source, thereby creating their own tiny mythologies. Rumors are created when false associations are made, either in drawing incorrect conclusions from one's own observations or by misinterpreting information provided from some other source. Pseudo-scholarship is built on a foundation of faulty patterns and conclusions, and on a willingness to believe myths offered up through poor research rather than accepted truths that are backed by better and broader information.

Since evolution did not provide us with a crap detector that works to counteract our worse tendencies for misinformation, we have to train ourselves to spot misinformation as it presents itself. We have to hone our own crap detectors to learn the skills of proper skepticism and of critical thinking.

Spotting misinformation in advance is essential in avoiding it, for the same reason that we're so susceptible to it. The human mind is prone to a multitude of cognitive biases that influence how we think. We will view information differently depending on how it's framed; survey questions can get different results based on nothing more than how the questions are worded. We tend to give more weight to recent events over older events. The phenomenon of pareidolia causes us to see or hear messages in random data, allowing the image of Jesus to appear on a taco or satanic messages to be heard when a record is played backward.

But according to Shermer, the king of all cognitive biases is the confirmation bias. Once we believe something, it is not in our nature to equally seek out all new evidence as it comes along and evaluate it neutrally. When we do look for new information, we put our best efforts toward finding information that supports the things we already believe. We make little or no effort toward discovering whether there is credible evidence that contradicts our beliefs. When we're confronted with information and evidence that undermines our already-held beliefs, we try to undermine the value of that information. We question its credibility, we interpret it in the least favorable light, or we just ignore it entirely. And at the same time, we're prone to give every benefit of the doubt to each tidbit of evidence that seems to support our existing beliefs, no matter how questionable its credibility may be or how much we have to bend it to suit our needs.

This bias toward confirmation of our beliefs drastically hinders our ability to dispel ourselves of false beliefs. A 2005–2006 study on political beliefs produced the discomfiting finding that subjects presented with corrective information did not have their beliefs corrected. In the first study, subjects were presented with a mock news article suggesting that Saddam Hussein had stockpiles of WMDs prior to the 2003 invasion of Iraq, with a correction appended to the article's end saying that no such stockpiles were found to exist. Not only were conservative subjects not persuaded by the correction; there was a "backfire effect," as the study authors called it, by which the conservative subjects who read the correction became more likely to believe the WMDs were real.

The researchers found this effect among its liberal subjects as well. In another study, they used a mock article stating that President George W. Bush had banned stem-cell research, with a correction that stated he had not done so. Conservatives were positively influenced by the correction, but liberal subjects did not change their beliefs in response to the correction (although they did not display any "backfire effect" in the study).

(Continues...)



Excerpted from Bullspotting by LOREN COLLINS Copyright © 2012 by Loren Collins. Excerpted by permission of Prometheus Books. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

Table of Contents

Introduction 7

Chapter 1 Baloney Detection 15

Chapter 2 Denialism 31

Chapter 3 Conspiracy Theories 57

Chapter 4 Rumors 79

Chapter 5 Quotations 89

Chapter 6 Hoaxes 111

Chapter 7 Pseudoscience 127

Chapter 8 Pseudohistory 175

Chapter 9 Pseudolaw 201

Chapter 10 What's the Harm? 229

Notes 249

Index 263

From the B&N Reads Blog

Customer Reviews