2010-02-27
Marauder!
2010-01-31
Back Once Again
2009-11-20
The Sharpened Pixel
2009-11-10
Stay Classy Pat Robertson
So here's Pat, stirring up shit again. Denouncing Islam which, last time I checked, has over 1 billion adherent's is not going to make the cause of Christianity any stronger. In fact Pat, quite frankly I think you can only make all Christians and it's followers look worse.
To denounce Islam is absolutely absurd. As someone who, gasp, has lived an work among people of the Islamic faith, I can tell you honestly, they are just as good, just as crazy, and no less human than any of the rest of us in the west, Christian, Jew, Buddhist, Hindu, or Athiest.
I know a lot of this is happening in light of the Fort Hood shooting. My brother is stationed there. His wife and kids are there, it worries me too. I feel sick for the families affected, the father's being put into early graves and the tragedy of it all.
I feel sorry for Nidal Hasan as well. Just one poor bastard stuck in the middle of socio-religious ideologies at war. As surely as we have made Islam into our "hated enemy" they have repaid in kind.
Now ask yourselves dear Christians, when shall we learn to learn to turn the other cheek? To give kindness to any that would aid us? To do what is right by people, even if they would harm us? When will stop making the same mistakes, and see these people only as our enemy, instead of humans, no different or worse, simply choosing to walk a different path?
2009-11-09
Stateside and the Titanic
If you really travel, really let go, you can really take in so much, see so much, expand yourself so much. I can't think of anything worse than for a man to never leave the comforts of the hearth. You don't know who you are really, deep down, until the juxtaposition is almost a part of you too.
But there is a danger in that. What if you never really liked who you were to begin with? What if in that mirror of yourself you see only an empty vessel, an object of derision. What if you only see what you were as incomplete, or obsolete? If you smash that image, I think you almost become a phantom. Not what you were, but no longer able to truly be anything else. You become lost, a man without an identity, a soul without a home.
I have seen the sunrise on the flip side enough times. Bathed in the water, eaten the food. Lived, loved, learned and taught. I hope the footprints on the foreign shore don't wash away to soon. But if you follow them long enough, you will see the way that points me back, back to my roots, back to my personal insanity, back to my family, back to the cradle of the mountains and the place I call home.
2009-10-14
10/GUI
10/GUI from C. Miller on Vimeo.
I think it's pretty cool, I haven't had a chance to see what Windows 7 actually does with multitouch, like 99.9999% of people I don't have an interface that supports it. It's does seem like 10/GUI has come up with a pretty interesting approach, and left the keyboard there, which is nice.
The gear head part of me hope's that in future everything is like Guitar Hero, seprate peripherals for every app. The guy that wants to live in an immaculate uber ergonomic Ikea future dome wants the one true interface, what ever that may be*
In any case, it's 2010 next year. Is it the future yet?
*I think the answer may ulimately look like multitouch breasts, volume, tactility and one for each hand. I am looking right at you Japan.
Reproduction
How Thinking Goes Wrong
Twenty-five Fallacies That Lead Us
to Believe Weird Things
by Michael Shermer
from his 1997 book "Why People Believe Weird Things"
(used by kind permission of the author; all rights reserved)
In 1994 NBC began airing a New Age program called The Other Side that explored claims of the paranormal, various mysteries and miracles, and assorted "weird" things. I appeared numerous times as the token skeptic -- the "other side" of The Other Side, if you will. On most talk shows, a "balanced" program is a half-dozen to a dozen believers and one lone skeptic as the voice of reason or opposition. The Other Side was no different, even though the executive producer, many of the program producers, and even the host were skeptical of most of the beliefs they were covering. I did one program on werewolves for which they flew in a fellow from England. He actually looked a little like what you see in werewolf movies -- big bushy sideburns and rather pointy ears -- but when I talked to him, I found that he did not actually remember becoming a werewolf. He recalled the experience under hypnosis. In my opinion, his was a case of false memory, either planted by the hypnotist or fantasized by the man.
Another program was on astrology. The producers brought in a serious, professional astrologer from India who explained how it worked using charts and maps with all the jargon. But, because he was so serious, they ended up featuring a Hollywood astrologer who made all sorts of predictions about the lives of movie stars. He also did some readings for members of the audience. One young lady was told that she was having problems staying in long-term relationships with men. During the break, she told me that she was fourteen years old and was there with her high-school class to see how television programs were produced.
In my opinion, most believers in miracles, monsters, and mysteries are not hoaxers, flimflam artists, or lunatics. Most are normal people whose normal thinking has gone wrong in some way.… I would like to ... [look] at twenty-five fallacies of thinking that can lead anyone to believe weird things. I have grouped them in four categories, listing specific fallacies and problems in each. But as an affirmation that thinking can go right, I begin with what I call Hume's Maxim and close with what I call Spinoza's Dictum.
Hume's Maxim
Skeptics owe a lot to the Scottish philosopher David Hume (1711-1776), whose An Enquiry Concerning Human Understanding is a classic in skeptical analysis. The work was first published anonymously in London in 1739 as A Treatise of Human Nature. In Hume's words, it "fell dead-born from the press, without reaching such distinction as even to excite a murmur among the zealots." Hume blamed his own writing style and reworked the manuscript into An Abstract of a Treatise of Human Nature, published in 1740, and then into Philosophical Essays Concerning the Human Understanding, published in 1748. The work still garnered no recognition, so in 1758 he brought out the final version, under the title An Enquiry Concerning Human Understanding, which today we regard as his greatest philosophical work.
Hume distinguished between "antecedent skepticism," such as René Descartes' method of doubting everything that has no "antecedent" infallible criterion for belief; and "consequent skepticism," the method Hume employed, which recognizes the "consequences" of our fallible senses but corrects them through reason: "A wise man proportions his belief to the evidence." Better words could not be found for a skeptical motto.
Even more important is Hume's foolproof, when-all-else-fails analysis of miraculous claims. For when one is confronted by a true believer whose apparently supernatural or paranormal claim has no immediately apparent natural explanation, Hume provides an argument that he thought so important that he placed his own words in quotes and called them a maxim:
The plain consequence is (and it is a general maxim worthy of our attention), "That no testimony is sufficient to establish a miracle, unless the testimony be of such a kind, that its falsehood would be more miraculous than the fact which it endeavors to establish."
When anyone tells me that he saw a dead man restored to life, I immediately consider with myself whether it be more probable, that this person should either deceive or be deceived, or that the fact, which he relates, should really have happened. I weigh the one miracle against the other, and according to the superiority, which I discover, I pronounce my decision, and always reject the greater miracle. If the falsehood of his testimony would be more miraculous than the event which he relates; then, and not till then, can he pretend to command my belief or opinion.([1758] 1952, p. 491)
Problems in Scientific Thinking
1. Theory Influences Observations
About the human quest to understand the physical world, physicist and Nobel laureate Werner Heisenberg concluded, "What we observe is not nature itself but nature exposed to our method of questioning." In quantum mechanics, this notion has been formalized as the "Copenhagen interpretation" of quantum action: "a probability function does not prescribe a certain event but describes a continuum of possible events until a measurement interferes with the isolation of the system and a single event is actualized" (in Weaver 1987, p. 412). The Copenhagen interpretation eliminates the one-to-one correlation between theory and reality. The theory in part constructs the reality. Reality exists independent of the observer, of course, but our perceptions of reality are influenced by the theories framing our examination of it. Thus, philosophers call science theory laden.
That theory shapes perceptions of reality is true not only for quantum physics but also for all observations of the world. When Columbus arrived in the New World, he had a theory that he was in Asia and proceeded to perceive the New World as such. Cinnamon was a valuable Asian spice, and the first New World shrub that smelled like cinnamon was declared to be it. When he encountered the aromatic gumbo-limbo tree of the West Indies, Columbus concluded it was an Asian species similar to the mastic tree of the Mediterranean. A New World nut was matched with Marco Polo's description of a coconut. Columbus's surgeon even declared, based on some Caribbean roots his men uncovered, that he had found Chinese rhubarb. A theory of Asia produced observations of Asia, even though Columbus was half a world away. Such is the power of theory.
2. The Observer Changes the Observed
Physicist John Archibald Wheeler "Even to observe so minuscule an object as an electron, [a physicist] must shatter the glass. He must reach in. He must install his chosen measuring equipment.… Moreover, the measurement changes the state of the electron. The universe will never afterward be the same" (in Weaver 1987, p. 427). In other words, the act of studying an event can change it. Social scientists often encounter this phenomenon. Anthropologists know that when they study a tribe, the behavior of the members may be altered by the fact they are being observed by an outsider. Subjects in a psychology experiment may alter their behavior if they know what experimental hypotheses are being tested. This is why psychologists use blind and double-blind controls. Lack of such controls is often found in tests of paranormal powers and is one of the classic ways that thinking goes wrong in the pseudosciences. Science tries to minimize and acknowledge the effects of the observation on the behavior of the observed; pseudoscience does not.
3. Equipment Constructs Results
The equipment used in an experiment often determines the results. The size of our telescopes, for example, has shaped and reshaped our theories about the size of the universe. In the twentieth century, Edwin Hubble's 60- and 100-inch telescopes on Mt. Wilson in southern California for the first time provided enough seeing power for astronomers to distinguish individual stars in other galaxies, thus proving that those fuzzy objects called nebulas that we thought were in our own galaxy were actually separate galaxies. In the nineteenth century, craniometry defined intelligence as brain size and instruments were designed that measured it as such; today intelligence is defined by facility with certain developmental tasks and is measured by another instrument, the IQ test. Sir Arthur Stanley Eddington illustrated the problem with this clever analogy:
Let us suppose that an ichthyologist is exploring the life of the ocean. He casts a net into the water and brings up a fishy assortment. Surveying his catch, he proceeds in the usual manner of a scientist to systematize what it reveals. He arrives at two generalizations:
(1) No sea-creature is less than two inches long.
(2) All sea-creatures have gills.
In applying this analogy, the catch stands for the body of knowledge which constitutes physical science, and the net for the sensory and intellectual equipment which we use in obtaining it. The casting of the net corresponds to observations.
An onlooker may object that the first generalization is wrong. "There are plenty of sea-creatures under two inches long, only your net is not adapted to catch them." The ichthyologist dismisses this objection contemptuously. "Anything uncatchable by my net is ipso facto outside the scope of ichthyological knowledge, and is not part of the kingdom of fishes which has been defined as the theme of ichthyological knowledge. In short, what my net can't catch isn't fish." (1958, p. 16)
Likewise, what my telescope can't see isn't there, and what my test can't measure isn't intelligence. Obviously, galaxies and intelligence exist, but how we measure and understand them is highly influenced by our equipment.
Problems in Pseudoscientific Thinking
4. Anecdotes Do Not Make a Science
Anecdotes -- stories recounted in support of a claim -- do not make a science. Without corroborative evidence from other sources, or physical proof of some sort, ten anecdotes are no better than one, and a hundred anecdotes are no better than ten. Anecdotes are told by fallible human storytellers. Farmer Bob in Puckerbrush, Kansas, may be an honest, church-going, family man not obviously subject to delusions, but we need physical evidence of an alien spacecraft or alien bodies, not just a story about landings and abductions at 3:00 A.M. on a deserted country road. Likewise with many medical claims. Stories about how your Aunt Mary's cancer was cured by watching Marx Brothers movies or taking a liver extract from castrated chickens are meaningless. The cancer might have gone into remission on its own, which some cancers do; or it might have been misdiagnosed; or, or, or.… What we need are controlled experiments, not anecdotes. We need 100 subjects with cancer, all properly diagnosed and matched. Then we need 25 of the subjects to watch Marx Brothers movies, 25 to watch Alfred Hitchcock movies, 25 to watch the news, and 25 to watch nothing. Then we need to deduct the average rate of remission for this type of cancer and then analyze the data for statistically significant differences between the groups. If there are statistically significant differences, we better get confirmation from other scientists who have conducted their own experiments separate from ours before we hold a press conference to announce the cure for cancer.
5. Scientific Language Does Not Make a Science
Dressing up a belief system in the trappings of science by using scientific language and jargon, as in "creation-science," means nothing without evidence, experimental testing, and corroboration. Because science has such a powerful mystique in our society, those who wish to gain respectability but do not have evidence try to do an end run around the missing evidence by looking and sounding "scientific." Here is a classic example from a New Age column in the Santa Monica News: "This planet has been slumbering for eons and with the inception of higher energy frequencies is about to awaken in terms of consciousness and spirituality. Masters of limitation and masters of divination use the same creative force to manifest their realities, however, one moves in a downward spiral and the latter moves in an upward spiral, each increasing the resonant vibration inherent in them." How's that again? I have no idea what this means, but it has the language components of a physics experiment: "higher energy frequencies," "downward and upward spirals," and "resonant vibration." Yet these phrases mean nothing because they have no precise and operational definitions. How do you measure a planet's higher energy frequencies or the resonant vibration of masters of divination? For that matter, what is a master of divination?
6. Bold Statements Do Not Make Claims True
Something is probably pseudoscientific if enormous claims are made for its power and veracity but supportive evidence is scarce as hen's teeth. L. Ron Hubbard, for example, opens his Dianetics: The Modern Science of Mental Health, with this statement: "The creation of Dianetics is a milestone for man comparable to his discovery of fire and superior to his invention of the wheel and arch" (in Gardner 1952, p. 263). Sexual energy guru Wilhelm Reich called his theory of Orgonomy "a revolution in biology and psychology comparable to the Copernican Revolution" (in Gardner 1952, p. 259). I have a thick file of papers and letters from obscure authors filled with such outlandish claims (I call it the "Theories of Everything" file). Scientists sometimes make this mistake, too, as we saw at 1:00 P.M., on March 23, 1989, when Stanley Pons and Martin Fleischmann held a press conference to announce to the world that they had made cold nuclear fusion work. Gary Taubes's excellent book about the cold fusion debacle, appropriately named Bad Science (1993), thoroughly examines the implications of this incident. Maybe fifty years of physics will be proved wrong by one experiment, but don't throw out your furnace until that experiment has been replicated. The moral is that the more extraordinary the claim, the more extraordinarily well-tested the evidence must be.
7. Heresy Does Not Equal Correctness
They laughed at Copernicus. They laughed at the Wright brothers. Yes, well, they laughed at the Marx brothers. Being laughed at does not mean you are right. Wilhelm Reich compared himself to Peer Gynt, the unconventional genius out of step with society, and misunderstood and ridiculed as a heretic until proven right: "Whatever you have done to me or will do to me in the future, whether you glorify me as a genius or put me in a mental institution, whether you adore me as your savior or hang me as a spy, sooner or later necessity will force you to comprehend that I have discovered the laws of the living" (in Gardner 1952, p. 259). Reprinted in the January-February 1996 issue of the Journal of Historical Review, the organ of Holocaust denial, is a famous quote from the nineteenth-century German philosopher Arthur Schopenhauer, which is quoted often by those on the margins: "All truth passes through three stages. First, it is ridiculed. Second, it is violently opposed. Third, it is accepted as self-evident." But "all truth" does not pass through these stages. Lots of true ideas are accepted without ridicule or opposition, violent or otherwise. Einstein's theory of relativity was largely ignored until 1919, when experimental evidence proved him right. He was not ridiculed, and no one violently opposed his ideas. The Schopenhauer quote is just a rationalization, a fancy way for those who are ridiculed or violently opposed to say, "See, I must be right." Not so.
History is replete with tales of the lone scientist working in spite of his peers and flying in the face of the doctrines of his or her own field of study. Most of them turned out to be wrong and we do not remember their names. For every Galileo shown the instruments of torture for advocating a scientific truth, there are a thousand (or ten thousand) unknowns whose "truths" never pass muster with other scientists. The scientific community cannot be expected to test every fantastic claim that comes along, especially when so many are logically inconsistent. If you want to do science, you have to learn to play the game of science. This involves getting to know the scientists in your field, exchanging data and ideas with colleagues informally, and formally presenting results in conference papers, peer-reviewed journals, books, and the like.
Who has to prove what to whom? The person making the extraordinary claim has the burden of proving to the experts and to the community at large that his or her belief has more validity than the one almost everyone else accepts. You have to lobby for your opinion to be heard. Then you have to marshal experts on your side so you can convince the majority to support your claim over the one that they have always supported. Finally, when you are in the majority, the burden of proof switches to the outsider who wants to challenge you with his or her unusual claim. Evolutionists had the burden of proof for half a century after Darwin, but now the burden of proof is on creationists. It is up to creationists to show why the theory of evolution is wrong and why creationism is right, and it is not up to evolutionists to defend evolution. The burden of proof is on the Holocaust deniers to prove the Holocaust did not happen, not on Holocaust historians to prove that it did. The rationale for this is that mountains of evidence prove that both evolution and the Holocaust are facts. In other words, it is not enough to have evidence. You must convince others of the validity of your evidence. And when you are an outsider this is the price you pay, regardless of whether you are right or wrong.
9. Rumors Do Not Equal Reality
Rumors begin with "I read somewhere that ... " or "I heard from someone that.…" Before long the rumor becomes reality, as "I know that…" passes from person to person. Rumors may be true, of course, but usually they are not. They do make for great tales, however. There is the "true story" of the escaped maniac with a prosthetic hook who haunts the lover's lanes of America. There is the legend of "The Vanishing Hitchhiker," in which a driver picks up a hitchhiker who vanishes from his car along with his jacket; locals then tell the driver that his hitchhiking woman had died that same day the year before, and eventually he discovers his jacket on her grave. Such stories spread fast and never die.
Caltech historian of science Dan Kevles once told a story he suspected was apocryphal at a dinner party. Two students did not get back from a ski trip in time to take their final exam because the activities of the previous day had extended well into the night. They told their professor that they had gotten a flat tire, so he gave them a makeup final the next day. Placing the students in separate rooms, he asked them just two questions: (1) "For 5 points, what is the chemical formula for water?" (2) "For 95 points, which tire?" Two of the dinner guests had heard a vaguely similar story. The next day I repeated the story to my students and before I got to the punch line, three of them simultaneously blurted out, "Which tire?" Urban legends and persistent rumors are ubiquitous. Here are a few:
The secret ingredient in Dr. Pepper is prune juice.
A woman accidentally killed her poodle by drying it in a microwave oven.
Paul McCartney died and was replaced by a look-alike.
Giant alligators live in the sewers of New York City.
The moon landing was faked and filmed in a Hollywood studio.
George Washington had wooden teeth.
The number of stars inside the "P" on Playboy magazine's cover indicates how many times publisher Hugh Hefner had sex with the centerfold.
A flying saucer crashed in New Mexico and the bodies of the extraterrestrials are being kept by the Air Force in a secret warehouse.
How many have you heard … and believed? None of them are true.
10. Unexplained Is Not Inexplicable
Many people are overconfident enough to think that if they cannot explain something, it must be inexplicable and therefore a true mystery of the paranormal. An amateur archeologist declares that because he cannot figure out how the pyramids were built, they must have been constructed by space aliens. Even those who are more reasonable at least think that if the experts cannot explain something, it must be inexplicable. Feats such as the bending of spoons, firewalking, or mental telepathy are often thought to be of a paranormal or mystical nature because most people cannot explain them. When they are explained, most people respond, "Yes, of course" or "That's obvious once you see it." Firewalking is a case in point. People speculate endlessly about supernatural powers over pain and heat, or mysterious brain chemicals that block the pain and prevent burning. The simple explanation is that the capacity of light and fluffy coals to contain heat is very low, and the conductivity of heat from the light and fluffy coals to your feet is very poor. As long as you don't stand around on the coals, you will not get burned. (Think of a cake in a 450°F oven. The air, the cake, and the pan are all at 450°F, but only the metal pan will burn your hand. It has a high heat capacity and high conductivity, while air and cake are light and fluffy and have a low heat capacity and low conductivity.) This is why magicians do not tell their secrets. Most of their tricks are extremely simple and knowing the secret takes the magic out of the trick.
There are many genuine unsolved mysteries in the universe and it is okay to say, "We do not yet know but someday perhaps we will." The problem is that most of us find it more comforting to have certainty, even if it is premature, than to live with unsolved or unexplained mysteries.
In science, the value of negative findings -- failures -- cannot be overemphasized. Usually they are not wanted, and often they are not published. But most of the time failures are how we get closer to truth. Honest scientists will readily admit their errors, but all scientists are kept in line by the fact that their fellow scientists will publicize any attempt to fudge. Not pseudoscientists. They ignore or rationalize failures, especially when exposed. If they are actually caught cheating -- not a frequent occurrence -- they claim that their powers usually work but not always, so when pressured to perform on television or in a laboratory, they sometimes resort to cheating. If they simply fail to perform, they have ready any number of creative explanations: too many controls in an experiment cause negative results; the powers do not work in the presence of skeptics; the powers do not work in the presence of electrical equipment; the powers come and go, and this is one of those times they went. Finally, they claim that if skeptics cannot explain everything, then there must be something paranormal; they fall back on the unexplained is not inexplicable fallacy.
Also known as "post hoc, ergo propter hoc," literally "after this, therefore because of this." At its basest level, it is a form of superstition. The baseball player does not shave and hits two home runs. The gambler wears his lucky shoes because he has won wearing them in the past. More subtly, scientific studies can fall prey to this fallacy. In 1993 a study found that breast-fed children have higher IQ scores. There was much clamor over what ingredient in mother's milk increased intelligence. Mothers who bottle-fed their babies were made to feel guilty. But soon researchers began to wonder whether breast-fed babies are attended to differently. Maybe nursing mothers spend more time with their babies and motherly vigilance was the cause behind the differences in IQ. As Hume taught us, the fact that two events follow each other in sequence does not mean they are connected causally. Correlation does not mean causation.
In the paranormal world, coincidences are often seen as deeply significant. "Synchronicity" is invoked, as if some mysterious force were at work behind the scenes. But I see synchronicity as nothing more than a type of contingency -- a conjuncture of two or more events without apparent design. When the connection is made in a manner that seems impossible according to our intuition of the laws of probability, we have a tendency to think something mysterious is at work.
But most people have a very poor understanding of the laws of probability. A gambler will win six in a row and then think he is either "on a hot streak" or "due to lose." Two people in a room of thirty people discover that they have the same birthday and conclude that something mysterious is at work. You go to the phone to call your friend Bob. The phone rings and it is Bob. You think, "Wow, what are the chances; This could not have been a mere coincidence. Maybe Bob and I are communicating telepathically." In fact, none of these coincidences are coincidences under the rules of probability. The gambler has predicted both possible outcomes, a fairly safe bet! The probability that two people in a room of thirty people will have the same birthday is 71 percent. And you have forgotten how many times Bob did not call under such circumstances, or someone else called, or Bob called but you were not thinking of him, and so on. As the behavioral psychologist B. F. Skinner proved in the laboratory, the human mind seeks relationships between events and often finds them even when they are not present. Slot-machines are based on Skinnerian principles of intermittent reinforcement. The dumb human, like the dumb rat, only needs an occasional payoff to keep pulling the handle. The mind will do the rest.
As Aristotle said, "The sum of the coincidences equals certainty." We forget most of the insignificant coincidences and remember the meaningful ones. Our tendency to remember hits and ignore misses is the bread and butter of the psychics, prophets, and soothsayers who make hundreds of predictions each January 1. First they increase the probability of a hit by predicting mostly generalized sure bets like "There will be a major earthquake in southern California" or "I see trouble for the Royal Family." Then, next January, they publish their hits and ignore the misses, and hope no one bothers to keep track.
We must always remember the larger context in which a seemingly unusual event occurs, and we must always analyze unusual events for their representativeness of their class of phenomena. In the case of the "Bermuda Triangle," an area of the Atlantic Ocean where ships and planes "mysteriously" disappear, there is the assumption that something strange or alien is at work. But we must consider how representative such events are in that area. Far more shipping lanes run through the Bermuda Triangle than its surrounding areas, so accidents and mishaps and disappearances are more likely to happen in the area. As it turns out, the accident rate is actually lower in the Bermuda Triangle than in surrounding areas. Perhaps this area should be called the "Non-Bermuda Triangle." (See Kusche 1975 for a full explanation of this solved mystery.) Similarly, in investigating haunted houses, we must have a baseline measurement of noises, creaks, and other events before we can say that an occurrence is unusual (and therefore mysterious). I used to hear rapping sounds in the walls of my house. Ghosts? Nope. Bad plumbing. I occasionally hear scratching sounds in my basement. Poltergeists? Nope. Rats. One would be well-advised to first thoroughly understand the probable worldly explanation before turning to other-worldly ones.
15. Emotive Words and False Analogies
Emotive words are used to provoke emotion and sometimes to obscure rationality. They can be positive emotive words -- motherhood, America, integrity, honesty. Or they can be negative -- rape, cancer, evil, communist. Likewise, metaphors and analogies can cloud thinking with emotion or steer us onto a side path. A pundit talks about inflation as "the cancer of society" or industry "raping the environment." In his 1992 Democratic nomination speech, Al Gore constructed an elaborate analogy between the story of his sick son and America as a sick country. Just as his son, hovering on the brink of death, was nursed back to health by his father and family, America, hovering on the brink of death after twelve years of Reagan and Bush, was to be nurtured back to health under the new administration. Like anecdotes, analogies and metaphors do not constitute proof. They are merely tools of rhetoric.
This is an appeal to ignorance of lack of knowledge and is related to the burden of proof and unexplained is not inexplicable fallacies, where someone argues that if you cannot disprove a claim it must be true. For example, if you cannot prove that there isn't any psychic power, then there must be. The absurdity of this argument comes into focus if one argues that if you cannot prove that Santa Claus does not exist, then he must exist. You can argue the opposite in a similar manner. If you cannot prove Santa Claus exists, then he must not exist. In science, belief should come from positive evidence in support of a claim, not lack of evidence for or against a claim.
Literally "to the man" and "you also," these fallacies redirect the focus from thinking about the idea to thinking about the person holding the idea. The goal of an ad hominem attack is to discredit the claimant in hopes that it will discredit the claim. Calling someone an atheist, a communist, a child abuser, or a neo-Nazi does not in any way disprove that person's statement. It might be helpful to know whether someone is of a particular religion or holds a particular ideology, in case this has in some way biased the research, but refuting claims must be done directly, not indirectly. If Holocaust deniers, for example, are neo-Nazis or anti-Semites, this would certainly guide their choice of which historical events to emphasize or ignore. But if they are making the claim, for example, that Hitler did not have a master plan for the extermination of European Jewry, the response "Oh, he is saying that because he is a neo-Nazi" does not refute the argument. Whether Hitler had a master plan or not is a question that can be settled historically. Similarly for tu quoque. If someone accuses you of cheating on your taxes, the answer "Well, so do you" is no proof one way or the other.
In logic, the hasty generalization is a form of improper induction. In life, it is called prejudice. In either case, conclusions are drawn before the facts warrant it. Perhaps because our brains evolved to be constantly on the lookout for connections between events and causes, this fallacy is one of the most common of all. A couple of bad teachers mean a bad school. A few bad cars mean that brand of automobile is unreliable. A handful of members of a group are used to judge the entire group. In science, we must carefully gather as much information as possible before announcing our conclusions.
19. Overreliance on Authorities
We tend to rely heavily on authorities in our culture, especially if the authority is considered to be highly intelligent. The IQ score has acquired nearly mystical proportions in the last half century, but I have noticed that belief in the paranormal is not uncommon among Mensa members (the high-IQ club for those in the top 2 percent of the population); some even argue that their "Psi-Q" is also superior. Magician James Randi is fond of lampooning authorities with Ph.D.s -- once they are granted the degree, he says, they find it almost impossible to say two things: "I don't know" and "I was wrong." Authorities, by virtue of their expertise in a field, may have a better chance of being right in that field, but correctness is certainly not guaranteed, and their expertise does not necessarily qualify them to draw conclusions in other areas.
In other words, who is making the claim makes a difference. If it is a Nobel laureate, we take note because he or she has been right in a big way before. If it is a discredited scam artist, we give a loud guffaw because he or she has been wrong in a big way before. While expertise is useful for separating the wheat from the chaff, it is dangerous in that we might either (1) accept a wrong idea just because it was supported by someone we respect (false positive) or (2) reject a right idea just because it was supported by someone we disrespect (false negative). How do you avoid such errors? Examine the evidence.
Also known as the fallacy of negation or the false dilemma, this is the tendency to dichotomize the world so that if you discredit one position, the observer is forced to accept the other. This is a favorite tactic of creationists, who claim that life either was divinely created or evolved. Then they spend the majority of their time discrediting the theory of evolution so that they can argue that since evolution is wrong, creationism must be right. But it is not enough to point out weaknesses in a theory. If your theory is indeed superior, it must explain both the "normal" data explained by the old theory and the "anomalous" data not explained by the old theory. A new theory needs evidence in favor of it, not just against the opposition.
Also known as the fallacy of redundancy, begging the question, or tautology, this is when the conclusion or claim is merely a restatement of one of the premises. Christian apologetics is filled with tautologies: Is there a God? Yes. How do you know? Because the Bible says so. How do you know the Bible is correct? Because it was inspired by God. In other words, God is because God is. Science also has its share of redundancies: What is gravity? The tendency for objects to be attracted to one another: Why are objects attracted to one another? Gravity. In other words, gravity is because gravity is. (In fact, some of Newton's contemporaries rejected his theory of gravity as being an unscientific throwback to medieval occult thinking.) Obviously, a tautological operational definition can still be useful. Yet, difficult as it is, we must try to construct operational definitions that can be tested, falsified, and refuted.
22. Reductio ad Absurdum and the Slippery Slope
Reductio ad absurdum is the refutation of an argument by carrying the argument to its logical end and so reducing it to an absurd conclusion. Surely, if an argument's consequences are absurd, it must be false. This is not necessarily so, though sometimes pushing an argument to its limits is a useful exercise in critical thinking; often this is a way to discover whether a claim has validity, especially if an experiment testing the actual reduction can be run. Similarly, the slippery slope fallacy involves constructing a scenario in which one thing leads ultimately to an end so extreme that the first step should never be taken. For example: Eating Ben & Jerry's ice cream will cause you to put on weight. Putting on weight will make you overweight. Soon you will weigh 350 pounds and die of heart disease. Eating Ben & Jerry's ice cream leads to death. Don't even try it. Certainly eating a scoop of Ben & Jerry's ice cream may contribute to obesity, which could possibly, in very rare cases, cause death. But the consequence does not necessarily follow from the premise.
Psychological Problems in Thinking
23. Effort Inadequacies and the Need for Certainty, Control, and Simplicity
Most of us, most of the time, want certainty, want to control our environment, and want nice, neat simple explanations. All this may have some evolutionary basis, but in a multifarious society with complex problems, these characteristics can radically oversimplify reality and interfere with critical thinking and problem solving. For example, I believe that paranormal beliefs and pseudoscientific claims flourish in market economies in part because of the uncertainty of the marketplace. According to James Randi, after communism collapsed in Russia there was a significant increase in such belief. Not only are the people now freer to try to swindle each other with scams and rackets but many truly believe they have discovered something concrete and significant about the nature of the world. Capitalism is a lot less stable a social structure than communism. Such uncertainties lead the mind to look for explanations for the vagaries and contingencies of the market (and life in general), and the mind often takes a turn toward the supernatural and paranormal.
Scientific and critical thinking does not come naturally. It takes training, experience, and effort, as Alfred Mander explained in his Logic for the Millions: "Thinking is skilled work. It is not true that we are naturally endowed with the ability to think clearly and logically -- without learning how, or without practicing. People with untrained minds should no more expect to think clearly and logically than people who have never learned and never practiced can expect to find themselves good carpenters, golfers, bridge players, or pianists" (1947, p. vii). We must always work to suppress our need to be absolutely certain and in total control and our tendency to seek the simple and effortless solution to a problem. Now and then the solutions may be simple, but usually they are not.
24. Problem-Solving Inadequacies
All critical and scientific thinking is, in a fashion, problem solving. There are numerous psychological disruptions that cause inadequacies in problem solving. Psychologist Barry Singer has demonstrated that when given the task of selecting the right answer to a problem after being told whether particular guesses are right or wrong, people:
A. Immediately form a hypothesis and look only for examples to confirm it.
B. Do not seek evidence to disprove the hypothesis.
C. Are very slow to change the hypothesis even when it is obviously wrong.
D. If the information is too complex, adopt overly-simple hypotheses or strategies for solutions.
E. If there is no solution, if the problem is a trick and "right" and "wrong" is given at random, form hypotheses about coincidental relationships they observed. Causality is always found. (Singer and Abell 1981, p. 18)
If this is the case with humans in general, then we all must make the effort to overcome these inadequacies in solving the problems of science and of life.
25. Ideological Immunity, or the Planck Problem
In day-to-day life, as in science, we all resist fundamental paradigm change. Social scientist Jay Stuart Snelson calls this resistance an ideological immune system: "educated, intelligent, and successful adults rarely change their most fundamental presuppositions" (1993, p. 54). According to Snelson, the more knowledge individuals have accumulated, and the more well-founded their theories have become (and remember, we all tend to look for and remember confirmatory evidence, not counterevidence), the greater the confidence in their ideologies. The consequence of this, however, is that we build up an "immunity" against new ideas that do not corroborate previous ones. Historians of science call this the Planck Problem, after physicist Max Planck, who made this observation on what must happen for innovation to occur in science: "An important scientific innovation rarely makes its way by gradually winning over and converting its opponents: it rarely happens that Saul becomes Paul. What does happen is that its opponents gradually die out and that the growing generation is familiarized with the idea from the beginning" (1936, p. 97).
Psychologist David Perkins conducted an interesting correlational study in which he found a strong positive correlation between intelligence (measured by a standard IQ test) and the ability to give reasons for taking a point of view and defending that position; he also found a strong negative correlation between intelligence and the ability to consider other alternatives. That is, the higher the IQ, the greater the potential for ideological immunity. Ideological immunity is built into the scientific enterprise, where it functions as a filter against potentially overwhelming novelty. As historian of science I. B. Cohen explained, "New and revolutionary systems of science tend to be resisted rather than welcomed with open arms, because every successful scientist has a vested intellectual, social, and even financial interest in maintaining the status quo. If every revolutionary new idea were welcomed with open arms, utter chaos would be the result" (1985, p. 35).
In the end, history rewards those who are "right" (at least provisionally). Change does occur. In astronomy, the Ptolemaic geocentric universe was slowly displaced by Copernicus's heliocentric system. In geology, George Cuvier's catastrophism was gradually wedged out by the more soundly supported uniformitarianism of James Hutton and Charles Lyell. In biology, Darwin's evolution theory superseded creationist belief in the immutability of species. In Earth history, Alfred Wegener's idea of continental drift took nearly a half century to overcome the received dogma of fixed and stable continents. Ideological immunity can be overcome in science and in daily life, but it takes time and corroboration.
Spinoza's Dictum
Skeptics have the very human tendency to relish debunking what we already believe to be nonsense. It is fun to recognize other people's fallacious reasoning, but that's not the whole point. As skeptics and critical thinkers, we must move beyond our emotional responses because by understanding how others have gone wrong and how science is subject to social control and cultural influences, we can improve our understanding of how the world works. It is for this reason that it is so important for us to understand the history of both science and pseudoscience. If we see the larger picture of how these movements evolve and figure out how their thinking went wrong, we won't make the same mistakes. The seventeenth-century Dutch philosopher Baruch Spinoza said it best: "I have made a ceaseless effort not to ridicule, not to bewail, not to scorn human actions, but to understand them."