Logical+Fallacies

= = =Fallacies=

This handout is on common logical fallacies that you may encounter in your own writing or the writing of others. The handout provides definitions, examples, and tips on avoiding these fallacies.

Most academic writing tasks require you to make an argument—that is, to present reasons for a particular claim or interpretation you are putting forward. You may have been told that you need to make your arguments more logical or stronger. And you may have worried that you simply aren't a logical person or wondered what it means for an argument to be strong. Learning to make the best arguments you can is an ongoing process, but it isn't impossible: "Being logical" is something //anyone// can do, with practice! Each argument you make is composed of //premises// (this is a term for statements that express your reasons or evidence) that are arranged in the right way to support your //conclusion// (the main claim or interpretation you are offering). You can make your arguments stronger by You also need to be sure that you present all of your ideas in an orderly fashion that readers can follow. See our handouts on argument and organization for some tips that will improve your arguments. This handout describes some ways in which arguments often fail to do the things listed above; these failings are called fallacies. If you're having trouble developing your argument, check to see if a fallacy is part of the problem! It is particularly easy to slip up and commit a fallacy when you have strong feelings about your topic—if a conclusion seems obvious to you, you're more likely to just assume that it is true and to be careless with your evidence. To help you see how people commonly make this mistake, this handout uses a number of controversial political examples—arguments about subjects like abortion, gun control, the death penalty, gay marriage, euthanasia, and pornography. The purpose of this handout, though, is not to argue for any particular position on any of these issues; rather, it is to illustrate weak reasoning, which can happen in pretty much any kind of argument! //Please be aware that the claims in these examples are just made-up illustrations—they haven't been researched, so you shouldn't use them as evidence in your own writing.// top
 * 1) **using good premises** (ones you have good reason to believe are both true and relevant to the issue at hand),
 * 2) making sure your premises **provide good support for your conclusion** (and not some other conclusion, or no conclusion at all),
 * 3) checking that you have **addressed the most important or relevant aspects** of the issue (that is, that your premises and conclusion focus on what is really important to the issue you're arguing about), and
 * 4) **not making claims that are so strong or sweeping that you can't really support them**.

Fallacies are defects that weaken arguments. By learning to look for them in your own and others' writing, you can strengthen your ability to evaluate the arguments you make, read, and hear. It is important to realize two things about fallacies: First, fallacious arguments are very, very common and can be quite persuasive, at least to the casual reader or listener. You can find dozens of examples of fallacious reasoning in newspapers, advertisements, and other sources. Second, it is sometimes hard to evaluate whether an argument is fallacious. An argument might be very weak, somewhat weak, somewhat strong, or very strong. An argument that has several stages or parts might have some strong sections and some weak ones. The goal of this handout, then, is not to teach you how to label arguments as fallacious or fallacy-free, but to help you look critically at your own arguments and move them away from the "weak" and toward the "strong" end of the continuum. top

For each fallacy listed, there is a definition or explanation, an example, and a tip on how to avoid committing the fallacy in your own arguments.

top


 * Hasty Generalization

Definition**: Making assumptions about a whole group or range of cases based on a sample that is inadequate (usually because it is atypical or just too small). Stereotypes about people ("frat boys are drunkards," "grad students are nerdy," etc.) are a common example of the principle underlying hasty generalization. top
 * Example**: "My roommate said her philosophy class was hard, and the one I'm in is hard, too. All philosophy classes must be hard!" Two people's experiences are, in this case, not enough on which to base a conclusion.
 * Tip**: Ask yourself what kind of "sample" you're using: Are you relying on the opinions or experiences of just a few people, or your own experience in just a few situations? If so, consider whether you need more evidence, or perhaps a less sweeping conclusion. (Notice that in the example, the more modest conclusion "//Some// philosophy classes are hard for //some// students" would not be a hasty generalization.)

Definition**: The premises of an argument do support a particular conclusion—but not the conclusion that the arguer actually draws. top
 * Missing the Point
 * Example**: "The seriousness of a punishment should match the seriousness of the crime. Right now, the punishment for drunk driving may simply be a fine. But drunk driving is a very serious crime that can kill innocent people. So the death penalty should be the punishment for drunk driving." The argument actually supports several conclusions—"The punishment for drunk driving should be very serious," in particular—but it doesn't support the claim that the death penalty, specifically, is warranted.
 * Tip**: Separate your premises from your conclusion. Looking at the premises, ask yourself what conclusion an objective person would reach after reading them. Looking at your conclusion, ask yourself what kind of evidence would be required to support such a conclusion, and then see if you've actually given that evidence. Missing the point often occurs when a sweeping or extreme conclusion is being drawn, so be especially careful if you know you're claiming something big.

This fallacy gets its name from the Latin phrase "//post hoc, ergo propter hoc//," which translates as "after this, therefore because of this." top
 * Post Hoc (also called False Causes)**
 * Definition**: Assuming that because B comes after A, A caused B. Of course, sometimes one event really does cause another one that comes later—for example, if I register for a class, and my name later appears on the roll, it's true that the first event caused the one that came later. But sometimes two events that seem related in time aren't really related as cause and event. That is, correlation isn't the same thing as causation.
 * Examples**: "President Jones raised taxes, and then the rate of violent crime went up. Jones is responsible for the rise in crime." The increase in taxes might or might not be one factor in the rising crime rates, but the argument hasn't shown us that one caused the other.
 * Tip**: To avoid the //post hoc// fallacy, the arguer would need to give us some explanation of the process by which the tax increase is supposed to have produced higher crime rates. And that's what you should do to avoid committing this fallacy: If you say that A causes B, you should have something more to say about how A caused B than just that A came first and B came later!

Definition**: The arguer claims that a sort of chain reaction, usually ending in some dire consequence, will take place, but there's really not enough evidence for that assumption. The arguer asserts that if we take even one step onto the "slippery slope," we will end up sliding all the way to the bottom; he or she assumes we can't stop halfway down the hill. Like post hoc, slippery slope can be a tricky fallacy to identify, since sometimes a chain of events really can be predicted to follow from a certain action. Here's an example that doesn't seem fallacious: "If I fail English 101, I won't be able to graduate. If I don't graduate, I probably won't be able to get a good job, and I may very well end up doing temp work or flipping burgers for the next year." top
 * Slippery Slope
 * Example**: "Animal experimentation reduces our respect for life. If we don't respect life, we are likely to be more and more tolerant of violent acts like war and murder. Soon our society will become a battlefield in which everyone constantly fears for their lives. It will be the end of civilization. To prevent this terrible consequence, we should make animal experimentation illegal right now." Since animal experimentation has been legal for some time and civilization has not yet ended, it seems particularly clear that this chain of events won't necessarily take place. Even if we believe that experimenting on animals reduces respect for life, and loss of respect for life makes us more tolerant of violence, that may be the spot on the hillside at which things stop—we may not slide all the way down to the end of civilization. And so we have not yet been given sufficient reason to accept the arguer's conclusion that we must make animal experimentation illegal right now.
 * Tip**: Check your argument for chains of consequences, where you say "if A, then B, and if B, then C," and so forth. Make sure these chains are reasonable.

Definition**: Many arguments rely on an analogy between two or more objects, ideas, or situations. If the two things that are being compared aren't really alike in the relevant respects, the analogy is a weak one, and the argument that relies on it commits the fallacy of weak analogy. If you think about it, you can make an analogy of some kind between almost any two things in the world: "My paper is like a mud puddle because they both get bigger when it rains (I work more when I'm stuck inside) and they're both kind of murky." So the mere fact that you draw an analogy between two things doesn't prove much, by itself. Arguments by analogy are often used in discussing abortion—arguers frequently compare fetuses with adult human beings, and then argue that treatment that would violate the rights of an adult human being also violates the rights of fetuses. Whether these arguments are good or not depends on the strength of the analogy: do adult humans and fetuses share the property that gives adult humans rights? If the property that matters is having a human genetic code or the potential for a life full of human experiences, adult humans and fetuses do share that property, so the argument and the analogy are strong; if the property is being self-aware, rational, or able to survive on one's own, adult humans and fetuses don't share it, and the analogy is weak. top
 * Weak Analogy
 * Example**: "Guns are like hammers—they're both tools with metal parts that could be used to kill someone. And yet it would be ridiculous to restrict the purchase of hammers—so restrictions on purchasing guns are equally ridiculous." While guns and hammers do share certain features, these features (having metal parts, being tools, and being potentially useful for violence) are not the ones at stake in deciding whether to restrict guns. Rather, we restrict guns because they can easily be used to kill large numbers of people at a distance. This is a feature hammers do not share—it'd be hard to kill a crowd with a hammer. Thus, the analogy is weak, and so is the argument based on it.
 * Tip**: Identify what properties are important to the claim you're making, and see whether the two things you're comparing both share those properties.


 * Appeal to Authority

Definition**: Often we add strength to our arguments by referring to respected sources or authorities and explaining their positions on the issues we're discussing. If, however, we try to get readers to agree with us simply by impressing them with a famous name or by appealing to a supposed authority who really isn't much of an expert, we commit the fallacy of appeal to authority. top
 * Example**: "We should abolish the death penalty. Many respected people, such as actor Guy Handsome, have publicly stated their opposition to it." While Guy Handsome may be an authority on matters having to do with acting, there's no particular reason why anyone should be moved by his political opinions—he is probably no more of an authority on the death penalty than the person writing the paper.
 * Tip**: There are two easy ways to avoid committing appeal to authority: First, make sure that the authorities you cite are experts on the subject you're discussing. Second, rather than just saying "Dr. Authority believes x, so we should believe it, too," try to explain the reasoning or evidence that the authority used to arrive at his or her opinion. That way, your readers have more to go on than a person's reputation. It also helps to choose authorities who are perceived as fairly neutral or reasonable, rather than people who will be perceived as biased.


 * Ad Populum

Definition**: The Latin name of this fallacy means "to the people." There are several versions of the //ad populum// fallacy, but what they all have in common is that in them, the arguer takes advantage of the desire most people have to be liked and to fit in with others and uses that desire to try to get the audience to accept his or her argument. One of the most common versions is the bandwagon fallacy, in which the arguer tries to convince the audience to do or believe something because everyone else (supposedly) does. top
 * Example**: "Gay marriages are just immoral. 70% of Americans think so!" While the opinion of most Americans might be relevant in determining what laws we should have, it certainly doesn't determine what is moral or immoral: There was a time where a substantial number of Americans were in favor of segregation, but their opinion was not evidence that segregation was moral. The arguer is trying to get us to agree with the conclusion by appealing to our desire to fit in with other Americans.
 * Tip**: Make sure that you aren't recommending that your audience believe your conclusion because everyone else believes it, all the cool people believe it, people will like you better if you believe it, and so forth. Keep in mind that the popular opinion is not always the right one!


 * Ad hominem and tu quoque

Definitions**: Like the appeal to authority and //ad populum// fallacies, the //ad hominem// ("against the person") and //tu quoque// ("you, too!") fallacies focus our attention on people rather than on arguments or evidence. In both of these arguments, the conclusion is usually "You shouldn't believe So-and-So's argument." The reason for not believing So-and-So is that So-and-So is either a bad person (//ad hominem//) or a hypocrite (//tu quoque//). In an //ad hominem// argument, the arguer attacks his or her opponent instead of the opponent's argument. In a //tu quoque// argument, the arguer points out that the opponent has actually done the thing he or she is arguing against, and so the opponent's argument shouldn't be listened to. Here's an example: Imagine that your parents have explained to you why you shouldn't smoke, and they've given a lot of good reasons—the damage to your health, the cost, and so forth. You reply, "I won't accept your argument, because you used to smoke when you were my age. You did it, too!" The fact that your parents have done the thing they are condemning has no bearing on the premises they put forward in their argument (smoking harms your health and is very expensive), so your response is fallacious. top
 * Examples**: "Andrea Dworkin has written several books arguing that pornography harms women. But Dworkin is an ugly, bitter person, so you shouldn't listen to her." Dworkin's appearance and character, which the arguer has characterized so ungenerously, have nothing to do with the strength of her argument, so using them as evidence is fallacious.
 * Tip**: Be sure to stay focused on your opponents' reasoning, rather than on their personal character. (The exception to this is, of course, if you are making an argument about someone's character—if your conclusion is "President Clinton is an untrustworthy person," premises about his untrustworthy acts are relevant, not fallacious.)


 * Appeal to Pity

Definition**: The appeal to pity takes place when an arguer tries to get people to accept a conclusion by making them feel sorry for someone. top
 * Examples**: "I know the exam is graded based on performance, but you should give me an A. My cat has been sick, my car broke down, and I've had a cold, so it was really hard for me to study!" The conclusion here is "You should give me an A." But the criteria for getting an A have to do with learning and applying the material from the course; the principle the arguer wants us to accept (people who have a hard week deserve A's) is clearly unacceptable. The information the arguer has given might //feel// relevant and might even get the audience to consider the conclusion—but the information isn't logically relevant, and so the argument is fallacious. Here's another example: "It's wrong to tax corporations—think of all the money they give to charity, and of the costs they already pay to run their businesses!"
 * Tip**: Make sure that you aren't simply trying to get your audience to agree with you by making them feel sorry for someone.


 * Appeal to Ignorance

Definition**: In the appeal to ignorance, the arguer basically says, "Look, there's no conclusive evidence on the issue at hand. Therefore, you should accept my conclusion on this issue." top
 * Example**: "People have been trying for centuries to prove that God exists. But no one has yet been able to prove it. Therefore, God does not exist." Here's an opposing argument that commits the same fallacy: "People have been trying for years to prove that God does not exist. But no one has yet been able to prove it. Therefore, God exists." In each case, the arguer tries to use the lack of evidence as support for a positive claim about the truth of a conclusion. There is one situation in which doing this is not fallacious: If qualified researchers have used well-thought-out methods to search for something for a long time, they haven't found it, and it's the kind of thing people ought to be able to find, then the fact that they haven't found it constitutes some evidence that it doesn't exist.
 * Tip**: Look closely at arguments where you point out a lack of evidence and then draw a conclusion from that lack of evidence.


 * Straw Man

Definition**: One way of making our own arguments stronger is to anticipate and respond in advance to the arguments that an opponent might make. In the straw man fallacy, the arguer sets up a wimpy version of the opponent's position and tries to score points by knocking it down. But just as being able to knock down a straw man, or a scarecrow, isn't very impressive, defeating a watered-down version of your opponents' argument isn't very impressive either. top
 * Example**: "Feminists want to ban all pornography and punish everyone who reads it! But such harsh measures are surely inappropriate, so the feminists are wrong: porn and its readers should be left in peace." The feminist argument is made weak by being overstated—in fact, most feminists do not propose an outright "ban" on porn or any punishment for those who merely read it; often, they propose some restrictions on things like child porn, or propose to allow people who are hurt by porn to sue publishers and producers, not readers, for damages. So the arguer hasn't really scored any points; he or she has just committed a fallacy.
 * Tip**: Be charitable to your opponents. State their arguments as strongly, accurately, and sympathetically as possible. If you can knock down even the best version of an opponent's argument, then you've really accomplished something.


 * Red Herring

Definition**: Partway through an argument, the arguer goes off on a tangent, raising a side issue that distracts the audience from what's really at stake. Often, the arguer never returns to the original issue.
 * Example**: "Grading this exam on a curve would be the most fair thing to do. After all, classes go more smoothly when the students and the professor are getting along well." Let's try our premise-conclusion outlining to see what's wrong with this argument:

Premise: Classes go more smoothly when the students and the professor are getting along well. Conclusion: Grading this exam on a curve would be the most fair thing to do.

When we lay it out this way, it's pretty obvious that the arguer went off on a tangent—the fact that something helps people get along doesn't necessarily make it more fair; fairness and justice sometimes require us to do things that cause conflict. But the audience may feel like the issue of teachers and students agreeing is important and be distracted from the fact that the arguer has not given any evidence as to why a curve would be fair. top
 * Tip**: Try laying your premises and conclusion out in an outline-like form. How many issues do you see being raised in your argument? Can you explain how each premise supports the conclusion?


 * False Dichotomy

Definition**: In false dichotomy, the arguer sets up the situation so it looks like there are only two choices. The arguer then eliminates one of the choices, so it seems that we are left with only one option: the one the arguer wanted us to pick in the first place. But often there are really many different options, not just two—and if we thought about them all, we might not be so quick to pick the one the arguer recommends! top
 * Example**: "Caldwell Hall is in bad shape. Either we tear it down and put up a new building, or we continue to risk students' safety. Obviously we shouldn't risk anyone's safety, so we must tear the building down." The argument neglects to mention the possibility that we might repair the building or find some way to protect students from the risks in question—for example, if only a few rooms are in bad shape, perhaps we shouldn't hold classes in those rooms.
 * Tip**: Examine your own arguments: If you're saying that we have to choose between just two options, is that really so? Or are there other alternatives you haven't mentioned? If there are other alternatives, don't just ignore them—explain why they, too, should be ruled out. Although there's no formal name for it, assuming that there are only three options, four options, etc. when really there are more is similar to false dichotomy and should also be avoided.


 * Begging the Question

Definition**: A complicated fallacy; it comes in several forms and can be harder to detect than many of the other fallacies we've discussed. Basically, an argument that begs the question asks the reader to simply accept the conclusion without providing real evidence; the argument either relies on a premise that says the same thing as the conclusion (which you might hear referred to as "being circular" or "circular reasoning"), or simply ignores an important (but questionable) assumption that the argument rests on. Sometimes people use the phrase "beg the question" as a sort of general criticism of arguments, to mean that an arguer hasn't given very good reasons for a conclusion, but that's not the meaning we're going to discuss here.
 * Examples**: "Active euthanasia is morally acceptable. It is a decent, ethical thing to help another human being escape suffering through death." Let's lay this out in premise-conclusion form:

Premise: It is a decent, ethical thing to help another human being escape suffering through death. Conclusion: Active euthanasia is morally acceptable.

If we "translate" the premise, we'll see that the arguer has really just said the same thing twice: "decent, ethical" means pretty much the same thing as "morally acceptable," and "help another human being escape suffering through death" means "active euthanasia." So the premise basically says, "active euthanasia is morally acceptable," just like the conclusion does! The arguer hasn't yet given us any real reasons //why// euthanasia is acceptable; instead, she has left us asking "well, really, why do you think active euthanasia is acceptable?" Her argument "begs" (that is, evades) the real question. Here's a second example of begging the question, in which a dubious premise which is needed to make the argument valid is completely ignored: "Murder is morally wrong. So active euthanasia is morally wrong." The premise that gets left out is "active euthanasia is murder." And that is a debatable premise—again, the argument "begs" or evades the question of whether active euthanasia is murder by simply not stating the premise. The arguer is hoping we'll just focus on the uncontroversial premise, "Murder is morally wrong," and not notice what is being assumed. top
 * Tip**: One way to try to avoid begging the question is to write out your premises and conclusion in a short, outline-like form. See if you notice any gaps, any steps that are required to move from one premise to the next or from the premises to the conclusion. Write down the statements that would fill those gaps. If the statements are controversial and you've just glossed over them, you might be begging the question. Next, check to see whether any of your premises basically says the same thing as the conclusion (but in other words). If so, you're begging the question. The moral of the story: You can't just assume or use as uncontroversial evidence the very thing you're trying to prove.


 * Equivocation

Definition**: Equivocation is sliding between two or more different meanings of a single word or phrase that is important to the argument. top
 * Example**: "Giving money to charity is the right thing to do. So charities have a right to our money." The equivocation here is on the word "right": "right" can mean both something that is correct or good (as in "I got the right answers on the test") and something to which someone has a claim (as in "everyone has a right to life"). Sometimes an arguer will deliberately, sneakily equivocate, often on words like "freedom," "justice," "rights," and so forth; other times, the equivocation is a mistake or misunderstanding. Either way, it's important that you use the main terms of your argument consistently.
 * Tip**: Identify the most important words and phrases in your argument and ask yourself whether they could have more than one meaning. If they could, be sure you aren't slipping and sliding between those meanings.

Here are some general tips for finding fallacies in your own arguments:
 * So how do I find Fallacies in my own Writing?**
 * **Pretend you disagree with the conclusion you're defending.** What parts of the argument would now seem fishy to you? What parts would seem easiest to attack? Give special attention to strengthening those parts.


 * **List your main points**; under each one, **list the evidence** you have for it. Seeing your claims and evidence laid out this way may make you realize that you have no good evidence for a particular claim, or it may help you look more critically at the evidence you're using.


 * **Learn which types of fallacies you're especially prone to**, and be careful to check for them in your work. Some writers make lots of appeals to authority; others are more likely to rely on weak analogies or set up straw men. Read over some of your old papers to see if there's a particular kind of fallacy you need to watch out for.


 * **Be aware that broad claims need more proof than narrow ones**. Claims that use sweeping words like "all," "no," "none," "every," "always," "never," "no one," and "everyone" are sometimes appropriate—but they require a lot more proof than less-sweeping claims that use words like "some," "many," "few," "sometimes," "usually," and so forth.

top
 * **Double check your characterizations of others**, especially your opponents, to be sure they are accurate and fair.

Yes, you can. Follow this link to see a sample argument that's full of fallacies (and then you can follow another link to get an explanation of each one). Then there's a more well-constructed argument on the same topic. top

We consulted these works while writing the original version of this handout. This is not a comprehensive list of resources on the handout's topic, and we encourage you to do your own research to find the latest publications on this topic. Please do not use this list as a model for the format of your own reference list, as it may not match the citation style you are using. For guidance on formatting citations, please see the [|UNC Libraries citation tutorial]. Hurley, Patrick J. A Concise Introduction to Logic. Thornson Learning, 2000 Lunsford, Andrea and John Ruszkiewicz. Everything's an Argument. Bedford Books, 1998. Copi, Irving M. and Carl Cohen. Introduction to Logic. Prentice Hall, 1998. taken from: = [] = = = = = = = = = = A List of Fallacious Arguments =

· **Ad Hominem (Argument To The Man):** attacking the person instead of attacking his argument. For example, "Von Daniken's books about ancient astronauts are worthless because he is a convicted forger and embezzler." (Which is true, but that's not why they're worthless.) Another example is this syllogism, which alludes to Alan Turing's homosexuality: Turing thinks machines think. Turing lies with men. Therefore, machines don't think. (Note the [|equivocation] in the use of the word "lies".) A common form is an attack on sincerity. For example, "How can you argue for vegetarianism when you wear leather shoes ?" The [|two wrongs make a right] fallacy is related. A variation (related to [|Argument By Generalization]) is to attack a whole class of people. For example, "Evolutionary biology is a sinister tool of the materialistic, atheistic religion of Secular Humanism." Similarly, one notorious net.kook waved away a whole category of evidence by announcing "All the scientists were drunk." Another variation is attack by innuendo: "Why don't scientists tell us what they really know; are they afraid of public panic ?" There may be a pretense that the attack isn't happening: "In order to maintain a civil debate, I will not mention my opponent's drinking problem." Or "I don't care if other people say you're [opinionated/boring/overbearing]." Attacks don't have to be strong or direct. You can merely show disrespect, or cut down his stature by saying that he seems to be sweating a lot, or that he has forgotten what he said last week. Some examples: "I used to think that way when I was your age." "You're new here, aren't you ?" "You weren't breast fed as a child, were you ?" "What drives you to make such a statement ?" "If you'd just listen.." "You seem very emotional." (This last works well if you have been hogging the microphone, so that they have had to yell to be heard.) Sometimes the attack is on the other person's intelligence. For example, "If you weren't so stupid you would have no problem seeing my point of view." Or, "Even you should understand my next point." Oddly, the stupidity attack is sometimes reversed. For example, dismissing a comment with "Well, you're just smarter than the rest of us." (In Britain, that might be put as "too clever by half".) This is Dismissal By Differentness. It is related to [|Not Invented Here] and [|Changing The Subject]. Ad Hominem is not fallacious if the attack goes to the credibility of the argument. For instance, the argument may depend on its presenter's claim that he's an expert. (That is, the Ad Hominem is undermining an [|Argument From Authority].) Trial judges allow this category of attacks. · **Needling:** simply attempting to make the other person angry, without trying to address the argument at hand. Sometimes this is a delaying tactic. Needling is also [|Ad Hominem] if you insult your opponent. You may instead insult something the other person believes in ("Argumentum Ad YourMomium"), interrupt, clown to show disrespect, be noisy, fail to pass over the microphone, and numerous other tricks. All of these work better if you are running things - for example, if it is your radio show, and you can cut off the other person's microphone. If the host or moderator is firmly on your side, that is almost as good as running the show yourself. It's even better if the debate is videotaped, and you are the person who will edit the video. If you wink at the audience, or in general clown in their direction, then we are shading over to [|Argument By Personal Charm]. Usually, the best way to cope with insults is to show mild amusement, and remain polite. A humorous comeback will probably work better than an angry one. · **Straw Man (Fallacy Of Extension):** attacking an exaggerated or caricatured version of your opponent's position. For example, the claim that "evolution means [|a dog giving birth to a cat]." Another example: "Senator Jones says that we should not fund the attack submarine program. I disagree entirely. I can't understand why he wants to leave us defenseless like that." On the Internet, it is common to exaggerate the opponent's position so that a comparison can be made between the opponent and Hitler. · **Inflation Of Conflict:** arguing that scholars debate a certain point. Therefore, they must know nothing, and their entire field of knowledge is "in crisis" or does not properly exist at all. For example, two historians debated whether Hitler killed five million Jews or six million Jews. A Holocaust denier argued that this disagreement made //his// claim credible, even though his death count is three to ten times smaller than the known minimum. Similarly, in "The Mythology of Modern Dating Methods" (John Woodmorappe, 1999) we find on page 42 that two scientists "cannot agree" about which one of two geological dates is "real" and which one is "spurious". Woodmorappe fails to mention that the two dates differ by less than one percent. · **Argument From Adverse Consequences (Appeal To Fear, Scare Tactics):** saying an opponent must be wrong, because if he is right, then bad things would ensue. For example: God must exist, because a godless society would be lawless and dangerous. Or: the defendant in a murder trial must be found guilty, because otherwise husbands will be encouraged to murder their wives. Wishful thinking is closely related. "My home in Florida is six inches above sea level. Therefore I am certain that global warming will not make the oceans rise by one foot." Of course, wishful thinking can also be about positive consequences, such as winning the lottery, or eliminating poverty and crime. · **Special Pleading (Stacking The Deck):** using the arguments that support your position, but ignoring or somehow disallowing the arguments against. Uri Geller used special pleading when he claimed that the presence of unbelievers (such as stage magicians) made him unable to demonstrate his psychic powers. · **Excluded Middle (False Dichotomy, Faulty Dilemma, Bifurcation):** assuming there are only two alternatives when in fact there are more. For example, assuming Atheism is the only alternative to Fundamentalism, or being a traitor is the only alternative to being a loud patriot. · **Short Term Versus Long Term:** this is a particular case of the [|Excluded Middle]. For example, "We must deal with crime on the streets before improving the schools." (But why can't we do some of both ?) Similarly, "We should take the scientific research budget and use it to feed starving children." · **Burden Of Proof:** the claim that whatever has not yet been proved false must be true (or vice versa). Essentially the arguer claims that he should win by default if his opponent can't make a strong enough case. There may be three problems here. First, the arguer claims priority, but can he back up that claim ? Second, he is impatient with ambiguity, and wants a final answer right away. And third, "absence of evidence is not evidence of absence." · **Argument By Question:** asking your opponent a question which does not have a snappy answer. (Or anyway, no snappy answer that the audience has the background to understand.) Your opponent has a choice: he can look weak or he can look long-winded. For example, "How can scientists expect us to believe that anything as complex as a single living cell could have arisen as a result of random natural processes ?" Actually, pretty well any question has this effect to some extent. It usually takes longer to answer a question than ask it. Variants are the [|rhetorical question], and the **loaded** question, such as "Have you stopped beating your wife ?" · **Argument by Rhetorical Question:** asking a question in a way that leads to a particular answer. For example, "When are we going to give the old folks of this country the pension they deserve ?" The speaker is leading the audience to the answer "Right now." Alternatively, he could have said "When will we be able to afford a major increase in old age pensions?" In that case, the answer he is aiming at is almost certainly //not// "Right now." · **Fallacy Of The General Rule:** assuming that something true in general is true in every possible case. For example, "All chairs have four legs." Except that rocking chairs don't have any legs, and what is a one-legged "shooting stick" if it isn't a chair ? Similarly, there are times when certain laws should be broken. For example, ambulances are allowed to break speed laws. · **Reductive Fallacy (Oversimplification):** over-simplifying. As Einstein said, everything should be made as simple as possible, but no simpler. Political slogans such as "Taxation is theft" fall in this category. · **Genetic Fallacy (Fallacy of Origins, Fallacy of Virtue):** if an argument or arguer has some particular origin, the argument must be right (or wrong). The idea is that things from that origin, or that social class, have virtue or lack virtue. (Being poor or being rich may be held out as being virtuous.) Therefore, the actual details of the argument can be overlooked, since correctness can be decided without any need to listen or think. · **Psychogenetic Fallacy:** if you learn the psychological reason why your opponent likes an argument, then he's biased, so his argument must be wrong. · **Argument Of The Beard:** assuming that two ends of a spectrum are the same, since one can travel along the spectrum in very small steps. The name comes from the idea that being clean-shaven must be the same as having a big beard, since in-between beards exist. Similarly, all piles of stones are small, since if you add one stone to a small pile of stones it remains small. However, the existence of pink should not undermine the distinction between white and red. · **Argument From Age (Wisdom of the Ancients):** snobbery that very old (or very young) arguments are superior. This is a variation of the [|Genetic Fallacy], but has the psychological appeal of seniority and tradition (or innovation). Products labelled "New ! Improved !" are appealing to a belief that innovation is of value for such products. It's sometimes true. And then there's cans of "Old Fashioned Baked Beans". · **Not Invented Here:** ideas from elsewhere are made unwelcome. "This Is The Way We've Always Done It." This fallacy is a variant of the [|Argument From Age]. It gets a psychological boost from feelings that local ways are superior, or that local identity is worth any cost, or that innovations will upset matters. An example of this is the common assertion that America has "the best health care system in the world", an idea that this 2007 [|New York Times editorial] refuted. People who use the Not Invented Here argument are sometimes accused of being stick-in-the-mud's. Conversely, foreign and "imported" things may be held out as superior. · **Argument By Dismissal:** an idea is rejected without saying why. Dismissals usually have overtones. For example, "If you don't like it, leave the country" implies that your cause is hopeless, or that you are unpatriotic, or that your ideas are [|foreign], or maybe all three. "If you don't like it, live in a Communist country" adds an [|emotive] element. · **Argument To The Future:** arguing that evidence will someday be discovered which will (then) support your point. · **Poisoning The Wells:** discrediting the sources used by your opponent. This is a variation of [|Ad Hominem]. · **Argument By Emotive Language (Appeal To The People):** using emotionally loaded words to sway the audience's sentiments instead of their minds. Many emotions can be useful: anger, spite, envy, condescension, and so on. For example, argument by condescension: "Support the ERA ? Sure, when the women start paying for the drinks! Hah! Hah!" Americans who don't like the Canadian medical system have referred to it as "socialist", but I'm not quite sure if this is intended to mean "foreign", or "expensive", or simply guilty by association. [|Cliche Thinking] and [|Argument By Slogan] are useful adjuncts, particularly if you can get the audience to chant the slogan. People who rely on this argument may seed the audience with supporters or "shills", who laugh, applaud or chant at proper moments. This is the live-audience equivalent of adding a laugh track or music track. Now that many venues have video equipment, some speakers give part of their speech by playing a prepared video. These videos are an opportunity to show a supportive audience, use emotional music, show emotionally charged images, and the like. The idea is old: there used to be professional cheering sections. (Monsieur Zig-Zag, pictured on the cigarette rolling papers, acquired his fame by applauding for money at the Paris Opera.) If the emotion in question isn't harsh, [|Argument By Poetic Language] helps the effect. Flattering the audience doesn't hurt either. · **Argument By Personal Charm:** getting the audience to cut you slack. Example: Ronald Reagan. It helps if you have an opponent with much less personal charm. Charm may create trust, or the desire to "join the winning team", or the desire to please the speaker. This last is greatest if the audience feels sex appeal. Reportedly George W. Bush lost a debate when he was young, and said later that he would never be "out-bubba'd" again. · **Appeal To Pity (Appeal to Sympathy, The Galileo Argument):** "I did not murder my mother and father with an axe ! Please don't find me guilty; I'm suffering enough through being an orphan." Some authors want you to know they're suffering for their beliefs. For example, "Scientists scoffed at Copernicus and Galileo; they laughed at Edison, Tesla and Marconi; they won't give my ideas a fair hearing either. But time will be the judge. I can wait; I am patient; sooner or later science will be forced to admit that all matter is built, not of atoms, but of tiny capsules of TIME." There is a strange variant which shows up on Usenet. Somebody refuses to answer questions about their claims, on the grounds that the asker is mean and has hurt their feelings. Or, that the question is personal. · **Appeal To Force:** threats, or even violence. On the Net, the usual threat is of a lawsuit. The traditional religious threat is that one will burn in Hell. However, history is full of instances where expressing an unpopular idea could you get you beaten up on the spot, or worse. "The clinching proof of my reasoning is that I will cut anyone who argues further into dogmeat." -- Attributed to Sir Geoffery de Tourneville, ca 1350 A.D. · **Argument By Vehemence:** being loud. Trial lawyers are taught this rule: If you have the facts, pound on the facts. If you have the law, pound on the law. If you don't have either, pound on the table. The above rule paints vehemence as an act of desperation. But it can also be a way to seize control of the agenda, use up the opponent's time, or just intimidate the easily cowed. And it's not necessarily aimed at winning the day. A tantrum or a fit is also a way to get a reputation, so that in the future, no one will mess with you. Depending on what you're loud about, this may also be an [|Appeal To Force], [|Argument By Emotive Language], [|Needling], or [|Changing The Subject]. · **Begging The Question (Assuming The Answer, Tautology):** reasoning in a circle. The thing to be proved is used as one of your assumptions. For example: "We must have a death penalty to discourage violent crime". (This assumes it discourages crime.) Or, "The stock market fell because of a technical adjustment." (But is an "adjustment" just a stock market fall ?) · **Stolen Concept:** using what you are trying to disprove. That is, requiring the truth of something for your proof that it is false. For example, using science to show that science is wrong. Or, arguing that you do not exist, when your existence is clearly required for you to be making the argument. This is a relative of [|Begging The Question], except that the circularity there is in what you are trying to prove, instead of what you are trying to disprove. It is also a relative of [|Reductio Ad Absurdum], where you //temporarily// assume the truth of something. · **Argument From Authority:** the claim that the speaker is an expert, and so should be trusted. There are degrees and areas of expertise. The speaker is actually claiming to be //more// expert, in the relevant subject area, than anyone else in the room. There is also an implied claim that expertise in the area is worth having. For example, claiming expertise in something hopelessly [|quack] (like [|iridology]) is actually an admission that the speaker is gullible. · **Argument From False Authority:** a strange variation on [|Argument From Authority]. For example, the TV commercial which starts "I'm not a doctor, but I play one on TV." Just what are we supposed to conclude ? · **Appeal To Anonymous Authority:** an [|Appeal To Authority] is made, but the authority is not named. For example, "Experts agree that ..", "scientists say .." or even "they say ..". This makes the information impossible to verify, and brings up the very real possibility that the arguer himself doesn't know who the experts are. In that case, he may just be spreading a rumor. The situation is even worse if the arguer admits it's a rumor. · **Appeal To Authority:** "Albert Einstein was extremely impressed with this theory." (But a statement made by someone long-dead could be out of date. Or perhaps Einstein was just being polite. Or perhaps he made his statement in some specific context. And so on.) To justify an appeal, the arguer should at least present an exact quote. It's more convincing if the quote contains context, and if the arguer can say where the quote comes from. A variation is to appeal to [|unnamed authorities]. There was a New Yorker cartoon, showing a doctor and patient. The doctor was saying: "Conventional medicine has no treatment for your condition. Luckily for you, I'm a quack." So the joke was that the doctor boasted of his //lack// of authority. · **Appeal To False Authority:** a variation on [|Appeal To Authority], but the [|Authority] is outside his area of expertise. For example, "Famous physicist John Taylor studied [|Uri Geller] extensively and found no evidence of trickery or fraud in his feats." Taylor was not qualified to detect trickery or fraud of the kind used by stage magicians. Taylor later admitted Geller had tricked him, but he apparently had not figured out how. A variation is to appeal to a non-existent authority. For example, someone reading an article by Creationist Dmitri Kuznetsov tried to look up the referenced articles. Some of the articles turned out to be in non-existent journals. Another variation is to [|misquote] a real authority. There are several kinds of misquotation. A quote can be inexact or have been edited. It can be taken out of context. (Chevy Chase: "Yes, I said that, but I was singing a song written by someone else at the time.") The quote can be separate quotes which the arguer glued together. Or, bits might have gone missing. For example, it's easy to prove that Mick Jagger is an assassin. In "Sympathy For The Devil" he sang: "I shouted out, who killed the Kennedys, When after all, it was ... me." · **Statement Of Conversion:** the speaker says "I used to believe in X". This is simply a weak form of asserting expertise. The speaker is implying that he has learned about the subject, and now that he is better informed, he has rejected X. So perhaps he is now an authority, and this is an implied [|Argument From Authority]. A more irritating version of this is "I used to think that way when I was your age." The speaker hasn't said what is wrong with your argument: he is merely claiming that his age has made him an expert. "X" has not actually been countered unless there is agreement that the speaker has that expertise. In general, any bald claim always has to be buttressed. For example, there are a number of Creationist authors who say they "used to be evolutionists", but the scientists who have rated their books haven't noticed any expertise about evolution. · **Bad Analogy:** claiming that two situations are highly similar, when they aren't. For example, "The solar system reminds me of an atom, with planets orbiting the sun like electrons orbiting the nucleus. We know that electrons can jump from orbit to orbit; so we must look to ancient records for sightings of planets jumping from orbit to orbit also." Or, "Minds, like rivers, can be broad. The broader the river, the shallower it is. Therefore, the broader the mind, the shallower it is." Or, "We have pure food and drug laws; why can't we have laws to keep movie-makers from giving us filth ?" · **Extended Analogy:** the claim that two things, both analogous to a third thing, are therefore analogous to each other. For example, this debate: "I believe it is always wrong to oppose the law by breaking it." "Such a position is odious: it implies that you would not have supported Martin Luther King." "Are you saying that cryptography legislation is as important as the struggle for Black liberation ? How dare you !" A person who advocates a particular position (say, about gun control) may be told that Hitler believed the same thing. The clear implication is that the position is somehow tainted. But Hitler also believed that window drapes should go all the way to the floor. Does that mean people with such drapes are monsters ? · **Argument From Spurious Similarity:** this is a relative of [|Bad Analogy]. It is suggested that some resemblance is proof of a relationship. There is a WW II story about a British lady who was trained in spotting German airplanes. She made a report about a certain very important type of plane. While being quizzed, she explained that she hadn't been sure, herself, until she noticed that it had a little man in the cockpit, just like the little model airplane at the training class. · **Reifying:** an abstract thing is talked about as if it were concrete. (A possibly [|Bad Analogy] is being made between concept and reality.) For example, "Nature abhors a vacuum." · **False Cause:** assuming that because two things happened, the first one caused the second one. (Sequence is not causation.) For example, "Before women got the vote, there were no nuclear weapons." Or, "Every time my brother Bill accompanies me to Fenway Park, the Red Sox are sure to lose." Essentially, these are arguments that the sun goes down because we've turned on the street lights. · **Confusing Correlation And Causation:** earthquakes in the Andes were correlated with the closest approaches of the planet Uranus. Therefore, Uranus must have caused them. (But Jupiter is nearer than Uranus, and more massive too.) When sales of hot chocolate go up, street crime drops. Does this correlation mean that hot chocolate prevents crime ? No, it means that fewer people are on the streets when the weather is cold. The bigger a child's shoe size, the better the child's handwriting. Does having big feet make it easier to write ? No, it means the child is older. · **Causal Reductionism (Complex Cause):** trying to use one cause to explain something, when in fact it had several causes. For example, "The accident was caused by the taxi parking in the street." (But other drivers went around the taxi. Only the drunk driver hit the taxi.) · **Cliche Thinking:** using as evidence a well-known wise saying, as if that is proven, or as if it has no exceptions. · **Exception That Proves The Rule:** a specific example of [|Cliche Thinking]. This is used when a rule has been asserted, and someone points out the rule doesn't always work. The cliche rebuttal is that this is "the exception that proves the rule". Many people think that this cliche somehow allows you to ignore the exception, and continue using the rule. In fact, the cliche originally did no such thing. There are two standard explanations for the original meaning. The first is that the word "prove" meant //test//. That is why the military takes its equipment to a //Proving Ground// to test it. So, the cliche originally said that an exception tests a rule. That is, if you find an exception to a rule, the cliche is saying that the rule is being tested, and perhaps the rule will need to be discarded. The second explanation is that the stating of an exception to a rule, proves that the rule exists. For example, suppose it was announced that "Over the holiday weekend, students do not need to be in the dorms by midnight". This announcement implies that normally students //do// have to be in by midnight. Here is a [|discussion] of that explanation. In either case, the cliche is not about waving away objections. · **Appeal To Widespread Belief (Bandwagon Argument, Peer Pressure, Appeal to Common Practice):** the claim, as evidence for an idea, that many people believe it, or used to believe it, or do it. If the discussion is about social conventions, such as "good manners", then this is a reasonable line of argument. However, in the 1800's there was a widespread belief that bloodletting cured sickness. All of these people were not just wrong, but horribly wrong, because in fact it made people sicker. Clearly, the popularity of an idea is no guarantee that it's right. Similarly, a common justification for bribery is that "Everybody does it". And in the past, this was a justification for slavery. · **Fallacy Of Composition:** assuming that a whole has the same simplicity as its constituent parts. In fact, a great deal of science is the study of //emergent properties//. For example, if you put a drop of oil on water, there are interesting optical effects. But the effect comes from the oil/water system: it does not come just from the oil or just from the water. Another example: "A car makes less pollution than a bus. Therefore, cars are less of a pollution problem than buses." Another example: "Atoms are colorless. Cats are made of atoms, so cats are colorless." · **Fallacy Of Division:** assuming that what is true of the whole is true of each constituent part. For example, human beings are made of atoms, and human beings are conscious, so atoms must be conscious. · **Complex Question (Tying):** unrelated points are treated as if they should be accepted or rejected together. In fact, each point should be accepted or rejected on its own merits. For example, "Do you support freedom and the right to bear arms ?" · **Slippery Slope Fallacy (Camel's Nose)** there is an old saying about how if you allow a camel to poke his nose into the tent, soon the whole camel will follow. The fallacy here is the assumption that something is wrong because it is right next to something that is wrong. Or, it is wrong because it could slide towards something that is wrong. For example, "Allowing abortion in the first week of pregnancy would lead to allowing it in the ninth month." Or, "If we legalize marijuana, then more people will try heroin." Or, "If I make an exception for you then I'll have to make an exception for everyone." · **Argument By Pigheadedness (Doggedness):** refusing to accept something after everyone else thinks it is well enough proved. For example, there are still Flat Earthers. · **Appeal To Coincidence:** asserting that some fact is due to chance. For example, the arguer has had a dozen traffic accidents in six months, yet he insists they weren't his fault. This may be [|Argument By Pigheadedness]. But on the other hand, coincidences do happen, so this argument is not always fallacious. · **Argument By Repetition (Argument Ad Nauseam):** if you say something often enough, some people will begin to believe it. There are some net.kooks who keeping reposting the same articles to Usenet, presumably in hopes it will have that effect. · **Argument By Half Truth (Suppressed Evidence):** this is hard to detect, of course. You have to ask questions. For example, an amazingly accurate "prophecy" of the assassination attempt on President Reagan was shown on TV. But was the tape recorded before or after the event ? Many stations did not ask this question. (It was recorded afterwards.) A book on "sea mysteries" or the "Bermuda Triangle" might tell us that the yacht Connemara IV was found drifting crewless, southeast of Bermuda, on September 26, 1955. None of these books mention that the yacht had been directly in the path of Hurricane Iona, with 180 mph winds and 40-foot waves. · **Argument By Selective Observation:** also called cherry picking, the enumeration of favorable circumstances, or as the philosopher Francis Bacon described it, counting the hits and forgetting the misses. For example, a state boasts of the Presidents it has produced, but is silent about its serial killers. Or, the claim "Technology brings happiness". (Now, there's something with hits and misses.) Casinos encourage this human tendency. There are bells and whistles to announce slot machine jackpots, but losing happens silently. This makes it much easier to think that the odds of winning are good. · **Argument By Selective Reading:** making it seem as if the weakest of an opponent's arguments was the best he had. Suppose the opponent gave a strong argument X and also a weaker argument Y. Simply rebut Y and then say the opponent has made a weak case. This is a relative of [|Argument By Selective Observation], in that the arguer overlooks arguments that he does not like. It is also related to [|Straw Man (Fallacy Of Extension)], in that the opponent's argument is not being fairly represented. · **Argument By Generalization:** drawing a broad conclusion from a small number of perhaps unrepresentative cases. (The cases may be unrepresentative because of [|Selective Observation].) For example, "They say 1 out of every 5 people is Chinese. How is this possible ? I know hundreds of people, and none of them is Chinese." So, by generalization, there aren't any Chinese anywhere. This is connected to the [|Fallacy Of The General Rule]. Similarly, "Because we allow terminally ill patients to use heroin, we should allow everyone to use heroin." It is also possible to under-generalize. For example, "A man who had killed both of his grandmothers declared himself rehabilitated, on the grounds that he could not conceivably repeat his offense in the absence of any further grandmothers." -- "Ports Of Call" by Jack Vance · **Argument From Small Numbers:** "I've thrown three sevens in a row. Tonight I can't lose." This is [|Argument By Generalization], but it assumes that small numbers are the same as big numbers. (Three sevens is actually a common occurrence. Thirty three sevens is not.) Or: "After treatment with the drug, one-third of the mice were cured, one-third died, and the third mouse escaped." Does this mean that if we treated a thousand mice, 333 would be cured ? Well, no. · **Misunderstanding The Nature Of Statistics (Innumeracy):** President Dwight Eisenhower expressed astonishment and alarm on discovering that fully half of all Americans had below average intelligence. Similarly, some people get fearful when they learn that their doctor wasn't in the top half of his class. (But that's half of them.) "Statistics show that of those who contract the habit of eating, very few survive." -- Wallace Irwin. Very few people seem to understand "regression to the mean". This is the idea that things tend to go back to normal. If you feel normal today, does it really mean that the headache cure you took yesterday performed wonders ? Or is it just that your headaches are always gone the next day ? Journalists are notoriously bad at reporting risks. For example, in 1995 it was loudly reported that a class of contraceptive pills would double the chance of dangerous blood clots. The news stories mostly did not mention that "doubling" the risk only increased it by one person in 7,000. The "cell phones cause brain cancer" reports are even sillier, with the supposed increase in risk being at most one or two cancers per 100,000 people per year. So, if the fearmongers are right, your cellphone has increased your risk from "who cares" to "who cares". · **Inconsistency:** for example, the declining life expectancy in the former Soviet Union is due to the failures of communism. But, the quite high infant mortality rate in the United States is not a failure of capitalism. This is related to [|Internal Contradiction]. · **Non Sequitur:** something that just does not follow. For example, "Tens of thousands of Americans have seen lights in the night sky which they could not identify. The existence of life on other planets is fast becoming certainty !" Another example: arguing at length that your religion is of great help to many people. Then, concluding that the teachings of your religion are undoubtably true. Or: "Bill lives in a large building, so his apartment must be large." · **Meaningless Questions:** irresistible forces meeting immovable objects, and the like. · **Argument By Poetic Language:** if it sounds good, it must be right. Songs often use this effect to create a sort of credibility - for example, "Don't Fear The Reaper" by Blue Oyster Cult. Politically oriented songs should be taken with a grain of salt, precisely because they sound good. · **Argument By Slogan:** if it's short, and connects to an argument, it must **be** an argument. (But slogans risk the [|Reductive Fallacy].) Being short, a slogan increases the effectiveness of [|Argument By Repetition]. It also helps [|Argument By Emotive Language (Appeal To The People)], since emotional appeals need to be punchy. (Also, the gallery can chant a short slogan.) Using an old slogan is [|Cliche Thinking]. · **Argument By Prestigious Jargon:** using big complicated words so that you will seem to be an expert. Why do people use "utilize" when they could utilize "use" ? For example, crackpots used to claim they had a Unified Field Theory (after Einstein). Then the word Quantum was popular. Lately it seems to be Zero Point Fields. · **Argument By Gibberish (Bafflement):** this is the extreme version of [|Argument By Prestigious Jargon]. An invented vocabulary helps the effect, and some net.kooks use lots of CAPitaLIZation. However, perfectly ordinary words can be used to baffle. For example, "Omniscience is greater than omnipotence, and the difference is two. Omnipotence plus two equals omniscience. META = 2." [From R. Buckminster Fuller's //No More Secondhand God//.] Gibberish may come from people who can't find meaning in technical jargon, so they think they should copy style instead of meaning. It can also be a "snow job", AKA "baffle them with BS", by someone actually familiar with the jargon. Or it could be [|Argument By Poetic Language]. An example of poetic gibberish: "Each autonomous individual emerges holographically within egoless ontological consciousness as a non-dimensional geometric point within the transcendental thought-wave matrix." · **Equivocation:** using a word to mean one thing, and then later using it to mean something different. For example, sometimes "Free software" costs nothing, and sometimes it is without restrictions. Some examples: "The sign said 'fine for parking here', and since it was fine, I parked there." All trees have bark. All dogs bark. Therefore, all dogs are trees. "Consider that two wrongs never make a right, but that three lefts do." - "Deteriorata", National Lampoon · **Euphemism:** the use of words that sound better. The lab rat wasn't killed, it was //sacrificed//. Mass murder wasn't genocide, it was //ethnic cleansing//. The death of innocent bystanders is //collateral damage//. Microsoft doesn't find bugs, or problems, or security vulnerabilities: they just discover an //issue// with a piece of software. This is related to [|Argument By Emotive Language], since the effect is to make a concept emotionally palatable. · **Weasel Wording:** this is very much like [|Euphemism], except that the word changes are done to claim a new, different concept rather than soften the old concept. For example, an American President may not legally conduct a war without a declaration of Congress. So, various Presidents have conducted "police actions", "armed incursions", "protective reaction strikes," "pacification," "safeguarding American interests," and a wide variety of "operations". Similarly, War Departments have become Departments of Defense, and untested medicines have become alternative medicines. The book "1984" has some particularly good examples. · **Error Of Fact:** for example, "No one knows how old the Pyramids of Egypt are." (Except, of course, for the historians who've read records and letters written by the ancient Egyptians themselves.) Typically, the presence of one error means that there are other errors to be uncovered. · **Lies:** intentional [|Errors of Fact]. If the speaker thinks that lying serves a moral end, this would be a [|Pious Fraud]. · **Hypothesis Contrary To Fact:** arguing from something that might have happened, but didn't. · **Internal Contradiction:** saying two contradictory things in the same argument. For example, claiming that [|Archaeopteryx] is a dinosaur with hoaxed feathers, and also saying in the same book that it is a "true bird". Or another author who said on page 59, "Sir Arthur Conan Doyle writes in his autobiography that he never saw a ghost." But on page 200 we find "Sir Arthur's first encounter with a ghost came when he was 25, surgeon of a whaling ship in the Arctic.." This is much like saying "I never borrowed his car, and it already had that dent when I got it." This is related to [|Inconsistency]. · **Changing The Subject (Digression, Red Herring, Misdirection, False Emphasis):** this is sometimes used to avoid having to defend a claim, or to avoid making good on a promise. In general, there is something you are not supposed to notice. For example, I got a bill which had a big announcement about how some tax had gone up by 5%, and the costs would have to be passed on to me. But a quick calculation showed that the increased tax was only costing me a dime, while a different part of the the bill had silently gone up by $10. This is connected to various diversionary tactics, which may be obstructive, obtuse, or [|needling]. For example, if you quibble about the meaning of some word a person used, they may be quite happy about being corrected, since that means they've derailed you, or changed the subject. They may pick nits in your wording, perhaps asking you to define "is". They may deliberately misunderstand you: "You said this happened five years before Hitler came to power. Why are you so fascinated with Hitler ? Are you anti-Semitic ?" It is also connected to various rhetorical tricks, such as announcing that there cannot be a question period because the speaker must leave. (But then he doesn't leave.) · **Argument By Fast Talking:** if you go from one idea to the next quickly enough, the audience won't have time to think. This is connected to [|Changing The Subject] and (to some audiences) [|Argument By Personal Charm]. However, some psychologists say that to understand what you hear, you must for a brief moment believe it. If this is true, then rapid delivery does not leave people time to reject what they hear. · **Having Your Cake (Failure To Assert, or Diminished Claim):** almost claiming something, but backing out. For example, "It may be, as some suppose, that ghosts can only be seen by certain so-called sensitives, who are possibly special mutations with, perhaps, abnormally extended ranges of vision and hearing. Yet some claim we are all sensitives." Another example: "I don't necessarily agree with the liquefaction theory, nor do I endorse all of Walter Brown's other material, but the geological statements are informative." The strange thing here is that liquefaction theory (the idea that the world's rocks formed in flood waters) was demolished in 1788. To "not necessarily agree" with it, today, is in the category of "not necessarily agreeing" with 2+2=3. But notice that writer implies some study of the matter, and only partial rejection. A similar thing is the failure to rebut. Suppose I raise an issue. The response that "Woodmorappe's book talks about that" could possibly be a reference to a resounding rebuttal. Or perhaps the responder hasn't even read the book yet. How can we tell ? [I later discovered it was the latter.] · **Ambiguous Assertion:** a statement is made, but it is sufficiently unclear that it leaves some sort of leeway. For example, a book about Washington politics did not place quotation marks around quotes. This left ambiguity about which parts of the book were first-hand reports and which parts were second-hand reports, assumptions, or outright fiction. Of course, lack of clarity is not always intentional. Sometimes a statement is just vague. If the statement has two different meanings, this is Amphiboly. For example, "Last night I shot a burglar in my pyjamas." · **Failure To State:** if you make enough attacks, and ask enough questions, you may never have to actually define your own position on the topic. · **Outdated Information:** information is given, but it is not the latest information on the subject. For example, some creationist articles about [|the amount of dust on the moon] quote a measurement made in the 1950's. But many much better measurements have been done since then. · **Amazing Familiarity:** the speaker seems to have information that there is no possible way for him to get, on the basis of his own statements. For example: "The first man on deck, seaman Don Smithers, yawned lazily and fingered his good luck charm, a dried seahorse. To no avail ! At noon, the Sea Ranger was found drifting aimlessly, with every man of its crew missing without a trace !" · **Least Plausible Hypothesis:** ignoring all of the most reasonable explanations. This makes the desired explanation into the only one. For example: "I left a saucer of milk outside overnight. In the morning, the milk was gone. Clearly, my yard was visited by fairies." There is an old rule for deciding which explanation is the most plausible. It is most often called "Occam's Razor", and it basically says that the simplest is the best. The current phrase among scientists is that an explanation should be "the most parsimonious", meaning that it should not introduce new concepts (like fairies) when old concepts (like neighborhood cats) will do. On ward rounds, medical students love to come up with the most obscure explanations for common problems. A traditional response is to tell them "If you hear hoof beats, don't automatically think of zebras". · **Argument By Scenario:** telling a story which ties together unrelated material, and then using the story as proof they are related. · **Affirming The Consequent:** logic reversal. A correct statement of the form "if P then Q" gets turned into "Q therefore P". For example, "All cats die; Socrates died; therefore Socrates was a cat." Another example: "If the earth orbits the sun, then the nearer stars will show an apparent annual shift in position relative to more distant stars (stellar parallax). Observations show conclusively that this parallax shift does occur. This proves that the earth orbits the sun." In reality, it proves that Q [the parallax] //is consistent with// P [orbiting the sun]. But it might also be consistent with some other theory. (Other theories did exist. They are now dead, because although they were consistent with a few facts, they were not consistent with all the facts.) Another example: "If space creatures were kidnapping people and examining them, the space creatures would probably hypnotically erase the memories of the people they examined. These people would thus suffer from amnesia. But in fact many people do suffer from amnesia. This tends to prove they were kidnapped and examined by space creatures." This is also a [|Least Plausible Hypothesis] explanation. · **Moving The Goalposts (Raising The Bar, Argument By Demanding Impossible Perfection):** if your opponent successfully addresses some point, then say he must also address some further point. If you can make these points more and more difficult (or diverse) then eventually your opponent must fail. If nothing else, you will eventually find a subject that your opponent isn't up on. This is related to [|Argument By Question]. Asking questions is easy: it's answering them that's hard. If each new goal causes a new question, this may get to be Infinite Regression. It is also possible to lower the bar, reducing the burden on an argument. For example, a person who takes Vitamin C might claim that it prevents colds. When they do get a cold, then they move the goalposts, by saying that the cold would have been much worse if not for the Vitamin C. · **Appeal To Complexity:** if the arguer doesn't understand the topic, he concludes that nobody understands it. So, his opinions are as good as anybody's. · **Common Sense:** unfortunately, there simply isn't a common-sense answer for many questions. In politics, for example, there are a lot of issues where people disagree. Each side thinks that their answer is common sense. Clearly, some of these people are wrong. The reason they are wrong is because common sense depends on the context, knowledge and experience of the observer. That is why instruction manuals will often have paragraphs like these: When boating, use common sense. Have one life preserver for each person in the boat. When towing a water skier, use common sense. Have one person watching the skier at all times. If the ideas are so obvious, then why the second sentence ? Why do they have to spell it out ? The answer is that "use common sense" actually meant "pay attention, I am about to tell you something that inexperienced people often get wrong." Science has discovered a lot of situations which are far more unfamiliar than water skiing. Not surprisingly, beginners find that much of it violates their common sense. For example, many people can't imagine how a mountain range would form. But in fact anyone can take good GPS equipment to the Himalayas, and measure for themselves that those mountains are rising today. · **Argument By Laziness (Argument By Uninformed Opinion):** the arguer hasn't bothered to learn anything about the topic. He nevertheless has an opinion, and will be insulted if his opinion is not treated with respect. For example, someone looked at a picture on one of my [|web pages], and made a complaint which showed that he hadn't even skimmed through the words on the page. When I pointed this out, he replied that I shouldn't have had such a confusing picture. · **Disproof By Fallacy:** if a conclusion can be reached in an obviously fallacious way, then the conclusion is incorrectly declared wrong. For example, "Take the division 64/16. Now, canceling a 6 on top and a six on the bottom, we get that 64/16 = 4/1 = 4." "Wait a second ! You can't just cancel the six !" "Oh, so you're telling us 64/16 is not equal to 4, are you ?" Note that this is different from [|Reductio Ad Absurdum], where your opponent's argument can lead to an absurd conclusion. In this case, an absurd argument leads to a normal conclusion. · **Reductio Ad Absurdum:** showing that your opponent's argument leads to some absurd conclusion. This is //in general// a reasonable and non-fallacious way to argue. If the issues are razor-sharp, it is a good way to completely destroy his argument. However, if the waters are a bit muddy, perhaps you will only succeed in showing that your opponent's argument does not apply in all cases, That is, using Reductio Ad Absurdum is sometimes using the [|Fallacy Of The General Rule]. However, if you are faced with an argument that is poorly worded, or only lightly sketched, Reductio Ad Absurdum may be a good way of pointing out the holes. An example of why absurd conclusions are bad things: Bertrand Russell, in a lecture on logic, mentioned that in the sense of material implication, a false proposition implies any proposition. A student raised his hand and said "In that case, given that 1 = 0, prove that you are the Pope". Russell immediately replied, "Add 1 to both sides of the equation: then we have 2 = 1. The set containing just me and the Pope has 2 members. But 2 = 1, so it has only 1 member; therefore, I am the Pope." · **False Compromise:** if one does not understand a debate, it must be "fair" to split the difference, and agree on a compromise between the opinions. (But one side is very possibly wrong, and in any case one could simply suspend judgment.) Journalists often invoke this fallacy in the name of "balanced" coverage. "Some say the sun rises in the east, some say it rises in the west; the truth lies probably somewhere in between." Television reporters like balanced coverage so much that they may give half of their report to a view held by a small minority of the people in question. There are many possible reasons for this, some of them good. However, viewers need to be aware of this tendency. · **Fallacy Of The Crucial Experiment:** claiming that some idea has been proved (or disproved) by a pivotal discovery. This is the "smoking gun" version of history. Scientific progress is often reported in such terms. This is inevitable when a complex story is reduced to a soundbite, but it's almost always a distortion. In reality, a lot of background happens first, and a lot of buttressing (or retraction) happens afterwards. And in natural history, most of the theories are about how often certain things happen (relative to some other thing). For those theories, no one experiment could ever be conclusive. · **Two Wrongs Make A Right (Tu Quoque, You Too, What's sauce for the goose is sauce for the gander):** a charge of wrongdoing is answered by a rationalization that others have sinned, or might have sinned. For example, Bill borrows Jane's expensive pen, and later finds he hasn't returned it. He tells himself that it is okay to keep it, since she would have taken his. War atrocities and terrorism are often defended in this way. Similarly, some people defend capital punishment on the grounds that the state is killing people who have killed. This is related to [|Ad Hominem (Argument To The Man)]. · **Pious Fraud:** a [|fraud] done to accomplish some good end, on the theory that the end justifies the means. For example, a church in Canada had a statue of Christ which started to weep tears of blood. When analyzed, the blood turned out to be beef blood. We can reasonably assume that someone with access to the building thought that bringing souls to Christ would justify his small deception. In the context of debates, a Pious Fraud could be a [|lie]. More generally, it would be when an emotionally committed speaker makes an assertion that is shaded, distorted or even fabricated. For example, British Prime Minister Tony Blair was accused in 2003 of "sexing up" his evidence that Iraq had Weapons of Mass Destruction. Around the year 400, Saint Augustine wrote two books, [|De Mendacio][On Lying] and //Contra Medacium//[Against Lying], on this subject. He argued that the sin isn't in what you do (or don't) say, but in your intent to leave a false impression. He strongly opposed Pious Fraud. I believe that Martin Luther also wrote on the subject.

I apologize for not citing the source of this article. An internet search revealed that various versions exist in different places. If you know the original author, please let us know.