Logical Fallacies
Handlist:
Fallacies
are statements that might sound reasonable or superficially true
but are actually flawed or dishonest. When readers detect them,
these logical fallacies backfire by making the audience think
the writer is (a) unintelligent or (b) deceptive. It is important
to avoid them in your own arguments, and it is also important
to be able to spot them in others' arguments so a false line of
reasoning won't fool you. Think of this as intellectual kung-fu:
the vital art of self-defense in a debate. For extra impact, learn both
the Latin terms and the English equivalents.
You can click here to download a PDF version of this material.
In general, one useful way
to organize fallacies is by category. We have below fallacies
of relevance, component
fallacies, fallacies
of ambiguity, and fallacies
of omission. We will discuss each type in turn. The
last point to discuss is Occam's
Razor.
FALLACIES
OF RELEVANCE:
These fallacies
appeal to evidence or examples that are not relevant to the argument
at hand.
Appeal
to Force (Argumentum
Ad Baculum or the "Might-Makes-Right" Fallacy): This
argument uses force, the threat of force, or some other
unpleasant
backlash to make the audience accept a conclusion. It commonly
appears as a last resort when evidence or rational arguments
fail
to convince a reader. If the debate is about whether or
not 2+2=4, an opponent's argument that he will smash
your nose in if you don't agree with his claim doesn't change
the truth
of an issue. Logically, this consideration has nothing to do
with the points under consideration. The fallacy is not limited
to
threats of violence, however. The fallacy includes threats of
any unpleasant backlash--financial, professional, and so on. Example:
"Superintendent, you should cut the school budget by $16,000.
I need not remind you that past school boards have fired superintendents
who cannot keep down costs." While intimidation may force the
superintendent to conform, it does not convince him that the
choice
to cut the budget was the most beneficial for the school or community.
Lobbyists use this method when they remind legislators that
they
represent so many thousand votes in the legislators' constituencies
and threaten to throw the politician out of office if he doesn't
vote the way they want. Teachers use this method if they state
that students should hold the same political or philosophical
position as the teachers or risk failing the class. Note that
it is isn't a logical fallacy, however, to assert that students
must fulfill certain requirements in the course or risk failing
the class!
Genetic
Fallacy: The genetic fallacy is the claim that
an idea, product, or person must be untrustworthy because
of its racial, geographic,
or ethnic origin. "That car can't possibly be any good!
It was made in Japan!" Or, "Why should I listen to
her argument? She comes from California, and we all know those
people
are
flakes."
Or, "Ha! I'm not reading that book. It was published
in Tennessee, and we know all Tennessee folk are hillbillies
and
rednecks!" This type of fallacy is closely related to the
fallacy of argumentum ad hominem or personal
attack, appearing immediately below.
Personal
Attack (Argumentum Ad Hominem, literally,
"argument toward the man." Also called "Poisoning the Well"):
Attacking or praising the people who make an argument, rather
than discussing the argument itself. This practice is fallacious
because the personal character of an individual is logically
irrelevant
to the truth or falseness of the argument itself. The statement
"2+2=4" is true regardless if it is stated by criminals,
congressmen, or pastors. There are two subcategories:
(1)
Abusive: To argue that proposals, assertions, or arguments
must be false or dangerous because they originate with atheists,
Christians, Muslims, communists, capitalists, the John Birch Society,
Catholics, anti-Catholics, racists, anti-racists, feminists,
misogynists (or any other group) is fallacious. This persuasion
comes from irrational psychological transference rather than
from an appeal to evidence or logic concerning the issue at
hand. This is similar to the genetic
fallacy, and only an anti-intellectual would argue
otherwise.
(2)
Circumstantial: To argue that an opponent should accept
or reject an argument because of circumstances in his or her life. If
one's adversary is a clergyman, suggesting that he should accept
a particular argument because not to do so would be incompatible
with the scriptures is such a fallacy. To argue that, because
the reader is a Republican or Democrat, she must vote for a
specific measure is likewise a circumstantial fallacy. The opponent's
special circumstances have no control over the truth or untruth of a specific
contention. The speaker or writer must find additional evidence beyond that to make a strong case. This is also similar to the genetic
fallacy in some ways. If you are a college student
who wants to learn rational thought, you simply must avoid circumstantial
fallacies.
Argumentum
ad Populum (Literally "Argument to the People"):
Using an appeal to popular assent, often by arousing
the feelings and
enthusiasm of the multitude rather than building an argument.
It is a favorite device with the propagandist, the demagogue,
and the advertiser. An example of this type of argument is Shakespeare's
version of Mark Antony's funeral oration for Julius Caesar.
There
are three basic approaches:
(1)
Bandwagon Approach: “Everybody is doing it.”
This argumentum ad populum asserts that, since the
majority of people believes an argument or chooses a particular
course of action, the argument must be true, or the course of
action must be followed, or the decision must be the best choice.
For instance, “85% of consumers purchase IBM computers
rather than Macintosh; all those people can’t be wrong.
IBM must make the best computers.” Popular acceptance
of any argument does not prove it to be valid, nor does popular
use of any product necessarily prove it is the best one. After
all, 85% of people may once have thought planet earth was flat,
but that majority's belief didn't mean the earth really was
flat when they believed it! Keep this in mind, and remember
that everybody should avoid this type of logical fallacy.
(2)
Patriotic Approach: "Draping oneself in the flag." This
argument asserts that a certain stance is true or correct because
it is somehow patriotic, and that those who disagree are unpatriotic.
It overlaps with pathos
and argumentum
ad hominem to a certain extent. The best way
to spot it is to look for emotionally charged terms like Americanism,
rugged individualism, motherhood, patriotism, godless communism,
etc. A true American would never use this approach. And a truly
free man will exercise his American right to drink beer, since
beer belongs in this great country of ours.This approach is unworthy of a good citizen.
(3)
Snob Approach: This type of argumentum
ad populum doesn’t assert “everybody
is doing it,” but rather that “all the best people
are doing it.” For instance, “Any true intellectual
would recognize the necessity for studying logical fallacies.”
The implication is that anyone who fails to recognize the truth
of the author’s assertion is not an intellectual, and
thus the reader had best recognize that necessity.
In all three of these examples,
the rhetorician does not supply evidence that an argument
is true;
he merely makes assertions about people who agree or disagree
with the argument. For Christian students in religious schools
like Carson-Newman, we might add a fourth category, "Covering
Oneself in the Cross." This argument asserts that
a certain political or denominational stance is true or correct
because it is somehow "Christian," and that anyone
who disagrees is behaving in an "un-Christian" or "godless"
manner. (It is similar to the patriotic
approach except it substitutes a gloss of piety
instead of patriotism.) Examples include the various "Christian
Voting Guides" that appear near election time, many of
them published by non-Church related organizations with hidden
financial/political
agendas, or the stereotypical crooked used-car salesman who keeps
a pair of bibles on his dashboard in order to win the trust
of
those he would fleece. Keep in mind Moliere's question in Tartuffe:
"Is not a face quite different than a mask?"
Is not the appearance of Christianity quite different than actual
Christianity? Christians should beware of such manipulation since
they are especially
vulnerable
to
it.
Appeal
to Tradition (Argumentum Ad Traditionem; aka Argumentum Ad Antiquitatem): This line
of thought asserts that a premise must be true because people
have always believed it or done it. For example, "We know the earth is flat because generations have thought that for centuries!" Alternatively, the appeal to tradition might conclude
that the premise has always worked in the past and will thus always
work in the future: “Jefferson City has kept its urban growth
boundary at six miles for the past thirty years. That has been
good enough for thirty years, so why should we change it now?
If it ain’t broke, don’t fix it.” Such an argument
is appealing in that it seems to be common sense, but it ignores
important questions. Might an alternative policy work even better
than the old one? Are there drawbacks to that long-standing policy?
Are circumstances changing from the way they were thirty years
ago? Has new evidence emerged that might throw that long-standing policy into doubt?
Appeal
to Improper Authority (Argumentum Ad Verecundium, literally
"argument from that which is improper"): An appeal to an improper
authority, such as a famous person or a source that may not
be
reliable or who might not know anything about the topic. This fallacy attempts to capitalize upon feelings of
respect or familiarity with a famous individual. It is not fallacious
to refer to an admitted authority if the individual’s expertise
is within a strict field of knowledge. On the other hand, to
cite
Einstein to settle an argument about education or economics is
fallacious. To cite Darwin, an authority on biology, on religious
matters is fallacious. To cite Cardinal Spellman on legal problems
is fallacious. The worst offenders usually involve movie stars
and psychic hotlines. A subcategory is the Appeal
to Biased Authority. In this sort of appeal, the authority
is one who actually is knowledgeable on the matter, but one
who may have
professional or personal motivations that render his professional
judgment suspect: for instance, "To determine whether fraternities
are beneficial to this campus, we interviewed all the frat
presidents."
Or again, "To find out whether or not sludge-mining
really is endangering the Tuskogee salamander's breeding grounds,
we interviewed the owners of the sludge-mines, who declared
there is no problem." Indeed, it is important to get "both
viewpoints" on an argument, but basing a substantial part of
your argument on a source that has personal, professional,
or financial
interests at stake may lead to biased arguments. As Upton Sinclair once stated, "It's difficult to get a man to understand something when his salary depends upon his not understanding it." Sinclair is pointing out that even a knowledgeable authority might not be entirely rational on a topic when he has economic incentives that bias his thinking.
Appeal
to Emotion (Argumentum Ad Misericordiam,
literally, "argument from pity"): An emotional appeal concerning
what should be a logical issue during a debate. While pathos
generally works to reinforce a reader’s sense of duty or
outrage at some abuse, if a writer tries to use emotion merely
for the sake of getting the reader to accept what should be a
logical conclusion, the argument is a fallacy. For example, in
the 1880s,
prosecutors
in a Virginia court presented overwhelming proof that a boy was
guilty of murdering his parents with an ax. The defense presented
a "not-guilty" plea for on the grounds that the boy
was now an orphan, with no one to look after his interests if
the court was not lenient. This appeal to emotion obviously seems
misplaced, and the argument is irrelevant to the question of
whether
or not he did the crime.
Argument from Adverse Consequences: Asserting that an argument must be false because the implications of it being true would create negative results. For instance, “The medical tests show that Grandma has advanced cancer. However, that can’t be true because then she would die! I refuse to believe it!” The argument is illogical because truth and falsity are not contingent based upon how much we like or dislike the consequences of that truth. Grandma, indeed, might have cancer, in spite of how negative that fact may be or how cruelly it may affect us.
Argument from Personal Incredulity: Asserting that opponent’s argument must be false because you personally don’t understand it or can’t follow its technicalities. For instance, one person might assert, “I don’t understand that engineer’s argument about how airplanes can fly. Therefore, I cannot believe that airplanes are able to fly.” Au contraire, that speaker’s own mental limitations do not limit the physical world—so airplanes may very well be able to fly in spite of a person's inability to understand how they work. One person’s comprehension is not relevant to the truth of a matter.

COMPONENT FALLACIES:
Component fallacies are errors in inductive and deductive reasoning
or in syllogistic terms that fail to overlap.
Begging
the Question (also called Petitio Principii,
this term is sometimes used interchangeably with Circular
Reasoning): If writers assume as evidence
for their argument the very conclusion they are attempting
to prove, they
engage in the fallacy of begging the question. The most common
form of this fallacy is when the first claim is initially
loaded
with the very conclusion one has yet to prove. For instance,
suppose a particular student group states, "Useless
courses like English 101 should be dropped from the college's
curriculum."
The members of the student group then immediately move on in
the argument, illustrating that spending money on a useless
course
is something nobody wants. Yes, we all agree that spending money
on useless courses is a bad thing. However, those students never
did prove that English 101 was itself a useless course--they
merely "begged the question" and moved on to the next
"safe" part of the argument, skipping over the part
that's the real controversy, the heart of the matter, the most
important component. Begging the question is often hidden in
the form of a complex
question
(see below).
Circular
Reasoning is closely related to begging
the question. Often the writers using this fallacy
word take one idea and phrase it in two statements. The assertions
differ sufficiently to obscure the fact that that the same proposition
occurs as both a premise and a conclusion. The speaker or author
then tries to "prove" his or her assertion by merely
repeating it in different words. Richard Whately wrote in Elements
of Logic (London 1826): “To allow every man unbounded
freedom of speech must always be on the whole, advantageous to
the state; for it is highly conducive to the interest of the community
that each individual should enjoy a liberty perfectly unlimited
of expressing his sentiments.” Obviously the premise is
not logically irrelevant to the conclusion, for if the premise
is true the conclusion must also be true. It is, however, logically
irrelevant in proving the conclusion. In the example,
the author is repeating the same point in different words, and
then attempting to "prove" the first assertion with
the second one. A more complex but equally fallacious type of
circular reasoning is to create a circular chain of reasoning
like this one: "God exists." "How do you know that
God exists?" "The Bible says so." "Why should
I believe the Bible?" "Because it's the inspired word
of God." If we draw this out as a chart, it looks like this:

The so-called "final
proof" relies on unproven evidence set forth initially as
the subject of debate. Basically, the argument goes in an endless
circle, with each step of the argument relying on a previous one,
which in turn relies on the first argument yet to be proven. Surely
God deserves a more intelligible argument than the circular reasoning
proposed in this example!
Hasty
Generalization (Dicto Simpliciter, also called “Jumping
to Conclusions,” "Converse Accident"): Mistaken
use of inductive reasoning when there are too few samples to prove
a point. Example: "Susan failed Biology 101. Herman failed
Biology 101. Egbert failed Biology 101. I therefore conclude that
most students who take Biology 101 will fail it." In understanding
and characterizing general situations, a logician cannot normally
examine every single example. However, the examples used in inductive
reasoning should be typical of the problem or situation at hand.
Maybe Susan, Herman, and Egbert are exceptionally poor students.
Maybe they were sick and missed too many lectures that term to
pass. If a logician wants to make the case that most students
will fail Biology 101, she should (a) get a very large sample--at
least one larger than three--or (b) if that isn't possible, she
will need to go out of his way to prove to the reader that her
three samples are somehow representative of the norm.
If a logician considers only exceptional or dramatic cases and
generalizes a rule that fits these alone, the author commits the
fallacy of hasty generalization.
One
common type of hasty generalization is the Fallacy of
Accident. This error occurs when one applies a general
rule to a particular case when accidental circumstances render
the general rule inapplicable. For example, in Plato’s Republic,
Plato finds an exception to the general rule that one should return
what one has borrowed: “Suppose that a friend when in his
right mind has deposited arms with me and asks for them when he
is not in his right mind. Ought I to give the weapons back to
him? No one would say that I ought or that I should be right in
doing so. . . .” What is true in general may not be true
universally and without qualification. So remember, generalizations
are bad. All of them. Every single last one. Except, of course,
for those that are not.
Another
common example of this fallacy is the misleading statistic.
Suppose an individual argues that women must be incompetent drivers,
and he points out that last Tuesday at the Department of Motor
Vehicles, 50% of the women who took the driving test failed. That
would seem to be compelling evidence from the way the statistic
is set forth. However, if only two women took the test that day,
the results would be far less clear-cut. Incidentally, the cartoon
Dilbert makes much of an incompetent manager who cannot
perceive misleading statistics. He does a statistical study of
when employees call in sick and cannot come to work during the
five-day work week. He becomes furious to learn that 40% of office
"sick-days" occur on Mondays (20%) and Fridays (20%)--just
in time to create a three-day weekend. Suspecting fraud, he decides
to punish his workers. The irony, of course, is that these two
days compose 40% of a five day work week, so the numbers are completely
average. Similar nonsense emerges when parents or teachers complain
that "50% of students perform at or below the national average
on standardized tests in mathematics and verbal aptitude."
Of course they do! The very nature of an average implies that!
False
Cause: This fallacy establishes a cause/effect relationship
that does not exist. There are various Latin names for various
analyses of the fallacy. The two most common include these types:
(1) Non Causa Pro Causa
(Literally, "Not the cause for a cause"): A
general, catch-all category for mistaking a false
cause of an event for the real cause.
(2) Post Hoc, Ergo Propter
Hoc
(Literally: "After this, therefore because of this"): This
type of false cause occurs when the writer mistakenly assumes
that, because the first event preceded the second event,
it must mean the first event caused the later one. Sometimes
it does, but sometimes it doesn't. It is the honest writer's
job to establish clearly that connection rather than merely
assert it exists. Example: "A black cat crossed my path
at noon. An hour later, my mother had a heart-attack. Because
the first event occurred earlier, it must have caused the
bad luck later." This is how superstitions begin.
The most common examples are arguments
that viewing a particular movie or show, or listening to a
particular type of music “caused” the listener
to perform an antisocial act--to snort coke, shoot classmates,
or take up a life of crime. These may be potential suspects
for the cause, but the mere fact that an individual did these
acts and subsequently behaved in a certain way does not yet
conclusively rule out other causes. Perhaps the listener
had
an abusive home-life or school-life, suffered from a chemical
imbalance leading to depression and paranoia, or made a bad
choice in his companions. Other potential causes must be
examined
before asserting that only one event or circumstance alone
earlier in time caused a event or behavior
later. For more information, see correlation
and causation.
Irrelevant
Conclusion (Ignorantio Elenchi): This
fallacy occurs when a rhetorician adapts an argument purporting
to establish
a particular conclusion and directs it to prove a different conclusion.
For example, when a particular proposal for housing legislation
is under consideration, a legislator may argue that decent housing
for all people is desirable. Everyone, presumably, will
agree.
However, the question at hand concerns a particular measure.
The question really isn't, "Is it good to have decent
housing?"
The question really is, "Will this particular measure actually
provide it or is there a better alternative?" This type
of fallacy is a common one in student papers when students
use a
shared assumption--such as the fact that decent housing is a
desirable thing to have--and then spend the bulk of their essays
focused
on that fact rather than the real question
at issue. It's similar to begging
the question, above.
One
of the most common forms of Ignorantio Elenchi is the
"Red Herring." A red herring is a deliberate
attempt to change the subject or divert the argument from the
real question at issue to some side-point; for instance, “Senator
Jones should not be held accountable for cheating on his income
tax. After all, there are other senators who have done far worse
things.” Another example: “I should not pay a fine
for reckless driving. There are many other people on the street
who are dangerous criminals and rapists, and the police should
be chasing them, not harassing a decent tax-paying citizen like
me.” Certainly, worse criminals do exist, but that
it is another issue! The questions at hand are (1) did the speaker
drive recklessly, and (2) should he pay a fine for it?
Another
similar example of the red herring is the fallacy known as Tu
Quoque (Latin for "And you too!"), which
asserts that the advice or argument must be false simply because
the person presenting the advice doesn't consistently follow it herself. For
instance, "Susan the yoga instructor claims that a low-fat diet and exercise are good for you--but I saw her last week pigging out on oreos, so her argument must be a load of hogwash." Or, "Reverend Jeremias claims that theft is wrong,
but how can theft be wrong if Jeremias himself admits he stole
objects when he was a child?" Or "Thomas Jefferson made many arguments about equality and liberty for all Americans, but he himself kept slaves, so we can dismiss any thoughts he had on those topics."
Straw
Man Argument: A subtype
of the
red herring,
this fallacy includes any lame attempt to "prove" an argument
by overstating, exaggerating, or over-simplifying the arguments
of the opposing side. Such an approach is building a straw man
argument. The name comes from the idea of a boxer or fighter
who
meticulously fashions a false opponent out of straw, like a scarecrow,
and then easily knocks it over in the ring before his admiring
audience. His "victory" is a hollow mockery, of course, because
the straw-stuffed opponent is incapable of fighting back. When
a writer makes a cartoon-like caricature of the opposing argument,
ignoring the real or subtle points of contention, and then proceeds
to knock
down each "fake" point one-by-one, he has created a straw man
argument.
For instance, one speaker
might be engaged in a debate concerning welfare. The opponent
argues, "Tennessee should increase funding to unemployed
single mothers during the first year after childbirth because
they need sufficient money to provide medical care for their newborn
children." The second speaker retorts, "My opponent
believes that some parasites who don't work should get a free
ride from the tax money of hard-working honest citizens. I'll
show you why he's wrong . . ." In this example,
the second speaker is engaging in a straw man strategy, distorting
the opposition's statement about medical care for newborn children
into an oversimplified form so he can more easily appear to "win."
However, the second speaker is only defeating a dummy-argument
rather than honestly engaging in the real nuances of the debate.
Non
Sequitur (literally, "It does not follow"): A non
sequitur is any argument that does not follow from the previous
statements. Usually what happened is that the writer leaped from
A to B and then jumped to D, leaving out step C of an argument
she thought through in her head, but did not put down on paper.
The phrase is applicable in general to any type of logical fallacy,
but logicians use the term particularly in reference to syllogistic
errors such as the undistributed
middle term, non
causa pro causa, and ignorantio
elenchi. A common example would be an argument
along these lines: "Giving up our nuclear arsenal in the
1980's weakened the United States' military. Giving up nuclear
weaponry also weakened China in the 1990s. For this reason, it
is wrong to try to outlaw pistols and rifles in the United States
today." There's obviously a step or two missing here.
The
"Slippery Slope" Fallacy (also called "The
Camel's Nose Fallacy") is a non
sequitur in which the speaker argues that, once
the first step is undertaken, a second or third step will inevitably
follow, much like the way one step on a slippery incline will
cause a person to fall and slide all the way to the bottom. It
is also called "the Camel's Nose Fallacy" because of
the image of a sheik who let his camel stick its nose into his
tent on a cold night. The idea is that the sheik is afraid to
let the camel stick its nose into the tent because once the beast
sticks in its nose, it will inevitably stick in its head, and
then its neck, and eventually its whole body. However, this sort
of thinking does not allow for any possibility of stopping the
process. It simply assumes that, once the nose is in, the rest
must follow--that the sheik can't stop the progression once it
has begun--and thus the argument is a logical fallacy. For instance,
if one were to argue, "If we allow the government to infringe
upon our right to privacy on the Internet, it will then feel free
to infringe upon our privacy on the telephone. After that, FBI
agents will be reading our mail. Then they will be placing cameras
in our houses. We must not let any governmental agency interfere
with our Internet communications, or privacy will completely vanish
in the United States." Such thinking is fallacious; no logical
proof has been provided yet that infringement in one area will
necessarily lead to infringement in another, no more than a person
buying a single can of Coca-Cola in a grocery store would indicate
the person will inevitably go on to buy every item available in
the store, helpless to stop herself. So remember to avoid the
slippery slope fallacy; once you use one, you may find yourself
using more and more logical fallacies.
Either/Or
Fallacy (also called "the Black-and-White Fallacy," "Excluded Middle," "False Dilemma," or "False Dichotomy"): This fallacy occurs when a writer
builds an argument upon the assumption that there are only two
choices or possible outcomes when actually there are several.
Outcomes are seldom so simple. This fallacy most frequently appears
in connection to sweeping generalizations: “Either we must
ban X or the American way of life will collapse.” "We
go to war with Canada, or else Canada will eventually grow in
population and overwhelm the United States." "Either
you drink Burpsy Cola, or you will have no friends and no social
life." Either you must avoid either/or fallacies, or everyone
will think you are foolish.
Faulty
Analogy: Relying only on comparisons to prove a point
rather than arguing deductively and inductively. For example, “education
is like cake; a small amount tastes sweet, but eat too
much and your
teeth will rot out. Likewise, more than two years of education
is bad for a student.” The analogy is only acceptable
to the degree a reader thinks that education
is similar
to cake. As you can see, faulty analogies are like flimsy wood,
and just as no carpenter would build a house out of flimsy
wood,
no writer should ever construct an argument out of flimsy material.
Undistributed
Middle Term: A specific type of error in deductive reasoning
in which the minor premise and the major premise of a syllogism
might or might not overlap. Consider these two examples: (1) “All
reptiles are cold-blooded. All snakes are reptiles. All snakes
are cold-blooded.” In the first example, the middle term
“snakes” fits in the categories of both “reptile”
and “things-that-are-cold-blooded.” (2) “All snails
are cold-blooded. All snakes are cold-blooded. All snails are
snakes.” In the second example, the middle term of “snakes”
does not fit into the categories of both “things-that-are-cold-blooded”
and “snails.”
Sometimes, equivocation
(see below) leads to an undistributed middle term.
Contradictory
Premises (also known as a logical paradox): Establishing
a premise in such a way that it contradicts another, earlier
premise.
For
instance, "If
God can do anything, he can make a stone so heavy that
he can't lift
it." The first premise establishes a deity that has the
irresistible capacity to move other objects. The second premise
establishes
an immovable object impervious to any movement. If the first
object capable of moving anything exists, by definition, the
immovable
object cannot exist, and vice-versa.
Closely related is the fallacy of Special Pleading, in which the writer creates a universal principle, then insists that principle does not for some reason apply to the issue at hand. For instance, “Everything must have a source or creator. Therefore God must exist and he must have created the world. What? Who created God? Well, God is eternal and unchanging--He has no source or creator.” In such an assertion, either God must have His own source or creator, or else the universal principle of everything having a source or creator must be set aside—the person making the argument can’t have it both ways.

FALLACIES
OF AMBIGUITY: These errors occur
with ambiguous words or phrases, the meanings of which shift and
change in the course of discussion. Such more or less subtle changes
can render arguments fallacious.
Equivocation:
Using a word in a different way than the author used it in the
original premise, or changing definitions halfway through a discussion.
When we use the same word or phrase in different senses within
one line of argument, we commit the fallacy of equivocation. Consider
this example: “Plato says the end of a thing is its perfection;
I say that death is the end of life; hence, death is the perfection
of life.” Here the word end means "goal"
in Plato's usage, but it means "last event" or "termination"
in the author's second usage. Clearly, the speaker is twisting
Plato's meaning of the word to draw a very different conclusion.
Compare with amphiboly,
below.
Amphiboly
(from the Greek word "indeterminate"): This fallacy is similar
to equivocation. Here, the ambiguity results from grammatical
construction. A statement may be true according to one interpretation
of how each word functions in a sentence and false according to
another. When a premise works with an interpretation that is true,
but the conclusion uses the secondary "false" interpretation,
we have the fallacy of amphiboly on our hands. In the command,
"Save soap and waste paper," the amphibolous use of "waste" results
in the problem of determining whether "waste" functions as a verb
or as an adjective.
Composition:
This fallacy is a result of reasoning from the properties of the
parts of the whole to the properties of the whole itself--it is
an inductive error. Such an argument might hold that, because
every individual part of a large tractor is lightweight, the entire
machine also must be lightweight. This fallacy is similar to Hasty
Generalization (see above), but it focuses on parts
of a single whole rather than using too few examples to create
a categorical generalization. Also compare it with Division
(see below).
Division:
This fallacy is the reverse of composition.
It is the misapplication of deductive reasoning. One fallacy of
division argues falsely that what is true of the whole must be
true of individual parts. Such an argument notes that, "Microtech
is a company with great influence in the California legislature.
Egbert Smith works at Microtech. He must have great influence
in the California legislature." This is not necessarily true.
Egbert might work as a graveyard shift security guard or as the
copy-machine repairman at Microtech--positions requiring little
interaction with the California legislature. Another fallacy of
division attributes the properties of the whole to the individual
member of the whole: "Sunsurf is a company that sells
environmentally safe products. Susan Jones is a worker at Sunsurf.
She must be an environmentally minded individual." (Perhaps
she is motivated by money alone?)
Fallacy of Reification (Also called “Fallacy of Misplaced Concreteness” by Alfred North Whitehead): The fallacy of treating a word or an idea as equivalent to the actual thing represented by that word or idea, or the fallacy of treating an abstraction or process as equivalent to a concrete object or thing. In the first case, we might imagine a reformer trying to eliminate illicit lust by banning all mention of extra-marital affairs or certain sexual acts in publications. The problem is that eliminating the words for these deeds is not the same as eliminating the deeds themselves. In the second case, we might imagine a person or declaring “a war on poverty.” In this case, the fallacy comes from the fact that “war” implies a concrete struggle with another concrete entity which can surrender or be exterminated. “Poverty,” however is an abstraction that cannot surrender or sign peace treaties, cannot be shot or bombed, etc. Reification of the concept merely muddles the issue of what policies to follow and leads to sloppy thinking about the best way to handle a problem. It is closely related to and overlaps with faulty analogy and equivocation.

FALLACIES
OF OMISSION:
These errors occur because the logician leaves out necessary material
in an argument or misdirects others from missing information.
Stacking
the Deck: In this fallacy, the speaker "stacks the
deck" in her favor by ignoring examples that disprove the
point and listing only those examples that support her case.
This fallacy is closely related to hasty generalization, but the
term usually implies deliberate deception rather than an accidental
logical error. Contrast it with the straw
man argument.
‘No True Scotsman’ Fallacy: Attempting to stack the deck specifically by defining terms in such a narrow or unrealistic manner as to exclude or omit relevant examples from a sample. For instance, suppose speaker #1 asserts, “The Scottish national character is brave and patriotic. No Scottish soldier has ever fled the field of battle in the face of the enemy.” Speaker #2 objects, “Ah, but what about Lucas MacDurgan? He fled from German troops in World War I.” Speaker #1 retorts, “Well, obviously he doesn’t count as a true Scotsman because he did not live up to Scottish ideals, thus he forfeited his Scottish identity.” By this fallacious reasoning, any individual who would serve as evidence contradicting the first speaker’s assertion is conveniently and automatically dismissed from consideration. We commonly see this fallacy when a company asserts that it cannot be blamed for one of its particularly unsafe or shoddy products because that particular one doesn’t live up to its normally high standards, and thus shouldn’t “count” against its fine reputation. Likewise, defenders of Christianity as a positive historical influence in their zeal might argue the atrocities of the eight Crusades do not “count” in an argument because the Crusaders weren’t living up to Christian ideals, and thus aren’t really Christians, etc. So, remember this fallacy. Philosophers and logicians never use it, and anyone who does use it by definition is not really a philosopher or logician.
Argument
from the Negative: Arguing from the negative asserts
that, since one position is untenable, the opposite stance must
be true. This fallacy is often used interchangeably with Argumentum
Ad Ignorantium (listed below) and the either/or
fallacy (listed above). For instance, one might
mistakenly argue that, since the Newtonian theory of mathematics
is not one hundred percent accurate, Einstein’s theory of
relativity must be true. Perhaps not. Perhaps the theories of
quantum mechanics are more accurate, and Einstein’s theory
is flawed. Perhaps they are all wrong. Disproving an opponent’s
argument does not necessarily mean your own argument must be true
automatically, no more than disproving your opponent's assertion
that 2+2=5 would automatically mean your argument that 2+2=7 must
be the correct one. Keeping this mind, students should remember that arguments from the negative are bad, arguments from the positive must automatically be good.
Appeal
to a Lack of Evidence (Argumentum Ad Ignorantium,
literally "Argument from Ignorance"): Appealing to a
lack of information to prove a point, or arguing that, since the
opposition cannot disprove a claim, the opposite stance must be
true. An example of such an argument is the assertion that ghosts
must exist because no one has been able to prove that they do
not exist. Logicians know this is a logical fallacy because no
competing argument has yet revealed itself.
Hypothesis
Contrary to Fact (Argumentum Ad Speculum):
Trying to prove something in the real world by using imaginary
examples
alone, or asserting that, if hypothetically X had occurred, Y would have been the result. For instance, suppose an individual
asserts that if Einstein had been aborted in utero,
the world would never have learned about relativity, or that
if
Monet
had been trained as a butcher rather than going to college, the
impressionistic movement would have never influenced modern
art.
Such hypotheses are misleading lines of argument because it is
often possible that some other individual would have solved
the
relativistic equations or introduced an impressionistic art style.
The speculation might make an interesting thought-experiment,
but it is simply useless when it comes to actually proving anything
about the real world. A common example is the idea that one "owes"
her success to another individual who taught her. For instance,
"You owe me part of your increased salary. If I hadn't taught
you how to recognize logical fallacies, you would be flipping
hamburgers at McDonald's for minimum wages right now instead
of taking in hundreds of thousands of dollars as a lawyer."
Perhaps. But perhaps the audience would have learned about logical
fallacies elsewhere, so the hypothetical situation described
is
meaningless.
Complex
Question (Also called the "Loaded Question"):
Phrasing a question or statement in such as way as to imply another
unproven statement is true without evidence or discussion. This
fallacy often overlaps with begging
the question (above), since it also presupposes a
definite answer to a previous, unstated question. For instance,
if I were to ask you “Have you stopped taking drugs yet?”
my hidden supposition is that you have been taking drugs.
Such a question cannot be answered with a simple yes or no answer.
It is not a simple question but consists of several questions
rolled into one. In this case the unstated question is, “Have
you taken drugs in the past?” followed by, “If you
have taken drugs in the past, have you stopped taking them now?”
In cross-examination, a lawyer might ask a flustered witness,
“Where did you hide the evidence?” or "when did
you stop beating your wife?" The intelligent procedure when
faced with such a question is to analyze its component parts.
If one answers or discusses the prior, implicit question first,
the explicit question may dissolve.
Complex questions appear
in written argument frequently. A student might write, “Why
is private development of resources so much more efficient than
any public control?” The rhetorical question leads directly
into his next argument. However, an observant reader may disagree,
recognizing the prior, implicit question remains unaddressed.
That question is, of course, whether private development of resources
really is more efficient in all cases, a point which
the author is skipping entirely and merely assuming to be true
without discussion.

To master logic more fully, become familiar with the tool of Occam's
Razor.