It’s no secret to people who read my essays, posts and comments that I am unabashedly critical of far-right-wing thought in America. That is not to say that there are no rational and humane conservative ideas, and no rational and humane conservatives, but rather that the current dominant brand of conservatism in America is neither rational nor humane (and it is this more extreme, currently popular version that I am referring to when I refer to “right-wing” thought). This is not a unique perspective, nor is it unusual for an intellectual to hold it; indeed, intellectualism is explicitly disdained by the ideological camp in question. Precisely those professions that methodically gather, verify, analyze and contemplate information are dismissed as bastions of liberal bias, and the (undoubtedly fallible) conclusions arrived at by those professional disciplines and held by the majority in rational deference to the greater reliability of such information are considered by right-wingers to be inferior to their own arbitrary, dogmatic false certainties.
Though we will not win the battle of narratives through rational argumentation alone, we will win it by driving home the fact that we are promoting the narrative of reason and humanity, because whether people actually engage in rational thought or not, the overwhelming majority recognize in principle the greater credibility of rationally over irrationally derived conclusions. The more that rational and humane people drive home the fact that they ARE rational and humane people, opposing the ideologies of irrationality and inhumanity, the more successful we will be in the battle of narratives that is the political arena. Therefore, be prepared, in every debate with a right-wing ideologue (or even, as is sometimes the case, an irrational left-wing ideologue) to mobilize formal logic and to name formal logical fallacies, or to describe specific fallacies routinely employed by right-wing ideologues. Let’s distinguish ourselves from them by looking like, as well as being, the voice of reason and humanity, because it is by making that distinction constantly and abundantly clear that we will move this country and this world in positive directions.
I’ve examined the very abundant universe of right-wing fallacy from many angles, tackling specific dimensions, specific issues, and specific aspects of it. But I’m not sure if I’ve yet published (on this blog) my growing typology of specific fallacies most particular to right-wing argumentation. Some don’t fit neatly into the list of conventional logical fallacies, or are very particular variations of them, and those are the ones I shall address first, because I find them the most interesting.
For instance, I’m fascinated by what I call “the right-wing two-step,” which involves first insulating poorly informed and poorly argued opinions from critical analysis on the basis of a relativistic argument, and then promoting them to the status of unassailable absolute truth on the basis of the argument that to fail to do so would be to commit the error of relativism. This fallacy, most common among right-wing evangelicals, is so luxurious in its absurdity that one has to admire the poetry of dogged ignorance that it represents.
It operates as follows: In Conversation 1, a right-wing opinion is challenged on the basis of a factual and rational critical argument, to which the right-wing ideologue replies, “I’m sure that there are equally good arguments supporting my position” or “whose reason, yours or mine?” as if there is no such thing as “reason” which exists independently of each person’s arbitrary claim to it. The right-wing ideologue will dismiss the critical argument not with a counterargument of any kind, but with an assertion of the equality of all opinions, and the right of each to have their own. In this way, they insulate their own opinion from any intrusion of fact or reason.
In Conversation 2, the right-wing ideologue is challenged on the more general basis that there are many different people from many different ideological camps who are as certain of their absolute truths as the right-wing ideologue is of his, and that there is no a priori reason for assuming that one is correct and the others false (this would be a good introduction to the critical challenge posed in Conversation 1, if it could get that far). This is in fact similar to the reasoning that the right-wing ideologue used in Conversation 1 to insulate his ideology from fact and reason, but rather than using it to bring the certainty of his own dogma into question, he uses it to reduce any other contention to a condition of a priori equality to his own. Now, however, in Conversation 2, he rejects that same line of reasoning, insisting that to accept it is to commit the error of relativism by failing to recognize that there IS one absolute truth rather than a variety of competing truths!
So, first, his opinion can’t be challenged because all opinions are equal, and then no other opinion can be considered because there is only one absolute truth, and since his can’t be challenged it must be that one absolute truth! It’s hard to overstate the wonder of such perfect irrationality.
It’s worth emphasizing that the actual order of conversations is irrelevant. I’ve ordered them as I have because I believe that that is the order by which they are used to insulate one’s own fortress of ideological dogma from both specific and general critical examination, the specific insulated against by a general argument, and the general insulated against by an appeal to specificity. This is a very particular and elaborate version of the tautological fallacy, described below.
The right-wing two-step is a particular variation of the broad spectrum of prevalent right-wing fallacy that involves selective perception, cherry-picked evidence, and what I call “pettifogging,” or the obfuscation of the big picture and the overwhelming thrust of evidence and reason by means of relentless picking at peripheral and often barely relevant details. This generally involves the narrowing of the frame of reference so as to ignore all contextual information, and focusing on anomalous or isolated information that supports their conclusions (and can generally be easily explained in the context of opposing conclusions) while ignoring the overall weight of the evidence comprehensively considered.
The George Zimmerman trial and the public debates surrounding it provide an excellent recent example of the narrowing of the frame of reference to an isolated instant, conveniently filtering out any consideration of the context leading up to that instant. By focusing exclusively on the moment in which the fatal shot was fired, and by assuming the facts most favorable to the person they most identify with (the guy going out with his gun to find “bad guys”), the far-right manages to disregard the fact that Zimmerman made aggressive choices that led to the shooting death, at his hands, of a kid who at least up until Zimmerman’s choice to follow him with a gun, had been committing no crime. But for Zimmerman’s choices to arm himself, leave his home, identify a black teen walking home from the store as suspicious looking, and stalk him, the violent encounter that led to Zimmerman shooting that teen to death would never have occurred. But by narrowing the frame to the instant of the shooting itself, this fact can be completely disregarded and the challenge it poses to their overall ideology ignored.
Another variation of this fallacy involves responding to statistical evidence with anecdotal evidence, as if finding any case that does not support the statistical correlation is disproof of that correlation’s validity. This is a favorite technique in arguments over gun regulations, when the statistical evidence demonstrating a positive correlation both domestically and internationally among developed nations of gun ownership and homicide rates is dismissed on the basis of the trope that “Chicago (or Washington DC) has the strictest gun regulations in America and the highest homicide rates,” or “crime rates went up right after gun regulations were implemented in X locale.” Sometimes this is true (sometimes invented), but the real point is that it is anecdotal, and not a meaningful response to the statistical data which does not cherry-pick convenient cases but rather considers all cases at once. (It also ignores the obvious causal relationship that jurisdictions with high homicide rates and strict gun laws generally implemented the latter in response to the former.)
My favorite analogy for the fallacy of using anecdotal evidence for rejecting statistical evidence is that of arguing that wearing seat belts in a car increases the likelihood that you will die in a car accident. One can argue against that position, which we all know to be absurd, by citing the statistics which demonstrate it to be absurd. But if a right-winger had some ideological reason to want to arrive at the opposite conclusion, they could simply cite every example they can find of the anomalous event that someone wearing a seatbelt died as a result of wearing their seatbelt, thus in their mind disproving what the statistical evidence demonstrates. Or, ideologues engaging in pseudo-science can data-mine for anomalous correlations, such as (hypothetically) “people driving four-door sedans on city streets in the third largest urban area in Illinois between 10:00 pm and 11:00 pm on weekdays are more likely to die if they are wearing seatbelts than if not.”
I’ve made the “cherry-picking” of the statistical correlation obvious in this case, in order to illustrate how it can be done (anomalous correlations can be found if you search long and hard enough) and the similarity to finding simple anecdotal anomalies to “refute” statistical evidence, but when used by right-wing ideologues, it is often less obvious to an untrained eye. (A favorite tactic, for instance, is to replace “firearms” with “rifles,” and then to cite homicide statistics by rifles as if rifles represented all firearms, often actually switching to “guns” from rifles when presenting the statistic.) John Lott’s study in “More Guns, Less Crime” for instance, has been widely criticized for the selection of parameters to arrive at desired conclusions, and has been rejected as invalid by a panel of experts convened by the National Research Council (as well as by numerous individual scholars), and yet is the study on which the most knowledgeable gun advocates almost exclusively rely.
(As a side note, this focus on anomalous data as a way of rebuffing the weight of all data considered comprehensively not only disregards the weight of the data considered comprehensively, but also disregards the explanations for such anomalies within the context of the larger causal relationships suggested by the comprehensive data. For instance, even accepting, for the sake of argument, the validity of John Lott’s thoroughly rejected study finding a positive correlation between laxer gun regulations and lower violent crime rates, such a correlation would not necessarily imply that such a paradigm is the optimal solution to the comparatively very high rate of deadly violence in America, due to a combination of considerations. Uneven local gun regulations in a nation with no internal barriers to the movement of goods across state and municipal lines mean that local regulations are undermined by laxer regulations elsewhere. The statistical fact that the overwhelming majority of the guns used in the commission of crimes anywhere in America are initially bought in jurisdictions with the laxest regulations reinforces this conclusion. And understanding the difference between local and global optima, in which it may be the case that in a gun-saturated society with no internal barriers to the transportation of goods across state and municipal borders, laws which increase “the war of all against all” could slightly reduce local deadly violence rates but keep them far higher than in other nations that don’t rely on “the war of all against all” to keep the peace, helps to put such anomalous evidence into perspective in the context of a comprehensive examination of the global evidence.)
One elaboration of narrowing the frame of reference, that also segues nicely into the issue of “pettifogging” discussed next, is the right-wing shell-game of isolated attention. This usually takes the form of focusing on one peripheral fact or anomaly or doctored study, which, once debunked, is replaced with another, until, after having exhausted their available supply, they return to the first one as if it had never been debunked. This is the more general tactic of which “the right-wing two-step” discussed above is one variation.
By far the favorite technique in right-wing “debate” is the tactic of “pettifogging,” which is picking away at every marginal and barely relevant detail of an opposing argument in order to avoid having to confront the central argument itself. This involves questioning the credibility of the source, even when the sources used are generally considered among the most credible (Harvard and other university peer-reviewed studies and conventional journalistic reporting by major media outlets are all dismissed as products of a liberal propaganda machine, while the arbitrary products of what is in reality a propaganda machine are embraced without question); insisting that every inconvenient assertion be cited in every casual exchange (though no one else is doing so); and finding peripheral and often irrelevant details to obsess about (definitions of conventionally understood terms, etc.). In this way, they can endlessly monopolize the time and energy of anyone arguing against any position they hold without permitting the argument to be compiled and presented in any coherent form, always derailing it with a barrage of irrelevant and peripheral demands, eventually wearing down the critique and thus claiming victory for having done so.
There is a hybrid fallacy that merits mention, even weaker than the others that it resembles: Changing the subject entirely. It has some straw man aspects (arguing against a position on an unrelated issue no one has advanced at all rather than a caricature of one advanced relevant to the issue at hand), some pettifogging aspects (picking at something not only barely relevant and marginal, but rather completely irrelevant and not even marginal), and some shell game aspects (not merely switching among distinct issues within the same argument, but switching to another topic altogether). A very recent example is, after providing comprehensive evidence debunking the notion that our gun culture has no relation to our rates of deadly violence, my opponent said, “so you must love ”Fast & Furious, then.” The discussion, of course, had no relation to that bungled Obama administration program, but the idea was to get me on the defensive on something, anything, no matter how irrelevant it might be.
One last technique merits mention: Rejection by typology. This usually involves some label imbued with a strong negative judgment, and the shoving of all things to be critiqued into that label, the assumption being that by doing so the defectiveness of the thing so labeled has been proved. The most common label is “socialist” (though libertarians are increasingly fond of “statist” instead, imbuing the identical folly with a false veneer of intellectualism that the overuse of the word “socialism” lacks), and its use incorporates an element of the cherry-picking fallacy described above. By this technique, all governments with large administrative infrastructures are “socialist” or “statist,” and all socialist or statist countries are known to have been dismal failures. The problem is that using a definition that broad renders the second point simply false, since every single modern, prosperous, free nation on Earth has a large administrative infrastructure, and has had such an infrastructure in place since prior to participating in the historically unprecedented post-WWII expansion in the production of prosperity.
What really distinguishes the famously failed “socialist” or “statist” countries from the famously successful ones that share that completely non-distinguishing trait are a set of other variables: Freedom of speech and press, relatively legitimate democratic processes, and protection of individual civil rights and due process in criminal proceedings. The existence of a large administrative state not only is not exclusively associated with failed states, but, in fact, the most successful states all, without exception, have such large administrative infrastructures, and have had them for generations. This fallacy combines the “false dichotomy” fallacy described below (i.e., there are just two categories of states, socialist and non-socialist) with the selective perception tactic described above (only noticing those states with large administrative infrastructures that failed, and not those that comprise the entire set of the most successful political economies in human history).
Following is a fairly complete list of major logical fallacies, excerpted verbatim from “The Skeptics Guide to the Universe” website, which also includes a very good introduction on the structure of logical arguments (http://www.theskepticsguide.org/resources/logicalfallacies.aspx).
|Ad hominem:An ad hominem argument is any that attempts to counter another’s claims or conclusions by attacking the person, rather than addressing the argument itself. True believers will often commit this fallacy by countering the arguments of skeptics by stating that skeptics are closed minded. Skeptics, on the other hand, may fall into the trap of dismissing the claims of UFO believers, for example, by stating that people who believe in UFO’s are crazy or stupid.A common form of this fallacy is also frequently present in the arguments of conspiracy theorists (who also rely heavily on ad-hoc reasoning). For example, they may argue that the government must be lying because they are corrupt.It should be noted that simply calling someone a name or otherwise making an ad hominem attack is not in itself a logical fallacy. It is only a fallacy to claim that an argument is wrong because of a negative attribute of someone making the argument. (i.e. “John is a jerk.” is not a fallacy. “John is wrong because he is a jerk.” is a logical fallacy.)The term “poisoning the well” also refers to a form of ad hominem fallacy. This is an attempt to discredit the argument of another by implying that they possess an unsavory trait, or that they are affiliated with other beliefs or people that are wrong or unpopular. A common form of this also has its own name – Godwin’s Law or the reductio ad Hitlerum. This refers to an attempt at poisoning the well by drawing an analogy between another’s position and Hitler or the Nazis.|
|Ad ignorantiam:The argument from ignorance basically states that a specific belief is true because we don’t know that it isn’t true. Defenders of extrasensory perception, for example, will often overemphasize how much we do not know about the human brain. It is therefore possible, they argue, that the brain may be capable of transmitting signals at a distance.UFO proponents are probably the most frequent violators of this fallacy. Almost all UFO eyewitness evidence is ultimately an argument from ignorance – lights or objects sighted in the sky are unknown, and therefore they are alien spacecraft.Intelligent design is almost entirely based upon this fallacy. The core argument for intelligent design is that there are biological structures that have not been fully explained by evolution, therefore a powerful intelligent designer must have created them.In order to make a positive claim, however, positive evidence for the specific claim must be presented. The absence of another explanation only means that we do not know – it doesn’t mean we get to make up a specific explanation.|
|Argument from authority:The basic structure of such arguments is as follows: Professor X believes A, Professor X speaks from authority, therefore A is true. Often this argument is implied by emphasizing the many years of experience, or the formal degrees held by the individual making a specific claim. The converse of this argument is sometimes used, that someone does not possess authority, and therefore their claims must be false. (This may also be considered an ad-hominen logical fallacy – see below.)In practice this can be a complex logical fallacy to deal with. It is legitimate to consider the training and experience of an individual when examining their assessment of a particular claim. Also, a consensus of scientific opinion does carry some legitimate authority. But it is still possible for highly educated individuals, and a broad consensus to be wrong – speaking from authority does not make a claim true.This logical fallacy crops up in more subtle ways also. For example, UFO proponents have argued that UFO sightings by airline pilots should be given special weight because pilots are trained observers, are reliable characters, and are trained not to panic in emergencies. In essence, they are arguing that we should trust the pilot’s authority as an eye witness.There are many subtypes of the argument from authority, essentially referring to the implied source of authority. A common example is the argument ad populum – a belief must be true because it is popular, essentially assuming the authority of the masses. Another example is the argument from antiquity – a belief has been around for a long time and therefore must be true.|
|Argument from final Consequences:Such arguments (also called teleological) are based on a reversal of cause and effect, because they argue that something is caused by the ultimate effect that it has, or purpose that is serves. Christian creationists have argued, for example, that evolution must be wrong because if it were true it would lead to immorality.One type of teleological argument is the argument from design. For example, the universe has all the properties necessary to support live, therefore it was designed specifically to support life (and therefore had a designer.|
|Argument from Personal Incredulity:I cannot explain or understand this, therefore it cannot be true. Creationists are fond of arguing that they cannot imagine the complexity of life resulting from blind evolution, but that does not mean life did not evolve.|
|Begging the Question:The term “begging the question” is often misused to mean “raises the question,” (and common use will likely change, or at least add this new, definition). However, the intended meaning is to assume a conclusion in one’s question. This is similar to circular reasoning, and an argument is trying to slip in a conclusion in a premise or question – but it is not the same as circular reasoning because the question being begged can be a separate point. Whereas with circular reasoning the premise and conclusion are the same.The classic example of begging the question is to ask someone if they have stopped beating their wife yet. Of course, the question assumes that they every beat their wife.In my appearance on the Dr. Oz show I was asked – what are alternative medicine skeptics (termed “holdouts”) afraid of? This is a double feature of begging the question. By using the term “holdout” the question assumes that acceptance is already become the majority position and is inevitable. But also, Oz begged the question that skeptics are “afraid.” This also created a straw man (see below) of our position, which is rather based on a dedication to reasonable standards of science and evidence.|
|Confusing association with causation:This is similar to the post-hoc fallacy in that it assumes cause and effect for two variables simply because they occur together. This fallacy is often used to give a statistical correlation a causal interpretation. For example, during the 1990’s both religious attendance and illegal drug use have been on the rise. It would be a fallacy to conclude that therefore, religious attendance causes illegal drug use. It is also possible that drug use leads to an increase in religious attendance, or that both drug use and religious attendance are increased by a third variable, such as an increase in societal unrest. It is also possible that both variables are independent of one another, and it is mere coincidence that they are both increasing at the same time.This fallacy, however, has a tendency to be abused, or applied inappropriately, to deny all statistical evidence. In fact this constitutes a logical fallacy in itself, the denial of causation. This abuse takes two basic forms. The first is to deny the significance of correlations that are demonstrated with prospective controlled data, such as would be acquired during a clinical experiment. The problem with assuming cause and effect from mere correlation is not that a causal relationship is impossible, it’s just that there are other variables that must be considered and not ruled out a-priori. A controlled trial, however, by its design attempts to control for as many variables as possible in order to maximize the probability that a positive correlation is in fact due to a causation.Further, even with purely epidemiological, or statistical, evidence it is still possible to build a strong scientific case for a specific cause. The way to do this is to look at multiple independent correlations to see if they all point to the same causal relationship. For example, it was observed that cigarette smoking correlates with getting lung cancer. The tobacco industry, invoking the “correlation is not causation” logical fallacy, argued that this did not prove causation. They offered as an alternate explanation “factor x”, a third variable that causes both smoking and lung cancer. But we can make predictions based upon the smoking causes cancer hypothesis. If this is the correct causal relationship, then duration of smoking should correlate with cancer risk, quitting smoking should decrease cancer risk, smoking unfiltered cigarettes should have a higher cancer risk than filtered cigarettes, etc. If all of these correlations turn out to be true, which they are, then we can triangulate to the smoking causes cancer hypothesis as the most likely possible causal relationship and it is not a logical fallacy to conclude from this evidence that smoking probably causes lung cancer.|
|Confusing currently unexplained with unexplainable:Because we do not currently have an adequate explanation for a phenomenon does not mean that it is forever unexplainable, or that it therefore defies the laws of nature or requires a paranormal explanation. An example of this is the “God of the Gapsa” strategy of creationists that whatever we cannot currently explain is unexplainable and was therefore an act of god.|
|False Analogy:Analogies are very useful as they allow us to draw lessons from the familiar and apply them to the unfamiliar. Life is like a box of chocolate – you never know what you’re going to get.A false analogy is an argument based upon an assumed similarity between two things, people, or situations when in fact the two things being compared are not similar in the manner invoked. Saying that the probability of a complex organism evolving by chance is the same as a tornado ripping through a junkyard and created a 747 by chance is a false analogy. Evolution, in fact, does not work by chance but is the non-random accumulation of favorable changes.Creationists also make the analogy between life and your home, invoking the notion of thermodynamics or entropy. Over time your home will become messy, and things will start to break down. The house does not spontaneously become more clean or in better repair.The false analogy here is that a home is an inanimate collection of objects. Whereas life uses energy to grow and reproduce – the addition of energy to the system of life allows for the local reduction in entropy – for evolution to happen.Another way in which false analogies are invoked is to make an analogy between two things that are in fact analogous in many ways – just not the specific way being invoked in the argument. Just because two things are analogous in some ways does not mean they are analogous in every way.|
|False Continuum:The idea that because there is no definitive demarcation line between two extremes, that the distinction between the extremes is not real or meaningful: There is a fuzzy line between cults and religion, therefore they are really the same thing.|
|False Dichotomy:Arbitrarily reducing a set of many possibilities to only two. For example, evolution is not possible, therefore we must have been created (assumes these are the only two possibilities). This fallacy can also be used to oversimplify a continuum of variation to two black and white choices. For example, science and pseudoscience are not two discrete entities, but rather the methods and claims of all those who attempt to explain reality fall along a continuum from one extreme to the other.|
|Genetic Fallacy:The term “genetic” here does not refer to DNA and genes, but to history (and therefore a connection through the concept of inheritance). This fallacy assumes that something’s current utility is dictated by and constrained by its historical utility. This is easiest to demonstrate with words – a words current use may be entirely unrelated to its etymological origins. For example, if I use the term “sunset” or “sunrise” I am not implying belief in a geocentric cosmology in which the sun revolves about the Earth and literally “rises” and “sets.”|
|Inconsistency:Applying criteria or rules to one belief, claim, argument, or position but not to others. For example, some consumer advocates argue that we need stronger regulation of prescription drugs to ensure their safety and effectiveness, but at the same time argue that medicinal herbs should be sold with no regulation for either safety or effectiveness.|
|No True Scotsman:This fallacy is a form of circular reasoning, in that it attempts to include a conclusion about something in the very definition of the word itself. It is therefore also a semantic argument.The term comes from the example: If Ian claims that all Scotsman are brave, and you provide a counter example of a Scotsman who is clearly a coward, Ian might respond, “Well, then, he’s no true Scotsman.” In essence Ian claims that all Scotsman are brave by including bravery in the definition of what it is to be a Scotsman. This argument does not establish and facts or new information, and is limited to Ian’s definition of the word, “Scotsman.”|
|Non-Sequitur:In Latin this term translates to “doesn’t follow”. This refers to an argument in which the conclusion does not necessarily follow from the premises. In other words, a logical connection is implied where none exists.|
|Post-hoc ergo propter hoc:This fallacy follows the basic format of: A preceded B, therefore A caused B, and therefore assumes cause and effect for two events just because they are temporally related (the latin translates to “after this, therefore because of this”).|
|Reductio ad absurdum:In formal logic, the reductio ad absurdum is a legitimate argument. It follows the form that if the premises are assumed to be true it necessarily leads to an absurd (false) conclusion and therefore one or more premises must be false. The term is now often used to refer to the abuse of this style of argument, by stretching the logic in order to force an absurd conclusion. For example a UFO enthusiast once argued that if I am skeptical about the existence of alien visitors, I must also be skeptical of the existence of the Great Wall of China, since I have not personally seen either. This is a false reductio ad absurdum because he is ignoring evidence other than personal eyewitness evidence, and also logical inference. In short, being skeptical of UFO’s does not require rejecting the existence of the Great Wall.|
|Slippery Slope:This logical fallacy is the argument that a position is not consistent or tenable because accepting the position means that the extreme of the position must also be accepted. But moderate positions do not necessarily lead down the slippery slope to the extreme.|
|Special pleading, or ad-hoc reasoning:This is a subtle fallacy which is often difficult to recognize. In essence, it is the arbitrary introduction of new elements into an argument in order to fix them so that they appear valid. A good example of this is the ad-hoc dismissal of negative test results. For example, one might point out that ESP has never been demonstrated under adequate test conditions, therefore ESP is not a genuine phenomenon. Defenders of ESP have attempted to counter this argument by introducing the arbitrary premise that ESP does not work in the presence of skeptics. This fallacy is often taken to ridiculous extremes, and more and more bizarre ad hoc elements are added to explain experimental failures or logical inconsistencies.|
|Straw Man:A straw man argument attempts to counter a position by attacking a different position – usually one that is easier to counter. The arguer invents a caricature of his opponent’s position – a “straw man” – that is easily refuted, but not the position that his opponent actually holds.For example, defenders of alternative medicine often argue that skeptics refuse to accept their claims because they conflict with their world-view. If “Western” science cannot explain how a treatment works, then it is dismissed out-of-hand. If you read skeptical treatment of so-called “alternative” modalities, however, you will find the skeptical position much more nuanced than that.Claims are not a-prior dismissed because they are not currently explained by science. Rather, in some cases (like homeopathy) there is a vast body of scientific knowledge that says that homeopathy is not possible. Having an unknown mechanism is not the same thing as demonstrably impossible (at least as best as modern science can tell). Further, skeptical treatments of homeopathy often thoroughly review the clinical evidence. Even when the question of mechanism is put aside, the evidence shows that homeopathic remedies are indistinguishable from placebo – which means they do not work.|
|Tautology:Tautology in formal logic refers to a statement that must be true in every interpretation by its very construction. In rhetorical logic, it is an argument that utilizes circular reasoning, which means that the conclusion is also its own premise. Typically the premise is simply restated in the conclusion, without adding additional information or clarification. The structure of such arguments is A=B therefore A=B, although the premise and conclusion might be formulated differently so it is not immediately apparent as such. For example, saying that therapeutic touch works because it manipulates the life force is a tautology because the definition of therapeutic touch is the alleged manipulation (without touching) of the life force.|
|The Fallacy Fallacy:As I mentioned near the beginning of this article, just because someone invokes an unsound argument for a conclusion, that does not necessarily mean the conclusion is false. A conclusion may happen to be true even if an argument used to support is is not sound. I may argue, for example, Obama is a Democrat because the sky is blue – an obvious non-sequitur. But the conclusion, Obama is a Democrat, is still true.Related to this, and common in the comments sections of blogs, is the position that because some random person on the internet is unable to defend a position well, that the position is therefore false. All that has really been demonstrated is that the one person in question cannot adequately defend their position.This is especially relevant when the question is highly scientific, technical, or requires specialized knowledge. A non-expert likely does not have the knowledge at their fingertips to counter an elaborate, but unscientific, argument against an accepted science. “If you (a lay person) cannot explain to me,” the argument frequently goes, “exactly how this science works, then it is false.”Rather, such questions are better handled by actual experts. And, in fact, intellectual honesty requires that at least an attempt should be made to find the best evidence and arguments for a position, articulated by those with recognized expertise, and then account for those arguments before a claim is dismissed.|
|The Moving Goalpost:A method of denial arbitrarily moving the criteria for “proof” or acceptance out of range of whatever evidence currently exists. If new evidence comes to light meeting the prior criteria, the goalpost is pushed back further – keeping it out of range of the new evidence. Sometimes impossible criteria are set up at the start – moving the goalpost impossibly out of range -for the purpose of denying an undesirable conclusion.|
|Tu quoque:Literally, you too. This is an attempt to justify wrong action because someone else also does it. “My evidence may be invalid, but so is yours.”|