Addendum to Skeptic's Dictionary: Hidden Persuaders Of Anthropogenic Global Warming

(original here of course, with plenty of links to explore each dictionary entry below in depth)

(the text outside < blockquote > is (mostly) mine)

hidden persuaders: A term used by Geoffrey Dean and Ivan Kelly (2003) to describe affective, perceptual, and cognitive biases or illusions that lead to erroneous beliefs.

A NOTE TO THOSE OF AGW-BELIEVING ATTITUDE:

The hidden persuaders sometimes seem to affect people in proportion to their intelligence: the smarter one is the easier it is to develop false beliefs. There are several reasons for this: (1) the hidden persuaders affect everybody to some degree; (2) the smarter one is the easier it is to see patterns, fit data to a hypothesis, and draw inferences; (3) the smarter one is the easier it is to rationalize, i.e., explain away strong evidence contrary to one’s belief; and (4) smart people are often arrogant and incorrectly think that they cannot be deceived by others, the data, or themselves

And now for some examples:

 

ad hoc hypothesis: An ad hoc hypothesis is one created to explain away facts that seem to refute one’s belief or theory. Ad hoc hypotheses are common in paranormal research and in the work of pseudoscientists. It is always more reasonable to apply Occam’s razor than to offer speculative ad hoc hypotheses.

AGW example: The discovery that aerosols have cooled the Earth just when the Earth was cooling, miraculously declining their action exactly when the Earth was warming due to CO2 emissions.

AGW example: The discovery that heavy (winter) snow and cold temperatures are exactly caused by temperature increases

 

affect bias: Our judgment regarding the costs and benefits of items is often significantly influenced by a feeling evoked by pictures or words not directly relevant to the actual cost or benefit

AGW example: Justifying reduction in CO2 emissions by way of how “green” things could become, and civilization “sustainable” in “harmony” with nature.

 

apophenia: Apophenia is the spontaneous perception of connections and meaningfulness of unrelated phenomena. “The propensity to see connections between seemingly unrelated objects or ideas most closely links psychosis to creativity … apophenia and creativity may even be seen as two sides of the same coin”. In statistics, apophenia is called a Type I error, seeing patterns where none, in fact, exist.

AGW example: The propensity to see Anthropogenic Global Warming at work in each and every (bad) thing that happens anywhere on Earth, including in earthquakes

 

autokinetic effect: The autokinetic effect refers to perceiving a stationary point of light in the dark as moving

AGW example: The incredible inability of past and present temperature measures to record the actual values, leading to contemporary researchers having to continuously adjust the figures (lowering the old ones, increasing the new ones)

 

availability error: availability heuristic, determining probability “by the ease with which relevant examples come to mind” (Groopman 2007: p. 64) or “by the first thing that comes to mind” (Sutherland 1992: p. 11)

AGW example: The IPCC declaring in 2007 that tens of thousands of indicators were all compatible to global warming, even if the overwhelming majority of those indicators was about Europe alone

 

backfire effect: The “backfire effect” is a term coined by Brendan Nyhan and Jason Reifler to describe how some individuals when confronted with evidence that conflicts with their beliefs come to hold their original position even more strongly

AGW example: AGWers patting each other in the back about climate science remaining totally unscathed by the Climategate e-mails

 

change blindness: Change blindness is the failure to detect non-trivial changes in the visual field.

AGW example: The obsession with computing linear trends, making it impossible even to fathom the step-function behaviors (=”tipping points”) the very same AGWers like to talk about

 

Clever Hans phenomenon: A form of involuntary and unconscious cuing

AGW example: Journalist AGWers crowding RealClimate to know how long to count for

 

Clever Linda phenomenon: A form of involuntary and unconscious cuing

AGW example: Climate scientists writing to journalists making sure they conform, because fortunately, the prestige press doesn’t fall for this sort of stuff, right?

 

clustering illusion: The clustering illusion is the intuition that random events which occur in clusters are not really random events

AGW example: All the global village idiots that will tell the world how climate change is upon us, as shown by the year’s news, rather than by relying on properly conducted scientific research capable to isolate climate-change effects from others such as poverty

 

cognitive dissonance: Cognitive dissonance is a theory of human motivation that asserts that it is psychologically uncomfortable to hold contradictory cognitions. The theory is that dissonance, being unpleasant, motivates a person to change his cognition, attitude, or behavior. What distinguishes the chiropractor’s rationalization from the cult member’s is that the latter is based on pure faith and devotion to a guru or prophet, whereas the former is based on evidence from experience. Neither belief can be falsified because the believers won’t let them be falsified: Nothing can count against them. Those who base their beliefs on experience and what they take to be empirical or scientific evidence (e.g., astrologers, palm readers, mediums, psychics, the intelligent design folks, and the chiropractor) make a pretense of being willing to test their beliefs. They only bother to submit to a test of their ideas to get proof for others. That is why we refer to their beliefs as pseudosciences. We do not refer to the beliefs of cult members as pseudoscientific, but as faith-based irrationality. The chiropractors’ misguided belief is probably not due to worrying about their self-image or removing discomfort. It is more likely due to their being arrogant and incompetent thinkers, convinced by their experience that they “know” what’s going on, and probably assisted by communal reinforcement from the like-minded arrogant and incompetent thinkers they work with and are trained by. They’ve seen how AK works with their own eyes. They’ve demonstrated it many times. If anything makes them uncomfortable it might be that they can’t understand how the world can be so full of idiots who can’t see with their own eyes what they see!

AGW example: Thousands and thousands of words written by journalists, scientists and activists about anthropogenic global warming, and not one of them indicating what if anything could falsify…anthropogenic global warming

 

law of truly large numbers (coincidence): The law of truly large numbers says that with a large enough sample many odd coincidences are likely to happen.

AGW example: Romm scouring the planet’s press agencies to list all sorts of disasters that might somehow be connected to anthropogenic global warming

 

cold reading: Cold reading refers to a set of techniques used by professional manipulators to get a subject to behave in a certain way or to think that the cold reader has some sort of special ability that allows him to “mysteriously” know things about the subject

AGW example: The popularity of climate models’ ensembles among politicians looking for something to confirm they need to be voted for, and in the process getting convinced science can really tell us something about the climate of 2100

 

communal reinforcement: Communal reinforcement is the process by which a claim becomes a strong belief through repeated assertion by members of a community

AGW example: The tendency of warmist websites to censor dissenting comments away, leaving readers (believers) with the impression there is really a huge huge number of them, and just a handful of nasty skeptics

 

confabulation: A confabulation is a fantasy that has unconsciously emerged as a factual account in memory. A confabulation may be based partly on fact or be a complete construction of the imagination

AGW example: The decade-long fight to remove from collective memory the substantial agreement among scientists about global cooling (potentially, an ice age), a consensus that lasted at least between 1972 and 1975.

 

confirmation bias: Confirmation bias refers to a type of selective thinking whereby one tends to notice and to look for what confirms one’s beliefs, and to ignore, not look for, or undervalue the relevance of what contradicts one’s beliefs

AGW example: Briffa’s uncanny ability to avoid for years any mention of the misbehaving trees he had himself published a paper about, in the Yamal saga

 

file-drawer effect: The file-drawer effect refers to the practice of researchers filing away studies with negative outcomes. Negative outcome refers to finding nothing of statistical significance or causal consequence, not to finding that something affects us negatively. Negative outcome may also refer to finding something that is contrary to one’s earlier research or to what one expects

AGW example: Extreme lack of interest among prominent climate scientists to publish anything (not even an Op-Ed) about the “travesty” that was (is) their inability to explain why temperatures (actually, the averages of the global temperature anomaly) have not risen since 1998

 

Forer effect: The Forer effect refers to the tendency of people to rate sets of statements as highly accurate for them personally even though the statements could apply to many people

AGW example: The worldwide phenomenon that sees most Ministers and Prime Ministers announce that their own particular country is being affected by climate change at twice or more the planetary average

 

gambler’s fallacy: The gambler’s fallacy is the mistaken notion that the odds for something with a fixed probability increase or decrease depending upon recent occurrences

AGW example: Tamino’s (?) absurdist blog about the probability of having consecutive hot periods being astronomically low

 

hindsight bias: Hindsight bias is the tendency to construct one’s memory after the fact (or interpret the meaning of something said in the past) according to currently known facts and one’s current beliefs. In this way, one appears to make the past consistent with the present and more predictive or predictable than it actually was.

AGW example: The Met Office discovering in January how it had forecasted a cold December in October, as shown by a statement nobody did read, and nobody has read

AGW example: The silly notion that Anthropogenic Global Warming has been consensually recognized in the 1970′s or even earlier

 

inattentional blindness: Inattentional blindness is an inability to perceive something that is within one’s direct perceptual field because one is attending to something else

AGW example: Lancet publishing an incredibly misleading Climate Change report with little mention of the huge difference in the number and type of deaths of people during cold and warm snaps

AGW example: The complete lack of interest about linking the generalized Northern Hemispheric cold and the silent Sun

 

magical thinking: According to anthropologist Dr. Phillips Stevens Jr., magical thinking involves several elements, including a belief in the interconnectedness of all things through forces and powers that transcend both physical and spiritual connections. Magical thinking invests special powers and forces in many things that are seen as symbol. One of the driving principles of magical thinking is the notion that things that resemble each other are causally connected in some way that defies scientific testing (the law of similarity)

AGW example: CO2′s mysterious ability to free the Arctic from the ice, and to increase the amount of ice in Antarctica, plus its long hand into anything and everything that ever happens and has bad consequences.

 

motivated reasoning: Motivated reasoning is confirmation bias taken to the next level. Motivated reasoning leads people to confirm what they already believe, while ignoring contrary data. But it also drives people to develop elaborate rationalizations to justify holding beliefs that logic and evidence have shown to be wrong

AGW example: The Anthropogenic Global Warming’s crowd supernatural swiftness in explaining every (bad) phenomenon as a consequence of human CO2 emissions

 

nonfalsifiability: Scientific theories not only explain empirical phenomena, they also predict empirical phenomena. One way we know a scientific theory is no good is that its predictions keep failing. Predictions can’t fail unless a theory is falsifiable. Some pseudoscientific [theories] can’t be falsified because they are consistent with every imaginable empirical state of affairs. Karl Popper noted that psychoanalytic theory, including Freud’s theory of the Oedipus complex, is pseudoscientific because they seem to explain everything and do not leave open the possibility of error. Even contradictory behaviors are appealed to in support of the theory.

AGW example: Thousands and thousands of words written by journalists, scientists and activists about anthropogenic global warming, and not one of them indicating what if anything could falsify…anthropogenic global warming

 

positive-outcome (publication) bias: Positive-outcome (or “publication”) bias is the tendency to publish research with a positive outcome more frequently than research with a negative outcome. Negative outcome refers to finding nothing of statistical significance or causal consequence, not to finding that something affects us negatively. Positive-outcome bias also refers to the tendency of the media to publish medical study stories with positive outcomes much more frequently than such stories with negative outcomes

AGW example: The amount of time some highly-functioning minds have spent to justify scientifically the reasons for the “hide the decline”

 

post hoc fallacy: The post hoc ergo propter hoc (after this therefore because of this) fallacy is based upon the mistaken notion that simply because one thing happens after another, the first event was a cause of the second event. Post hoc reasoning is the basis for many superstitions and erroneous beliefs

AGW example: The Anthropogenic Global Warming’s crowd supernatural completeness in explaining every (bad) phenomenon as a consequence of human CO2 emissions

 

pragmatic fallacy: The pragmatic fallacy is committed when one argues that something is true because it works and where ‘works’ means something like “I’m satisfied with it,” “I feel better,” “I find it beneficial, meaningful, or significant,” or “It explains things for me

AGW example: The inane request to publish via peer-review a scientific alternative to mainstream Anthropogenic Global Warming theory because “it works”. One doesn’t need to be a leader or a tailor to see if the Emperor is naked.

 

regressive fallacy: The regressive fallacy is the failure to take into account natural and inevitable fluctuations of things when ascribing causes to them

AGW example: The general agreement that natural variability doesn’t count much for Anthropogenic Global Warming, even if the very same people go on to claim temperatures have not increased in a decade because of natural variability

 

representativeness error: In judging items, we compare them to a prototype or representative idea and tend to see them as typical or atypical according to how they match up with our model. The problem with the representativeness heuristic is that what appears typical sometimes blinds you to possibilities that contradict the prototype

AGW example: The sterile obsession with studying climate science by climate models alone

 

retrospective falsification: D. H. Rawcliffe coined this term to refer to the process of telling a story that is factual to some extent, but which gets distorted and falsified over time by retelling it with embellishments

AGW example: The abuse of Arrhenius’ “greenhouse gas” works, with the first one continuously mentioned exactly as the second one gets forgotten, being a more sober rethinking of the original ideas

 

selection bias: Selection bias comes in two flavors: (1) self-selection of individuals to participate in an activity or survey, or as a subject in an experimental study; (2) selection of samples or studies by researchers to support a particular hypothesis

AGW example: Mann’s obviously irrelevant pick-and-choose of which series to use for the Hockey Stick

 

selective thinking: Selective thinking is the process whereby one selects out favorable evidence for remembrance and focus, while ignoring unfavorable evidence for a belief

AGW example: Any post at Skeptical Science, with its incredible list of peer-reviewed all-mutually-consistent scientific papers

 

self-deception: Self-deception is the process or fact of misleading ourselves to accept as true or valid what is false or invalid. Self-deception, in short, is a way we justify false beliefs to ourselves

AGW example: Connolley et al publishing an article about a “Myth” of global cooling consensus in the 1970′s despite providing themselves ample evidence to support the same “myth”

 

shoehorning: Shoehorning is the process of force-fitting some current affair into one’s personal, political, or religious agenda

AGW example: Also known as “decorating the Christmas tree”…at every climate negotiation for the UN, thousands of people try to add their pet project to the cause, including “forest protection, poverty alleviation, water equity, women’s and indigenous rights

 

subjective validation: Subjective validation is the process of validating words, initials, statements, or signs as accurate because one is able to find them personally meaningful and significant

AGW example: Anthropogenic Global Warming causing a (temporary?) shutdown in critical thinking among those worried about getting the world “greener”

 

sunk-cost fallacy: When one makes a hopeless investment, one sometimes reasons: I can’t stop now, otherwise what I’ve invested so far will be lost. This is true, of course, but irrelevant to whether one should continue to invest in the project. Everything one has invested is lost regardless. If there is no hope for success in the future from the investment, then the fact that one has already lost a bundle should lead one to the conclusion that the rational thing to do is to withdraw from the project

AGW example: The UN’s COP bandwagon, moving a lot of people a lot of times in a lot of different locations (but never in Moldova or North Korea, for some reason) even if everybody agrees it will never mean anything substantial

 

anecdotal (testimonial) evidence: Testimonials and vivid anecdotes are one of the most popular and convincing forms of evidence presented for beliefs in the supernatural, paranormal, and pseudoscientific

AGW example: Monbiot’s famous February floral musings brought to the world as evidence of anthropogenic global warmings, back when Februarys were still warm

 

Texas-sharpshooter fallacy: The Texas-sharpshooter fallacy is the name epidemiologists give to the clustering illusion. Politicians, lawyers and some scientists tend to isolate clusters of diseases from their context, thereby giving the illusion of a causal connection between some environmental factor and the disease. What appears to be statistically significant (i.e., not due to chance) is actually expected by the laws of chance

AGW example:Pretty much any Al Gore speech

 

wishful thinking: Wishful thinking is interpreting facts, reports, events, perceptions, etc., according to what one would like to be the case rather than according to the actual evidence

AGW example:Pretty much any warmist blog or statement

=======

Obviously there’s much better examples out there, so do send them across if you see any…