Archive 1 Archive 2

Format of references

Should the references perhaps be made in APA format? They don't seem to be at the moment...

confirmation bias

Wason abandonded his confirmation bias account of the card selection task, in favor of the Matching Bias...so something needs to change.

The notion that the scientific method of falsification was invented "to compensate for this observed human tendency" is just plain nonsense. --Arno Matthias 14:51, 8 August 2006 (UTC)

Whom ever added it might be confusing it with doing double blind studies to minimise bias? David D. (Talk) 16:50, 15 September 2006 (UTC)
Probably. Falsficationism has better philosophical justifications than just compensating for us being human, after all. -- Gwern (contribs) 16:54, 15 September 2006 (UTC)

"confirmation bias" versus "biased assimilation" etc.

Psychologists generally use "confirmation bias" just to refer to a tendency to search for information confirming a hypothesis. This is different from a tendency to interpret new information in a way that is consistent with a previous belief. A hypothesis need not even be a belief, and confirmation bias for search seems to operate on different principles (based on cognitive constraints) from belief perseverance/biased assimilation, which is generally thought to be at least partly motivational. So I don't think the stuff about biased assimilation (e.g. Lord, Ross, and Lepper, 1979) should be referred to as "confirmation bias", but this would require more time for a rewrite than I have right now. If someone wants to split off this concept from biased assimilation, and create a separate article for the latter, that would be excellent. I may do it in the future if I have time. -- Trbdavies 17 September 2008 —Preceding undated comment was added at 06:52, 18 September 2008 (UTC).

deleted material

I deleted interesting, but verbatum text copied from the Scientific American Skeptic column published in July 2006 titled The Political Brain - A recent brain-imaging study shows that our political predilections are a product of unconscious confirmation bias by Michael Shermer. This content needs to be rewritten and cited. David D. (Talk) 22:32, 2 September 2006 (UTC)

If the participants in the study were strong partisans (of either party), it is probable that they already knew the political messages they were asked to assess. So why should they have recurred to reasoning instead of emotions, if what they heard was nothing new to them?

polarization effect

http://en.wikipedia.org/wiki/Polarization_effect seems to be related to confirmation bias —Preceding unsigned comment added by Sgeiger (talkcontribs)

need for cognitive closure

This article needs to go into an explanation of how the confirmation bias does not effect everyone equally, but is largely influenced by situational pressures and by individual attributes. I'm not a psychology student so I can't write it, but I know for example that Webster & Kruglanski's Need for Closure Scale is one method used to determine an individual's need for cognitive closure and thus how susceptable they are to confirmation bias. Daniel 13:29, 27 October 2006 (UTC)

merge article: myside bias?

It appears that the article myside bias largely overlaps with this article. I proposed merging the article into this one. Another article to merge it with could be disconfirmation bias.

JuhazOne 23:18, 20 November 2006 (UTC)

They do look to be the same thing. If a source can be cited to that effect, then I agree that they should be merged. Ruakh 23:28, 20 November 2006 (UTC)
Yes. I would remove myside bias and just merge it here. Marky48 03:32, 12 February 2007 (UTC)
I agree. Confirmation bias is the basic phenomenon. Jeremy Tobacman 01:47, 18 February 2007 (UTC)
They should be merged, but there is much more merging yet to be done. Grumpyyoungman01 12:38, 3 March 2007 (UTC)

Big merger proposal

These are all obviously the same article under different names with different etymologies and different text. A term such as 'belief preservation' can take into account both 'confirmation' and 'disconfirmation' bias. But I see no need for a distinction in article titles between confirmation and disconfirmation, particularly as it is part of the same psychological process. More pages link to this article. We shouldn't mention all of the possible names for the cognitive bias in the introduction in the single merged article, only some, and have a naming section to systematically deal with the rest. Grumpyyoungman01 12:50, 3 March 2007 (UTC)

Oppose. Firstly, I'm not convinced that confirmation bias and disconfirmation are really part of the same process. Secondly, this proposal is way too vague for me to consider voting for it: exactly which articles are you saying should be merged? What should be the title of the resulting article? Will the resulting article discuss the differences, or just pretend that the terms are synonyms? —RuakhTALK 20:07, 3 March 2007 (UTC)

Comment. Confirmation and disconfirmation bias are part of the same process, just two different aspects of it, but complimentary and should be in the same article. For instance, straight from the wikipedia policy page on mergers, 'flammable' and 'non-flammable' should be combined in a page on flammability. From the disconfirmation page: "Disconfirmation bias refers to the tendency for people to extend critical scrutiny to information which contradicts their prior beliefs and uncritically accept information that is congruent with their prior beliefs.", They are confirming their prior held opinions. They do this in two ways, one is to uncritically accept congruent information and the other is to critically assess contradictory information. Both of these are opposite sides of the same coin. The coin of mainting your belief/s regardless of an objective analysis of the evidence.
The articles that should be merged into this article are shown on the template at the top of this article. The title of the resulting page shall be Confirmation bias because this is the most comphrehensive page of the many dealing with the same thing and has many internal links pointing to it. If you think another name is appropriate, then after the merge, if it happens, you can suggest another name. At the moment I don't think that the name is important. The article would discuss the differences between the initial articles, which mainly revolve around who coined what terms and did what research when in a "naming" section or something like that.
In the merge itself, if it happens, no information should be lost that appeared on any original page, it would all be placed down and it would need some working out over a few days or weeks. Of course the current pages will not be deleted, a redirect will be left and the page history will be accessible. Grumpyyoungman01 09:53, 5 March 2007 (UTC)

Disconfirmation bias.

Disconfirmation bias was recently merged here, despite there not being a clear consensus about this (since it was rather mixed with the discussion of merging Myside bias). What do people think? Should Disconfirmation bias have its own article, or not? My vote is that it should; Grumpyyoungman01's is obviously that it should not. —RuakhTALK 15:48, 8 March 2007 (UTC)

As the only significant editor of disconfirmation bias has not edited WP for 18 months Special:Contributions/Taak and as there doesn't seem to be much activity on this page from other users, I suggest that you contact people involved Wikipedia:WikiProject Psychology. Otherwise I can just see two of us here and no consensus. Grumpyyoungman01 23:21, 8 March 2007 (UTC)

Pseudoscience

Can anyone justify why this article is categorised as Pseudoscience? Grumpyyoungman01 23:18, 8 March 2007 (UTC)

Real life example of confirmation bias

I've posted a quote from Sen. Sam Brownback if which he incurs on a confirmation bias. Quote from [1] --Loren 10:41, 1 June 2007 (UTC)

Anthropogenic Global Warming (AGW) is a fantastic example of Confirmation Bias on a huge scale. Discussion of what is needed to outline this? Happy to start things off. --BadCop666 08:40, 28 October 2007 (UTC)

That would be original research.--Boffob 16:48, 28 October 2007 (UTC)

Political study is not confirmation bias

I wouldn't classify the political bias study as a study of confirmation bias, but a different bias phenomenon. In the Wason experiment, subjects were given a chance to test their hypotheses, and the choices they made showed the confirmatory bias. The study involving quotes from Bush and Kerry didn't require subjects to choose an information source or test their own hypotheses: it was more about interpretation and justification. The political bias study could be moved to another cognitive bias page, as it concerns motivated rationality, cognitive dissonance or similar. MartinPoulter 11:45, 1 August 2007 (UTC)

I think that both are indeed confirmation bias, but it's possible that confirmation bias is really two distinct things. Mirosław Kofta et al's Personal Control in Action: Cognitive and Motivational Mechanisms writes of "the two concurrently used and often confused meanings of the ‘confirmation bias’—tendentious handling of evidence and a confirmatory strategy of hypothesis testing" (pages 234–5). We can try to make the distinction more clear in the article, but I don't think we can move it to an entirely different page. —RuakhTALK 13:32, 1 August 2007 (UTC)
Thanks for this, I was unaware of that reference. It would be a good idea to mention the two kinds. I still think there will be a lot of overlap between an entry on motivated-biased-evaluation-of-evidence and cognitive dissonance. MartinPoulter 20:29, 8 August 2007 (UTC)
Two years after proposing that the political bias study was irrelevant, I've read up more about this subject and admit I was totally wrong. I've put it back in (with my own summary which focuses on the behavioral effects rather than the brain imaging results). MartinPoulter (talk) 14:53, 15 August 2009 (UTC)

An observation about a danger in the sector "Naming".

It is stated in this sector, that the Confirmation bias is "also known as" many many other biases. This suggests to the reader, who is not up to date with the science, that there is no difference between all these effects. So there is a very big danger, that all the efforts, that the science has made, to work out the differences and relationships and classificate them carefully, will have been for nothing, and people will go back to the state, that was in force before, and call all these effects and biases and syndromes again by the old unscientifical name of "wishful thinking", because also the science will not tolerate a hundred names for the very same thing. -- Hanno Kuntze (talk) 20:10, 12 December 2007 (UTC)

Confirmation bias in Cold Reading / Psychic practices?

Hi All,

Just wondering if confirmation bias is the mechanism at work in Cold Reading / Professional Psychic practices, whereby a participant will often lend greater weight to accurate guesses made by the 'psychic' as opposed to inaccurate ones? If so, should this be mentioned / explored in this article? —Preceding unsigned comment added by 202.12.144.21 (talk) 01:21, 26 February 2008 (UTC)

I'm pretty sure that Susan Blackmore, Christopher French or other members of the skeptical community have addressed this. Haven't the references to hand though but it's worth pursuing if we can find the sources.MartinPoulter (talk) 11:38, 26 November 2008 (UTC)
They have, and confirmation bias is just one of the mechanisms cold reading uses. (I believe Shermer even uses the term in the psychics section of Why People Believe Weird Things.) --Gwern (contribs) 21:44 26 November 2008 (GMT)
The book Pseudoscience and Extraordinary Claims of the Paranormal: A Critical Thinker's Toolkit by Jonathan C. Smith addresses this: it's previewable via Google Books. I'll add some relevant content to the article. MartinPoulter (talk) 14:12, 22 August 2009 (UTC)

Learning Object

I made an interactive learning object on CB for work, but it never got used. Rather than edit the article itself, I thought I'd mention it here and let a Real Editor decide whether it was worthy: Confirmation Bias Activity 24.84.6.197 (talk) 06:01, 2 April 2008 (UTC)

Confirmation bias and authority

The examples in the section on "myside bias", were tilted in favor of those in positions of authority. Nobody, as far as I know, claims that teachers and doctors are less susceptible to confirmation bias than bad students and contrarian patients. The difference between these groups is that the bad students and contrarian patients are on the wrong side of power. Confirmation bias is just as real (and much more difficult to detect) when it reinforces a position of power. The examples cited make the term into a tool against the powerless.Likebox (talk) 20:02, 25 November 2008 (UTC)

The paragraph under dispute is a statement about the work of Jonathan Baron. Do your edits reflect the content of that source? If not, they should not be present in that paragraph. If you want to redress the balance, find other examples and put them in- the Aaronson and Tavris book probably has some of what you want. We don't want readers to be misled about the content of a crucial academic source. If you object to the content of a book, an encyclopedia is not the place to express that objection.MartinPoulter (talk) 11:36, 26 November 2008 (UTC)
The paragraph is listing examples of confirmation bias, that is all it is doing. These examples illustrate the contents of this book. But if the book gives slanted examples, there's no need to include it at all.Likebox (talk) 01:56, 27 November 2008 (UTC)


As person who knows little about this subject, but was interested, i can say this is the dullest article ive ever read, —Preceding unsigned comment added by 219.89.10.29 (talk) 12:34, 30 January 2009 (UTC)

Reasons for effect

Am i right in thinking that the first paragraph in the 'reasons for effect' are not relevant? They seem to be suggesting other cognitive biases and/ or causes for 'holding beliefs despite contrary evidence.' WotherspoonSmith (talk) 06:11, 10 June 2009 (UTC)

Link for Reference 3

Link for Reference 3 (On the conflict between logic and belief in syllogistic reasoning)

http://www.psychonomic.org/search/view.cgi?id=9308 —Preceding unsigned comment added by 63.99.162.99 (talk) 00:27, 9 August 2009 (UTC)

Clean-up of Further reading section

Presently this is an excessively large mix of papers (some of questionable relevance) and books. As a first step, I will move the journal articles here and keep the books. Also, the Wason ref is very old, so not the place to go straight away to learn more about this topic.

  • Cohen, L. J. (1981) Can human irrationality be experimentally demonstrated? The Behavioral and Brain Sciences, 4, 317-370.
  • Ditto, P. H., & Lopez, D. F. (1992) Motivated skepticism: Use of differential decision criteria for preferred and non-preferred conclusions.Journal of Personality and Social Psychology, 63, 568-584.
  • Edwards K. & Smith E. E. (1996) A disconfirmation bias in the evaluation of arguments. Journal of Personality and Social Psychology 715-24.
  • Fugelsang, J., Stein, C., Green, A., & Dunbar, K. (2004) Theory and data interactions of the scientific mind: Evidence from the molecular and the cognitive laboratory. Canadian Journal of Experimental Psychology, 58, 132-141.
  • Griggs, R.A. & Cox, J.R. (1982) The elusive thematic materials effect in the Wason selection task. British Journal of Psychology, 73, 407-420.
  • Monwhea Jeng (2006) "A selected history of expectation bias in physics", American Journal of Physics 74 578-583.
  • Mynatt, C.R., Doherty, M.E., & Tweney, R.D. (1977). Confirmation bias in a simulated research environment: an experimental study of scientific inference. Quarterly Journal of Experimental Psychology, 29, 85-95.
  • Schumann, D. W., (Ed.) "Causal reasoning and belief perseverance" Proceedings of the Society for Consumer Psychology (pp. 115-120) Knoxville, TN: University of Tennessee 1989
  • Tutin, Judith (1983) "Belief Perseverance: A Replication and Extension of Social Judgment Findings" Edu. Resources Inf. Ctr. ED240409
  • Wason, P.C. (1966) Reasoning. In B. M. Foss (Ed.), New horizons in psychology I, 135-151. Harmondsworth, UK: Penguin.

MartinPoulter (talk) 14:33, 26 August 2009 (UTC)

Mynatt, Doherty and Tweney (1977) has now been added back into the article. MartinPoulter (talk) 20:53, 15 September 2009 (UTC)

Morton's demon

The term "Morton's demon" arises from an online essay on talkorigins.org. It has been picked up by a number of online forums, but not the academic literature. I can find one book (Mark Isaak's The counter-creationism handbook) which briefly mentions the term (and cites the essay) and a Psychology Today article which mentions it based on its inclusion in this article. Hence is seems clear that "Morton's demon" deserves an external link but not a mention in the main text as its notability is simply not on a par with the academic terms. MartinPoulter (talk) 14:50, 26 August 2009 (UTC)

Tolstoy syndrome

I'm going to remove the phrase "Tolstoy syndrome" from the article, because I can't find this use of the phrase in academic sources. I only find it on Wikipedia, and online sources that are based on Wikipedia. What's more, there seems to be a Tolstoy Syndrome which is a fear of not measuring up to one's heroes, so using it for confirmation bias seems confusing and misleading. If there are academic paper publications that I haven't found that use "Tolstoy syndrome" as a synonym for "confirmation bias" then please put it back in the article. MartinPoulter (talk) 20:49, 15 September 2009 (UTC)

I've also removed the Tolstoy quotes from the article, since they arguably illustrate cognitive dissonance rather than confirmation bias, and Tolstoy isn't mentioned in the article's sources with the same prominence that Bacon is. MartinPoulter (talk) 15:25, 17 October 2009 (UTC)

Another application

Confirmation bias in self-verification. Refs to be added. MartinPoulter (talk) 22:28, 4 December 2009 (UTC)

Quotations and Thucydides

I previously had more quotations in the article, but then realised that its not 100% clear what effect some of these old quotes are talking about, and making a link that isn't made in the literature runs the risk of Original Research. Hence I've pared it down to the Bacon quote which is mentioned in multiple sources about the topic. I think the newly inserted quote might have a similar problem, in that "it is a habit of mankind to entrust to careless hope what they long for" sounds like a reference to Wishful thinking rather than confirmation bias. "to use sovereign reason to thrust aside what they do not fancy." does sound to me like confirmation bias, or at least some sort of valence effect. I would prefer, though, if there were a secondary source explicitly making the connection between this quote and confirmation bias. Is there one? MartinPoulter (talk) 13:28, 11 January 2010 (UTC)

examples

Might it be worth expanding some of the sections with expansions, when I was reading the science section, I immediately thought of [this], and I would imagine there are other good examples for the other areas. No doubt there are many examples in the various references, but linking to examples within WP seems more immediately useful. --86.181.123.179 (talk) 20:29, 7 April 2010 (UTC)

Alt med

There are any number of RS which point to the link between alternative medicine proponents and confirmation bias (just do a Google Scholar search for the terms and you'll see what I mean), so I propose to add a load of these references and remove the "According to Ben Goldacre" weasel words from the appropriate section. I just thought I'd mention it here first in case there was a possibility that a cadre of altmed wingnuts were planning on disputing the change.--Dr Marcus Hill (talk) 12:50, 12 April 2010 (UTC)

Don't worry about the altmed wingnuts. At Wikipedia the "so-called complementary 'medicine' must not be called 'medicine' because it is not medicine but criminal pseudoscience" fraction is much stronger. Hans Adler 13:11, 12 April 2010 (UTC)
Ha ha! Hans, why am I not surprised by your hyperbolic straw man character assassination of skeptics who try in vain to stop the flood of wingnuts who try to misuse Wikipedia for advocacy of their nonsense? Wikipedia isn't a free webhost, blog, or a blank advertising board to be exploited. I for one do all I can to make sure their ideas are included, not deleted, but also not worded as advocacy for false claims. What's so bad about that? OTOH there's too much going on that facilitates them in their attempts.
Dr. Hill, give it a try. BTW, "[a]ccording to Ben Goldacre" isn't necessarily weasel wording, but proper attribution that lends strength to the statement quoted. (Note that I haven't read the statement...I'm just speaking about the principle of attributing statements. We do it all the time.)-- Brangifer (talk) 14:51, 12 April 2010 (UTC)
True, it's not weasel words as it stands, it is indeed correct attribution as only one source is listed. I think a stronger statement is needed, and for that we need several reliable sources. I'll get on it in a week or two if nobody beats me to it.Dr Marcus Hill (talk) 11:11, 14 April 2010 (UTC)

where is the 2-4-6 problem explanation?

. —Preceding unsigned comment added by 88.218.159.38 (talk) 15:11, 11 May 2010 (UTC)

It's in the History section. MartinPoulter (talk) 11:37, 18 May 2010 (UTC)

First sentence

Thank you Martin for the great job you're doing on this and on other articles. However, to me the first sentence of the article "to treat information preferentially" is not at all as clear as the old sentence "prefer information". You're native English, and I am not, and I understand that there are many subtleties in the English language that I miss. But the English wikipedia is also the big international wikipedia, and I would really appreciate if you would reconsider this sentence and change it back to "prefer information" - or maybe something different but more clear. PS I'm sorry I sound so formal in English because I am not a formal person at all... Lova Falk talk 16:06, 4 June 2010 (UTC)

Thank you, Lova Falk, for your kind words. A reviewer in the Featured Article Review wrote, "I'm not sure I'm fully happy with the starting paragraph. The first two sentences don't appear to be communicating the same concept, at least to me. The first relates a preference to types of information; the second regards how information is processed." In response to this, I changed the wording. I agree that even though it might be more accurate it is more clumsy and we need something better. I have suggested "...tendency to favor information that confirms their preconceptions..." and will see if that gets support. "Prefer" could be taken to simply mean "like", whereas what's meant is that people give a preferable treatment to some information and not others. I think "favor" has the connotation of not just liking something but giving it a better treatment. I'm keen to hear what others think. MartinPoulter (talk) 16:34, 4 June 2010 (UTC)
I'm all in favor for favor. Lova Falk talk 16:40, 4 June 2010 (UTC)

FAC comments

I'm trying to help clear the backlog of FAC nominations. I'm working from the bottom of the list up. Here we go:

Thanks a lot for this set of helpful comments. Specific answers below. MartinPoulter (talk) 16:37, 24 June 2010 (UTC)
  • Is this a sociological or psychological phenomenon? Can we state that in the first sentence of the lead? Confirmation bias (also called confirmatory bias or myside bias) is a sociological phenomenon where people favor information that confirms their preconceptions or hypotheses...
Definitely not sociological. The literature is mainly psychological, but it could be argued that this is a topic in inductive inference. For the sake of readability, I'm wary of loading the first sentence with too many clauses. This is above all an effect in how people process information. That's what the existing lead conveys.
  • For example, various contradictory ideas about someone's personality could each be supported by looking at isolated things that he or she does. This is a tiny issue, and probably idiosyncratic, but I generally think "things" is too generic a word and try not to use it. I used to tell my students not to use it when they wrote, if only to force them to find a better word and be more descriptive. To express ideas that need to be very clear to readers, I think this is also applicable to this section in the article.
Good catch. I occasionally lapse into vague language and am glad to have it pointed out. This bit has been rephrased.
  • the homosexual men in the set were more likely to report seeing buttocks or anuses in the ambiguous figures. That's hilarious. That kind of crazyass left field assumption makes me want to become a researcher.
It's what the subjects in the experiment thought, but they were objectively wrong. That's the surprising outcome of that research.
  • It may be me, but I didn't understand the Klayman and Ha's critique section at all.
I've made another pass at rewording. Can you say more specifically what the difficulty is?
Seriously, probably me. I froth at the mouth at probability and statistics. --Moni3 (talk) 19:39, 24 June 2010 (UTC)
  • The Informal observation section should be a cohesive paragraph instead of four one-sentence paragraphs.
I've put the first three sentences together.
  • How might this Confirmation bias contrast with the results of the Asch conformity experiments? I believe people think they're pretty smart and not easily led when they're in the minority, as was refuted by Asch. So when they are presented with opposing information they become maybe overly stubborn. Has there been any discussion about this in the literature? If not, why the hell not?
Confirmation bias and Asch conformity experiments are both mentioned in many of the sources (Baron, Sutherland, Kunda) but in different chapters. Yes they are interesting for similar reasons, but one is a social pressure phenomenon and one is an internal effect in how someone internally handles information.
Bias blind spot (another article that's mostly my work) seems to be the phenomenon you're talking about above. Again, this is a related finding in the same area of psychology, but not the same thing.
The experiments mentioned in the early parts of the article introduce external factors such as research that participants were required to read, statements from political candidates, or stories about a theft. Research ostensibly comes from an expert. Politicians portray themselves (usually) as experts in law or policy. The story about the theft would come, I assume, from someone who has factual knowledge of the incident. While these experts weren't in the room staring at the participant such as in Asch, nonetheless, using information that comes from an expert has a bearing on how people come to their conclusions. The effect of people learning about the Asch experiments no doubt is that they will overcompensate for being too willing to follow the group (or random expert) and subsequently refuse to believe what the expert says, creating a bias.
I'm aware that if no sources make a connection between the two then you can't put one in the article. However, I thought it, and I was always the idiot in class who raised my hand and asked a question. Sometimes someone whispered "Thank God someone else asked that." I wasn't sure if you followed my connection here, so I felt I should explain it better. --Moni3 (talk) 17:08, 24 June 2010 (UTC)
This is a really interesting line of thought: maybe some of the confirmation bias effect can be understood in terms of obedience to authority or social conformity. Hand on heart, I don't find the social conformity (or compensation for it) in the sources on confirmation bias. MartinPoulter (talk) 18:02, 24 June 2010 (UTC)
Feel free to list me on your paper resulting from a massive research project when you devise experiments to connect the two. --Moni3 (talk) 19:39, 24 June 2010 (UTC)
  • Husband E. Kimmel's confirmation bias played a role in the success of the Japanese attack on Pearl Harbor in what way? What decision did Himmel make that led to greater loss of life and materiel?
Yes, this was vague. Now rephrased.
This is what it says now: US Admiral Husband E. Kimmel showed confirmation bias when playing down the first signs of the Japanese attack on Pearl Harbor was an example of . --Moni3 (talk) 17:14, 24 June 2010 (UTC)
Ouch- I was rushing edits before I left my desk at the end of a working day. It is okay now?
  • What is shotgunning? Is there no literature on conspiracy theories? Religious arguments?
Now you mention it, "shotgunning" is a technical term that the article could well do without.
In ordinary discussion (on blogs etc.) it's common to see people accusing each other of confirmation bias because they disagree about religion or whatever. In the scholarly sources on which this article is based, the examples are more abstract or more political such as gun control, hence the examples that the article concentrates on. The Nickerson material on pyramids comes close to conspiracy theory stuff. I think the reader can make their own connection with other conspiracy theories.
  • In my own experience in being required to meet the NPOV policy by including observations and statements that I think are pretty stupid and have no merit in articles I construct, I notice that I react emotionally negative to reading this information. I guess this is the unpalatable part of what is described in the Explanations section. A natural connection to me at least, is discussing why people react so emotionally when faced with information that refutes a core belief. It's a physical sensation, like a gland just went nuts. I was hoping to read a discussion that connects how early ideas are formed about emotional beliefs like the death penalty and gun control, and how those ideas are maintained more by emotion than they are by reason, and how that effects brain chemistry when facing the opposing view. Of course, I didn't read the source material for the article. Was there anything about this issue in the sources?
Very nearly all the literature I've seen deals with confirmation bias as an abstract information processing effect. The most notable research that related it to brain processes is the Westen et al. MRI research mentioned in the article. This is pretty recent: it's one of the more recent references in the article. I expect more research like this will make it into textbooks and review papers in the coming years.

I wanted to have a discussion here about these issues before supporting and opposing. I like to choose one or the other in FAC. I think this article is well-written and very interesting. I like thought-provoking articles, and this one fit the bill. However, there are parts of the article where the language is so obtuse, or that it assumes the reader may already be familiar with psychological or statistical precepts that it's difficult to discern what the point of the sentence or paragraph is. Let me know your thoughts. Thanks. --Moni3 (talk) 14:40, 24 June 2010 (UTC)

Footnotes in lede

The lede should either have no footnotes, or else a lot more footnotes than it has now.Anythingyouwant (talk) 22:16, 24 June 2010 (UTC)

That's not necessarily the case. Leads don't have to cite what is cited and discussed lower in the article. If the lead covers something likely to be contested, such as statistics, quotes, or a fact that is controversial, it should be cited. --Moni3 (talk) 22:27, 24 June 2010 (UTC)
You're undermining my authority, Moni3. Seriously, in this case, the cited sentence is not controversial, and unlikely to be challenged. So, it should be easy to move the cite to the body of the article. Leaving it as-is will be a magnet attracting people to add more footnotes to the lead, IMO. BTW, maybe they should open a MOBS for pics like the Uriah Heep one (Museum of Bad Subjects).  :-)Anythingyouwant (talk) 22:40, 24 June 2010 (UTC)
The lead used to have fifteen footnotes. In the FAC, I was told that these break up the readability and, since the lead is a summary of the rest of the article, it shouldn't generally need its own references, except, as Moni3 says, for controversial statements. This sounds sensible to me. So to progress the article towards FA I cut out almost all the references. The remaining notes are necessary because 1) the phrase "myside bias" doesn't appear in the rest of the article, and it comes from one specific source, so we need to explain what that source is. 2) as the article explains, the meaning of the phrase "confirmation bias" has changed across time and across different sources, so we need to spell out at the outset which source the article uses for the definition in the first sentence. MartinPoulter (talk) 08:21, 25 June 2010 (UTC)
That's very reasonable, but there is an alternative way to do it. This is totally up to you, and I'm not going to implement it myself, but here's how I'd do it....In the first paragraph of the body (immediately under the heading "Types"), we could add after the first two words "or myside bias" and move the footnote from the lede. In that same paragraph (under the heading "Types") you write that some people use the term "confirmation bias" one way and some other people use it another way. You could add a sentence saying how the term is used in this Wikipedia article, and move the other footnote in the lede to the end of that paragraph. This might save you a lot of aggravation later when people see two footnotes in the lede and therefore start adding lots of "citation needed" tags to the rest of the lead.Anythingyouwant (talk) 15:11, 25 June 2010 (UTC)
Is the readability concern with regards to the presentation or the edit view? If the latter, you could consider using WP:LDR, at least for the lede.—RJH (talk) 17:17, 25 June 2010 (UTC)
It just seems that the article would be neater with the lede either fully footnoted or not footnoted at all. Having only one sentence footnoted will attract "citation needed" tags to the rest of the lede, and (as explained above) there doesn't seem to be any reason why the footnotes now in the lede couldn't be moved to the body of the article. This is not about having two different types of footnotes right next to each other in the lede (although I do think that looks odd too, and I've never seen such a thing in a scholarly article, even if it may be technically allowed here).Anythingyouwant (talk) 17:28, 25 June 2010 (UTC)
AYW, you and I are both used to editing articles that are very controversial and where users are likely to take content out of the lead if it isn't cited. This isn't the case with most of WP. Look through recent FAs, including today's Featured Article of the Day, Privilege of Peerage which has just two footnotes (to the same cite) in the lead. I'll consider your suggestions above. MartinPoulter (talk) 15:26, 26 June 2010 (UTC)
Not to quibble or anything, but when Privilege of Peerage became a featured article on July 27,2004 there were no footnotes in the lead. When it passed featured article review on November 22,2007 the lead was fully footnoted (five footnotes). This is basically an issue of consistency in the lede; readers are less likely to be distracted (or subconsciously confused) if a pattern is used (no footnotes in the lead, everything footnoted in the lead, parentheticals after every listed item, no parentheticals after any listed item, naming the researchers for every study mentioned, naming the researchers for none of the studies mentioned, et cetera).Anythingyouwant (talk) 16:24, 26 June 2010 (UTC)
Thanks for the recent burst of copy editing, by the way! MartinPoulter (talk) 15:40, 26 June 2010 (UTC)
My pleasure, Martin. I haven't looked at your contributions, but I will now.  :-)Anythingyouwant (talk) 15:49, 26 June 2010 (UTC)
Thank you. I haven't noticed a particular problem such as you describe in other featured articles (attracting "citation needed" tags because some references are included in the lede), but perhaps the situation is different because of the subject matter.—RJH (talk) 17:12, 27 June 2010 (UTC)

Confirmation bias and search engine design

Reading a thought-provoking review of Nicholas Carr's The Shallows: What the Internet is Doing to Our Brains, I was struck by this passage:

Concerns that the internet traps users in unchallenging information ghettos are not new, stretching back to 2001 and the US legal scholar Cass Sunstein’s book Republic.com. Sunstein argues that, when compared to older media, the internet allows users to seek out opinions and news with which they already agree, creating online news ghettos in which the views of right and left rarely mix.

What is surprising, however, is that today’s technology companies seem to use that book as a to-do-list. Google, for example, has been pushing to provide personalised search results to its users, meaning that two people searching for the same term may now get different results, altered according to what they have clicked on before. In December 2009, Google tweaked its rules in such a way that even users who are not signed into Google—thus denying the search giant access to their previous search history—will see their results personalised too.

I've seen this point raised before, but—with confirmation bias on my brain—it struck me in a new way. As regards users' preconceptions and hypotheses, we could describe the objective of this approach to search engine design as the creation of a feedback loop that reinforces confirmation bias. I wonder if there's been any serious consideration of the topic in these terms.—DCGeist (talk) 17:01, 3 July 2010 (UTC)

It's occasionally mentioned as an informal observation in this article's sources that people seek out sources that confirm what they already believe. The most prominent serious investigation of this that I've found is the Tabor and Lodge study mentioned under Confirmation_bias#Polarization_of_opinion which gave people the chance to access chunks of text from different sources on a computer screen.
As for the implications for search engines, they've not been mentioned in the sources I've looked at, but I agree it's worth keeping an eye out for a serious treatment. Facebook especially worries me with its eagerness to answer people's demands for reinforcement of their attitudes). I hope this article's increasing prominence will prompt many more thoughts along similar lines. MartinPoulter (talk) 18:16, 3 July 2010 (UTC)
There are frequently new articles right here on Wikipedia that are essentially POV forks from existing articles so that some editors can go on believing what they believed before they became Wikipedians. This is definitely an issue to watch out for, and sourced information on Internet information ghettos would be very helpful for this article. -- WeijiBaikeBianji (talk) 20:57, 3 July 2010 (UTC)

Too few wikilinks or too many?

User:Piotrus asked for more wikilinks in the lede, citing WP:BTW. User:Dabomb87 has removed the resulting links as overlinking. I'm not sure myself whether we need terms like "bias", "belief" and "evidence" to be wikilinked. Can we build consensus here? MartinPoulter (talk) 12:04, 7 July 2010 (UTC)

BTW is not carte blanche to link any word that's somewhat "relevant" to the topic. It is true that words such as "belief", "evidence", "emotion", and "beliefs" are relevant, but the information in those articles does not at all aid the readers' understanding of confirmation bias. For example, click on belief. The first sentence is this: "Belief is the psychological state in which an individual holds a proposition or premise to be true." Is that likely to be helpful? Most, if not all, readers could give you the basic essence of that definition if asked. There is no net benefit to linking these common terms, which dilute the higher-value links that readers should see. I left the link to "bias" because I thought there would be background information on biases that would help readers, but on clicking the link I saw it was just a glorified diambiguation page, so I delinked it. Dabomb87 (talk) 14:04, 7 July 2010 (UTC)
Thanks for spelling that out. Seems reasonable to me, though I don't feel strongly either way. MartinPoulter (talk) 16:35, 7 July 2010 (UTC)

Proposed re-wording

In the "In scientific procedure" section, I think the sentence, "In the context of scientific research, confirmation biases can sustain theories or research programs in the face of inadequate or even contradictory evidence; the field of parapsychology has been particularly affected by confirmation bias in this way." would be better without the final clause, "by confirmation bias in this way". Since this wording was introduced in the FAC, I'd like to see if there's consensus for a change. MartinPoulter (talk) 12:13, 7 July 2010 (UTC)

  • Support.—RJH (talk) 16:15, 8 July 2010 (UTC)

Hostile media effect

The statement recently added, that people may regard evidence that opposes their opinions as biased, is something I agree with and is a documented effect. I don't contest that the statement is true. I do question whether it should be mentioned in the lede of this article. There is a cluster of related biases, including confirmation bias, belief bias and the hostile media effect. The lede of this article isn't meant to introduce all of them, just the different kinds of confirmation bias and some effects that are described in sources as being directly related. It may well be my own biased recall, but I don't think the article's sources mention the hostile media effect directly in conjunction with confirmation bias, although clearly the same books will often mention both topics. I'm wary of packing descriptions of other biases into the lede, especially when those other biases are not mentioned in the article. MartinPoulter (talk) 11:54, 22 July 2010 (UTC) As a constructive suggestion, could Hostile media effect be added as a See also rather than put in the lede of this article? MartinPoulter (talk) 12:20, 22 July 2010 (UTC)

Unless someone comes along and can demonstrate that HME is described as a kind or effect of confirmation bias, that seems like a very reasonable way to deal with it.—DCGeist (talk) 18:35, 22 July 2010 (UTC)

I always knew Wikipedia was biased

I always knew Wikipedia was biased, and this article confirms it!μηδείς (talk) 02:59, 23 July 2010 (UTC)

How? Burpelson AFB (talk) 03:00, 23 July 2010 (UTC)
I believe he was making a joke :) —Preceding unsigned comment added by 94.140.42.98 (talk) 12:01, 23 July 2010 (UTC)

Negation bias

Isn't there also a similar bias which works the other way? If so, what is it called?--78.144.102.48 (talk) 09:44, 23 July 2010 (UTC)

In the history section:
He interpreted his results as showing a preference for confirmation over falsification, hence the term "confirmation bias"
So it would be a falsification bias, according to the term's inventor, although not sure he used the term explictly. --Sfnhltb (talk) 22:43, 23 July 2010 (UTC)

Praise!

What an interesting and thought provocing article! Very well written and excellent selection of sub topics. Well done! —Preceding unsigned comment added by 94.140.42.98 (talk) 12:46, 23 July 2010 (UTC)

Yeah, I agree that this is a well written and fascinating article. Not enough 'carrot' on wikipedia and too much 'stick' sometimes. Well done editors. ValenShephard 17:41, 24 July 2010 (UTC) —Preceding unsigned comment added by ValenShephard (talkcontribs)

Subjective validation

I created the above stub a couple of years ago and haven't really visited it much since. I separated in out into an article when it was just a redirect to the Forer effect. My knowledge of cognitive biases was, and still is, pretty limited as were my sources at the time. Now I'm not sure one way or the other whether subjective validation deserves an article of its own or if it's better to be merged with (or redirected to) this article. What do others think? Big Bird (talkcontribs) 15:13, 23 July 2010 (UTC)

As I understand it, the Forer effect is the subjective validation effect, and they are related but different topics from confirmation bias. The "Forer effect" is a label that has attached to what Forer himself called the fallacy of subjective validation. Hence the status quo is correct and neither term should redirect to this article. Robert Carrol in the Skeptic's Dictionary has separate articles for Forer effect and subjective validation, but to be honest I just don't understand what the difference is supposed to be. MartinPoulter (talk) 21:53, 25 July 2010 (UTC)

What about religion

If bias equates to preference, and that preference leads to omission of important information, then the glaring omission of the effect of confirmation bias on people's religious beliefs clearly makes this article biased, in the sense that it (perhaps deliberately) omits any discourse around faith.

Why are intellectual minds still shying away from discussing religious beliefs in the context of its inherent prejudice against scientific fact, and the significant role various biases and other cognitive filters play in building, internalising and reinforcing religious beliefs. Despite the mounting evidence against certain religious premises, such as the creation theory, which is probably one of the prime examples of confirmation bias in everyday (Western) culture, the topic fails to generate meaningful debate.

Surely the authors of both the article and the discussion must have considered the topic of religion when they decided to elaborate on finance, health, politics, science and even the paranormal (which is a very marginal brush with mainstream religion), yet they somehow decided to censor themselves from advancing the theme or even mentioning the word "religion".

One contributor said that he always believed that Wikipedia is biased (perhaps as a joke). Another asked "How?". This is how. —Preceding unsigned comment added by 165.146.110.12 (talk) 08:14, 23 July 2010 (UTC)


I agree, I was amazed that this article didn't mention the word "religion" even one time. Nortonew (talk) 21:05, 23 July 2010 (UTC)
The answer is that religion is not mentioned prominently in the academic sources from which the article is written, and as the article says the key experiments focused on issues like gun control and affirmative action. I personally think that confirmation bias could be applied to many many more things in life, and I expect there will be academic sources about those things in due course, but until then applying the idea of confirmation bias to other domains is left as an exercise for the reader. MartinPoulter (talk) 21:55, 25 July 2010 (UTC)
Agreed with User:MartinPoulter on all counts. Obamafan70 (talk) 03:12, 2 August 2010 (UTC)

The first image

This image of multiple guns is a tangential example, and is out of place at the head of this article. The caption reads "Strong opinions on an issue such as gun ownership...". This is a weird statement. "gun ownership" is not an opinion. It is a condition. This condition may be highly correlated with having a strong opinion, but that is not what the caption says. Further, commas should be placed around the "such as" clause because it gives an example which can be removed from the sentence without changing its meaning. Finally, it is not clear what exactly this example is trying to imply. It seems to me to imply that gun owners are generally less objective because they have strong opinions. If that is the intended message, why not state it clearly?--76.27.92.148 (talk) 23:24, 2 August 2010 (UTC)

The caption as it is is a clear statement. There is suggestion that gun ownership is an opinion: that would be absurd. But it is, as the caption says, an issue on which people might have opinions. Note that it's "opinions on the issue" not "opinions in favor" or "opinions against". The sentence doesn't say anything about gun owners, so I don't understand how anyone can interpret it in terms of a sentence with "gun owners" as the subject. As for the commas, they can be included or not: it's a stylistic choice. MartinPoulter (talk) 10:24, 3 August 2010 (UTC)

First sentence

The first sentence used to say: "Confirmation bias (also called confirmatory bias or myside bias) is a tendency for people to favor information that confirms their preconceptions or hypotheses whether or not it is true." (emphasis added)

To clarify what "it" is, I changed the sentence to this: "Confirmation bias (also called confirmatory bias or myside bias) is a tendency for people to favor information that confirms their preconceptions or hypotheses regardless of whether that information is true."

Should we amend this to say the following?

"Confirmation bias (also called confirmatory bias or myside bias) is a tendency for people to favor information that confirms their preconceptions or hypotheses regardless of whether that information a preconception or hypothesis is true."108.18.182.123 (talk) 02:10, 5 August 2010 (UTC)

It may be appropriate to say that the bias applies regardless of whether the information is true and regardless of whether the preconception or hypothesis is true. I think we need the article's primary contributor, MartinPoulter, to weigh in here. I do agree that the lede sentence has been ambiguous and should be made more clear.—DCGeist (talk) 02:17, 5 August 2010 (UTC)
Good idea. I tentatively tweaked the lead sentence accordingly, pending input from the article's primary contributor.108.18.182.123 (talk) 06:25, 5 August 2010 (UTC)
I agree there's a problem with the lede, but am not sure how to fix it. The important sense to get is "whether or not the information is true". "Whether or not the hypotheses or preconceptions are true" would apply as well. I think the current first sentence has too many clauses. I consult the sources (in a few days time: they're not here) and feed back. MartinPoulter (talk) 09:03, 5 August 2010 (UTC)

Is the Tolstoy quote bogus?

The full text of "What is Art?" is online here, and I cannot find the quote that appears in this Wikipedia article:

I know that most men, including those at ease with problems of the greatest complexity, can seldom accept the simplest and most obvious truth if it be such as would oblige them to admit the falsity of conclusions which they have proudly taught to others, and which they have woven, thread by thread, into the fabrics of their life.

But on page 124 there is this:

I know that most men — not only those considered clever, but even those who are very clever, and capable of understanding most difficult scientific, mathematical, or philosophic problems — can very seldom discern even the simplest and most obvious truth if it be such as to oblige them to admit the falsity of conclusions they have formed, perhaps with much difficulty—conclusions of which they are proud, which they have taught to others, and on which they have built their lives.

I'll go ahead and install the more accurate quote.108.18.182.123 (talk) 16:05, 5 August 2010 (UTC)

quotes from Bacon and Tolstoy

I appreciate the article, and particularly, the quotes from Bacon and Tolstoy. Would it be possible to see the quotes moved up somewhere nearer to the top of the article? -Civilizededucationtalk 04:29, 13 January 2011 (UTC)

At the moment, the article has what I think is a logical structure. Those quotes are in the historical section, explaining the observations of bias that were made in the past. This article is mainly about the modern understanding of confirmation bias, which derives from experiments done in recent decades. Confirmation bias isn't a literary observation about human nature, but a set of related effects which can be demonstrated in repeatable experiments. The historical material needs to be there, but not given as much priority or weight as the more modern references. Hence I don't see a case for moving those bits closer to the top: they are already relatively early in the article (note that the historical section occurs before the explanations section). MartinPoulter (talk) 21:17, 14 January 2011 (UTC)
I see that you are a qualified expert on the subject. As such, I am happy to go along with what you say. However, I would request you to consider moving up the history section because it can help "let in" newcomers into the topic. Tolstoy and Bacon are guys most newcomers can relate to, and rely on. I have seen books on computers start off with a chapter on the history of computers. However, I would not prod on this point again.-Civilizededucationtalk 02:16, 15 January 2011 (UTC)

Gun control - using this article as a platform for the gun lobby?

I think this article would be much improved if the gun example were removed and a more neutral (and less contentious) one included instead. The example may have some validity but it creates a huge distraction from the purpose of the article which is about a psycological phenomenon, not a political debate. —Preceding unsigned comment added by 128.243.253.107 (talk) 16:06, 18 January 2011 (UTC)

The image caption refers to "an issue such as gun ownership". Thus it doesn't associate the article with any particular position on the issue. It's equally relevant to either side. I think it would surprise a lot of readers that it can be taken as "a platform for the gun lobby" because it doesn't express an opinion either way. When people complain that it's not neutral, it's not clear which POV it is supposed to be supporting. That indicates that it is in fact neutral.
The article shows the confirmation bias applies in particular to hot-button political issues, with gun politics named as one of those issues where the effect was demonstrated. Hence there is a clear relevance to that political debate. As said in the other talk thread above, it would be better to have another picture, say, depicting guns on sale in a shop, but the political debate itself is definitely relevant. MartinPoulter (talk) 17:29, 18 January 2011 (UTC)
I agree with MartinPoulter. There is nothing POV about mentioning gun ownership as an issue, and the language doesn't favor either side of the debate. —tktktk 02:24, 20 January 2011 (UTC)
It's clear that the text is neutral. While I agree that the image is distracting (as in the discussion in the section above), I don't think it's a "platform for the gun lobby" in any way. -Porlob (talk) 17:32, 22 January 2011 (UTC)

Image for the article

It's too bad that there is no image in the lead. How about this image to lead off the article?: http://en.wikipedia.org/wiki/File:Bush_mission_accomplished.jpg Or, maybe just an image of a brain from commons? -- Ssilvers (talk) 14:21, 23 July 2010 (UTC)

The image currently used in the lead (making the point about gun ownership) is awful: it does nothing to illustrate the article's subject. As it is therefore completely unnecessary, and I cannot find where a consensus was reached that the picture is any good, I have been bold and removed it - I suggest until or unless a better one can be found that it remain that way. An example of a decent one might be a person reading a newspaper, as said newspaper may report its news biased towards that person's viewpoint (as the caption may explain). --92.25.213.145 (talk) 20:39, 23 July 2010 (UTC)

  • I fully support the decision of the above editor; it added nothing to the article, and was distracting to say the least. Overall, though, I was very impressed by the quality of this article. City of Destruction (The Celestial City) 23:48, 23 July 2010 (UTC)

I dunno, the example of gun politics seems like an excellent visualization giving an example of confirmation bias. SaltyBoatr get wet 16:31, 24 July 2010 (UTC)

It's certainly an example of an issue that can be affected by confirmation bias. My point is that it is in no way an illustration of confirmation bias itself. --AdamM (talk) 15:26, 25 July 2010 (UTC)

Couldn't the same be said of non-gun-ownership? Or owning a dog? Or anything? It seems to give an implicit opinion of gun-owners and their likelihood of having a bias. —Preceding unsigned comment added by 75.92.158.79 (talk) 08:48, 18 August 2010 (UTC)

Some topics, such as this, do not lend themselves to "direct" illustration, but to related matters. That's perfectly appropriate.—DCGeist (talk) 17:52, 25 July 2010 (UTC)
As discussed at length when this article was an FA candidate, the image is not meant to illustrate confirmation bias itself, but to bring to mind the issue of gun politics which, as the article says, is the subject of one of the clearest demonstrations. MartinPoulter (talk) 21:59, 25 July 2010 (UTC)
Would it be a good idea to have "issue such as gun politics" rather than "issue such as gun ownership"? It seems to me that it shouldn't matter, as logically the issue of whether or not to permit private gun ownership is the same as the issue of whether or not to not permit private gun ownership. MartinPoulter (talk) 14:52, 1 September 2010 (UTC)

Maybe the image at the top could be made even more relevant, by modifying it a little bit. It could be converted into an animated picture of the guns that at first is blurry but gradually comes into focus. As the article states later on, such an animated picture is one of the primary tests for confirmation bias; people who see the blurry image before the clear image are less likely to properly identify the image once it becomes clear. Anyway, that's just a thought, take it or leave it. Without some modification or change to the top picture, you're likely to spend the rest of your life responding to objections to it....though not any objections from me.  :-)108.18.182.123 (talk) 17:30, 5 August 2010 (UTC)

I'm going to go ahead and remove the image. the image says nothing whatsoever about confirmation bias, and seems incredibly out-of-place. While, yes, views on gun-ownership are subject to confirmation bias, so is virtually every other topic on which opinions can be formed. A photograph of collection of guns seems very tenuously related even to its own caption, and is needlessly jarring on this article. -Porlob (talk) 21:51, 14 January 2011 (UTC)
This has already been discussed at length, including in the FA review. Gun ownership isn't an arbitrarily chosen topic, but the topic of one of the pivotal experiments. The image merely illustrates a topic connected to the article: it doesn't suggest an attitude on the issue. What image do you suggest in its place? There's no clear consensus from Talk: please try to establish a consensus before acting. MartinPoulter (talk) 22:26, 14 January 2011 (UTC)
I don't necessarily suggest any image in its place. One is not necessary and should only be there if it help to illustrate the article as a whole. Perhaps this image would be more appropriate in the section on the Taber and Lodge study, but it does not appear to be a very useful illustreation of the subject of the article as a whole. -Porlob (talk) 23:21, 14 January 2011 (UTC)
The image should be reinstated. As a newcomer to the article, I found the image thought provoking, it sparked an interest, and gradually helped me see the the meaning of this article.-Civilizededucationtalk 01:27, 15 January 2011 (UTC)
Consider the entries on List of cognitive biases. While many of the article included on that list contain images in their article, very few have an illustration in the lead: only those having specifically to do with visual phenomena (such as Pareidolia), plus this article.
Now, I'm not saying that by no account should any of these articles ever have a lead illustration, but it is indicative of the fact that these articles may not necessarily lend themselves to a "summary" illustration. And I definitely don't think a picture of a variety of different handguns says anything about confirmation bias. -Porlob (talk) 03:08, 15 January 2011 (UTC)
Under the section "Polarization of opinion", you do find gun control being discussed. And the image sparked an interest in me as to "what is this image doing in an article like this". However, when I looked through the article, I knew why it was there and the image helped me visualize the thing and to see the argument clearly. Otherwise, I may have just read the thing about gun control without seeing the meaning. So, the image does help.-Civilizededucationtalk 11:23, 15 January 2011 (UTC)
Isn't it indicative of the fact that those other articles weren't taken to FA? We can't use non-FA articles as a guide on how to improve an FA. I notice that you are complaining about the gun image, not, say, the Uriah Heep image, which has less of a connection to the content of the article. If you were to consistently apply your principle, we'd remove all the images from this article, and almost all of them from all the articles about abstract concepts. That's not how an encyclopedia works.
The caption "Strong opinions on an issue such as gun ownership can bias how someone interprets new evidence." is a crucial part of the lead, giving newcomers the gist of what confirmation bias is. It's clear from the article that we can't use "just about any other topic on which opinions can be formed." We don't want a lead picture that includes people (for reasons explained in the FA review). The existing image isn't perfect, but something like a rack of guns in a shop would illustrate the concept.MartinPoulter (talk) 12:19, 15 January 2011 (UTC)
I take your point on the images in the other articles. However, you are mischaracterizing my objection. As I have said, I wouldn't object to the image in the section of the article on the Taber and Lodge study, just as I do not mind the Uriah Heep image in its appropriate section. It is the existence of this image in the lead section that I object to, and I do not propose substituting another image (though neither am I apposed to a more appropriate image). While I do find the image rather jarring, I would not object to it if it illustrated the subject of this article, which I do not believe it does. Furthermore if the impression the caption is meant to give is that gun control is more prone to confirmation bias than other issues ("It is clear from the article that we can't use 'just about any topic on which opinions can be formed.'"), then it is definitely inappropriate, as this article is about a cognitive bias which, indeed, affects just about any topic on which opinions can be formed, including gun control (and to go a bit meta, whether or not this image illustrates confirmation bias, ;) ). -Porlob (talk) 16:04, 15 January 2011 (UTC)
Thanks for the clarification and I have a better idea of where you're coming from now. Gun ownership and affirmative action are the two hot-button issues that were the focus of the research on how strong opinions affect the interpretation of new evidence. The point of the Taber & Lodge research was that the effect does not happen with just any topic, but with "hot button" political topics, and they explored those two in particular. To use other topics as examples, when they weren't addressed in the research, would smell of OR. Affirmative action seems to be even harder to illustrate visually than gun politics: at least with gun politics there's a physical object that can be depicted. I submitted the article to FA review without a lead image, but one was introduced as part of the review process, after a great deal of debate about what was suitable. I'd like to replace the current image with a better one for the caption, but I'm against the removal of that caption in particular. I welcome other opinions. MartinPoulter (talk) 16:21, 15 January 2011 (UTC)
I strongly advocate that the 'Gun Ownership' reference in the intro should be removed. By reference I mean the picture and the sentence "For example, in reading about gun control, people usually prefer sources that affirm their existing attitudes...". The intro of an abstract concept like Confirmation Bias should be general. Further down in the article more specific statements can be made. Also, the 'Gun Ownership' debate is a very American one. In most of Europe and Asia it's just not a topic. Maybe one could make an Examples section: - 'Gun Ownership' in the US - Abortion in Christian countries, - Diesel vs. Gasoline fuel ... 195.49.47.227 (talk) 08:39, 1 February 2011 (UTC)]
The fact that it's an abstract topic makes it all the more important to have an example to help people understand it. What sources relate abortion or diesel versus gasoline to the topic of the article? MartinPoulter (talk) 13:29, 1 February 2011 (UTC)
' abstract topic makes it all the more important to have an example ' I agree with that. Only there should not be just one example which dominates the intro, even with a picture on the 'front page'. Confirmation bias is a very widespread phenomenon, which happens in all kinds of circumstances to all kinds of people. It's part of human nature. Also -again- 'Gun Control' is a very American issue. In many parts of the world people just don't care about it, which already rules it out as a prime example, IMO. 195.49.47.227 (talk) 17:00, 1 February 2011 (UTC)
Please address the argument above: the main sources explicitly use gun control as an example: they don't use the other examples you've cited. What sources justify using a different topic? You could remove the lead image from any article with this reasoning. MartinPoulter (talk) 14:47, 8 February 2011 (UTC)
The American bias of the gun imagery had not occurred to me on my initial objection (probably due to my Americanness), but I agree. It seems the weight of opinion of the matter is to remove (or at least move to the section on the relevant study). -Porlob (talk) 20:34, 1 February 2011 (UTC)
What should the lead image be, then? How do we get the caption, or something similar, back into the lead? The weight of discussion isn't against having an example to illustrate the article's topic. MartinPoulter (talk) 14:47, 8 February 2011 (UTC)
I have no problem with the gun image, but if consensus is that we should find something else, how about an image relating to psychic readings or parapsychology?—DCGeist (talk) 22:20, 8 February 2011 (UTC)

Please include the act of ignoring information in defining confirmation bias

It seems that this article lacks or lacks the stress of a very important aspect of confirmation bias: Not only do people pay attention to (and sometimes misinterpret) information to reinforce their beliefs, but they also ignore (and/or distort) information and evidence that disputes their views. This is a considerable part of the definition, and one that must be presented in order to fully inform the reader, especially in the opening paragraph. Below are two recent textbooks in the field of psychology which give similar definitions. Please consider adding to and revising the current explanation.

Baumeister, R. F., & Bushman, B. J. (2010). Social psychology and human nature. Belmont, CA: Wadsworth.

Myers, D. G. (2009). Exploring psychology. New York, NY: Worth Publishers.

Xsmod (talk) 20:44, 23 January 2011 (UTC)

If you have those books, you could try doing it yourself. If you want to discuss it first, you could place some relevant excerpts on the talk page here (not too much in order to avoid copyvio).-Civilizededucationtalk 06:48, 24 January 2011 (UTC)
I thought "people gather evidence and recall information from memory selectively, and interpret it in a biased way" covers the phenomenon you're talking about. If you say (as the article does) that people prefer information that reinforces their existing attitude, that implies a comparison; that information that conflicts with the attitude is downplayed or ignored. There's more detail in the Biased Interpretation section. MartinPoulter (talk) 13:35, 1 February 2011 (UTC)

A good example of confirmation bias

Here is an example of listed articles at the Journal of Chinese Medicine ("the preeminent English Language Journal on Traditional Chinese Medicine" -

  • 14/02/2011 - Acupuncture effective for bedwetting
  • 10/02/2011 - Acupuncture reduces androgen levels in PCOS
  • 09/02/2011 - Leafy greens protect against diabetes
  • 09/02/2011 - Acupuncture improves symptoms of ADHD
This is a very good, simple, and clear example of confirmation bias for the article, but it is WP:OR to say so. PPdd (talk) 14:03, 16 February 2011 (UTC)

Introductory paragraph

This sentence: "Hence they can lead to disastrous decisions, especially in organizational, military, political and social contexts" strikes me as odd. Decisions in organizational, military, political, and social contexts. Seems to me that's pretty much all contexts there are when it comes to making a decision. I would argue the second part of the sentence could be dropped and if need be the rest appended to the previous. 84.226.37.91 (talk) 03:51, 19 March 2011 (UTC)

There are many in which people make decisions, but those areas in particular are mentioned because the sources specifically mention the possible consequences of biased reasoning in those areas. I'm open to excluding "social" from the list, because although the sources extensively talk about confirmation bias in social contexts (getting to know another person), it's not described as a disaster in the same sense that a bad political decision can be a disaster. MartinPoulter (talk) 14:08, 20 March 2011 (UTC)

Style concern

A superficial reading of this article leads me to believe that parts of it are written in a style which is essay-like. The sections which most likely might need some stylistic tweaking to reduce the essay effect are: Biased interpretation, Related effects and History. I can't specify any particular thing which needs re-writing and I surely wouldn't want to try tearing up such a work. If you look at each little subsection they are well written. However, taken as a whole the article tends toward this style. I wonder if it might be better broken into smaller articles? Regards to all and kudos for your hard work. Trilobitealive (talk) 16:36, 16 April 2011 (UTC) (Its obvious that this is one of the best Wikipedia articles written and I apologize in advance for this nagging perception I couldn't let go of.) Trilobitealive (talk) 16:46, 16 April 2011 (UTC)

Could you be more specific about what needs fixing? What do you mean in particular by the "essay effect"? MartinPoulter (talk) 13:08, 26 June 2011 (UTC)

Adding the methodological problems and some criticisms of confirmation bias

I was wondering whether the article could do with some criticisms of confirmation bias. There is of course the methodological problem of how one would extract confirmation bias from any research on confirmation bias.

But the main thing of course and the reason that I ended up deciding to add to the discussion was that the first thing that struck me when reading the article, understanding the implications (or so I believe) of the theory, was that having stated that people like to confirm their beliefs without recourse to potentially falsifying counter-arguments, or as someone else suggested in the discussion ignoring evidence, was that the article did not contain any. Neither did it contain any criticisms other than Klayman and Ha's. Indeed Klayman and Ha's critique whilst critiquing Wason's experiment is added as a critique that subsequently added to the paradigm. I'm not saying this is bad, it indeed confirms an ability to take on criticism for the later psychologists. However as that is the case for the only critique here that is a potential own goal. I would have thought that such a scientific rigour would be important in such a subject? I think this theory is an important one, but on the importance of the self-reflexive application of one's own theory... If i come across any examples I shall add some. (I've only just come across this article.) Schizo stroller (talk) 20:07, 23 April 2011 (UTC) schizo stroller

The Explanations section includes more criticisms of the idea that it's an irrational process; for instance in the idea that the confirmation behaviour in social settings is an example of social skills. This article has to be guided by the major secondary sources on the topic: what criticisms from those sources are missing? MartinPoulter (talk) 13:12, 26 June 2011 (UTC)

Faulty math: "Illusory association between events"

The article reads:

"Days Rain No Rain Arthritis 14 6 No arthritis 7 2"

"In the above fictional example, arthritic symptoms are more likely on days with no rain. "

There are 29 days, 21 of them rainy, 8 not. The odds of being arthritic on a rainy day are 7/21, i.e. 0.333. The odds of being arthritic on a non-rainy day are 2/8, i.e. 0.250. So in the example arthritic symptoms are more likely on days with rain, no? — Preceding unsigned comment added by 24.77.29.136 (talk) 03:43, 26 June 2011 (UTC)

No, the statement in the article is correct. In your writing above, you seem to have reversed "Arthritis" and "No arthritis". MartinPoulter (talk) 13:07, 26 June 2011 (UTC)

Incorrect definition of the confirmation bias?

One issue with this article is the definition of the confirmation bias. Originally Wason intended it to mean only the biased search for evidence (i.e. the 2-4-6 task or the A K 4 7 Selection task). Biased assimilation and belief perserverance are related but fundamentally different. — Preceding unsigned comment added by Jeffrotman (talkcontribs) 02:06, 1 July 2011 (UTC)

In order to reflect up-to-date sources, we can't just rely on Wason's original definition. The definition used by the article comes from a recent university-level textbook. Other sources explicitly mention assimilation and perseverance as phenomena arising from conf. bias. MartinPoulter (talk) 14:12, 4 July 2011 (UTC)

Explanations missing cognitive consistency

Hi, Excellent work with the article. However, I would argue that the most common explanation of the confirmation bias is cognitive consistency (e.g. cognitive dissonance, expectancy theory etc.) However, there is little discussion given to that topic in terms of the confirmation bias. Jeffrotman (talk) 23:58, 22 August 2011 (UTC) Jeff

The connection is mentioned fleetingly in the third para of the Explanations section, but maybe it could be expanded. What sources do you recommend? MartinPoulter (talk) 13:40, 24 October 2011 (UTC)

Guns and attitude

From the article: "For example, in reading about gun control, people usually prefer sources that affirm their existing attitudes." Is this statement based upon an actual real-world study (which possibly should be referenced to if so; given the political nature of the decision) or is a hypothetical example to illustrate the point (which should also be labelled as so)?

If you read down a bit you come to the Confirmation_bias#Polarization_of_opinion section which explains and cites the relevant study. Please sign your posts. Thanks. MartinPoulter (talk) 13:38, 24 October 2011 (UTC)
I was about to mention the same, it seems a bit non-NPOV; unless the "example" of gun control is self-referential conformation bias; in which case it should be identified as such. btw, I'm not the author of the privious unsigned entry. Signed. You're welcome. ~Eric F 184.76.225.106 (talk) 00:29, 7 March 2012 (UTC)
I don't understand how that summary of a significant experiment can be interpreted as non-NPOV. What POV is it pushing, and how does it not fairly represent the source? This comes up again and again that the mere mention of gun control as an issue, without taking a position on it, is somehow (unfairly) perceived as advancing a position. MartinPoulter (talk) 13:53, 25 May 2012 (UTC)
The current wording does not present a POV problem, whatsoever. Our neutral point of view policy directs us, in writing articles, not to take sides between significant, reliably sourced views in the disputes we may describe. This article obviously does not take a side on the issue of gun control. Seems simple enough, but unfortunately, "non-NPOV" is tossed around freely and inaptly all too often here by editors who don't understand what the policy actually concerns.—DCGeist (talk) 14:19, 25 May 2012 (UTC)

Verification bias

"Verification bias" shouldn't be used as a synonym for "confirmation bias". Strictly, "verification bias" refers to something which probably doesn't exist. As the article explains, Wason thought that people had a tendency to test hypotheses in a way that sought verification of their existing beliefs. However, Wason later changed his view on this and later work (such as Klayman and Ha) showed that people were actually testing their hypotheses in an informative way. However, a number of effects have been shown to exist whereby people favour information that confirms existing beliefs, attitudes and hypotheses, and the main sources of the article use "confirmation bias" collectively to refer to these effects. So confirmation bias is definitely a thing according to modern psychology, and verification bias isn't. They aren't synonyms. You can find "confirmation bias" referred to in many papers and textbooks, while "verification bias" was an obscure and short-lived coinage. So the latter belongs in a footnote and I'll remove it from the lede. MartinPoulter (talk) 13:53, 25 May 2012 (UTC)

Merge proposal

I think Backfire effect and Confirmation bias cover the same ground. Since Wikipedia articles are about things, not words, having two separate articles is a form of content forkery. — Ƶ§œš¹ [ãːɱ ˈfɹ̠ˤʷɪ̃ə̃nlɪ] 13:11, 10 September 2012 (UTC)

  • Agree. Good argument! Lova Falk talk 18:56, 10 September 2012 (UTC)
  • Yep. It seems specifically that attitude polarisation (covered in a section of this article) is the same or closely related. I haven't looked at the sources yet, but do they mention any link? MartinPoulter (talk) 16:21, 21 September 2012 (UTC)
  • I agree as well. Same durn thing. DeistCosmos (talk) 15:24, 22 September 2012 (UTC)
  • Forkery, indeed! Gulbenk (talk) 00:33, 1 October 2012 (UTC)
  • I agree the articles should be merged. IjonTichyIjonTichy (talk) 02:58, 31 October 2012 (UTC)
  • Comment I'm not an expert on cognitive psychology, so I will avoid !voting. However, to the extent backfire effect belongs in the confirmation bias article, I think it would belong under the "related effects" section. --Jprg1966 (talk) 23:11, 2 November 2012 (UTC)
  • The Backfire Effect is NOT the same as Confirmation Bias. Confirmation Bias is a tendency to seek out information that confirms existing views. The Backfire Effect is the tendency for contradictory information to actually strengthen existing views. The two phenomena also differ substantially in the amount and kind of research supporting their existence. If Backfire Effect is mentioned in this article, it should be made clear that the concepts are different in important ways, and research supporting the existence of Confirmation Bias does not necessarily support the existence of the Backfire Effect. — Preceding unsigned comment added by 24.123.14.50 (talk) 23:01, 20 November 2012 (UTC)
  • Agree TheOriginalSoni (talk) 14:01, 12 December 2012 (UTC)

Done, per talk page consensus. TheOriginalSoni (talk) 14:01, 12 December 2012 (UTC)

"Informal Observation" in history

While I was reading that section, I immediately recalled a quote from Greek philosopher Epictetus (c. 55 - c. 135 AD):

The form I recalled:

It is impossible for a man to learn what he thinks he already knows.

I wonder if that quote could be considered "close enough" to be included.


But, being vigilant for confirmation bias, I have to include the more complete version of the quote, which is given as:

If a man would pursue Philosophy, his first task is to throw away conceit. For it is impossible for a man to begin to learn what he has a conceit that he already knows.
(http://en.wikiquote.org/wiki/Epictetus)


I didn't remember it with the "conceit" angle. Did he consider the self-importance and hubris aspect to be the main driver, or the contingency of which viewpoint/knowledge you happened to get in touch with first, which then colors your evaluation of later information?

JH-man (talk) 11:23, 11 February 2013 (UTC)

"Current political issues"

This is mainly directed at Lova Falk. Lova, thanks for your improvements to this article and for the heroic work you continue to do on many psychology articles. This is about this edit. The Taber and Lodge study is one of the central studies in this topic, and the topics they examined were gun control and affirmative action. The point is that it's emotive, contentious issues that trigger the bias effects, not political issues. Political issues are generally boring and unemotional to a lot of people. That's why the article originally gave gun control as an example (because for gun control there is one of the clearest experimental demonstrations), and why it's not strictly correct to replace this with the more general phrase "current political issues". The change was originally made by an anonymous user in an unexplained edit, and isn't based on the sources. Using affirmative action as an example, instead of gun control, would be just as valid: I think that affirmative action is a less internationally-understood phrase. I don't feel strongly about that, but we need a specific example in order to be correct. MartinPoulter (talk) 15:38, 1 June 2013 (UTC)

Hi MartinPoulter, thank you for your comment! I undid my edit. What happened was that I worked from this diff but didn't read your edit summary. So I thought the whole thing happened the other way around: originally it was about current political issues, and it was edited into gun control. A good reminder I should not revert edits whithout checking edit summaries. Thank you! Lova Falk talk 18:09, 1 June 2013 (UTC)
MartinPoulter, do you have a source for the "gun control" information? Who is making this claim? I don't see why you want to have an unexplained, unsourced "example" in the lead/introduction. It seems like the subject should either be defined and introduced without an example, or if an example is needed it should have further explanation of what it means and why it is relevant or necessary to have this in the introduction. --Mikiemike (talk) 16:38, 4 June 2013 (UTC)
I don't understand why you're asking here and not just reading the article. Your web browser probably has a search function to find particular words (gun, Taber, Lodge) within a page you're reading. This is the relevant section: Confirmation_bias#Polarization_of_opinion. The lede summarises the article: it's not normally needed to have references in the lede if the information is sourced within the article itself. Why do I want an example in the lede? Simple: concrete examples help people understand the topic. MartinPoulter (talk) 10:53, 5 June 2013 (UTC)

"You are not so smart"

Paging User:Lova Falk re this edit:[2]. The "You are not so smart" blog and book are not written by an expert, but they do reflect a mainstream scientific understanding of the topics, sourced to the same mainstream academic sources that are used in this and similar Wikipedia articles. What's more, the writing makes the topics more accessible than the Wikipedia article does. The external links policy says to consider "Sites that fail to meet criteria for reliable sources yet still contain information about the subject of the article from knowledgeable sources." YANSS meets this exactly, so I'd like to re-include it in the external links for this article. Cheers, MartinPoulter (talk) 12:45, 20 June 2013 (UTC)

Thank you for telling me MartinPoulter! Lova Falk talk 19:48, 20 June 2013 (UTC)

Biased Interpretation and Intelligence

I’d like to add another study to the biased interpretation section in relation to intelligence levels. In a study conducted by Stanovich and West, myside bias was shown to be persistent in belief formation and interpretation, regardless of intelligence level. Participants had to evaluate if they would allow a dangerous German car on American streets and a dangerous American car on German streets. Both groups completed an intelligence assessment before taking the surveys. Participants responded to a scale ranging from one to six, where one indicated “definitely yes” and six indicated “definitely no,” based on their opinion if the car should be banned. Results indicated that participants believed that the dangerous German car on American streets should be banned more quickly than the dangerous American car on German streets. When people were not warned in advance to avoid biased processing, participants with higher intelligence were just as likely to demonstrate myside bias as those with lower intelligence scores. These results indicated a sizable myside bias, for the rate at which participants would ban a car did not differ among intelligence levels. There is no association between the magnitude of bias and intelligence. Individuals, regardless of their respective intelligence levels or cognitive ability, will still commit errors in rational thinking.

Stanovich, K. E., & West, R. F. (2008b). On the relative independence of thinking biases and cognitive ability. Journal of Personality and Social Psychology, 94, 672–695. --Hacleek (talk) 22:42, 28 October 2013 (UTC)

Hello @Hacleek: and thanks for your improvements to the article. You've taken on an extraordinarily difficult assignment, improving an article that's already at Featured Article status. I have some questions about your additions. First, for the paragraph that begins "Myside bias has been shown to influence the accuracy of memory recall." the paragraph seems to describe a bias in memory, but does not seem to relate to myside bias, also called confirmation bias. Could you post here a quote from the source that establishes that this is indeed a case of myside bias? MartinPoulter (talk) 20:43, 7 December 2013 (UTC)
Hi @MartinPoulter:, thanks for your message. The Safer article relates primarily to wishful thinking. The wishful thinking article on Wiki defines this as "the formation of beliefs and making decisions according to what might be pleasing to imagine instead of by appealing to evidence, rationality, or reality." Safer et al.'s purpose of this article was to demonstrate that current emotional appraisals influence how individuals recall emotionally traumatic information, as they hypothesized " (1) participants would overestimate past grief; (2) current levels of grief would predict recalled grief; (3) recalled grief would predict current coping; and (4) participants who improved the least would be particularly likely to overestimate past grief" (197). They found presence of memory distortion when participants were asked to recall initial grief levels. "Objective change in intrusive ideation correlated significantly with memory distortion for avoidant thoughts (r = 7.35, p < .05) and for grief symptoms (r = 7.43, p < .01), and objective change in avoidant thoughts correlated significantly with memory distortion for grief symptoms (r = 7.33, p < .05)" (201).
Previous research has shown that individuals in therapy will overestimate recalling prior emotional distress in order to exaggerate the benefits of therapy (202). This team found that "Although most participants reported much less grief at 5 years than at 6 months, those who improved relatively little tended to overestimate their initial grief, perhaps so as to feel good about however much they had improved" (202). Studies by Schwarz et al. and Southwick et al. showed that "one’s current emotional state affected what was recalled, but we would also argue that what was recalled, including the retrospective reappraisal of how one coped, contributed to and justified the current emotional state."
People will utilize wishful thinking principles when recalling previous traumatic memories in order to justify current emotional states, i.e., participants in the Safer study overestimate their prior grief levels when asked to recall how they felt 6 months after the death. The participants wanted to believe that they were doing much better five years later. In reality, the way that they felt five years later more closely resembled how they felt 6 months following the death rather than how they recalled the period after 6 months. I think that this relates to myside bias because: 1) the participants favored information (grief-related intrusive ideation) that confirmed their beliefs that time or therapy 2) people remembered the information selectively and interpreted it in a biased way to show that time helped and 3) we know that the effect for myside bias is stronger for emotionally charged issues. Would you have any recommendations to making this relate to wishful thinking more explicitly? Thanks for your time.--Hacleek (talk) 22:20, 7 December 2013 (UTC)
Hi all. Thanks for the explanation Hacleek, but the fact that your explanation is so elaborate only highlights to me the fact that the research was not conducted with confirmation bias or myside bias in mind. That is, the explanation you are describing was not in the source article but rather is your own interpretation. As such it is a pretty clear case of original research on your part. Moreover, I think that by trying to shoehorn this content into the article you are adding confusion to an already confusing article. My recommendation would still be to exclude this content altogether (as per my earlier removal). Does this make sense to you? Cheers Andrew (talk) 04:57, 8 December 2013 (UTC)
Thanks for the quick reply, Hacleek. The finding you're talking about is really interesting and belongs on Wikipedia somewhere, but like Andrew above, I'm not convinced this is the right article. There are lots of biases in memory, judgement, or decision-making and "confirmation bias" or "myside bias" are not catch-all terms for any of these biases but refer to specific phenomena. In particular, wishful thinking (the shaping of beliefs/interpretation by desires) is not necessarily the same as myside bias (when the assessment of evidence is shaped by prior beliefs or hypotheses). Wishful thinking might be one of the possible explanations of confirmation bias, but that doesn't mean that any finding relevant to wishful thinking belongs in an explanation of myside bias. Emotion and memory seems to be a better place to preserve your paragraph. MartinPoulter (talk) 11:12, 8 December 2013 (UTC)
On another topic, the "individual differences" section has "Myside bias can cause an inability to effectively and logically evaluate the opposite side of an argument." On the face of it, that doesn't seem right, because isn't it fair to say that myside bias is an inability to evaluate opposing arguments? It seems like you're saying that myside bias is its own cause. I've replaced the phrase "mutable and subject to change over time" with just "mutable" because it means the same thing. The sentence "These data also reveal that personal belief is not a source of myside bias.," is another that seems odd. If people don't have a personal belief, then they don't have a "side", so how can they exhibit myside bias? Thanks in advance for any help explaining this. MartinPoulter (talk) 11:28, 8 December 2013 (UTC)

Hi all, and thanks for your suggestions regarding my writing. I understand why the Safer article might be better applicable for a different article. The "individual differences" section was actually updated by @Jamela Peterson: - might be helpful if she joins us in discussing her reasoning for writing in that way, as she could answer that question better than I could.--Hacleek (talk) 19:19, 9 December 2013 (UTC)

First sentence

@WikiTryHardDieHard: Sorry I've had to undo your latest edit. The definition that you pasted into the first sentence is the original definition of confirmation bias, rather than the sense in which the term is used today. The history section of the article explains this change. Your first sentence would have contradicted the rest of the article. Note that there is relatively little or ambiguous evidence that people test hypotheses in a way that seeks to confirm hypotheses, but the evidence is stronger (according to the Oswald ref) for the other effects mentioned in the article, which would not be covered by your lead sentence. The lead sentence is what passed Featured Article Review. MartinPoulter (talk) 22:20, 27 March 2014 (UTC)

Note 2

Why do we have such a very long quote about confirmation bias in government, when more has been published about confirmation bias in corporations and financial markets (which is why the lead uses the phrase "organizational contexts")? Maybe I'm being over-sensitive, but it smacks of agenda-pushing. MartinPoulter (talk) 15:37, 15 September 2014 (UTC)

VDM Publishing

It seems to me that "Miller, Frederic P.; Vandome, Agnes; McBrewster, John (2009), Confirmation Bias, VDM Publishing, ISBN 6-130-20015-3" (note no. 2) is simply a printed-on-demand copy hereof and should be replaced by reliable source or removed. — Александр Крайнов (talk) 15:04, 29 October 2014 (UTC)

VDM Publishing is notorious for reproducing wikipedia. All "books" published by them and their subsidiaries need to be double checked, especially those whose ISBN begins with (978)613 (apparently the Mauritius office is devoted to wikipedia copies) Ihaveacatonmydesk (talk) 20:31, 29 October 2014 (UTC)

"Myside" bias

The article is called "Confirmation bias", and "confirmation bias" is mentioned as the primary name of the concept. "Myside" bias is mentioned as something it is "also called", rather parenthetically. When I read the introduction, I did not pay much attention to the name "myside bias", as I always focus on the primary name. And then I became confused when later in the article "myside" bias is suddenly used a lot (my thought was: "Does myside bias mean something different that confirmation bias, since now it is suddenly referred to as "myside bias"? Are we talking a specific submeaning here?"). The concept is called "confirmation bias" 51 places and "myside bias" 17 places. If "confirmation bias" is considered to be the best name, why not use it all over the article? I find it confusing that the article switches between the two; as a reader, I believe there should be some kind of consistency within the article. We did mention early on that myside bias is a synonym, so that fact is there if people need it. But I see no reason to mix the two names rather arbitrarily within the article; I only find that confusing as a reader.

Also, does "myside bias" really mean the exact same as "confirmation bias"? "Myside bias" seems to be something always related to "my side" of things, but does "confirmation bias" really always relate to "my side" of things? Couldn't confirmation bias happen even if there is no "me" involved, but just as a simple logical malfunction of the brain (which seems to be what we are talking about)? I tend to see some nuance, some slight difference, between the meanings of the (in that case two) concepts; as if "myside bias" is a subset of "confirmation bias".

--Jhertel (talk) 14:04, 8 December 2014 (UTC)

Hi Jhertel. This isn't my area of expertise, but in the course of my limited exposure I have not come across any argued distinction between 'myside bias' and 'confirmation bias'. Moreover, I believe confirmation bias is the more common term so it seems to me that adopting that term as the uniform vernacular in this article would be the way to go. I would be keen to hear what other have to say though.
In terms of your own reasoning about a potential distinction, I actually don't see a way for confirmation bias to exist without the 'me'. Confirmation bias is about the way the theories we hold about the world inform our interpretation of stimuli. As such, if there is no "me" then there are no theories, and if there are no theories then there is no way to interpret the world in light of them; thus, no confirmation bias. Does that make sense to you? Cheers Andrew (talk) 01:52, 11 December 2014 (UTC)
Hi Andrew. Thanks for your feedback! About the potential distinction: While I do believe I understand what you mean (in its extreme: nothing exists unless there is someone (a "me") to sense it), what I was thinking was that a person could have multiple different theories in his thoughts at the same time, without calling any one of them "my" theory, but simply "one" theory among others. In that case there is no "my side", but just "sides" or plainly "theories" without any feeling of ownership. What I was thinking was that there could perhaps still be confirmation bias even when a person is trying as neutrally as possible to investigate one theory among many he wants to test – even if that person did not in any way believe that theory to be "his" ("my") theory or "the right" theory. I think the name "myside bias" seems to insinuate some kind of ownership feeling that is not necessarily there and might not be necessary for confirmation bias to appear. At least that is my theory ;-) – which could very well be affected by in this case "myside" bias, as I do feel some kind of myness connected to that theory :-). Which is why I would also prefer that more people stated their view on the subject. :-) --Jhertel (talk) 18:09, 11 December 2014 (UTC)
Hi Jhertel. I see the distinction that you are trying to make, but I don't think that approach is reflected in the literature. As I understand it (and again, not exactly my area) both confirmation bias and myside bias are about the perceptual effects of believing something to be true. As such, both necessitate some ownership or the idea/theory/attitude. What you are describing could be true (i.e. that knowing of, but not believing in, a theory may result in confirmation bias), but this is a novel empirical question and not an issue for this article. Of course, if you come across some sources that do support that line of thinking then it does potentially become something that should get some coverage here. Does that resonate with you? Cheers Andrew (talk) 10:34, 16 December 2014 (UTC)
It resonates completely with me, Andrew! --Jhertel (talk) 19:57, 16 December 2014 (UTC)

Unclear sentence

The last sentence in the following text from the article (from this section) is unclear and does not make logical sense:

"In one experiment, participants read a set of psychiatric case studies, including responses to the Rorschach inkblot test. The participants reported that the homosexual men in the set were more likely to report seeing buttocks, anuses or sexually ambiguous figures in the inkblots. In fact the case studies were fictional and, in one version of the experiment, had been constructed so that the homosexual men were less likely to report this imagery."

What does the single fact that the case studies were fictional say about the reports of the participants? I don't see it. Note that only in one version of the experiment the cases had been constructed "so that the homosexual men were less likely to report this imagery", implying that they were not constructed that way in other versions of the experiment. So, in the other versions (or other version), we only know that the cases were constructed, but nothing is said about in which way; they could in fact have been constructed so that the homosexual men actually were more likely to report this imagery – in which case the participants would be absolutely right in their observations in these versions of the experiment.

So some information is missing for the above quoted text to make sense. I do not know enough about the subject to figure out what is missing, but I wanted to report the error so that someone more knowledgeable on the subject can correct it.

--Jhertel (talk) 13:40, 8 December 2014 (UTC)

It's been a long while since I checked this Talk page, but this is a valid criticism and I've re-worded the section to be more clear. MartinPoulter (talk) 15:34, 16 December 2015 (UTC)

Merger with Subjective validation

Merge - I think that Subjective validation should be merged into this article because it is very short and is essentially another name for the same thing. The merge wouldn't be much, probably just a mention of the information or an AKA. JoshBM16 (talk) 17:07, 16 December 2015 (UTC)

I saw that the subjective validation article mentions it is closely related to the Forer effect and so checked the confirmation bias article and discovered it never mentions the words "Forer", "Barnum", nor "synchronicity". Also, the Barnum effect article has a definition for subjective validation that's quite different than the one provided in the subjective validation article. While digging into that I found that the term "subjective validation" came from a 1980 book and so added that to the SV article. Subjective validation, Forer effect, Barnum effect, synchronicity, and confirmation bias, may well all be blind men's descriptions of the same elephant but I suspect it'll be hard to get them in the same room. --Marc Kupper|talk 05:57, 10 February 2016 (UTC)
  • I also oppose this because I disagree strongly that "subjective validation" "is essentially another name for the same thing" as confirmation bias. I don't see any sign of them being treated as synonyms in the literature. They're part of the same topic, sure, but I can't see why they'd be seen as the same thing. It at least needs some argument. MartinPoulter (talk) 11:58, 10 February 2016 (UTC)
  • I oppose the merger because confirmation bias predates subjective validation and was established by scientific testing. I see no particular need to admix these different concepts, and a good reason not to. That is, subjective validation is defined rather subjectively with vague definitions, and by comparison, confirmation bias is crisply defined and mechanistic.CarlWesolowski (talk) 17:34, 18 May 2016 (UTC)

Related to "Power corrupts"?

Seems to me that it would be much easier for the powerful to succumb to confirmation bias, since without disproportionate power, one is more likely to to need to examine one's justifications for choices. -lifeform (talk) 05:02, 30 May 2016 (UTC)

External links modified

Hello fellow Wikipedians,

I have just modified 2 external links on Confirmation bias. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit User:Cyberpower678/FaQs#InternetArchiveBot*this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, please set the checked parameter below to true or failed to let others know (documentation at {{Sourcecheck}}).

 Y An editor has reviewed this edit and fixed any errors that were found.

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—cyberbot IITalk to my owner:Online 08:10, 2 June 2016 (UTC)

Belief Perseverance (BP)

Hi Martin,

We talked briefly in the past about a minor revision of your excellent article.

On June 25, 2016 I submitted my first article for consideration. The topic is Belief Perseverance, and you can see the submission, and the reason it was declined, here: https://en.wikipedia.org/wiki/Draft:Belief_perseverance

This was my first submission ever, and it took about 2 weeks to put it together, so I’m a bit discouraged—to the point of finding another retirement hobby. At the same time, I suspect that the reviewer is right, and that it might be written more like an essay than an entry in an encyclopedia.

The reviewer, Omni Flame, later suggested that I incorporate the article into your Confirmation Bias article, making it part of the Confirmation Bias entry. He felt that entering BP already redirects you to CB, and thus we can avoid the hassle of making a new entry. I feel CB and BP are distinct, though overlapping. What do you think? Would it be appropriate to incorporate my BP into your CB? If so, would you be interested in editing the draft before I incorporate it?

If on the other hand you feel that BP deserves its own entry, would you be willing to revise my draft and perhaps submit it as your own entry? I hope to hear from you soon. I obviously don’t think it’s appropriate to make such a major revision in a great article, without discussing this matter with your first!

With many thanks, Brachney Brachney (talk) 23:41, 26 June 2016 (UTC)

Focus on negative aspects

This article very much focuses on the negative aspects of confirmation bias. And, yes, there are many. But it leaves out why confirmation bias still exists. And it exists because it is *useful* (more often than not). In huge amounts of data (with no *apparent* conclusion to be found and careful analysis taking too long a time (or just being plain impossible)) confirmation bias allows us to come up with a "working theory" very quickly. And while the individual might very well be wrong with his theory (but still being more likely to be right) the group is most likely to reach consent on the best theory (because of positive feedback mechanisms). Obviously that's true only as long as the best theory is supported by more data (or things that are perceived as data). So in short confirmation bias is a mechanism that allows you to arrive at conclusions almost instantly - but at the cost of guaranteed correctness (which might take years or lifetimes). And as much as we like to think of ourselves as rational animals... confimation bias is still how we go through our day. When we are on a street alone at night it is confirmation bias that drives our decision on whether we should cross he street and avoid the person coming down on our side. The sweet old grandma might actually have a gun and shoot us, the person with gang tattoos might be a loving father of 4 who couldn't hurt a fly. But with the data at hand (and considering the timeframe available for making the decision) confirmation bias is the only way to reach a decision (that's more helpful than going 50-50). I feel like the negative aspects of confimation bias have been covered enough, so I won't dive into that here.

I think it is important to point out that it is useful in some situations. Because only then can you move forward to discussing when it is not (which is whenever you have the time to analyze and the means to get more hard data). --95.89.98.17 (talk) 13:33, 21 October 2016 (UTC)

I think the general issues raised above are adequately addressed in the encyclopedia. For example: (1) Confirmation bias § Biased search for information states: "The preference for positive tests in itself is not a bias, since positive tests can be highly informative." (2) Heuristics in judgment and decision-making states: "This heuristics-and-biases tradition has been criticised by Gerd Gigerenzer and others for being too focused on how heuristics lead to errors. The critics argue that heuristics can be seen as rational in an underlying sense. According to this perspective, heuristics are good enough for most purposes without being too demanding on the brain's resources. Another theoretical perspective sees heuristics as fully rational in that they are rapid, can be made without full information and can be as accurate as more complicated procedures." (3) See also Reason § Logical reasoning methods and argumentation and the links to other articles therein. Biogeographist (talk) 14:46, 21 October 2016 (UTC)

Backfire effect

This term presently redirects here, and I don't think it's well served to do so. There's a lot of media coverage right now for "backfire effect" and it's a much more direct idea, which makes this page a poor substitute for most searchers. — MaxEnt 01:17, 3 November 2016 (UTC)

@MaxEnt: If you are willing to make an effort to improve the article, you could expand the sentence about backfire effect into a section with that title, and alter backfire effect so that it redirects to that section specifically. Backfire effect is just one of many terms for closely interrelated (if not identical) cognitive phenomena; see also: Belief perseverance, Cognitive dissonance, Defence mechanisms, Denialism, Psychological resistance, Self-justification, Semmelweis reflex, etc. Biogeographist (talk) 12:36, 3 November 2016 (UTC)
That's not a bad idea, but I'm not sure I'll get to it myself. Perhaps. — MaxEnt 16:53, 3 November 2016 (UTC)

Confirmation Bias Sources

This is a very well-written and cited article. Most of the sources are academic papers with research and facts to back up their claims, and all facts in the article are cited with a footnote to the source. However, upon looking at the references, most of them are from the mid 1990's, which is about twenty years ago. I am wondering how well this information and research holds up today in the digital age, when most people get their information from the internet, rather than the local paper or evening news. Personally, I would like to see more sources like the one from 2015, which discuss social media and its effect on confirmation bias. Overall, however, the article is well-rounded and describes the topic of confirmation bias very well. Maddienousaine (talk) 18:14, 14 March 2017 (UTC)

Evaluation of the article

This article seems to have a good amount of links to definitions or further explanations of words in the article with other wikipedia articles, and the external sites used seemed good, but I was confused a little about the reliability of the activity for information. One other thing that I took particular note of was that all three of the areas under the "Types" category had a good amount of examples in them, but I feel the "Biased Interpretation" section had less actual information in it. Shayma0414 (talk) 05:52, 16 March 2017 (UTC)Shayma0414 3/16/15

Article Review

Overall I thought this was a very solid article. However, there were two specific things that I thought could potentially use improvement. I though the introduction was a little wordy and could have been more concise. It wasn't bad but as its purpose is to give someone a very quick idea of what the article talks about I thought it could be slightly more simple. The other thing I noticed was that in the first section, talking about bias when searching for information, there was a bit of a tangent about seeking positive results. It was stated in this portion that searching for positive results is not a bias because one would draw the same conclusion as if they searched for negative results. To me this seemed to stray a little bit from the concept of bias in research and more just to people's tendencies that do not likely effect the results. This could be wrong but it just seemed a little off topic to me. Weavere1998 (talk) 16:25, 16 March 2017 (UTC)

Confirmation Bias Critique

I don't see any aspects of this article that have major flaws. In fact, it's been identified as one of the best the Wikipedia community has written. Looking at the edit history of the article, I noticed there were a few times where users mentioned website references that had broken links or opened irrelevant webpages. I assume this is a persistent issue dealt with across all Wikipedia articles as the internet is constantly evolving and expanding. In addition, some studies or references cited for use in the article seem to be outdated. Many references come from books and journals published in the 1980s and 1990s. One would think more recent information on this topic has been published to incorporate our ever changing society. -Reed1998 (talk) 04:40, 17 March 2017 (UTC)

religion, creationism, esoterism

I'm astonished that religion, creationism, esoterism are not mentioned in this article. — Preceding unsigned comment added by 178.82.249.85 (talk) 19:28, 15 July 2017 (UTC)

It has "In the paranormal" however (I'm not saying it cannot be improved of course)... —PaleoNeonate - 19:42, 15 July 2017 (UTC)

Confirmation Bias in Science

@PaleoNeonate: I support Jhanna2's addition of confirmation bias in science to the lede, because there is a large section on it in the body. Pursuant to WP:LEDE, it must be covered. The 2013 reference is recent. The other two references of 1977 and 1974, I have no position on: They are older, but if they are relevant and make the case, I don't see a problem, unless more recent material contradicts their finding or is somehow more on point. Without a link to the materials, I have no easy way to verify their relevance. @Jhanna2: --David Tornheim (talk) 12:50, 28 December 2017 (UTC)

Other than the source's age, I considered it a bit suspicious considering an earlier edit using the same source ([3] which was reverted by another editor). If it seems adequate here, all for the better. Confirmation bias is indeed one of the issues that good science takes in consideration and attempts to minimize using various methods. Thanks, —PaleoNeonate – 20:04, 28 December 2017 (UTC)
I see what you mean about diff [1] in William A. Dembski and the previous edits on that subject. I definitely agree with those who have reverted Jhanna2 at that article. However, that diff looks more like WP:OR--and I see that was the reason NeilN reverted. I would have done the same. Here, however, the source might be relevant. Given that I can't easily verify it, and we have other sources on the subject, it's okay with me if you remove all the sources (or replace them with one good one from the body), as long as the line remains.
"Confirmation bias is indeed one of the issues that good science takes in consideration and attempts to minimize using various methods." True, that is the goal, and I like your use of the word "attempt." As the sources show, especially the grounding-breaking text by Thomas Kuhn, The Structure of Scientific Revolutions, or the work of Michel Foucault, scientists are "Human, all too Human" [4] and often stick to the traditional or institutional view even when strong evidence proves otherwise, until some revolution topples the traditional thinking. Think what happened to Galileo and Copernicus for the blasphemy for challenging the most current knowledge of science handed down from Aristotle. Did these two really have the arrogance to say they knew better than one of the most brilliant minds of the Western world, Aristotle?  :) --David Tornheim (talk) 21:14, 28 December 2017 (UTC)

Backfire Effect Not Significant

I’m a new user; I’d like input from more experienced users to decide whether the "Confirmation Bias" page should be edited to include this new information:

A paper published in SSRN on 12/31/2017 challenges conclusions drawn about the “Backfire Effect.”

The Elusive Backfire Effect: Mass Attitudes' Steadfast Factual Adherence https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2819073

Excerpt from the paper’s abstract: “Evidence of factual backfire is far more tenuous than prior research suggests. By and large, citizens heed factual information, even when such information challenges their ideological commitments.”

Further information about the paper (and the source for this section’s subject/headline) is here: https://theness.com/neurologicablog/index.php/backfire-effect-not-significant/

Abd1771 (talk) 14:53, 4 January 2018 (UTC)

Confirmation bias and "synchromysticism"?

Regarding this recent edit (which I reverted), I have started a discussion at Wikipedia talk:WikiProject Skepticism § Confirmation bias and "synchromysticism"? Thanks, Biogeographist (talk) 13:31, 24 June 2018 (UTC)

Why revert the criticism section with inline citations?

Why was the most recent edit with the criticism section reverted? The references were placed inline as the tutorial said.98.128.166.173 (talk) 16:40, 21 April 2020 (UTC)

the changes read like an essay you have written. There is a position that biases are actually heuristics and as it happens I agree with it, but it is a minority view. I don't think the references fully support what you say anyway. Something a lot shorter and closer to the sources possibly but not an essay - a lot looked like synthesis -----Snowded TALK 16:43, 21 April 2020 (UTC)
It is not an essay. It is written as a regular Wikipedia criticism section, which often contains views that are different from those of other, non-criticism sections in the same article. I can find no rule on Wikipedia that says that criticism sections must be shorter than other sections. And your assumption about the references prove nothing.98.128.166.173 (talk) 17:08, 21 April 2020 (UTC)
There is a rule on reaching consensus with other editors. Unless you have edited wikipedia before (in which case you should declared it) this is your first piece of work. I suggest you find a couple of quotes from your citations which link with what you want to say. We also have to be proportionate - and the suggestion that biases are actually heuristics is a minority view and we therefore need to be proportionate. -----Snowded TALK 17:50, 21 April 2020 (UTC)

Wiki Education Foundation-supported course assignment

  This article is or was the subject of a Wiki Education Foundation-supported course assignment. Further details are available on the course page. Student editor(s): Fox1988CF.

Above undated message substituted from Template:Dashboard.wikiedu.org assignment by PrimeBOT (talk) 18:18, 16 January 2022 (UTC)