Illusory truth effect
The illusory truth effect is the tendency to believe false information to be correct after repeated exposure. This phenomenon was first identified in a 1977 study at Villanova University and Temple University. When truth is assessed, people rely on whether the information is in line with their understanding or if it feels familiar. The first condition is logical, as people compare new information with what they already know to be true. Repetition makes statements easier to process relative to new, unrepeated statements, leading people to believe that the repeated conclusion is more truthful. The illusory truth effect has also been linked to hindsight bias, in which the recollection of confidence is skewed after the truth has been received.
In a 2015 study, researchers discovered that familiarity can overpower rationality and that repetitively hearing that a certain fact is wrong can affect the hearer's beliefs. Researchers attributed the illusory truth effect's impact on participants who knew the correct answer to begin with, but were persuaded to believe otherwise through the repetition of a falsehood, to "processing fluency".
The illusory truth effect plays a significant role in such fields as election campaigns, advertising, news media, and political propaganda.
Initial study
The effect was first named and defined following the results in a study from 1977 at Villanova University and Temple University where participants were asked to rate a series of trivia statements as true or false. On three occasions, Lynn Hasher, David Goldstein, and Thomas Toppino presented the same group of college students with lists of sixty plausible statements, some of them true and some of them false. The second list was distributed two weeks after the first, and the third two weeks after that. Twenty statements appeared on all three lists; the other forty items on each list were unique to that list. Participants were asked how confident they were of the truth or falsity of the statements, which concerned matters about which they were unlikely to know anything. Specifically, the participants were asked to grade their belief in the truth of each statement on a scale of one to seven. While the participants' confidence in the truth of the non-repeated statements remained steady, their confidence in the truth of the repeated statements increased from the first to the second and second to third sessions, with an average score for those items rising from 4.2 to 4.6 to 4.7. The conclusion made by the researchers, who were from Villanova and Temple universities, was that repeating a statement makes it more likely to appear factual.In 1989, Hal R. Arkes, Catherine Hackett, and Larry Boehm replicated the original study, with similar results showing that exposure to false information changes the perceived truthfulness and plausibility of that information.
The effect works because when people assess truth, they rely on whether the information agrees with their understanding or whether it feels familiar. The first condition is logical as people compare new information with what they already know to be true and consider the credibility of both sources. However, researchers discovered that familiarity can overpower rationality—so much so that repetitively hearing that a certain fact is wrong can have a paradoxical effect.
Relation to other phenomena
Processing fluency
At first, the truth effect was believed to occur only when individuals are highly uncertain about a given statement. Psychologists also assumed that "outlandish" headlines wouldn't produce this effect however, recent research shows the illusory truth effect is indeed at play with false news. This assumption was challenged by the results of a 2015 study by Lisa K. Fazio, Nadia M. Brasier, B. Keith Payne, and Elizabeth J. Marsh. Published in the Journal of Experimental Psychology; the study suggested that the truth effect can influence participants who actually knew the correct answer to begin with, but who were swayed to believe otherwise through the repetition of a falsehood. For example, when participants encountered on multiple occasions the statement "A sari is the name of the short plaid skirt worn by Scots," some of them were likely to come to believe it was true, even though these same people were able to correctly answer the question "What is the name of the short pleated skirt worn by Scots?"After replicating these results in another experiment, Fazio and her team attributed this curious phenomenon to processing fluency, the facility with which people comprehend statements. "Repetition," explained the researcher, "makes statements easier to process relative to new statements, leading people to the false conclusion that they are more truthful." When an individual hears something for a second or third time, their brain responds faster to it and misattributes that fluency as a signal for truth.
Hindsight bias
In a 1997 study, Ralph Hertwig, Gerd Gigerenzer, and Ulrich Hoffrage linked the truth effect to the phenomenon known as "hindsight bias", described as a situation in which the recollection of confidence is skewed after the truth or falsity has been received. They have described the truth effect as a subset of hindsight bias.Other studies
In a 1979 study, participants were told that repeated statements were no more likely to be true than unrepeated ones. Despite this warning, the participants perceived repeated statements as being more true than unrepeated ones.Studies in 1981 and 1983 showed that information deriving from recent experience tends to be viewed as "more fluent and familiar" than new experience. A 2011 study by Jason D. Ozubko and Jonathan Fugelsang built on this finding by demonstrating that, generally speaking, information retrieved from memory is "more fluent or familiar than when it was first learned" and thus produces an illusion of truth. The effect grew even more pronounced when statements were repeated twice and yet more pronounced when they were repeated four times. The researchers thus concluded that memory retrieval is a powerful method for increasing the so-called validity of statements and that the illusion of truth is an effect that can be observed without directly polling the factual statements in question.
A 1992 study by Ian Maynard Begg, Ann Anas, and Suzanne Farinacci suggested that a statement will seem true if the information seems familiar.
A 2012 experiment by Danielle C. Polage showed that some participants exposed to false news stories would go on to have false memories. The conclusion was that repetitive false claims increase believability and may also result in errors.
In a 2014 study, Eryn J. Newman, Mevagh Sanson, Emily K. Miller, Adele Quigley-McBride, Jeffrey L. Foster, Daniel M. Bernstein, and Maryanne Garry asked participants to judge the truth of statements attributed to various people, some of whose names were easier to pronounce than others. Consistently, statements by persons with easily pronounced names were viewed as being more truthful than those with names that were harder to pronounce. The researchers' conclusion was that subjective, tangential properties can matter when people evaluate sourced information.
Examples
Although the truth effect has been demonstrated scientifically only in recent years, it is a phenomenon with which people have been familiar for millennia. One study notes that the Roman statesman Cato closed each of his speeches with a call to destroy Carthage, knowing that the repetition would breed agreement, and that Napoleon reportedly "said that there is only one figure in rhetoric of serious importance, namely, repetition", whereby a repeated affirmation fixes itself in the mind "in such a way that it is accepted in the end as a demonstrated truth". Others who have taken advantage of the truth effect have included Quintilian, Ronald Reagan, Bill Clinton, Barack Obama, Donald Trump, and Marcus Antonius in Shakespeare's Julius Caesar.The truth effect plays a significant role in various fields of activity. During election campaigns, false information about a candidate, if repeated in TV commercials, can cause the public to believe it. Similarly, advertising that repeats unfounded claims about a product may boost sales because some viewers may come to think that they heard the claims from an objective source. The truth effect is also used in news media and is a staple of political propaganda.