Cognitive Biases in Fact-Checking and Their Countermeasures

Cognitive Biases in Fact-Checking and Their Countermeasures: A Review

  • We identify 39 cognitive biases that may compromise the fact-checking process.
  • Through a systematic review, we highlight key categories of cognitive biases influencing human assessors.
  • We propose a set of 11 countermeasures to mitigate the impact of cognitive biases on fact-checking activities.
  • We describe the constituting elements of a bias-aware fact-checking pipeline.

Types of user tasks that may involve cognitive biases:

TaskDescription
Causal AttributionTasks involving an assessment of causality.
DecisionTasks involving the selection of one over several alternative options.
EstimationTasks where people are asked to assess the value of a quantity.
Hypothesis AssessmentTasks involving an investigation of whether one or more hypotheses are true or false.
Opinion ReportingTasks where people are asked to answer questions regarding their beliefs or opinions on political, moral, or social issues.
RecallTasks where people are asked to recall or recognize previous material.
OtherTasks which are not included in one of the previous categories.

Phenomena that affect human cognition;

FlavorDescription
AssociationCognition is biased by associative connections between information items.
BaselineCognition is biased by comparison with (what is perceived as) a baseline.
InertiaCognition is biased by the prospect of changing the current state.
OutcomeCognition is biased by how well something fits an expected or desired outcome.
Self-PerspectiveCognition is biased by a self-oriented viewpoint.

Categorization of cognitive biases:

The task/flavor classification described in following table provides a structured and detailed approach to understanding the multifaceted ways in which cognitive biases can influence the fact-checking process.
By mapping each bias to specific fact-checking tasks and cognitive influences, we aim to offer a comprehensive framework that aids researchers and practitioners in identifying and addressing potential biases in their work.

AssociationBaselineInertiaOutcomeSelf-Perspective
Causal AttributionB26. Hostile Attribution Bias 
B31. Just-World Hypothesis
B23. Fundamental Attribution Error
B30. Ingroup Bias
DecisionB4. Authority Bias 
B5. Automation Bias 
B22. Framing Effect
EstimationB7. Availability Heuristic 
B16. Conjunction Fallacy
B2. Anchoring Effect
B11. Base Rate Fallacy 
B14. Compassion Fade
B21. Dunning-Kruger Effect 
B35. Overconfidence Effect
B17. Conservatism BiasB27. Illusion of Validity 
B34. Outcome Bias
B32. Optimism Bias 
B37. Salience Bias
Hypothesis AssessmentB6. Availability Cascade
B29. Illusory Truth Effect
B10. Barnum Effect 
B12. Belief Bias 
B15. Confirmation Bias 
B28. Illusory Correlation
Opinion ReportingB36. Proportionality BiasB8. Backfire EffectB9. Bandwagon Effect 
B38. Stereotypical Bias
B19. Courtesy Bias
RecallB24. Google Effect 
B39. Telescoping Effect
B18. Consistency BiasB13. Choice-Supportive Bias 
B20. Declinism 
B25. Hindsight Bias
OtherB3. Attentional BiasB33. Ostrich Effect

Note: The details on the biases presented in this table are discussed in the list of 39 cognitive biases selected in section 5 of the article, while a full list of the 221 cognitive biases considered is reported in Appendix B.


The literature allows us to specify 11 countermeasures that can be employed in a fact-checking context to help prevent manifesting the cognitive biases outlined in previous table.

Task PhaseCountermeasureBrief DescriptionBiases Involved
General PurposeC7. Randomized or constrained experimental designEmploy a randomization process when pairing assessors and information itemsBias in General
General PurposeC8. Redundancy and diversityUse more than one assessor for each information item, and a variegated pool of assessorsBias in General
General PurposeC10.  TimeAllocate an adequate amount of time for the assessors to perform the taskB2. Anchoring Effect 
B9. Bandwagon Effect
Pre-TaskC4. EngagementPut the assessors in a good mood and keep them engaged
Pre-TaskC5. InstructionsPrepare a clear set of instructions to the assessors before the taskB21. Dunning-Kruger Effect 
B35. Overconfidence Effect
Pre-TaskC11. TrainingTrain the Assessors before the taskBias in General
During the TaskC1. Custom search engineDeploy a custom search engineBias in General
During the TaskC2. Inform assessorsInform the assessors about AI-based support systemsB5. Automation Bias
During the TaskC3. DiscussionSynchronous discussion between assessorsBias in General
During the TaskC6. Require evidenceAsk the assessors to provide supporting evidenceB2. Anchoring Effect 
B8. Backfire Effect 
B11. Base Rate Fallacy 
B17. Conservatism Bias 
B22. Framing Effect 
B27. Illusion of Validity 
B28. Illusory Correlation
During the TaskC9. RevisionAsk the assessors to revise the assessmentsB2. Anchoring Effect 
B3. Attentional Bias 
B7. Availability Heuristic 
B9. Bandwagon Effect 
B11. Base Rate Fallacy 
B22. Framing Effect 
B37. Salience Bias
Post-TaskC8. Redundancy and diversityAggregate the final scoresBias in General

It is important to note that our findings may not be generalizable to all possible fact-checking contexts or human populations, as cognitive biases may manifest differently depending on the specific context and individual differences.

Regarding the set of countermeasures presented, it should be noted that it is difficult to ascertain the extent to which the countermeasures are effective and general. Their effectiveness could be influenced by various factors, such as the specific context, individual differences, and the nature of the misinformation.


Some visuals to conclude 🙂

Lucy
Spreadsheets Rules
Flawed Data

Leave a comment