Fact checking is the act of checking factual assertions in non-fictional text in order to determine the veracity and correctness of the factual statements in the text. This may be done either before (ante hoc) or after (post hoc) the text has been published or otherwise disseminated.
Ante hoc fact-checking (fact checking before dissemination) aims to remove errors and allow text to proceed to dissemination (or to rejection if it fails confirmations or other criteria). Post hoc fact-checking is most often followed by a written report of inaccuracies, sometimes with a visual metric from the checking organization (e.g., Pinocchios from The Washington Post Fact Checker, or TRUTH-O-METER ratings from PolitiFact). Several organizations are devoted to post hoc fact-checking, such as FactCheck.org and PolitiFact.
Research on the impact of fact-checking is relatively recent but the existing research suggests that fact-checking does indeed correct misperceptions among citizens, as well as discourage politicians from spreading misinformation.
Video Fact checking
Post hoc fact-checking
Consistency across fact-checkers
One study finds that fact-checkers PolitiFact, FactCheck.org, and Washington Post's Fact Checker overwhelmingly agree on their evaluations of claims.
However, a study by Morgan Marietta, David C. Barker and Todd Bowser found "substantial differences in the questions asked and the answers offered." They concluded that this limited the "usefulness of fact-checking for citizens trying to decide which version of disputed realities to believe."
A paper by Chloe Lim, Ph.D. student at Stanford University, finds little overlap in the statements that fact-checkers check. Out of 1065 fact-checks by PolitiFact and 240 fact-checks by The Washington Post's Fact-Checker, there were only 70 statements that both fact-checkers checked. The study found that the fact-checkers gave consistent ratings for 56 out of 70 statements, which means that one out every five times, the two fact-checkers disagree on the accuracy of statements.
Effects
Studies of post hoc fact checking have made clear that such efforts often result in changes in the behavior, in general, of both the speaker (making them more careful in their pronouncements) and of the listener or reader (making them more discerning with regard to the factual accuracy of content); observations include the propensities of audiences to be completely unswayed by corrections to errors regarding the most divisive subjects, or the tendency to be more greatly persuaded by corrections of negative reporting (e.g., "attack ads"), and to see minds changed only when the individual in error was someone reasonably like-minded to begin with.
Correcting misperceptions
A 2015 study found evidence a "backfire effect" (correcting false information may make partisan individuals cling more strongly to their views): "Corrective information adapted from the Centers for Disease Control and Prevention (CDC) website significantly reduced belief in the myth that the flu vaccine can give you the flu as well as concerns about its safety. However, the correction also significantly reduced intent to vaccinate among respondents with high levels of concern about vaccine side effects--a response that was not observed among those with low levels of concern." A 2017 study attempted to replicate the findings of the 2015 study but failed to do so.
A 2016 study found little evidence for the "backfire effect": "By and large, citizens heed factual information, even when such information challenges their partisan and ideological commitments." A study of Donald Trump supporters during the 2016 race similarly found little evidence for the backfire effect: "When respondents read a news article about Mr. Trump's speech that included F.B.I. statistics indicating that crime had "fallen dramatically and consistently over time," their misperceptions about crime declined compared with those who saw a version of the article that omitted corrective information (though misperceptions persisted among a sizable minority)."
Studies have shown that fact-checking can affect citizens' belief in the accuracy of claims made in political advertisement. A paper by a group of Paris School of Economics and Sciences Po economists found that falsehoods by Marine Le Pen during the 2017 French presidential election campaign (i) successfully persuaded voters, (ii) lost their persuasiveness when fact-checked, and (iii) did not reduce voters' political support for Le Pen when her claims were fact-checked. A 2017 study in the Journal of Politics found that "individuals consistently update political beliefs in the appropriate direction, even on facts that have clear implications for political party reputations, though they do so cautiously and with some bias... Interestingly, those who identify with one of the political parties are no more biased or cautious than pure independents in their learning, conditional on initial beliefs."
A study by Yale University cognitive scientists Gordon Pennycook and David G. Rand found that Facebook tags of fake articles "did significantly reduce their perceived accuracy relative to a control without tags, but only modestly". A Dartmouth study led by Brendan Nyhan found that Facebook tags had a greater impact than the Yale study found. A "disputed" tag on a false headline reduced the number of respondents who considered the headline accurate from 29% to 19%, whereas a "rated false" tag pushed the number down to 16%. The Yale study found evidence of a backfire effect among Trump supporters younger than 26 years whereby the presence of both untagged and tagged fake articles made the untagged fake articles appear more accurate. In response to research which questioned the effectiveness of the Facebook "disputed" tags, Facebook decided to drop the tags in December 2017 and would instead put articles which fact-checked a fake news story next to the fake news story link whenever it is shared on Facebook.
Based on the findings of a 2017 study in the journal Psychological Science, the most effective ways to reduce misinformation through corrections is by:
- limiting detailed descriptions of / or arguments in favor of the misinformation;
- walking through the reasons why a piece of misinformation is false rather than just labelling it false;
- presenting new and credible information which allows readers to update their knowledge of events and understand why they developed an inaccurate understanding in the first place;
- using video, as videos appear to be more effective than text at increasing attention and reducing confusion, making videos more effective at correcting misperception than text.
A forthcoming study in the Journal of Experimental Political Science found "strong evidence that citizens are willing to accept corrections to fake news, regardless of their ideology and the content of the fake stories."
A paper by Andrew Guess (of Princeton University), Brendan Nyhan (Dartmouth College) and Jason Reifler (University of Exeter) found that consumers of fake news tended to have less favorable views of fact-checking, in particular Trump supporters. The paper found that fake news consumers rarely encountered fact-checks: "only about half of the Americans who visited a fake news website during the study period also saw any fact-check from one of the dedicated fact-checking website (14.0%)."
A 2018 study found that Republicans were more likely to correct their false information on voter fraud if the correction came from Breitbart News rather than a non-partisan neutral source such as PolitiFact.
Political discourse
A 2015 experimental study found that fact-checking can encourage politicians to not spread misinformation. The study found that it might help improve political discourse by increasing the reputational costs or risks of spreading misinformation for political elites. The researchers sent, "a series of letters about the risks to their reputation and electoral security if they were caught making questionable statements. The legislators who were sent these letters were substantially less likely to receive a negative fact-checking rating or to have their accuracy questioned publicly, suggesting that fact-checking can reduce inaccuracy when it poses a salient threat."
Political preferences
One experimental study found that fact-checking during debates affected viewers' assessment of the candidates' debate performance and "greater willingness to vote for a candidate when the fact-check indicates that the candidate is being honest."
A study of Trump supporters during the 2016 presidential campaign found that while fact checks of false claims made by Trump reduced his supporters' belief in the false claims in question, the corrections did not alter their attitudes towards Trump.
Controversies and criticism
Political fact-checking is sometimes criticized as being opinion journalism. In September 2016, a Rasmussen Reports national telephone and online survey found that "just 29% of all Likely U.S. Voters trust media fact-checking of candidates' comments. Sixty-two percent (62%) believe instead that news organizations skew the facts to help candidates they support."
Organizations and individuals
The Reporters' Lab at Duke University maintains a database of fact checking organizations that is managed by Mark Stencel and Bill Adair. The database tracks more than 100 non-partisan organizations around the world. The Lab's inclusion criteria is based on whether the organization
- examines all parties and sides;
- examines discrete claims and reaches conclusions;
- tracks political promises;
- is transparent about sources and methods;
- discloses funding/affiliations;
- and whether its primary mission is news and information.
Africa
- Africa Check: Africa's first independent fact-checking organisation with offices in Kenya, Nigeria, South Africa, Senegal and the UK checking claims made by public figures and the media in Africa.
India
- Boom is a fact checking digital journalism website.
- SMHoaxSlayer is a broad spectrum fact checking website with verifying social media hoaxes and scams circulating in india.
Iran
- Gomaneh an online Persian magazine devoted to the investigation of rumours and hearsay.
Japan
- GoHoo: Launched by a nonprofit association Watchdog for Accuracy in News-reporting, Japan (WANJ or ?????? ????????) on November 16, 2014. Crowd-funded approx. 1.6 million yen through Ready For. Awarded Social Business Grand Prize 2012 Summer.
- Japan Center of Education for Journalists (JCEJ): Fosters journalists and fact checkers by referring to a Journalist's Guide to Social Sources published by First Draft News, a project of the Harvard Kennedy School's Shorenstein Center. JCEJ itself also debunks falsehoods.
Europe
- BBC Reality Check
- Full Fact: An independent fact checking organisation based in the UK which aims to "promote accuracy in public debate", launched in 2009.
- The FactCheck blog: A fact checking blog run by the Channel 4 News organization in the UK.
- Les Décodeurs: French fact-checking blog run by Le Monde.
- Pagella Politica: an Italian fact-checking website.
- Ellinikahoaxes.gr: a Greek fact-checking website launced in 2013. Debunks hoaxes, urban legends, fake news, internet scams and other stories of questionable origin.
- Factchecker.gr: an independent Greek fact-checking website launced in February 2017 specializing in pseudoscience and medical frauds. Affiliated to Ellinika Hoaxes.
- Bufale.net: an Italian fact-checking website
- Ferret Fact Service: Scotland's first fact-checker launched in April 2017 after a grant from the Google Digital News Initiative.
- Mimikama: Austrian fact-checking website which mainly focuses on Facebook hoaxes in the German and Dutch language area.
- Miniver.org: First dedicated fact checking web in Spain, launched in 2017, with the purpose of debunking fake news. Accredited by Google as fact-checking organization.
Latin America
- Argentina: Chequeado.com
- Central America: Rete al candidato
- Brazil: E-farsas
United States
- FactCheck.org and FactCheckEd.org: non-partisan, nonprofit sister websites that are self-described "advocates for voters that aims to reduce the level of deception and confusion in U.S. politics," and serving as an educational resource for high school teachers and students, respectively (the latter founded 2005). They are projects of the Annenberg Public Policy Center of the Annenberg School for Communication at the University of Pennsylvania, and are funded primarily by the Annenberg Foundation.
- Fact Checker (The Washington Post): A project of The Washington Post, known for grading politicians on the factual accuracy of their statements with zero to four "Pinocchios." Created September 2007 by Post diplomatic writer Michael Dobbs specifically for the 2008 presidential campaign. Ceased operation 4 November 2008, but relaunched with a broader focus in January 2011, led by veteran Post diplomatic correspondent Glenn Kessler.
- Our.News crowdsources fact-checking from users for any news article, and allows users to rate news for spin, trust, accuracy, and relevance. Fact sources are both user-contributed and automatically scraped. Publisher and author information and statistics are also provided.
- PolitiFact.com: A service of the Tampa Bay Times - Created August 2007, uses the "Truth-o-Meter" to rank the amount of truth in public persons' statements. 2009 Pulitzer Prize Winner.
- Snopes.com focuses on, but is not limited to, validating and debunking urban legends and other stories in American popular culture.
- TruthOrFiction.com validates and debunks urban legends, Internet rumors, e-mail forwards, and other stories of unknown or questionable origin.
- RealClearPolitics' Fact Check Review aspires to offer quaternary-level critiquing of such tertiary-level efforts at fact checking as those listed above. Within its inaugural review item on April 9, 2018, RCP writer Kalev Leetaru said its efforts at "checking the fact checkers" were to "explore how the flagship fact-checking organizations operate in practice (as opposed to their self-reported descriptions), from their claim and verification sourcing to their topical focus to just what constitutes a 'fact.'" Leetaru is a Georgetown University fellow in residence, holding the chair established there for study and promotion of "international values, communications technology and the global Internet."
Maps Fact checking
Ante hoc fact-checking
Among the benefits of printing only checked copy is that it averts serious, sometimes costly, problems, e.g. lawsuits and discreditation. Fact checkers are primarily useful in catching accidental mistakes; they are not guaranteed safeguards against those who wish to commit journalistic frauds
The possible societal benefit of honing the fundamental skill of fact checking has been noted in a round table discussion by Moshe Benovitz, who observes that "modern students use their wireless worlds to augment skepticism and to reject dogma," but goes on to argue that this has positive implications for values development. He argues:
"We can encourage our students to embrace information and vigorously pursue accuracy and veracity. Fact checking can become a learned skill, and technology can be harnessed in a way that makes it second nature... By finding opportunities to integrate technology into learning, students will automatically sense the beautiful blending of... their cyber... [and non-virtual worlds]. Instead of two spheres coexisting uneasily and warily orbiting one another, there is a valuable experience of synthesis...".
He closes, noting that this constitutes "new opportunities for students to contribute to the discussion like never before, inserting technology positively into academic settings" (rather than it being seen as purely as agent of distraction).
Controversy
One journalistic controversy is that of admitted and disgraced reporter and plagiarist Stephen Glass, who began his journalism career as a fact-checker. The fact checkers at The New Republic and other weeklies for which he worked never flagged the numerous fictions in Glass's reporting. Michael Kelly, who edited some of Glass's concocted stories, blamed himself, rather than the fact-checkers, saying: "Any fact-checking system is built on trust ... If a reporter is willing to fake notes, it defeats the system. Anyway, the real vetting system is not fact-checking but the editor."
Individuals
- Sarah Harrison Smith spent some time and also headed the fact checking department for The New York Times. She is the author of the book, The Fact Checker's Bible.
- Jim Fingal worked for several years as a fact-checker at The Believer and McSweeney's and is co-author with John D'Agata of The Lifespan of a Fact which is an inside look of a battle between himself as fact-checker and author D'Agata regarding one of his essays that pushes the limits of "artistic license" that is acceptable of a non-fiction work.
Alumni of the role
The following is a list of individuals for whom it has been reported, reliably, that they have played such a fact checking role at some point in their careers, often as a stepping point to other journalistic endeavors, or to an independent writing career:
See also
References
Further reading
- The Poynter Institute's summary of research on fact-checking.
- Silverman, Craig (23 October 2007). Regret The Error: How Media Mistakes Pollute The Press And Imperil Free Speech. Penguin Canada. ISBN 9780143186991.
- Amazeen, Michelle (2015) "Monkey Cage: Sometimes political fact-checking works. Sometimes it doesn't. Here's what can make the difference.," The Washington Post (online), 3 June 2015, see, accessed 27 July 2015.
- Davis, Katy (2012) "Study: Fact-checkers disagree on who lies most," The Center for Media and Public Affairs (CMPA), George Mason University (online, press release), 22 October 2012 see,
- Lewis-Kraus, Gideon (2012) "RIFF: The fact-checker versus the fabulist," The New York Times Magazine (online), 21 February 2012 (print edition, 26 February 2012, p. MM45, title, "I Have Taken Some Liberties"), see,
- Heffernan, Virginia (2010) "The Medium: What 'fact-checking' means online," The New York Times Magazine (online), 20 August 2010 (print edition, 22 August 2010, p. MM14). Accessed 27 July 2015.
- Silverman, Craig (2010) "Top fact checkers and news accuracy experts gather in Germany," Regret the Error (online), 4 September 2010, see, accessed 28 July 2015. Cited by Tobias Reitz & Kersten Alexander Riechers (2011) Quo vadis Qualitätssicherung? Corrigo, Konzeption eines Crowdsourced Media Accountability Services," p. 151, Fachbereich Media, 31 May 2011 (Hochschule Darmstadt, University of Applied Sciences), see, accessed 28 July 2015.
- Bergstrom, Carl T. and Jevin West "Calling Bullshit: Data Reasoning in a Digital World." Online Lecture INFO 198 / BIOL 106B, 2017, University of Washington.
- Sagan, Carl; Druyan, Ann (1995). "The Fine Art of Baloney Detection". The Demon-Haunted World: Science as a Candle in the Dark. Random House. pp. 201-218.
- Adler, Mortimer J.; Doren, Charles Van (1972) [1940]. "Agreeing or Disagreeing with an Author". How to Read a Book: The Classic Guide to Intelligent Reading (Revised ed.). New York: Simon & Schuster. pp. 154-167.
After he has said 'I understand but I disagree,' he can make the following remarks to the author: (1) 'You are uninformed'; (2) 'You are misinformed'; (3) You are illogical-your reasoning is not cogent'; (4) 'Your analysis is incomplete.'
- "Rapidly expanding fact-checking movement faces growing pains", Washington Post, 25 June 2018
External links
- Duke Reporters Lab
- RealClearPolitics' Fact Check Review
Source of the article : Wikipedia