Search

  • CONTACT SALES
  • TECHNICAL SUPPORT
  • Contact sales
  • Technical support

icon

Valuable capabilities of the #1 choice generative learning platform. Learn about AI, automation, gamification, course creation to delivery, and more.

fake news assignment answer key

Everything from employee training to customer training, career growth to hybrid learning, certification to compliance, and more.

fake news assignment answer key

Explore valuable best practices from CYPHER's customers, featuring insightful videos and expert advice.

fake news assignment answer key

Create and deliver courses quicker. Reduce costs and reclaim your resources.

fake news assignment answer key

Research, videos, and resources to your AI questions!

5 Activities to teach your students how to spot fake news

By Matt Phillpott

fake news assignment answer key

How to spot fake news ? These two words have trended in the last decade as a way of describing news and information that is false. It is not as simple as that, though. Other words like misinformation, disinformation, propaganda, satire, hoaxes, and conspiracy theories also describe something very similar and have been around for much longer. They do not, however, convey that snappy dismissive air conjured by the words “fake news.” To understand this trend, it is important to realize that there is no one description of what it is. In reality, fake news is three separate things:

  • Stories that are not true (making people believe something entirely false);
  • Stories that are partially true (a deliberate attempt to convince a reader of a viewpoint using skewed information or opinion);
  • A tactic used to discredit other people’s views (to make another person’s opinion or even facts appear to someone else to be false, even when there is no sign that this is the case).

What all three descriptions have in common is the attempt to confuse and misdirect . As such, the tools that we need to teach our students are the same ones that we use to help them to assess information and conduct their own research for assignments.

Five activities to teach students how to spot fake news

Students must learn skills and capabilities to check the quality, bias, and background of news they encounter daily. Thus, when instructing students how to spot fake news, we need to teach them to:

  • Develop a critical mindset (consider bias, quality, sensationalism, and the date of creation);
  • Ask: why has this been written, and by who?;
  • Check the source of the story;
  • Check elsewhere to see if the story appears in more than one place. 

With all of this in mind, how to approach this topic in the classroom? Here are five ideas that will help you navigate this challenging subject: 

1. The News Comparison exercise

I love this one. Ask your students to select three or four national news websites. However, there’s a catch: they should include sites they would generally avoid because it conflicts with their opinions. Next, ask them to select the main news item of the day. Visit each of the websites and compare the article they have written to one another.

What is different? What is similar? Your students can write a short reflection about what they have found and then discuss how they might feel about the topic if only one of these sites was their sole source of news. You might wish to take this one step further by asking them to “fact-check” the news story using one of the fact-checking websites such as FactCheck.org , Snopes.com , Washington Post Fact Checker , Politifact.com . While not specifically about fake news, this exercise helps students understand the nature of news and the variability and quality of the information found online.

Read more: Digital reflection tools your students can use in class

2. Google Reverse Image Search

Images are just as likely as text to have been falsified or altered. Set your students a task to trace the history of an image through Google Reverse Image Search:

  • Right-click on an image on a website and copy the image address or select an image on your hard drive;
  • Go to Google Reverse Image Search ;
  • Click on the camera icon and then paste the image address URL into the search field or upload your image from your hard drive;
  • The results will show you where the image has appeared online. 

 This allows you to see where that image has appeared online (context) and to see similar images (which might reveal that it has been doctored).

3. LMS Quiz

Another way to engage students with issues around fake news is to develop a quiz on your learning management system (LMS) which asks students to spot fake news. Here are a few sample questions you might wish to use:

Q: Is this a photograph of how MGM created the legendary MGM intro of a lion roaring?

A. Show students the MGM version then reveal the real version , which is a picture from 2005 of a lion receiving a CAT scan. You can find plenty of other examples online .

Q: In the lead-up to the 2016 US Presidential election, Pope Francis broke papal tradition by endorsing the US Presidential candidate Donald Trump. True or False?

A: False. This fake story appeared on the now-defunct website WTOE 5 News and spread from there. Reuters and other reputable news sources confirmed that the news was false. See the fact check on this story  

Q: NASA plans to install internet on the Moon. True or False?

A: True. NASA plans to build a 4G network on the moon to help them control lunar robots. This is called LunaNet. Find out more on the NASA website and see one of the news stories on CNBC .

Read more: 9 Types of assignments teachers can create in their LMS to evaluate student progress

4. Discuss research about fake news

Scholars have published articles about fake news in recent years, examples include Apuke and Omar (2021), Tsfati et al. (2020), and Leeder (2019). Select one or two articles for students to read and appraise, and then discuss the points raised in the classroom, asking them questions about how to spot fake news and websites.

As a homework assignment, ask students to investigate a current fake news story and compare their findings to the research. You might ask them to upload a brief response on an LMS forum , blog, or digital portfolio as an additional exercise. Or, perhaps, to create a poster to advertise to their peers why they should not fall for it.

5. Make up a fake news story

This is a fun exercise to do as a lesson to spot fake news. Divide your class into two groups:

Group A: write a fake news story;

Group B: write a real story.

Ensure that they are unable to share which group they are in. Ask them to individually write a short 500-word news story and then post it to the LMS forum or blog. Students in Group A should make up the story but add three elements of truth to it. Students in Group B should write about a real story that they find online from a reputable source (but one that is on a niche topic). Once done, divide your class into different small groups, and ask them to read through the stories, discuss, and label each one as Fake News or Real. Bring the class together to discuss the results and ask each student to update their forum/blog post to identify it as Fake or Real.

Learning how to spot fake news is a crucial skill 

Fake News is a real challenge for educators. However, by teaching research skills to students, it becomes easier for them to identify misinformation and assess the quality of their sources. If there is one more piece of advice I would offer it is to make these activities as real as possible for students. Let them discover information themselves using the tools that they would normally use and focus on how they share stories.

In addition to the activities listed above, you might wish to try exercises created by Noah Tavlin , Vicki Davis , or Terry Heick . In addition, SFU Library provides a nice infographic and some videos about Fake News. Mindtools also provide some useful examples.

Get valuable resources and tips monthly. Subscribe to the newsletter. Don't miss out .

Subscribe to our newsletter, you may also like.

5 Tech tools every student should have

Data literacy skills: what they are and how do we teach them to our students?

  • Digital Offerings
  • Biochemistry
  • College Success
  • Communication
  • Electrical Engineering
  • Environmental Science
  • Mathematics
  • Nutrition and Health
  • Philosophy and Religion
  • Our Mission
  • Our Leadership
  • Accessibility
  • Diversity, Equity, Inclusion
  • Learning Science
  • Sustainability
  • Affordable Solutions
  • Curriculum Solutions
  • Inclusive Access
  • Lab Solutions
  • LMS Integration
  • Instructor Resources
  • iClicker and Your Content
  • Badging and Credidation
  • Press Release
  • Learning Stories Blog
  • Discussions
  • The Discussion Board
  • Webinars on Demand
  • Digital Community
  • Macmillan Learning Peer Consultants
  • Macmillan Learning Digital Blog
  • Learning Science Research
  • Macmillan Learning Peer Consultant Forum
  • English Community
  • Achieve Adopters Forum
  • Hub Adopters Group
  • Psychology Community
  • Psychology Blog
  • Talk Psych Blog
  • History Community
  • History Blog
  • Communication Community
  • Communication Blog
  • College Success Community
  • College Success Blog
  • Economics Community
  • Economics Blog
  • Institutional Solutions Community
  • Institutional Solutions Blog
  • Handbook for iClicker Administrators
  • Nutrition Community
  • Nutrition Blog
  • Lab Solutions Community
  • Lab Solutions Blog
  • STEM Community
  • STEM Achieve Adopters Forum
  • Contact Us & FAQs
  • Find Your Rep
  • Training & Demos
  • First Day of Class
  • For Booksellers
  • International Translation Rights
  • Permissions
  • Report Piracy

Digital Products

Instructor catalog, our solutions.

  • Macmillan Community

Ten Questions to Ask About Fake News

traci_gardner

  • Subscribe to RSS Feed
  • Mark as New
  • Mark as Read
  • Printer Friendly Page
  • Report Inappropriate Content

Quality Journalism Means an Informed Citizenry, by Mike Licht, on Flickr

  • cause and effect: english

You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.

  • Bedford New Scholars 49
  • Composition 547
  • Corequisite Composition 58
  • Developmental English 38
  • Events and Conferences 6
  • Instructor Resources 9
  • Literature 55
  • Professional Resources 4
  • Virtual Learning Resources 48

TKN header logo

“Fake News” Resources

Great resources to help you avoid “fake news” and foster critical thinking, book a virtual or in-person presentation.

JOYCE PRESENTING

Joyce Grant, TKN’s co-founder, gives high-energy, engaging virtual presentations for students and for educators as well as keynotes, on “fake news.” How to spot it and how to teach about it.

Her passion for the subject and broad base of knowledge over more than a decade, combined with her background as a journalist make for interesting, exciting presentations.

Find out more about how to book Joyce Grant for a virtual or an in-person visit. Click here or contact [email protected] .

ONLINE GAMES

These may be the best–they’re certainly the most fun –way to learn about fake news and journalism.

Click here or scroll for great online games. We highly recommend these games because they’re fun, educational and support your teaching.

ARTICLES & RESEARCH

Smart, succinct, readable articles about fake news and how to help young people spot it.

Click here or scroll for excellent articles and research studies about fake news. Use in these classroom or for background information for your own learning.

MEDIA LITERACY ORGANIZATIONS

Our curated collection of the best fake news websites, fact-checkers and organizations to support you in learning, and teaching, about fake news.

Click here or scroll for fake-news fighting organizations. They have tons of resources.

Since the dawn of the internet, people have been gambling online. Casinos and other gambling establishments have taken advantage of this by creating websites that allow players to wager on any number of games. All true and complete information about the different online casinos, their services, bonuses and safe payment options they offer can be seen here: onlinecasinosdk.com/payments/paysafecard . In recent years, there has been an increase in online casinos that accept cryptocurrency as a way of payment. This is due to the fact that cryptocurrencies are often more anonymous than traditional forms of payment, and they are also relatively secure. Additionally, some online casinos do not accept all cryptocurrencies, so it’s important to do your research before concluding that one is the right option for you. The sheer number of different casino providers makes it difficult for newcomers to find the right one for them, but reviews can help in narrowing down the selection considerably. Regardless of which provider a player chooses, there are a few key things they should keep in mind. Firstly, always make sure that a casino is licensed by relevant authorities – this will ensure that all funds being deposited are safe and secure.

fake news assignment answer key

BBC iReporter may be the single best online resource for learning about how real-world journalism works. Highly recommended.

In BBC iReporter you’re a BBC journalist covering breaking news and have to decide whether or not to post things on social media. It’s real-world and it’s exciting. It will help young people to understand the pressures on journalists to be accurate and at the same time, publish news in a timely manner.

fake news assignment answer key

Fake or Real? Look at the headline and then guess if the article is real … or fake news.

The Guardian has put together a fun and interesting group of real and fake headlines for this simple-to-use quiz .

fake news assignment answer key

Harmony Square is an online game in which you’re the bad guys, posting fake news and sowing disharmony. It’s silly and fun and it has quite a bit of reading to it.

https://inoculation.science/inoculation-games/harmony-square/

fake news assignment answer key

“ Go Viral! ” is a game with a medical misinformation game. You are the fake news creator. You win by going viral with lies and fake news. (By the same people who brought you Breaking Harmony Square, above.)

Not for everyone, because it shows you bad habits in the hopes that you’ll recognize good ones. Older students (grade 7+) may come away with some solid ideas about misinformation that will help them spot it in the future.

Spot the Troll

Can you tell if a commenter or poster is a troll, a bot or a real human? It’s important to know the difference so you don’t accidentally share their deliberately created fake news!

This is a really great (and fun) game that not only teaches you what to look for, but lets you guess to see if you’re right. https://spotthetroll.org/start

fake news assignment answer key

Doubt it or Trust It by The Canadian Journalism Federation’s NewsWise. An online game in which you guess if something is fake or real.

fake news assignment answer key

Fake or Foto? Take a look at the image and then take a guess. You’ll get the answers at the end.

https://area.autodesk.com/fakeorfoto

We’ve used this very successfully with classes, both in-person and online. Young people love trying to out-guess the adults (and they usually do)!

fake news assignment answer key

Play “Reality Check” by Media Smarts and learn how to check whether something is fake or real.

http://mediasmarts.ca/sites/mediasmarts/files/games/reality-check/index.html#/

fake news assignment answer key

Play FakeOut (CIVIX/Newsliteracy.ca): https://newsliteracy.ca/fakeOut

fake news assignment answer key

Is it Real or Photoshopped? (by Adobe).

https://landing.adobe.com/en/na/products/creative-cloud/69308-real-or-photoshop/index.html

fake news assignment answer key

In Bad News (Junior), created by a team of academics from Cambridge University and media experts, you become a fake news creator. Can you go viral with your lies and exaggerations? The aim is to teach what “not” to do, by teaching what the baddies do.

http://getbadnews.com/droggame_book/junior/#intro

fake news assignment answer key

University of Akron: Fake News Quiz

https://akron.qualtrics.com/jfe/form/SV_2bhqIwpegOtj5yZ

fake news assignment answer key

Fake News board game. We played this several times with different age groups and found it fun and educational. Lots of interesting facts for the curious. Worth the price ($20 US from BreakingGames.com. Amazon Canada had it online for awhile at $12 CDN, but we can’t find it there now. It’s likely online in other places. Let us know if you can find it in Canada, please.)

factitious

In Factitious2020 you try to figure out which articles are fake and which are real.

It’s okay, but a bit wordy, with articles that are fairly lengthy and involved. I’m not sure that just looking at the name of a source would help you truly figure out if something’s fake or not–you’d probably also have to do some Googling as well. But if you want to do a deep dive into teaching about whether an article is fake or not, and talk about sources, this game provides some excellent articles to work with.

fake news assignment answer key

Can You Believe It?: How to Spot Fake News and Find the Facts

Written by Teaching Kids News’ co-founder, Joyce Grant and beautifully illustrated by Kathleen Marcotte; published by Kids Can Press in 2022 and suitable for young people 9 to 12 as well as classrooms.

You can buy this illustrated non-fiction book in most independent bookstores or from one of the big chains including Amazon.com and Chapters Indigo .

For today’s tech-savvy kids, here’s the go-to resource for navigating what they read on the internet.

Should we believe everything we read online? Definitely not! And this book will tell you why. This fascinating book explores in depth how real journalism is made, what “fake news” is and, most importantly, how to spot the difference. It’s chock-full of practical advice, thought-provoking examples and tons of relevant information on subjects that range from bylines and credible sources to influencers and clickbait.

For more information and to purchase

MIT study published in Science magazine shows misinformation travels faster than the truth (2018)

Interesting follow-up: The Atlantic published an article saying the MIT study, which had been widely spread, was itself misinformation. AND THEN The Atlantic reneged on that, saying the MIT study was valid after all.

All of this “misinformation” was duly retweeted and shared.

https://news.mit.edu/2018/study-twitter-false-news-travels-faster-true-stories-0308

https://www.theatlantic.com/technology/archive/2022/03/fake-news-misinformation-mit-study/629396/

Wonder why people believe in conspiracy theories? This article gives insight into who these people are and why they believe seemingly ridiculous claims

https://medium.com/jigsaw/7-insights-from-interviewing-conspiracy-theory-believers-c475005f8598

If you’re teaching about POV or “filter bubbles,” this 2-minute BBC video is a must watch.

Journalist Steve Rosenberg is one of the BBC’s main Russia correspondents. He lives in, and reports on, Russia. His video shows very clearly why we all must break out of our “filter bubble” or “information silo.” Russians who only listen to the state TV there have a very different perspective on the invasion of Ukraine (btw, the word “invasion” is banned on Russian state TV) than those who get their news from outside sources via the Internet.

A story of two Russias. We report on Russians who believe what state TV is telling them about Ukraine. And those who don’t, and who are leaving the country. Camera @AntonChicherov Producer @BBCWillVernon @BBCNews @BBCWorld https://t.co/TYRf0LYINR pic.twitter.com/Wbr4m4bNOU — Steve Rosenberg (@BBCSteveR) March 1, 2022

fake news assignment answer key

This is a must-read. If you’re an educator interested in media literacy, this is well worth your time.

Click here: STANFORD STUDY ABOUT KIDS AND FAKE NEWS .

Researchers at Stanford University wanted to know if kids can recognize–and avoid–fake news. The bottom line: not very well.

The study explains what kids were asked to do (ie, tell the difference between an ad and advertorial) and how they approached it. The study looks at kids who were good at critical thinking (ie, uncovering fake news), kids who were okay but still had much to learn and kids who weren’t able to distinguish between real and fake news. It’s a fascinating look at kids and media, and many of the results may surprise you.

fake news assignment answer key

Filter Bubbles , by Eli Paliser

In this 2011 Ted Talk (9 mins) activist Eli Paliser talks about what he calls “filter bubbles.” We at TKN call them “silos” but it’s the same thing—how social media give you more of what you like. And less, to the point of nothing, of what you’re not clicking on.

We need to consciously seek out information that’s not in our bubble.

fake news assignment answer key

Really good NPR Cartoon about how to spot fake news. https://www.npr.org/2020/04/17/837202898/comic-fake-news-can-be-deadly-heres-how-to-spot-it (May 2021) Illo by Connie Jin .

fake news assignment answer key

Great article by The Walrus’ fact-checker Viviane Fairbank. She talks about why people share news they don’t even believe and the importance of showing people how “real” news is gathered–something we’ve been advocating for years. A must-read if you’re interested in facts and news and helping young people understand the difference.

https://thewalrus.ca/how-do-we-exit-the-post-truth-era/ (May 2021)

Art by JOSH HOLINATY (Image excerpt used with permission.)

fake news assignment answer key

“We must save democracy from conspiracies” TIME magazine article by Sacha Baron Cohen, Oct. 8, 2020. Cohen, as you may know, is a satirist whose humour is often, as he puts it, prepubescent. Nevertheless, he is brilliant and in his article, has this to say about satire: “When it works, satire can humble the powerful and expose the ills of society.”

https://time.com/5897501/conspiracy-theory-misinformation/

fake news assignment answer key

“Time to act on newsroom inequality” — Toronto Star

https://www.thestar.com/opinion/public_editor/2020/06/11/time-to-act-on-newsroom-inequality.html

fake news assignment answer key

Interesting BBC article about “ Why (even) smart people believe coronavirus myths “

fake news assignment answer key

Cognitive immunity. This pdf contains an infographic that is a virtual master class in all aspects of disinformation. We’re still making our way through it ourselves, but if there’s a critical thinking topic you have never heard before, chances are it’s here: https://www.iftf.org/fileadmin/user_upload/downloads/ourwork/IFTF_ODNI_Cognitive_Immunity_Map__2019.pdf

fake news assignment answer key

Great article (“Fighting Fake News in the Classroom”) in the American Psychological Association journal about teaching about misinformation. It’s a comprehensive article that includes tons of resources and links to great information about how to get kids thinking about this important subject. https://www.apa.org/monitor/2022/01/career-fake-news

fake news assignment answer key

CBC article on how to (tactfully) discourage the spread of false pandemic information in chats and email

https://www.cbc.ca/news/canada/covid-19-misinformation-rumour-1.5532302

fake news assignment answer key

This article, on a website called #30 Seconds to Check it Out talks about various forms of fake news:

https://web.archive.org/web/20201125174822/https://30secondes.org/en/module/what-is-fake-news/

fake news assignment answer key

This Forbes article talks about microtransactions in games. You know, how games get harder and suddenly you want to buy (with real money) that “thing” that will help you get to the next level. Tell your kids it’s not them, it’s a deliberate business strategy. (Note: I don’t love the headline on this Forbes article–I’m not sure Mario Kart needs “two big warnings.” I don’t think the article really echoes the headline; but the point is a good one.)

https://www.forbes.com/sites/davidthier/2019/09/27/two-big-warnings-about-mario-kart-tour-on-ios-and-android

The Toronto Star’s Classroom Connections has a great series of one-pagers about journalism (TKN’s Joyce Grant is a contributing writer and editor). They’re free to download and they include curriculum questions.

Teaching how “real journalism” is done is key to helping young people understand what “fake news” looks like and how it falls short.

Click here for For the Record: https://www.classroomconnection.ca/for-the-record.html

fake news assignment answer key

There is a nice infographic here, by Simon Fraser University, with eight “simple steps” to spotting fake news — not that spotting fake news is that simple, but these offer a good starting point.

https://www.lib.sfu.ca/help/research-assistance/fake-news

fake news assignment answer key

Article from Webwise, which Google “the Irish Internet Safety Awareness Centre.” A good article and well written–clear and succinct.

https://www.webwise.ie/teachers/what-is-fake-news/

fake news assignment answer key

Sometimes, it’s just nice to hear about it from librarians–here’s much of the same information about what fake news is, but this time from Enoch Pratt free Library, in Baltimore, Maryland, US.

https://www.prattlibrary.org/research/guides/spotting-fake-news

fake news assignment answer key

So interesting. In 2016, The New York Times followed a fake news tweet to show how it started and then how it went viral. It started as a tweet on Nov. 9. It was posted by someone else on Reddit, then a conservative discussion forum, was shared 5,000 times and posted on a Facebook page with 300,000 users.

On Nov. 10, US president Donald Trump tweeted about “the protesters,” which “emboldened” Tucker to think maybe he had something after all, if the president was tweeting about it.

Anyway, read the article for the full story–pretty interesting stuff. Tucker eventually did republish his tweet with “FALSE” stamped on it–that correction was retweeted just 29 times.

https://www.nytimes.com/2016/11/20/business/media/how-fake-news-spreads.html?_r=0

fake news assignment answer key

Interesting 2016 NPR article (and audio clip) about how they tracked down the creator of one specific fake news article: “FBI Agent Suspected in Hillary Email Leaks Found Dead in Apparent Murder-Suicide.”

“Everything about it was fictional… the town, the sheriff…,” says the “fake news entrepreneur” who created it.

NPR tracked him down in Los Angeles; he makes money from ads on his website. How much money? He wouldn’t say for himself but he said that “$10,000 to $30,000 a month is ‘in the ballpark.'”

https://www.npr.org/sections/alltechconsidered/2016/11/23/503146770/npr-finds-the-head-of-a-covert-fake-news-operation-in-the-suburbs

fake news assignment answer key

This Common Sense Media article talks about deepfakes (videos created, using AI, that make it look like someone is doing or saying something they didn’t). Note that the video they link to as a deepfake example is NOT suitable for children , because it contains swears and other inappropriate language.

https://www.commonsensemedia.org/blog/common-sense-explains-what-are-deepfake-videos

fake news assignment answer key

Article in the Rolling Stone magazine about Russia’s massive involvement in the dissemination of “fake news” as well as how fake news plays on our emotions–and not just provoking “shocking” or “sad” reactions, but “smug” and “happy.”

https://www.rollingstone.com/politics/politics-features/russia-troll-2020-election-interference-twitter-916482

fake news assignment answer key

This article, on the Thomson Rivers University (TRU) website provides a good, simple run-down of the basics about fake news. It also gives links to sources if you’d like to go a bit more in-depth.

https://libguides.tru.ca/fakenews/falling

fake news assignment answer key

Feb. 15, 2019 study ( Edelman Trust Barometer ) says 71 per cent of Canadians are worried about “fake news.”

Here is an excellent Global News article that analyses the study.

fake news assignment answer key

A fun and quick video that really drives home why it’s so important to CHECK YOUR FACTS . By the News Literacy Project. You will want to share this with your students.

fake news assignment answer key

https://www.nytimes.com/2018/03/07/technology/two-months-news-newspapers.html

This article in THE SACRAMENTO BEE talks about a new bill proposed in California, to develop “statewide school standards on internet safety and digital citizenship, including cyberbullying and privacy.”

fake news assignment answer key

Great article in the NEW YORK TIMES ABOUT PRINT VS. DIGITAL NEWS : Reporter “took a step back in time,” as he puts it, and read news only from printed newspapers for two months. Here are his fascinating insights about the good and the bad of print vs. digital news.

From the article: “Real life is slow; it takes professionals time to figure out what happened, and how it fits into context. Technology is fast. Smartphones and social networks are giving us facts about the news much faster than we can make sense of them, letting speculation and misinformation fill the gap.”

fake news assignment answer key

VANESSA OTERO’S AWESOME MEDIA BIAS CHART  can help you plot your favourite “real news” sources. (NOTE: Don’t use her chart without crediting Vanessa Otero and/or linking to her website, AllGeneralizationsAreFalse.com– her chart is copyrighted .)

In fact, you should check out VANESSA OTERO’S WEBSITE, ALLGENERALIZATIONSAREFALSE.COM  for tons of great information on bias and the news.

fake news assignment answer key

HOW TO DO A GOOGLE REVERSE IMAGE SEARCH  (includes links to places where you can do this).

Here’s why a reverse-image search is useful: Let’s say there’s a picture of a living room with a real live shark swimming around in it! You might want to double-check whether that’s a real image or one that’s been Photoshopped. You can do a reverse-image search and find the original image.

If it’s, say, a living room with no water and no shark, then you’ve got your answer!

fake news assignment answer key

THE HOUSE HIPPO — “That looked really real –but you knew it couldn’t be true.” One of our favourite go-to videos about critical thinking, still holds true today–even more, in fact. By Concerned Children’s Advertisers. YouTube video, 1:02.

The House Hippo video has had a makeover by Media Smarts! Check out House Hippo 2.0 here:

fake news assignment answer key

BBC’S INSIDE LOOK AT THE WHITE HOUSE PRESS CORPS  and how they cover the president . (YouTube, 13:49 but totally worth the time–so interesting.)

This shows you just how difficult it can be to get the information you need to write your news article.

fake news assignment answer key

Who are the people who create fake news? The University of Massachusetts Amherst and the University of Leeds in the UK teamed up to find out. Read THEIR INTERESTING REPORT, “ARCHITECTS OF NETWORKED DISINFORMATION” on who these people are, and why they do what they do.

Download executive summary  HERE . 

Download full report  HERE .

fake news assignment answer key

What’s FACEBOOK DOING TO DISCOURAGE FAKE NEWS? This is Mark Zuckerberg’s statement.

One of the things they’re doing is to ask people to rank a source’s trustworthiness; they’re hoping that will help to identify some fake news sources. (If you Google this topic, you’ll see lots of columns and insights on what these new Facebook initiatives may mean. It’s an ongoing project, and these are merely the early stages.)

fake news assignment answer key

Stanford University published this STUDY ABOUT THE INFLUENCE OF SOCIAL MEDIA AND FAKE NEWS IN THE 2016 US ELECTION .

fake news assignment answer key

Common Sense Media’s video, 5 WAYS TO SPOT FAKE NEWS .

fake news assignment answer key

“How Stuff Works” — a pretty good little flipchart-type presentation: 10 WAYS TO SPOT FAKE NEWS .

fake news assignment answer key

THREE LESSON PLANS  about “fake news” from Cool Cat Teacher.com.

Study suggests that “lies spread faster than the truth” on Twitter. “Falsehood diffused significantly farther, faster, deeper, and more broadly than the truth in all categories of information…” according to researchers Vosoughi  et al. https://science.sciencemag.org/content/359/6380/1146.full

NPR article talks about the above study about how lies travel 70% faster on Twitter than the truth. https://www.npr.org/sections/alltechconsidered/2018/03/12/592885660/can-you-believe-it-on-twitter-false-stories-are-shared-more-widely-than-true-one

screen illustration

The Poynter Institute trains journalists and is a journalism watchdog for ethics and best-practices. Its media literacy arm, MediaWise , supports the education of young people about journalism, fact-checking and fake news.

On this page , they list a number of resources including games that teach about fake news.

fake news assignment answer key

Check out the BBC’s broad collection of excellent fake news resources: https://www.bbc.co.uk/beyondfakenews/

fake news assignment answer key

The Toronto Star’s Washington correspondent, DANIEL DALE, is @ddale8 on Twitter. He’s in the trenches, uncovering facts, digging around and fighting fake news on a daily basis.

*We can’t fully vouch for third-party links–the ones above are all useful, in our opinion, but their owners could change the material on them at any time, etc. etc. etc., blah, blah, blah.” (Please excuse the legalese.) Also, if you have an excellent resource about “fighting fake news” please let us know on OUR  FACEBOOK PAGE .

A Must-Have for your Library! How to Spot Fake News & Find the Facts

Cover of book Can You Believe It?

How to Talk to Kids About the News

Sometimes the news is challenging or frightening. Here are our tips for talking to kids about difficult news.

Help us Remain FREE to Use!

Internationally recognized.

fake news assignment answer key

TKN on CTV’s Things to Know T.O.

https://review.bellmedia.ca/view/503979640

TKN on CTV’s Toronto Together

By tkn’s joyce grant order from your local independent bookstore today.

fake news assignment answer key

TKN/Scott Radley Show

Tkn/hamilton spectator.

Website Brings ‘Readable, Teachable News’ to Kids , by Kate McCullough, Hamilton Spectator.

1,100+ free articles in our archives

Teaching Kids News posts weekly news articles, written by professional journalists. It’s free to read and use in the classroom. Please also use TKN’s Search feature to search the more than 1,000 articles in our archives.

MG novel by TKN’s Joyce Grant

cover of middle grade novel sliding home by joyce grant

MG novel TKN’s Joyce Grant

fake news assignment answer key

Browse TKN’s Archives

Teaching kids news.

web analytics

  • Follow us on Facebook
  • Follow us on Twitter
  • Criminal Justice
  • Environment
  • Politics & Government
  • Race & Gender

Expert Commentary

Fake news and the spread of misinformation: A research roundup

This collection of research offers insights into the impacts of fake news and other forms of misinformation, including fake Twitter images, and how people use the internet to spread rumors and misinformation.

fake news assignment answer key

Republish this article

Creative Commons License

This work is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International License .

by Denise-Marie Ordway, The Journalist's Resource September 1, 2017

This <a target="_blank" href="https://journalistsresource.org/politics-and-government/fake-news-conspiracy-theories-journalism-research/">article</a> first appeared on <a target="_blank" href="https://journalistsresource.org">The Journalist's Resource</a> and is republished here under a Creative Commons license.<img src="https://journalistsresource.org/wp-content/uploads/2020/11/cropped-jr-favicon-150x150.png" style="width:1em;height:1em;margin-left:10px;">

It’s too soon to say whether Google ’s and Facebook ’s attempts to clamp down on fake news will have a significant impact. But fabricated stories posing as serious journalism are not likely to go away as they have become a means for some writers to make money and potentially influence public opinion. Even as Americans recognize that fake news causes confusion about current issues and events, they continue to circulate it. A December 2016 survey by the Pew Research Center suggests that 23 percent of U.S. adults have shared fake news, knowingly or unknowingly, with friends and others.

“Fake news” is a term that can mean different things, depending on the context. News satire is often called fake news as are parodies such as the “Saturday Night Live” mock newscast Weekend Update. Much of the fake news that flooded the internet during the 2016 election season consisted of written pieces and recorded segments promoting false information or perpetuating conspiracy theories. Some news organizations published reports spotlighting examples of hoaxes, fake news and misinformation  on Election Day 2016.

The news media has written a lot about fake news and other forms of misinformation, but scholars are still trying to understand it — for example, how it travels and why some people believe it and even seek it out. Below, Journalist’s Resource has pulled together academic studies to help newsrooms better understand the problem and its impacts. Two other resources that may be helpful are the Poynter Institute’s tips on debunking fake news stories and the  First Draft Partner Network , a global collaboration of newsrooms, social media platforms and fact-checking organizations that was launched in September 2016 to battle fake news. In mid-2018, JR ‘s managing editor, Denise-Marie Ordway, wrote an article for  Harvard Business Review explaining what researchers know to date about the amount of misinformation people consume, why they believe it and the best ways to fight it.

—————————

“The Science of Fake News” Lazer, David M. J.; et al.   Science , March 2018. DOI: 10.1126/science.aao2998.

Summary: “The rise of fake news highlights the erosion of long-standing institutional bulwarks against misinformation in the internet age. Concern over the problem is global. However, much remains unknown regarding the vulnerabilities of individuals, institutions, and society to manipulations by malicious actors. A new system of safeguards is needed. Below, we discuss extant social and computer science research regarding belief in fake news and the mechanisms by which it spreads. Fake news has a long history, but we focus on unanswered scientific questions raised by the proliferation of its most recent, politically oriented incarnation. Beyond selected references in the text, suggested further reading can be found in the supplementary materials.”

“Who Falls for Fake News? The Roles of Bullshit Receptivity, Overclaiming, Familiarity, and Analytical Thinking” Pennycook, Gordon; Rand, David G. May 2018. Available at SSRN. DOI: 10.2139/ssrn.3023545.

Abstract:  â€œInaccurate beliefs pose a threat to democracy and fake news represents a particularly egregious and direct avenue by which inaccurate beliefs have been propagated via social media. Here we present three studies (MTurk, N = 1,606) investigating the cognitive psychological profile of individuals who fall prey to fake news. We find consistent evidence that the tendency to ascribe profundity to randomly generated sentences — pseudo-profound bullshit receptivity — correlates positively with perceptions of fake news accuracy, and negatively with the ability to differentiate between fake and real news (media truth discernment). Relatedly, individuals who overclaim regarding their level of knowledge (i.e. who produce bullshit) also perceive fake news as more accurate. Conversely, the tendency to ascribe profundity to prototypically profound (non-bullshit) quotations is not associated with media truth discernment; and both profundity measures are positively correlated with willingness to share both fake and real news on social media. We also replicate prior results regarding analytic thinking — which correlates negatively with perceived accuracy of fake news and positively with media truth discernment — and shed further light on this relationship by showing that it is not moderated by the presence versus absence of information about the new headline’s source (which has no effect on perceived accuracy), or by prior familiarity with the news headlines (which correlates positively with perceived accuracy of fake and real news). Our results suggest that belief in fake news has similar cognitive properties to other forms of bullshit receptivity, and reinforce the important role that analytic thinking plays in the recognition of misinformation.”

“Social Media and Fake News in the 2016 Election” Allcott, Hunt; Gentzkow, Matthew. Working paper for the National Bureau of Economic Research, No. 23089, 2017.

Abstract: “We present new evidence on the role of false stories circulated on social media prior to the 2016 U.S. presidential election. Drawing on audience data, archives of fact-checking websites, and results from a new online survey, we find: (i) social media was an important but not dominant source of news in the run-up to the election, with 14 percent of Americans calling social media their “most important” source of election news; (ii) of the known false news stories that appeared in the three months before the election, those favoring Trump were shared a total of 30 million times on Facebook, while those favoring Clinton were shared eight million times; (iii) the average American saw and remembered 0.92 pro-Trump fake news stories and 0.23 pro-Clinton fake news stories, with just over half of those who recalled seeing fake news stories believing them; (iv) for fake news to have changed the outcome of the election, a single fake article would need to have had the same persuasive effect as 36 television campaign ads.”

“Debunking: A Meta-Analysis of the Psychological Efficacy of Messages Countering Misinformation” Chan, Man-pui Sally; Jones, Christopher R.; Jamieson, Kathleen Hall; Albarracín, Dolores. Psychological Science , September 2017. DOI: 10.1177/0956797617714579.

Abstract: “This meta-analysis investigated the factors underlying effective messages to counter attitudes and beliefs based on misinformation. Because misinformation can lead to poor decisions about consequential matters and is persistent and difficult to correct, debunking it is an important scientific and public-policy goal. This meta-analysis (k = 52, N = 6,878) revealed large effects for presenting misinformation (ds = 2.41–3.08), debunking (ds = 1.14–1.33), and the persistence of misinformation in the face of debunking (ds = 0.75–1.06). Persistence was stronger and the debunking effect was weaker when audiences generated reasons in support of the initial misinformation. A detailed debunking message correlated positively with the debunking effect. Surprisingly, however, a detailed debunking message also correlated positively with the misinformation-persistence effect.”

“Displacing Misinformation about Events: An Experimental Test of Causal Corrections” Nyhan, Brendan; Reifler, Jason. Journal of Experimental Political Science , 2015. doi: 10.1017/XPS.2014.22.

Abstract: “Misinformation can be very difficult to correct and may have lasting effects even after it is discredited. One reason for this persistence is the manner in which people make causal inferences based on available information about a given event or outcome. As a result, false information may continue to influence beliefs and attitudes even after being debunked if it is not replaced by an alternate causal explanation. We test this hypothesis using an experimental paradigm adapted from the psychology literature on the continued influence effect and find that a causal explanation for an unexplained event is significantly more effective than a denial even when the denial is backed by unusually strong evidence. This result has significant implications for how to most effectively counter misinformation about controversial political events and outcomes.”

“Rumors and Health Care Reform: Experiments in Political Misinformation” Berinsky, Adam J. British Journal of Political Science , 2015. doi: 10.1017/S0007123415000186.

Abstract: “This article explores belief in political rumors surrounding the health care reforms enacted by Congress in 2010. Refuting rumors with statements from unlikely sources can, under certain circumstances, increase the willingness of citizens to reject rumors regardless of their own political predilections. Such source credibility effects, while well known in the political persuasion literature, have not been applied to the study of rumor. Though source credibility appears to be an effective tool for debunking political rumors, risks remain. Drawing upon research from psychology on ‘fluency’ — the ease of information recall — this article argues that rumors acquire power through familiarity. Attempting to quash rumors through direct refutation may facilitate their diffusion by increasing fluency. The empirical results find that merely repeating a rumor increases its power.”

“Rumors and Factitious Informational Blends: The Role of the Web in Speculative Politics” Rojecki, Andrew; Meraz, Sharon. New Media & Society , 2016. doi: 10.1177/1461444814535724.

Abstract: “The World Wide Web has changed the dynamics of information transmission and agenda-setting. Facts mingle with half-truths and untruths to create factitious informational blends (FIBs) that drive speculative politics. We specify an information environment that mirrors and contributes to a polarized political system and develop a methodology that measures the interaction of the two. We do so by examining the evolution of two comparable claims during the 2004 presidential campaign in three streams of data: (1) web pages, (2) Google searches, and (3) media coverage. We find that the web is not sufficient alone for spreading misinformation, but it leads the agenda for traditional media. We find no evidence for equality of influence in network actors.”

“Analyzing How People Orient to and Spread Rumors in Social Media by Looking at Conversational Threads” Zubiaga, Arkaitz; et al. PLOS ONE, 2016. doi: 10.1371/journal.pone.0150989.

Abstract: “As breaking news unfolds people increasingly rely on social media to stay abreast of the latest updates. The use of social media in such situations comes with the caveat that new information being released piecemeal may encourage rumors, many of which remain unverified long after their point of release. Little is known, however, about the dynamics of the life cycle of a social media rumor. In this paper we present a methodology that has enabled us to collect, identify and annotate a dataset of 330 rumor threads (4,842 tweets) associated with 9 newsworthy events. We analyze this dataset to understand how users spread, support, or deny rumors that are later proven true or false, by distinguishing two levels of status in a rumor life cycle i.e., before and after its veracity status is resolved. The identification of rumors associated with each event, as well as the tweet that resolved each rumor as true or false, was performed by journalist members of the research team who tracked the events in real time. Our study shows that rumors that are ultimately proven true tend to be resolved faster than those that turn out to be false. Whilst one can readily see users denying rumors once they have been debunked, users appear to be less capable of distinguishing true from false rumors when their veracity remains in question. In fact, we show that the prevalent tendency for users is to support every unverified rumor. We also analyze the role of different types of users, finding that highly reputable users such as news organizations endeavor to post well-grounded statements, which appear to be certain and accompanied by evidence. Nevertheless, these often prove to be unverified pieces of information that give rise to false rumors. Our study reinforces the need for developing robust machine learning techniques that can provide assistance in real time for assessing the veracity of rumors. The findings of our study provide useful insights for achieving this aim.”

“Miley, CNN and The Onion” Berkowitz, Dan; Schwartz, David Asa. Journalism Practice , 2016. doi: 10.1080/17512786.2015.1006933.

Abstract: “Following a twerk-heavy performance by Miley Cyrus on the Video Music Awards program, CNN featured the story on the top of its website. The Onion — a fake-news organization — then ran a satirical column purporting to be by CNN’s Web editor explaining this decision. Through textual analysis, this paper demonstrates how a Fifth Estate comprised of bloggers, columnists and fake news organizations worked to relocate mainstream journalism back to within its professional boundaries.”

“Emotions, Partisanship, and Misperceptions: How Anger and Anxiety Moderate the Effect of Partisan Bias on Susceptibility to Political Misinformation”

Weeks, Brian E. Journal of Communication , 2015. doi: 10.1111/jcom.12164.

Abstract: “Citizens are frequently misinformed about political issues and candidates but the circumstances under which inaccurate beliefs emerge are not fully understood. This experimental study demonstrates that the independent experience of two emotions, anger and anxiety, in part determines whether citizens consider misinformation in a partisan or open-minded fashion. Anger encourages partisan, motivated evaluation of uncorrected misinformation that results in beliefs consistent with the supported political party, while anxiety at times promotes initial beliefs based less on partisanship and more on the information environment. However, exposure to corrections improves belief accuracy, regardless of emotion or partisanship. The results indicate that the unique experience of anger and anxiety can affect the accuracy of political beliefs by strengthening or attenuating the influence of partisanship.”

“Deception Detection for News: Three Types of Fakes” Rubin, Victoria L.; Chen, Yimin; Conroy, Niall J. Proceedings of the Association for Information Science and Technology , 2015, Vol. 52. doi: 10.1002/pra2.2015.145052010083.

Abstract: “A fake news detection system aims to assist users in detecting and filtering out varieties of potentially deceptive news. The prediction of the chances that a particular news item is intentionally deceptive is based on the analysis of previously seen truthful and deceptive news. A scarcity of deceptive news, available as corpora for predictive modeling, is a major stumbling block in this field of natural language processing (NLP) and deception detection. This paper discusses three types of fake news, each in contrast to genuine serious reporting, and weighs their pros and cons as a corpus for text analytics and predictive modeling. Filtering, vetting, and verifying online information continues to be essential in library and information science (LIS), as the lines between traditional news and online information are blurring.”

“When Fake News Becomes Real: Combined Exposure to Multiple News Sources and Political Attitudes of Inefficacy, Alienation, and Cynicism” Balmas, Meital. Communication Research , 2014, Vol. 41. doi: 10.1177/0093650212453600.

Abstract: “This research assesses possible associations between viewing fake news (i.e., political satire) and attitudes of inefficacy, alienation, and cynicism toward political candidates. Using survey data collected during the 2006 Israeli election campaign, the study provides evidence for an indirect positive effect of fake news viewing in fostering the feelings of inefficacy, alienation, and cynicism, through the mediator variable of perceived realism of fake news. Within this process, hard news viewing serves as a moderator of the association between viewing fake news and their perceived realism. It was also demonstrated that perceived realism of fake news is stronger among individuals with high exposure to fake news and low exposure to hard news than among those with high exposure to both fake and hard news. Overall, this study contributes to the scientific knowledge regarding the influence of the interaction between various types of media use on political effects.”

“Faking Sandy: Characterizing and Identifying Fake Images on Twitter During Hurricane Sandy” Gupta, Aditi; Lamba, Hemank; Kumaraguru, Ponnurangam; Joshi, Anupam. Proceedings of the 22nd International Conference on World Wide Web , 2013. doi: 10.1145/2487788.2488033.

Abstract: “In today’s world, online social media plays a vital role during real world events, especially crisis events. There are both positive and negative effects of social media coverage of events. It can be used by authorities for effective disaster management or by malicious entities to spread rumors and fake news. The aim of this paper is to highlight the role of Twitter during Hurricane Sandy (2012) to spread fake images about the disaster. We identified 10,350 unique tweets containing fake images that were circulated on Twitter during Hurricane Sandy. We performed a characterization analysis, to understand the temporal, social reputation and influence patterns for the spread of fake images. Eighty-six percent of tweets spreading the fake images were retweets, hence very few were original tweets. Our results showed that the top 30 users out of 10,215 users (0.3 percent) resulted in 90 percent of the retweets of fake images; also network links such as follower relationships of Twitter, contributed very little (only 11 percent) to the spread of these fake photos URLs. Next, we used classification models, to distinguish fake images from real images of Hurricane Sandy. Best results were obtained from Decision Tree classifier, we got 97 percent accuracy in predicting fake images from real. Also, tweet-based features were very effective in distinguishing fake images tweets from real, while the performance of user-based features was very poor. Our results showed that automated techniques can be used in identifying real images from fake images posted on Twitter.”

“The Impact of Real News about ‘Fake News’: Intertextual Processes and Political Satire” Brewer, Paul R.; Young, Dannagal Goldthwaite; Morreale, Michelle. International Journal of Public Opinion Research , 2013. doi: 10.1093/ijpor/edt015.

Abstract: “This study builds on research about political humor, press meta-coverage, and intertextuality to examine the effects of news coverage about political satire on audience members. The analysis uses experimental data to test whether news coverage of Stephen Colbert’s Super PAC influenced knowledge and opinion regarding Citizens United, as well as political trust and internal political efficacy. It also tests whether such effects depended on previous exposure to The Colbert Report (Colbert’s satirical television show) and traditional news. Results indicate that exposure to news coverage of satire can influence knowledge, opinion, and political trust. Additionally, regular satire viewers may experience stronger effects on opinion, as well as increased internal efficacy, when consuming news coverage about issues previously highlighted in satire programming.”

“With Facebook, Blogs, and Fake News, Teens Reject Journalistic ‘Objectivity’” Marchi, Regina. Journal of Communication Inquiry , 2012. doi: 10.1177/0196859912458700.

Abstract: “This article examines the news behaviors and attitudes of teenagers, an understudied demographic in the research on youth and news media. Based on interviews with 61 racially diverse high school students, it discusses how adolescents become informed about current events and why they prefer certain news formats to others. The results reveal changing ways news information is being accessed, new attitudes about what it means to be informed, and a youth preference for opinionated rather than objective news. This does not indicate that young people disregard the basic ideals of professional journalism but, rather, that they desire more authentic renderings of them.”

Keywords: alt-right, credibility, truth discovery, post-truth era, fact checking, news sharing, news literacy, misinformation, disinformation

5 fascinating digital media studies from fall 2018
Facebook and the newsroom: 6 questions for Siva Vaidhyanathan

About The Author

' src=

Denise-Marie Ordway

HCC Libraries Home

Fake News, Misleading News, Biased News: Assignments on Evaluating Sources

  • Evaluating Sources
  • Assignments on Evaluating Sources
  • Terms and Definitions
  • Open Textbooks
  • Fact-Checking Sites and Plug-Ins
  • Coronavirus COVID-19
  • Fake News and AI

Assignments

  • Caulfield, Mike. The Four Moves: Adventures in Fact-Checking for Students
  • CORA (Community of Online Research Assignments). Evaluating news sites: Credible or Clickbait?
  • McCormick Foundation. Introduction to news literacy: Structured engagement with current and controversial issues.
  • University of Delaware. Curing fake news phobia (Google Doc with lesson plan - by Lauren Wallis)
  • University of Texas El Paso. News gathering and investigation: An evaluation exercise
  • Whiting, Jacquelyn. (2019, September 4). Everyone has invisible bias. This lesson shows students how to recognize it. EdSurge.

More Assignments

C-SPAN Classroom: Lesson idea: Media Literacy and Fake News

SchoolJournalism.com   News  and media literacy lessons.

Walsh-Moorman, Elizabeth and Katie Ours. Introducing lateral reading before research MLA Style Center. (Objectives include identifying credibitilty and/or bias of a course, identifying how professional fact-checkers assess iinformation vs a general audience.)

  • The Media Manipulation Casebook Includes methods, definitions of terms related to misinformaiton, disinformation, and media manipulation.

A Course on News Literacy

Making Sense of the News: News Literacy Lessons for Digital Citizens   A six week course offered by The University of Hong Kong & The State University of New York via Coursera,  Audit the course  for free. Resources include a glossary of terms such as bias, cognitive dissonance, confirmation bias,  propaganda, selective dissonance, verfication, etc.  

News Literacy. Digital Resource Center. Stony Brook University

Stony Brook University. Digital Resources Center.   The 14 Lessons   This course pack consists of lessons that can be taught in sequence or separately and cover topics such as verification, fairness and balance, bias, etc. This material is the basis for the Coursera course (above) on news literacy.

Fake New: Curriculum. Cal State University Long Beach

  • Fake News: Curriculum Curriculum about fake news, curriculum guides, presentations; instructional strategies from librarian Lesley Farmer.

Fake News in First-Year Writing - Paul Corrigan

  • Corrigan, Paul T. Fake News in First-Year Writing. Writing Commons.org A description of a first-year writing course that integrates feeling and fact-checking with a description of the writing projects.

Need to Evaluate a Source? Try a Worksheet

  • Evaluating Web Sites: A Checklist (University of Maryland)

Quality of News Sources - You Decide!

Vanessa Otero - a patent attorney - made a chart with her views on various news sites - and you can too! She put out a blank version so you can decide. See her blog post on  news quality   and her chart on Twitter  

Valenza, J.  (2016, November 26).   Truth, truthiness, triangulation: A news literacy toolkit for a "post-truth" world.   School Library Journal.  

A course from University of Washington, Seattle, WA

  • Calling Bullshit - Course Syllabus A proposal for a course by two professors from University of Washington, Seattle meant to teach students how to recognize. bullshit. "Bullshit is language, statistical figures, data graphics, and other forms of presentation intended to persuade by impressing and overwhelming a reader or listener, with a blatant disregard for truth and logical coherence."
  • Check, Please! Starter Course "In this course, we show you how to fact and source-check in five easy lessons, taking about 30 minutes apiece. The entire online curriculum is two and a half to three hours and is suitable homework for the first week of a college-level module on disinformation or online information literacy, or the first few weeks of a course if assigned with other discipline focused homework." This course has been released into the public domain.
  • Real vs. Fake. Science vs. Pseudoscience. A course syllabus Fall 2019 A course by Dr. Douglas Duncan, University of Colorado Boulder

A course from University of Michigan on fake news

  • Fake News, Lles, and Propaganda: The Class An entire seven week course developed by librarians at the University of Michigan.

Learning Tools Suggested by Richard Byrne

Learning tools suggested by Richard Byrne in his Practical Ed Tech Tip of the Week .

  • Can You Spot the Problem with These Headlines? A TED-Ed lesson.
  • Checkology: A free version with interactive modules (that become increasingly difficult.)
  • Civic Online Reasoning: Lesson plans From the Stanford History Education Group. (SHEG). Lessons on lateral reading with fact-checking organizations, who's behind the information? What's the evidence? Create a free account to the SHEG to access these lessons.
  • Spot the Troll A troll is a fake social media account, often created to spread misleading information.- Learn to spot them! From Clemson University's Media Forensics Lab
  • This One Weird Trick Will Help You Spot Clickbait. A TED-Ed lesson
  • << Previous: Evaluating Sources
  • Next: Terms and Definitions >>
  • Last Updated: Mar 21, 2024 6:23 PM
  • URL: https://libguides.hccfl.edu/fakenews

fake news assignment answer key

© 2024 | All rights reserved

  • Environmental Science
  • Introduction
  • Scientific Principles
  • Matter and Energy
  • Evolution and Ecology
  • Biodiversity
  • Land Biomes
  • Aquatic Ecosystems
  • Human Population
  • Environmental Toxins
  • Plant Agriculture
  • Fishing and Aquaculture
  • Animal Agriculture
  • Global Climate Change
  • Air Pollution
  • Water Pollution
  • Fossil Fuels
  • Nuclear Energy
  • Renewable Energy
  • Answer Keys and Test Bank
  • Current Events Articles
  • Planet Earth Series
  • Blue Planet Series
  • Life in the Freezer Series

Site Navigation

  • Anatomy & Physiology

Suggested Materials

Topic search.

LinkedIn Profile

Email me or visit my LinkedIn profile .

Teacher Resources

Looking to save time on your lesson planning and assessment design?

Answer keys and a test bank can be accessed for a paid subscription.

Fake News Assignment Death by Vaccines

With the growth of social media, fake news websites are appearing with greater frequency. This has begun the rapid spread of misinformation on topics regarding vaccines, food safety, global warming, and many other topics. Students need to be able to evaluate these news sites and soruces.

This assignment will present students with three articles about vaccines: one from the Washington Post, one from Natural News, and the other from a website called "Vaccines.news". Students will be guided through an analysis of the articles, including identifying clickbait, researching the background of the authors, and judging the validity of several statements within the articles.

Essential concepts: Pseudoscience, fake news, news analysis, sources.

Answer Key: Available as part of a environmental science instructor resources subscription .

Save 10% today on your lessons using the code GIVEME10

fake news assignment answer key

Fake News Unit – Media Literacy Analysis Unit

Total Pages: 250 pages Answer Key: Included with rubric Teaching Duration: 2 Weeks File Size: 31.7 MB File Type: PDF (Zip)

  • Description

This Fake News Unit contains 10 high-interest lessons about media literacy, fake news and digital citizenship to help 21st-century students learn to think critically about the information they consume from print and digital media sources. This is highly relevant to today’s learners as it contains lessons on social media content and algorithms as well as artificial intelligence and deep fakes. Teachers will use these fake news scaffolded lessons starting with an interactive QR code vocabulary scavenger hunt and headline analysis then moving on to specific lessons about fake news and how to spot it.

After the scaffolded lessons, students will love working through the eight different fake news stations about satire, websites, videos, news articles, TED talks, social media algorithms, deep fakes, and AI music in groups with their peers. The Fake News Unit culminates with students creating their own fake news stories in either print or digital formats.

Click Here To View The Preview

Lesson Overview

  • Introduction – QR Code Vocabulary Search
  • Lesson 1 – What is Media Literacy?
  • Lesson 2 – Media Literacy in Action: Headline Analysis
  • Lesson 3 – What is Fake News?
  • Lesson 4 – Social Media and Fake News
  • Lesson 5 – Case Study: The War of the Worlds Radio Broadcast
  • Lesson 6 – How to Spot Fake News
  • Lesson 7 – Fake News Stations: Spot The Fake (Satire, Websites, Videos, News Articles, Ted Talks, Social Media Algorithms, Deep Fakes)
  • Lesson 8 – Digital Citizenship
  • Lesson 9 – Creating Fake News Assignment

What’s Inside:

  • 10 Engaging Lessons :  Created specifically for middle school students, these lessons are designed to capture their attention and foster meaningful discussions.
  • Detailed Teacher Lesson Plans :  Teachers can seamlessly integrate each lesson into their plans.
  • Interactive Content (QR Codes, Scenarios, Group Work) :  Lesson variety helps students stay interested and motivated.
  • Group Work Stations :  Foster collaboration and dialogue with group work stations that encourage peer-to-peer learning and interactive discussions.
  • Vocabulary Lesson :  Help meet curriculum objectives with this lesson on media vocabulary.
  • Historical Example Case Study :  Students will learn about a time in history when the media created panic by not disclosing that the news story was fake.
  • MP3 Audio Files :  Students can read the articles independently or use the provided MP3 audio files.
  • Answer Keys :  Guide your students through each lesson with the comprehensive answer keys.
  • Video Links :  Enhance learning through multimedia experiences that reinforce key concepts.
  • Graphic Organizers :  Help students organize their thoughts and ideas visually, promoting deeper understanding and engagement with the content.
  • Pre-Made Google Slideshow :  Seamlessly integrate technology into your lessons with a pre-made Google Slideshow that’s ready to use.
  • Print & Digital Formats :  Cater to your classroom’s needs with both print and digital formats, ensuring accessibility and flexibility.

Teacher Feedback – Fake News Unit:

  • “This is honestly one of my favourite resources I have ever purchased. I have done this unit with grade 8 and a 6/7 class and it has been a hit! Students have loved all the opportunities to explore fake news, and to create their own fake news websites. It is honestly such a great resource and is so highly recommended.”
  • “This was a fun and engaging unit that is necessary to teach effectively in today’s world!  The students enjoyed it too!!!”
  • “This is a great way to introduce a very important topic for students who regularly encounter misleading or fake news while they are online. It is a fun unit for a serious topic – lots of ways for students to make connections with the things they are seeing online in social media or hearing from other students. Helps to build critical thinking skills!”

With 10 captivating fake news lessons that include a variety of independent and group work activities, this Fake News Unit ensures that your students are well-prepared to make informed decisions about the media content they consume. Grab this unit today!

Other Engaging Media Literacy Lessons:

Media Literacy Unit – Analyzing Public Service Announcements and Commercials

  • Media Literacy Bundle 1 –  Consumer Awareness Lessons
  • Media Literacy Bundle 2 – Consumer Awareness Lessons
  • Media Literacy Review Writing 16 Lessons

Related products

Book Versus Movie Comparison Analysis Project

Book Versus Movie Comparison Analysis Project

Media Literacy and Consumer Awareness Lesson - Outlet vs. Retail

Media Literacy and Consumer Awareness Lesson – Outlet vs. Retail

This Media Literacy Unit is designed specifically for middle school students, offering a comprehensive 10-lesson program that will deepen their understanding of media and critical thinking.

Remembrance Day Unit

This free persuasive writing unit is.

  • Perfect for engaging students in public speaking and persuasive writing
  • Time and energy saving
  • Ideal for in-person or online learning

By using highly-engaging rants, your students won’t even realize you’ve channeled their daily rants and complaints into high-quality, writing!

FREE persuasive writing unit is

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List

Logo of plosone

Determinants of individuals’ belief in fake news: A scoping review determinants of belief in fake news

Kirill bryanov.

Laboratory for Social and Cognitive Informatics, National Research University Higher School of Economics, St. Petersburg, Russia

Victoria Vziatysheva

Associated data.

All relevant data are available within the paper. Search protocol is described in the text, and Table 3 contains information about all studies included in the review.

Proliferation of misinformation in digital news environments can harm society in a number of ways, but its dangers are most acute when citizens believe that false news is factually accurate. A recent wave of empirical research focuses on factors that explain why people fall for the so-called fake news. In this scoping review, we summarize the results of experimental studies that test different predictors of individuals’ belief in misinformation.

The review is based on a synthetic analysis of 26 scholarly articles. The authors developed and applied a search protocol to two academic databases, Scopus and Web of Science. The sample included experimental studies that test factors influencing users’ ability to recognize fake news, their likelihood to trust it or intention to engage with such content. Relying on scoping review methodology, the authors then collated and summarized the available evidence.

The study identifies three broad groups of factors contributing to individuals’ belief in fake news. Firstly, message characteristics—such as belief consistency and presentation cues—can drive people’s belief in misinformation. Secondly, susceptibility to fake news can be determined by individual factors including people’s cognitive styles, predispositions, and differences in news and information literacy. Finally, accuracy-promoting interventions such as warnings or nudges priming individuals to think about information veracity can impact judgements about fake news credibility. Evidence suggests that inoculation-type interventions can be both scalable and effective. We note that study results could be partly driven by design choices such as selection of stimuli and outcome measurement.

Conclusions

We call for expanding the scope and diversifying designs of empirical investigations of people’s susceptibility to false information online. We recommend examining digital platforms beyond Facebook, using more diverse formats of stimulus material and adding a comparative angle to fake news research.

Introduction

Deception is not a new phenomenon in mass communication: people had been exposed to political propaganda, strategic misinformation, and rumors long before much of public communication migrated to digital spaces [ 1 ]. In the information ecosystem centered around social media, however, digital deception took on renewed urgency, with the 2016 U.S. presidential election marking the tipping point where the gravity of the issue became a widespread concern [ 2 , 3 ]. A growing body of work documents the detrimental effects of online misinformation on political discourse and people’s societally significant attitudes and beliefs. Exposure to false information has been linked to outcomes such as diminished trust in mainstream media [ 4 ], fostering the feelings of inefficacy, alienation, and cynicism toward political candidates [ 5 ], as well as creating false memories of fabricated policy-relevant events [ 6 ] and anchoring individuals’ perceptions of unfamiliar topics [ 7 ].

According to some estimates, the spread of politically charged digital deception in the buildup to and following the 2016 election became a mass phenomenon: for example, Allcott and Gentzkow [ 1 ] estimated that the average US adult could have read and remembered at least one fake news article in the months around the election (but see Allen et al. [ 8 ] for an opposing claim regarding the scale of the fake news issue). Scholarly reflections upon this new reality sparked a wave of research concerned with a specific brand of false information, labelled fake news and most commonly conceptualized as non-factual messages resembling legitimate news content and created with an intention to deceive [ 3 , 9 ]. One research avenue that has seen a major uptick in the volume of published work is concerned with uncovering the factors driving people’s ability to discern fake from legitimate news. Indeed, in order for deceitful messages to exert the hypothesized societal effects—such as catalyzing political polarization [ 10 ], distorting public opinion [ 11 ], and promoting inaccurate beliefs [ 12 ]—the recipients have to believe that the claims these messages present are true [ 13 ]. Furthermore, research shows that the more people find false information encountered on social media credible, the more likely they are to amplify it by sharing [ 14 ]. The factors and mechanisms underlying individuals’ judgements of fake news’ accuracy and credibility thus become a central concern for both theory and practice.

While message credibility has been a longstanding matter of interest for scholars of communication [ 15 ], the post-2016 wave of scholarship can be viewed as distinct on account of its focus on particular news formats, contents, and mechanisms of spread that have been prevalent amid the recent fake news crisis [ 16 ]. Furthermore, unlike previous studies of message credibility, the recent work is increasingly taking a turn towards developing and testing potential solutions to the problem of digital misinformation, particularly in the form of interventions aimed at improving people’s accuracy judgements.

Some scholars argue that the recent rise of fake news is a manifestation of a broader ongoing epistemological shift, where significant numbers of online information consumers move away from the standards of evidence-based reasoning and pursuit of objective truth toward “alternative facts” and partisan simplism—a malaise often labelled as the state of “post-truth” [ 17 , 18 ]. Lewandowsky and colleagues identify large-scale trends such as declining social capital, rising economic inequality and political polarization, diminishing trust in science, and an increasingly fragmented media landscape as the processes underlying the shift toward the “post-truth.” In order to narrow the scope of this report, we specifically focus on the news media component of the larger “post-truth” puzzle. This leads us to consider only the studies that explore the effects of misinformation packaged in news-like formats, perforce leaving out investigations dealing with other forms of online deception–for example, messages coming from political figures and parties [ 19 ] or rumors [ 20 ].

The apparently vast amount and heterogeneity of recent empirical research output addressing the antecedents to people’s belief in fake news calls for integrative work summarizing and mapping the newly generated findings. We are aware of a single review article published to date synthesizing empirical findings on the factors of individuals’ susceptibility to believing fake news in political contexts, a narrative summary of a subset of relevant evidence [ 21 ]. In order to systematically survey the available literature in a way that permits both transparency and sufficient conceptual breadth, we employ a scoping review methodology, most commonly used in medical and public health research. This method prescribes specifying a research question, search strategy, and criteria for inclusion and exclusion, along with the general logic of charting and arranging the data, thus allowing for a transparent, replicable synthesis [ 22 ]. Because it is well-suited for identifying diverse subsets of evidence pertaining to a broad research question [ 23 ], scoping review methodology is particularly relevant to our study’s objectives. We begin our investigation with articulating the following research questions:

  • RQ1: What factors have been found to predict individuals’ belief in fake news and their capacity to discern between false and real news?
  • RQ2: What interventions have been found to reduce individuals’ belief in fake news and boost their capacity to discern between false and real news?

In the following sections, we specify our methodology and describe the findings using an inductively developed framework organized around groups of factors and dependent variables extracted from the data. Specifically, we approached the analysis without a preconceived categorization of the factors in mind. Following our assessment of the studies included in the sample, we divided them into three groups based on whether the antecedents of belief in fake news that they focus on 1) reside within the individual or 2) are related to the features of the message, source, or information environment or 3) represent interventions specifically designed to tackle the problem of online misinformation. We conclude with a discussion of the state of play in the research area under review, identify strengths and gaps in existing scholarship, and offer potential avenues for further advancing this body of knowledge.

Materials and methods

Our research pipeline has been developed in accordance with PRISMA guidelines for systematic scoping reviews [ 24 ] and contains the following steps: a) development of a review protocol; b) identification of the relevant studies; c) extraction and charting of the data from selected studies, elaboration of the emerging themes; d) collation and summarization of the results; e) assessment of the strengths and limitations of the body of literature, identification of potential paths for addressing the existing gaps and theory advancement.

Search strategy and protocol development

At the outset, we defined the target population of texts as English-language scholarly articles published in peer-reviewed journals between January 1, 2016 and November 1, 2020 and using experimental methodology to investigate the factors underlying individuals’ belief in false news. We selected this time frame with the intention to specifically capture the research output that emerged in response to the “post-truth” turn in the public and scholarly discourse that many observers link to the political events of 2016, most notably Donald Trump’s ascent to U.S. presidency [ 17 ]. Because we were primarily interested in causal evidence for the role of various antecedents to fake news credibility perceptions, we decided to focus on experimental studies. Our definition of experiment has been purposefully lax, since we acknowledged the possibility that not all relevant studies could employ rigorous experimental design with random assignment and a control group. For example, this would likely be the case for studies testing factors that are more easily measured than manipulated, such as individual psychological predispositions, as predictors of fake news susceptibility. We therefore included investigations where researchers varied at least one of the elements of news exposure: Either a hypothesized factor driving belief in fake news (both between or within subjects), or veracity of news used as a stimulus (within-subjects). Consequently, the studies included in our review presented both causal and correlational evidence.

Upon the initial screening of relevant texts already known to the authors or discovered through cross-referencing, it became apparent that proposed remedies and interventions enhancing news accuracy judgements should also be included into the scope of the review. In many cases practical solutions are presented alongside fake news believability factors, while in several instances testing such interventions is the reports’ primary concern. We began with developing the string of search terms informed by the language found in the titles of the already known relevant studies [ 14 , 25 – 27 ], then enhanced it with plausible synonymous terms drawn from the online service Thesaurus . com . As the initial version of this report went into peer review, we received reviewer feedback suggesting that some of the relevant studies, particularly on the topic of inoculation-based interventions, were left out. We modified our search query accordingly, adding further three inoculation-related terms. The ultimate query looked as follows:

  • (belie* OR discern* OR identif* OR credib* OR evaluat* OR assess* OR rating OR
  • rate OR suspic* OR "thinking" OR accura* OR recogn* OR susceptib* OR malleab* OR trust* OR resist* OR immun* or innocul*) AND (false* OR fake OR disinform* OR misinform*).

Based on our understanding that the relevant studies should fall within the scope of such disciplines as media and communication studies, political science, psychology, cognitive science, and information sciences, we identified two citation databases, Scopus and Web of Science, as the target corpora of scholarly texts. Web of Science and Scopus are consistently ranked among leading academic databases providing citation indexing [ 28 , 29 ]. Norris and Oppenheim [ 30 ] argue that in terms of record processing quality and depth of coverage these databases provide valid instruments for evaluating scholarly contributions in social sciences. Another possible alternative is Google Scholar, which also provides citation indexing and is often considered the largest academic database [ 31 ]. Yet, according to some appraisals, this database lacks quality control [ 32 ], transparency, and can contribute to parts of relevant evidence being overlooked when used in systematic reviews [ 33 ]. Thus, for the purposes of this paper, we chose WoS and Scopus as sources of data.

Relevance screening and inclusion/exclusion criteria

Using title search, our queries resulted in 1622 and 1074 publications in Scopus and Web of Science, respectively. The study selection process is demonstrated in Fig 1 .

An external file that holds a picture, illustration, etc.
Object name is pone.0253717.g001.jpg

We began the search with crude title screening performed by the authors (KB and VV) on each database independently. On this stage, we mainly excluded obviously irrelevant articles (e.g. research reports mentioning false-positive biochemical tests results) and those whose titles unambiguously indicated that the item was outside of our original scope, such as work in the field of machine learning on automated fake news detection. Both authors’ results were then cross-checked, and disagreements resolved. This stage narrowed our selection down to 109 potentially relevant Scopus articles and 76 WoS articles. Having removed duplicate items present in both databases, we arrived at the list of 117 unique articles retained for abstract review.

On the abstract screening stage, we excluded items that could be identified as utilizing non-experimental research designs. Furthermore, on this stage we determined that all articles that fit our intended scope include at least one of the following outcome variables: 1) perceived credibility, believability, or accuracy of false news messages and 2) a measure of the capacity to discern false from authentic news. Screening potentially eligible abstracts suggested that studies not addressing one of these two outcomes do not answer the research questions at the center of our study. Seventy articles were thus removed, leaving us with 45 articles for full-text review.

The remaining articles were read in full by the authors independently, disagreements on whether specific items fit the inclusion criteria resolved, resulting in the final sample of 26 articles (see Table 1 for the full list of included studies). Since our primary focus is on perceptions of false media content and corresponding interventions designed to improve news delivery and consumption practices, we only included the experiments that utilized a news-like format of the stimulus material. As a result, we forwent investigations focusing on online rumors, individual politicians’ social media posts, and other stimuli that were not meant to represent content produced by a news organization. We did not limit the range of platforms where the news articles were presented to participants, since many studies simulated the processes of news selection and consumption in high-choice environments such as social media feeds. We then charted the evidence contained therein according to a categorization based on the outcome and independent variables that the included studies investigate.

* Note: In study design statements, all factors are between-subjects unless stated otherwise.

Outcome variables

Having arranged the available evidence along a number of ad-hoc dimensions, including the primary independent variables/correlates and focal outcome variables, we opted for a presentation strategy that opens with a classification of study dependent variables. Our analysis revealed that the body of scholarly literature under review is characterized by a significant heterogeneity of outcome variables. The concepts central to our synthesis are operationalized and measured in a variety of ways across studies, which presents a major hindrance to comparability of their results. In addition, in the absence of established terminology these variables are often labelled differently even when they represent similar constructs.

In addition to several variations of the dependent variables that we used as one of the inclusion criteria, we discovered a range of additional DVs relevant to the issue of online misinformation that the studies under review explored. The resulting classification is presented in Table 2 below.

Note: A single study could yield several observations if it considered multiple outcome variables.

As visible from Table 2 , the majority of studies in our sample measured the degree to which participants identified news messages or headlines as credible, believable or accurate. This strategy was utilized in experiments that both exposed individuals to made-up messages only, and those where stimulus material included a combination of real and fake items. Studies of the former type examined the effects of message characteristics or presentation cues on perceived credibility of misinformation, while the latter stimulus format also enabled scholars to examine the factors driving the accuracy of people’s identification of news as real or fake. In most instances, these synthetic “media truth discernment” scores were constructed post-hoc by matching participants’ credibility responses to the known “ground truth” of messages that they were asked to assess. These individual discernment scores could then be matched with the respondent’s or message’s features to infer the sources of systematic variation in the aggregate judgement accuracy.

Looking at credibility perceptions of real and false news separately also enabled scholars to determine whether the effects of factors or interventions were symmetric for both message types. In a media environment where the overwhelming majority of news is real after all [ 27 ], it is essential to ensure both that fake news is dismissed, and high-quality content is trusted.

Another outcome that several studies in our sample investigated is the self-reported likelihood to share the message on social media. Given that social platforms like Facebook are widely believed to be responsible for the rapid spread of deceitful political content in recent years [ 2 ], the determinants of sharing behavior are central to developing effective measures for limiting the reach of fake news. Moreover, in at least one study [ 34 ] researchers explicitly used sharing intent as a proxy for a news accuracy judgement in order to estimate perceived accuracy without priming participants’ thinking about veracity of information. This approach appears promising given that this as well as other studies reported sizable correlations between perceived accuracy and sharing intent [ 35 – 37 ], yet it is obviously limited as a host of considerations beyond credibility can inform the decision to share a news item on social media.

Having extracted and classified the dependent variables in the reviewed studies, we proceed to mapping our observations against the factors and correlates that were theorized to exert effects on them (see Table 3 ).

Note: Only outcome variables with more than one observation are included in the table.

A single study could yield several observations if it considered multiple independent and/or outcome variables.

We observed that the experimental studies in our sample measure or manipulate three types of factors hypothesized to influence individuals’ belief in fake news. The first category encompasses variables related to the news message, the way it is presented, or the features of the information environment where exposure to information occurs. In other words, these tests seek to answer the question: What kinds of fake news are people more likely to fall for? The second category takes a different approach and examines respondents’ individual traits predictive of their susceptibility to disinformation. Put simply, these tests address the broad question of who falls for fake news. Finally, the effects of measures specifically designed to combat the spread of fake news constitute a qualitatively distinct group. Granted, this is a necessarily simplified categorization, as factors do not always easily lend themselves to inclusion into one of these baskets. For example, the effect of a pro-attitudinal message can be seen as a combination of both message-level (e. g. conservative-friendly wording of the headline) and an individual-level predisposition (recipient embracing politically conservative views). For presentation purposes, we base our narrative synthesis of the reviewed evidence on the following categorization: 1) Factors residing entirely outside of the individual recipient (message features, presentation cues, information environment); 2) Recipient’s individual features; 3) Interventions. For each category, we discuss theoretical frameworks that the authors employ and specific study designs.

A fundamental question at the core of many investigations that we reviewed is whether people are generally predisposed to believe fake news that they encounter online. Previous research suggests that individuals go about evaluating the veracity of falsehoods similarly to how they process true information [ 38 ]. Generally, most individuals tend to accept information that others communicate to them as accurate, provided that there are no salient markers suggesting otherwise [ 39 ].

Informed by these established notions, some of the authors whose work we reviewed expect to find the effects of “truth bias,” a tendency to accept all incoming claims at face value, including false ones. This, however, does not seem to be the case. No study under review reported the majority of respondents trusting most fake messages or perceiving false and real messages as equally credible. If anything, in some cases a “deception bias” emerges, where individuals’ credibility judgements are biased in the direction of rating both real and false news as fake. For example, Luo et al. [ 40 ] found that across two experiments where stimuli consisted of equal numbers of real and fake headlines participants were more likely to rate all headlines as fake, resulting in just 44.6% and 40% of headlines marked as real across two studies. Yet, it is possible that this effect is a product of the experimental setting where individuals are alerted to the possibility that some of the news is fake and prompted to scrutinize each message more thoroughly than they would while leisurely browsing their newsfeed at home.

The reviewed evidence of individuals’ overall credibility perceptions of fake news as compared to real news, as well as of people’s ability to tell one from another, is somewhat contradictory. Several studies that examined participants’ accuracy in discerning real from fake news report estimates that are either below or indistinguishable from random chance: Moravec et al. [ 41 ] report a mean detection rate of 43.9%, with only 17% of participants performing better than chance; in Luo et al. [ 40 ], detection accuracy is slightly better than chance (53.5%) in study 1 and statistically indistinguishable from chance in study 2 (49.2%). Encouragingly, the majority of other studies where respondents were exposed to both real and fake news items provide evidence suggesting that people’s average capacity to tell one from another is considerably greater than chance. In all studies reported in Pennycook and Rand [ 25 ], average perceived credibility of real headlines is above 2.5 on a four-point scale from 1 to 4, while average credibility of fake headlines is below 1.6. A similar distance—about one point on a four-point scale—marks the difference between real and fake news’ perceived credibility in experiments reported in Bronstein et al. [ 42 ]. In Bago et al. [ 43 ], participants rated less than 40% of fake headlines and more than 60% of real headlines as accurate. In Jones-Jang et al. [ 44 ], respondents correctly identified fake news 6.35 attempts out of 10.

Following the aggregate-level assessment, we proceed to describing three main groups of factors that researchers identify as sources of variation in perceived credibility of fake news.

Message-level and environmental factors

When apparent signs of authenticity or fakeness of a news item are not immediately available, individuals can rely on certain message characteristics when making a credibility judgement. Two major message-level factors stand out in this cluster of evidence as most frequently tested (see Table 3 ). Firstly, alignment of the message source, topic, or content with the respondent’s prior beliefs and ideological predispositions; secondly, social endorsement cues. Theoretical expectations within this approach are largely shaped by dual-process models of learning and information processing [ 58 , 59 ] borrowed from the field of psychology and adapted for online information environments. These theories emphasize how people’s information processing can occur through either the more conscious, analytic route or the intuitive, heuristic route. The general assumption traceable in nearly every theoretical argument is that consumers of digital news routinely face information overload and have to resort to fast and economical heuristic modes of processing [ 60 ], which leads to reliance on cues embedded in messages or the way they are presented. For example, some studies that examine the influence of online social heuristics on evaluations of fake news’ credibility build on Sundar’s [ 61 ] concept of bandwagon cues, or indicators of collective endorsement of online content as a sign of its quality. More generally, these studies continue the line of research investigating how perceived social consensus on certain issues, gauged from online information environments, contributes to opinion formation (e. g. Lewandowsky et al. [ 62 ]).

Exploring the interaction between message topic and bandwagon heuristics on perceived credibility of fake news headlines, Luo et al. [ 40 ] find that a high number of likes associated with the post modestly increases (by 0.34 points on a 7-point scale) perceived credibility of both real and fake news compared to few likes. Notably, this effect is observed for health and science headlines, but not for political ones. In contrast, Kluck et al. [ 35 ] fail to find the effect of the numeric indicator of Facebook post endorsement on perceived credibility. This discrepancy could be explained by differences in the design of these two studies: whereas in Luo et al. participants were exposed to multiple headlines, both real and fake, Kluck et al. only assessed perceived credibility of just one made-up news story. This may have led to the unique properties of this single news story contributing to the observed result., Kluck et al. further reveal that negative comments questioning the stimulus post’s authenticity do dampen both perceived credibility (by 0.21 standard deviations) and sharing intent. In a rare investigation of news evaluation on Instagram, Mena et al. [ 46 ] demonstrate that trusted endorsements by celebrities do increase credibility of a made-up non-political news post, while bandwagon endorsements do not. Again, this study relies on one fabricated news post as a stimulus. These discrepant results of social influence studies suggest that the likelihood of detecting such effects may be contingent on specific study design choices, particularly the format, veracity, and sampling of stimulus messages. Generalizability and comparability of the results generated in experiments that use only one message as a stimulus should be enhanced by replications that employ stimulus sampling techniques [ 63 ].

Following one of the most influential paradigms in political communication research—the motivated reasoning account postulating that people are more likely to pursue, consume, endorse and otherwise favor information that matches their preexisting beliefs or comes from an ideologically aligned source—most studies in our sample measure the ideological or political concordance of the experimental messages and most commonly use it in statistical models as covariates or hypothesized moderators. Where they are reported, the pattern of direct effects of ideological concordance largely conforms to expectations, as people tend to rate congenial messages as more credible. In Bago et al. [ 43 ], headline political concordance increased the likelihood of participants rating it as accurate (b = 0.21), which was still meager compared to the positive effect of the headline’s actual veracity (b = 1.56). In Kim, Moravec and Dennis [ 50 ], headline political concordance was a significant predictor of believability (b = 0.585 in study 1; b = 0.153 in study 2), but the magnitude of this effect was surpassed by that of low source ratings by experts (b = −0.784 in study 1; b = -0.365 in study 2). In turn, increased believability heightened the reported intent to read, like, and share the story. In the same study, both expert and user ratings of the source displayed alongside the message influenced its perceived believability in both directions. According to the results of the study by Kim and Dennis [ 14 ], increased relevance and pro-attitudinal directionality of the statement contained in the headline predicted increased believability and sharing intent. Similarly, Moravec et al. [ 41 ] argued that the confirmatory nature of the headline is the single most powerful predictor of belief in false but not true news headlines. Tsang [ 55 ] found sizable effects of the respondents’ stance on the Hong Kong extradition bill on perceived fakeness of a news story covering the topic in line with the motivated reasoning mechanism.

At the same time, the expectation that individuals will use the ideological leaning of the source as a credibility cue when faced with ambiguous messages lacking other credibility indicators was not supported by data. Relying on the data collected from almost 4000 Amazon Mechanical Turk workers, Clayton et al. [ 45 ] failed to detect the hypothesized influence of motivated reasoning, induced by the right or left-leaning mainstream news source label, on belief in a false statement presented in a news report.

Several studies tested the effects of factors beyond social endorsement and directional cues. Schaewitz et al. [ 13 ] looked at the effects of such message characteristics as source credibility, content inconsistencies, subjectivity, sensationalism, and the presence of manipulated images on message and source credibility appraisals, and found no association between these factors and focal outcome variables—against the background of the significant influence of personal-level factors such as the need for cognition. As already mentioned, Luo et al. [ 40 ] found that fake news detection accuracy can also vary by the topic, with respondents recording the highest accuracy rates in the context of political news—a finding that could be explained by users’ greater familiarity and knowledge of politics compared to science and health.

One study under review investigated the possibility that news credibility perceptions can be influenced not by the features of specific messages, but by characteristics of a broader information environment, for example, the prevalence of certain types of discourse. Testing the effects of exposure to the widespread elite rhetoric about “fake news,” van Duyn and Collier [ 26 ] discovered evidence that it can dampen believability of all news, damaging people’s ability to identify legitimate content in addition to reducing general media trust. These effects were sizable, with primed participants ascribing real articles on average 0.47 credibility points less than those who haven’t been exposed to politicians’ tweets about fake news, on a 3-point scale.

As this brief overview demonstrates, the message-level approaches to fake news susceptibility consider a patchwork of diverse factors, whose effects may vary depending on the measurement instruments, context, and operationalization of independent and outcome variables. Compared to individual-level factors, scholars espousing this paradigm tend to rely on more diverse experimental stimuli. In addition to headlines, they often employ story leads and full news reports, while the stimulus new stories cover a broader range of topics than just politics. At the same time, out of ten studies attributed to this category, five used either one or two variations of a single stimulus news post. This constitutes an apparent limitation to the generalizability of their findings. To generate evidence generalizable beyond specific messages and topics, future studies in this domain should rely on more diverse sets of stimuli.

Individual-level factors

This strain of research recognizes the differences in people’s individual cognitive styles, predispositions, and conditions as the main source of variation in fake news credibility judgements. Theoretically, they largely rely on dual-process approaches to human cognition as well [ 64 , 65 ]. Scholars embracing this approach explain some people’s tendency to fall for fake news by their reliance, either innate or momentary, on less analytical and more reflexive modes of thinking [ 37 , 42 ]. Generally, they tend to ascribe fake news susceptibility to lack of reasoning rather than to directionally motivated reasoning.

Pennycook and Rand [ 25 ] employ the established measure of analytical thinking, the Cognitive Reflection Test, to demonstrate that respondents who are more prone to override intuitive thinking with further reflection are also better at discerning false from real news. This effect holds regardless of whether the headlines are ideologically concordant or discordant with individuals’ views. Importantly, the authors also find that headline plausibility (understood as the extent to which it contains a statement that sounds outrageous or patently false to an average person) moderates the observed effect, suggesting that more analytical individuals can use extreme implausibility as a cue indicating news’ fakeness.

In a 2020 study [ 37 ], Pennycook and Rand replicated the relationship between CRT and fake news discernment, in addition to testing novel measures—pseudo-profound bullshit receptivity (the tendency to ascribe profound meaning to randomly generated phrases) and a tendency to overclaim one’s level of knowledge—as potential correlates of respondents’ likelihood to accept claims contained in false headlines. Pearson’s r ranged from 0.30 to 0.39 in study 1 and from 0.20 to 0.26 in study 2 (all significant at p<0.001 in both studies), indicating modestly sized yet significant correlations. All three measures were correlated with perceived accuracy of fake news headlines as well as with each other, based on which the authors speculated that these measures are all connected to a common underlying trait that manifests as the propensity to uncritically accept various claims of low epistemic value. The researchers labelled this trait reflexive open-mindedness , as opposed to reflective open-mindedness observed in more analytical individuals. In a similar vein, Bronstein et al. [ 42 ] added cognitive tendencies such as delusion-like ideation, dogmatism, and religious fundamentalism to the list of individual-level traits weakly associated with heightened belief in fake news, while analytical and open-minded thinking slightly decreased this belief.

Schaewitz et al. [ 13 ] linked the classic concept from credibility research, need for cognition, to the tendency to rate down credibility (in some models but not others) and accuracy of non-political fake news. This concept overlaps with analytical thinking from Pennycook and Rand’s experiments, yet distinct in that it captures the self-reported pleasure from (and not just the proneness to) performing cognitively effortful tasks.

Much like the studies reviewed above, experiments by Martel et al. [ 48 ] and Bago et al. [ 43 ] challenged the motivated reasoning argument as applied to fake news detection, focusing instead on the classical reasoning explanation: the more analytic the reasoning, the higher the likelihood to accurately detect false headlines. In contrast to the above accounts, both studies investigate the momentary conditions, rather than stable cognitive features, as sources of variation in fake news detection accuracy. In Martel et al. [ 48 ], increased emotionality (as both the current mental state at the time of task completion and the induced mode of information processing) was strongly associated with the increased belief in fake news, with induced emotional processing resulting in a 10% increase in believability of false headlines. Fernández-López and Perea [ 49 ] reached similar conclusions about the role of emotion drawing on a sample of Spanish residents.

Bago et al. [ 43 ] relied on the two-response approach to test the effects of the increased time for deliberation on perceived accuracy of real and false headlines. Compared to the first response, given under time constraints and additional cognitive load, the final response to the same news items for which participants had no time limit and no additional cognitive task indicated significantly lower perceived accuracy of fake (but not real) headlines, both ideologically concordant and discordant. The effect of heightened deliberation (b = 0.36) was larger than the effect of headline political concordance (b = -0.21). These findings lend additional support to the argument that decision conditions favoring more measured, analytical modes of cognitive processing are also more likely to yield higher rates of fake news discernment.

Pennycook et al. [ 47 ] provide evidence supporting the existence of the illusory truth effect—the increased likelihood to view the already seen statements as true, regardless of the actual veracity—in the context of fake news. In their experiments, a single exposure to either a fake or real news headline slightly yet consistently (by 0.09 or 0.11 points on a 4-point scale) increased the likelihood to rate it as true on the second encounter, regardless of political concordance, and this effect persists after as long as a week.

It is not always how individuals process messages, but how competent they are about the information environment, that affects their ability to resist misinformation. Amazeen and Bucy [ 57 ] introduce a measure of procedural news knowledge (PNK), or working knowledge of how news media organizations operate, as a predictor of the ability to identify fake news and other online messages that can be viewed as deliberately deceptive (such as native advertising). In their analysis, one standard deviation decrease in PNK increased perceived accuracy of fabricated news headlines by 0.19 standard deviation. Interestingly, Jones-Jang et al. [ 44 ] find a significant correlation between information literacy (but not media and news literacies) and identification between fake news stories.

Taken together, the evidence reviewed in this section provides robust support to the idea that analytic processing is associated with more accurate discernment of fake news. Yet, it has to be noted that the generalizability of these findings could be constrained by the stimulus selection strategy that many of these studies share. All experiments reviewed above, excluding Schaewitz et al. [ 13 ] and Fernández-López and Perea [ 49 ], rely on stimulus material constructed from equal shares of real mainstream news headlines and real fake news headlines sourced from fact-checking websites like Snopes . com . As these statements are intensely political and often blatantly untrue, the sheer implausibility of some of the headlines can offer a “fakeness” cue easily picked up by more analytical—or simply politically knowledgeable—individuals, a proposition tested by Pennycook and Rand [ 25 ]. While they preserve the authenticity of the information environment around the 2016 U.S. presidential election, it is unclear what these findings can tell us about the reasons behind people’s belief in fake news that are less egregiously “fake” and therefore do not carry a conspicuous mark of falsehood.

Accuracy-promoting interventions

The normative foundation of much of the research investigating the reasons behind people’s vulnerability to misinformation is the need to develop measures limiting its negative effects on individuals and society. Two major approaches to countering fake news and its negative effects can be distinguished in the literature under review. The first approach, often labelled inoculation, is aimed at preemptively alerting individuals to the dangers of online deception and equipping them with the tools to combat it [ 44 , 56 ]. The second manifests in tackling specific questionable news stories or sources by labelling them in a way that triggers increased scrutiny by information consumers [ 51 , 54 ]. The key difference between the two is that inoculation-based strategies are designed to work preemptively, while labels and flags are most commonly presented to information consumers alongside the message itself.

Some of the most promising inoculation interventions are those designed to enhance various aspects of media and information literacy. Recent studies demonstrated that preventive techniques—like exposing people to anti-conspiracy arguments [ 66 ] or explaining deception strategies [ 67 ]—can help neutralize harmful effects of misinformation before the exposure. Grounded in the idea that the lack of adequate knowledge and skills among news consumers makes people less critical and, thus, more susceptible to fake news [ 68 ], such measures aim at making online deception-related considerations salient in the minds of large swaths of users, as well as at equipping them with basic techniques that help spot false news.

In a cross-national study that involved respondents from the United States and India, Guess et al [ 52 ] find that exposing users to a set of simple guidelines for detecting misinformation modelled after similar Facebook guidelines (e.g., “Be skeptical of headlines,” “Watch for unusual formatting”) improves fake news discernment rate by 26% in the U.S. sample and by 19% in the Indian sample, regardless of whether the headlines are politically concordant or discordant. These effects persist several weeks post-exposure. Interestingly, it might be that the effect is caused not so much by participants heeding the instructions as by simply priming them to think about accuracy. When testing the effects of accuracy priming in the context of COVID-19 misinformation, Pennycook et al. [ 34 ] reveal that inattention to accuracy considerations is rampant: people asked whether they would share false stories appear to rarely consider their veracity unless prompted to do so. Yet, asking them to rate the accuracy of a single unrelated headline before going into the task dramatically improved accuracy and reduced the likelihood to share false stories: the difference in sharing likelihood of true relative to false headlines was 2.8 times higher in the treatment group comparatively to the control group.

On a more general note, the latter finding could suggest that the results of all experiments that include false news discernment tasks could be biased in the direction of more accuracy simply by the virtue of priming participants to think about news’ veracity, compared to their usual state of mind when browsing online news. Lutzke et al. [ 36 ] reach similar results when they prime critical thinking in the context of climate change news, resulting in diminished trust and sharing intentions for falsehoods even among climate change doubters.

A study by Roozenbeek and van der Linden [ 56 ] demonstrated the capacity of a scalable inoculation intervention in the format of a choice-based online game to confer resistance against several common misinformation strategies. Over the average of 15 minutes of gameplay, users were tasked with choosing the most efficient ways of misinforming the audience in a series of hypothetical scenarios. Post-gameplay credibility scores of fake news items embedded in the game were significantly lower than pre-test scores using a one-way repeated measures F(5, 13559) = 980.65, Wilk’s Λ = 0.73, p < 0.001, η 2 = 0.27. These findings were replicated in a between-subjects design with a control group in Basol et al. [ 69 ], although this study was not included in our sample based on formal criteria.

Fact-checking is arguably the most publicly visible format of real measures used to combat online misinformation. Studies in our sample present mixed evidence of the effectiveness of fact-checking interventions in reducing credibility of misinformation. Using different formats of fact-checking warnings before exposing participants to a set of verifiably fake news stories, Morris et al. [ 53 ] demonstrated that the effects of such measures can be limited and contingent on respondents’ ideology (liberals tend to be more responsive to fact-checking warnings than conservatives). Encouragingly, Clayton et al. [ 51 ] found that labels indicating the fact that a particular false story has been either disputed or rated false do decrease belief in this story, regardless of partisanship. The “Disputed” tag placed next to the story headline decreased believability by 10%, while the “Rated false” tag was 13% effective. At the same time, in line with van Duyn and Collier [ 26 ], they showed that general warnings that are not specific to particular messages are less effective and can reduce belief in real news. Finally, Garrett and Poulsen [ 54 ], comparing the effects of three types of Facebook flags (fact-checking warning; peer warning; humorous label) found that only self-identification of the source as humorous reduces both belief and sharing intent. The discrepant conclusions that these three studies reach are unsurprising given differences in format and meaning of warnings that they test.

In sum, findings in this section suggest that the general warnings and non-specific rhetoric of “fake news” should be employed with caution so as to avoid the outcomes that can be opposite to the desired effects. Recent advances in scholarship on the backfire effect of misinformation corrections have called into question the empirical soundness of this phenomenon [ 70 , 71 ]. However, multiple earlier studies across several issue contexts have documented specific instances where attitude-challenging corrections were linked to compounding misperceptions rather than rectifying them [ 72 , 73 ]. Designers of accuracy-promoting interventions should at least be aware of the possibility that such effects could follow.

Overall, while the evidence of the effects of labelling and flagging specific social media messages and sources remains inconclusive, it appears that priming users to think of online news’ accuracy is a scalable and cheap way to improve the rates of fake news detection. Gamified inoculation strategies also hold potential to reach mass audiences while preemptively familiarizing users with the threat of online deception.

We have applied a scoping review methodology to map the existing evidence of the effects various antecedents to people’s belief in false news, predominantly in the context of social media. The research landscape presents a complex picture, suggesting that the focal phenomenon is driven by the interplay of cognitive, psychological and environmental factors, as well as characteristics of a specific message.

Overall, the evidence under review speaks to the fact that people on average are not entirely gullible, and they can detect deceitful messages reasonably well. While there has been no evidence to support the notion of “truth bias,” i.e., people’s propensity to accept most incoming messages as true, the results of some studies in our sample suggested that under certain conditions the opposite—a scenario that can be labelled “deception bias”—can be at work. This is consistent with some recent theoretical and empirical accounts suggesting that a large share of online information consumers today approach news content with skepticism [ 74 , 75 ]. In this regard, the problem with fake news could be not only that people fall for it, but also that it erodes trust in legitimate news.

At the same time, given the scarcity of attention and cognitive resources, individuals often rely on simple rules of thumb to make efficient credibility judgements. Depending on many contextual variables, such heuristics can be triggered by bandwagon and celebrity endorsements, topic relevance, or presentation format. In many cases, messages’ concordance with prior beliefs remains a predictor of increased credibility perceptions.

There is also consistent evidence supporting the notion that certain cognitive styles and predilections are associated with the ability to discern real from fake headlines. The overarching concept of reflexive open-mindedness captures an array of related constructs that are predictive of propensity to accept claims of questionable epistemological value, an entity of which fake news is representative. Yet, while many of the studies focusing on individual-level factors demonstrate that the effects of cognitive styles and mental states are robust across both politically concordant and discordant headlines, the overall effects of belief consistency remain powerful. For example, in Pennycook and Rand [ 25 ] politically concordant items were rated as significantly more accurate than politically discordant items overall (this analysis was used as a manipulation check). This suggests that individuals may not be necessarily engaging in motivated reasoning, yet still using belief consistency as a credibility cue.

The line of research concerned with accuracy-improving interventions reveals limited efficiency of general warnings and Facebook-style tags. Available evidence suggests that simple inoculation interventions embedded in news interfaces to prime critical thinking and exposure to news literacy guidelines can induce more reliable improvements while avoiding normatively undesirable effects.

Conclusions and future research

The review highlighted a number of blind spots in the existing experimental research on fake news perceptions. Since this literature has to a large extent emerged as a response to particular societal developments, the scope of investigations and study design choices bear many contextual similarities. The sample is heavily skewed toward the U.S. news and news consumers, with the majority of studies using a limited set of politically charged falsehoods for stimulus material. While this approach enhances external validity of studies, it also limits the universe of experimental fake news to a rather narrow subset of this sprawling genre. Future studies should transcend the boundaries of the “fake news canon” and look beyond Snopes and Politifact for stimulus material in order to investigate the effects of already established factors on perceived credibility of misinformation that is not political or has not yet been debunked by major fact-checking organizations.

Similarly, the overwhelming majority of experiments under review seek to replicate the environment where many information consumers encountered fake news during and after the misinformation crisis of 2016, to which end they present stimulus news items in the format of Facebook posts. As a result, there is currently a paucity of studies looking at all other rapidly emerging venues for political speech and fake news propagation: Instagram, messenger services like WhatsApp, and video platforms like YouTube and TikTok.

The comparative aspect of fake news perceptions, too, is conspicuously understudied. The only truly comparative study in our sample [ 52 ] uncovered meaningful differences in effect sizes and decay time between U.S. and Indian samples. More comparative research is needed to specify whether the determinants of fake news credibility are robust across various national political and media systems.

Two methodological concerns also stand out. Firstly, a dominant approach to constructing experimental stimuli rests on the assumption that the bulk of news consumption on social media occurs on the level of headline exposure—i.e. users process news and make sharing decisions based largely on news headlines. While there are strong reasons to believe that it is true for some news consumers, others might engage with news content more thoroughly, which can yield differences in effects observed on the headline level. Future studies could benefit from accounting for this potential divergence. For example, researchers can borrow the logic of Arceneaux and Johnson [ 76 ] and introduce an element of choice, thus enabling comparisons between those who only skim headlines and those who prefer to click on articles to read.

Finally, the results of most existing fake news studies could be systematically biased by the mere presence of a credibility assessment task. As Kim and Dennis [ 14 ] argue, browsing social media feeds is normally associated with a hedonic mindset, which is less conducive to critical assessment of information compared to a utilitarian mindset. This is corroborated by Pennycook et al. [ 34 ] who show that people who are not primed to think about accuracy are significantly more likely to share false news. A small credibility rating task produces massive accuracy improvement, underscoring the difference that a simple priming intervention can make. Asking respondents to rate credibility of treatment news items could work similarly, thus distorting the estimates compared to respondents’ “real” accuracy rates. In this light, future research should incorporate indirect measures of perceived fake and real news accuracy that could measure the focal construct without priming respondents to think about credibility and veracity of information.

Limitations

The necessary conceptual and temporal boundaries that constitute the framework of this review can also be viewed as its limitation. By focusing on a specific type of online misinformation—fake news—we intentionally excluded other variations of deceitful messages that can be influential in the public sphere, such as rumors, hoaxes, conspiracy theories, etc. This focus on the relatively recent species of misinformation led us to apply specific criteria to the stimulus material, as well as to limit the search by the period beginning in 2016. Since belief in both fake news and adjacent genres of misinformation could be driven by same mechanisms, focusing on just fake news could result in leaving out some potentially relevant evidence.

Another limitation is related to our methodological criteria. We selected studies to review based on the experimental design. Yet, the evidence of how people interact with misinformation may also be generated from questionnaires, behavioral data analysis, or qualitative inquiry. For example, recent non-experimental studies reveal certain demographic characteristics, political attitudes or media use habits associated with increased susceptibility to fake news [ 77 , 78 ]. Finally, our focus on articles published in peer-reviewed scholarly journals means that potentially relevant evidence that appeared in formats more oriented toward practitioners and policymakers could be overlooked. Future systematic reviews can present a more comprehensive view of the research area by expanding their focus beyond the exclusively “news-like” online misinformation formats, relaxing methodological criteria, and diversifying the range of data sources.

Funding Statement

The research was supported by the Russian Scientific Fund Grant № 19-18-00206 (2019–2021) at the National Research University Higher School of Economics. Funder website: https://grant.rscf.ru/enexp/ The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Data Availability

  • PLoS One. 2021; 16(6): e0253717.

Decision Letter 0

24 Mar 2021

PONE-D-21-00513

Determinants of individuals’ belief in fake news: A scoping review

Dear Dr. Bryanov,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

==============================

The expert Reviewers gave generally favorable opinions on the manuscript. However, revision is needed especially to better justify inclusion/exclusion of sources, structure and clarification of some methodological and discussion choices. 

Please submit your revised manuscript by May 08 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at gro.solp@enosolp . When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see:  http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

We look forward to receiving your revised manuscript.

Kind regards,

Stefano Triberti, Ph.D.

Academic Editor

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. Please ensure that you refer to Figure 1 in your text as, if accepted, production will need this reference to link the reader to the figure.

3. We note you have included a table to which you do not refer in the text of your manuscript. Please ensure that you refer to Table 2 in your text; if accepted, production will need this reference to link the reader to the Table.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Partly

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: N/A

Reviewer #2: N/A

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #2: Yes

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: This is a very useful paper. It is also clear and well written. A pleasure to review.

I’m not familiar with the methodologies used to do systematic reviews, thus I cannot properly evaluate the rigor of their methodology. However, from what I understood, their methodology is sound and robust.

I have two minor comments and two personal comments. But the manuscript can be published as is.

Minor comments:

P24: “In sum, findings in this section suggest that the general warnings and non-specific rhetoric of “fake news” should be employed with caution so as to avoid the possible backfire effect.” ---> What do the authors mean by backfire effect? It is used in the literature to mean a lot of different things.

P26: “The line of research concerned with accuracy-improving interventions reveals limited efficiency of general warnings and Facebook-style tags and suggests that simple interventions embedded in news interfaces to prime critical thinking and exposure to news literacy guidelines can induce more reliable improvements while avoiding normatively undesirable backfire effects.” ---> Add commas (e.g. after “tags”) otherwise there is no space to breathe.

Personal comments:

The “truth bias” is, I believe, psychologically implausible (Sperber, D., Clément, F., Heintz, C., Mascaro, O., Mercier, H., Origgi, G., & Wilson, D. (2010). Epistemic vigilance. Mind & Language, 25(4), 359-393). However, it is, for some reason, very influential in the literature on misinformation. The authors’ result regarding the absence of truth bias could be mentioned in the discussion. Indeed, this implausible assumption could influence the way researchers design their experiments and frame their results.

On the other hand, the “deception bias” makes sense in light of what we know about trust in the digital age: the problem is not that people trust fake news sources too much but rather that they don’t trust good sources enough (e.g. Fletcher, R., & Nielsen, R. K. (2019). Generalised scepticism: how people navigate news on social media. Information, Communication & Society, 22(12), 1751-1769 ; Altay, S., Hacquin, AS. & Mercier, H. (2020) Why do so Few People Share Fake News? It Hurts Their Reputation. New Media & Society; Pennycook, G., & Rand, D. G. (2019). Fighting misinformation on social media using crowdsourced judgments of news source quality. Proceedings of the National Academy of Sciences, 116(7), 2521-2526.; Mercier, H. (2020). Not born yesterday: The science of who we trust and what we believe.).

Reviewer #2: Review of PONE-D-21-00513

Reviewers: Stephan Lewandowsky and Muhsin Yesilada

Overall, the paper has clear importance; identifying the determinants of fake news beliefs can have useful implications for targeted interventions. The authors mapped out factors that could affect the outcome variables set out in the included studies. These factors were: message-level factors, individual-level factors, and intervention & ecological factors (although it was hard to determine how they identified these three factors).

The research team decided to use precedents to guide their scoping review (such as the PRISMA review guidelines). Still, there were major issues with the presentation of evidence. For example, certain aspects of the methodology would have benefited from being in the results or even discussion section. There was also a lack of comprehensive coverage of certain research areas (such as fake news interventions). These issues are described below.

Overall, the submission requires major revision before a favourable opinion can be given.

Major Points:

1. The number of studies included in the scoping review that investigate fake news interventions was limited. The intervention section highlights the prominence of inoculation-based research in this context; however, we noticed some studies that could have been included but that were not (e.g., Basol, Roozenbeek, & van der Linden, 2020; Roozenbeek & van der Linden, 2019). These studies should be included to provide a comprehensive overview of the research area.

2. The discussion needs further organization into subsections to make it clear to the reader where to locate information. There is no real conclusion subsection which makes it difficult to tie together the report's implications and findings. Also, much of the methods section (page 7-9) seems to assess and evaluate the included studies' methodological decisions rather than describing the review's methodology. Although this information is valuable, it is perhaps better suited for the results or discussion section.

3. It is not entirely clear how the research team identified the three-factor groups (message-level factors, individual-level factors, and intervention factors). Were these groups based on a precedent, or is there consensus in the research area that factors can be categorized into these three groups? It is important to have this information to justify the methodology and determine if any potentially important factors were missed. Also, the intervention factor group is paired with ecological factors - it is not entirely clear what the research team means by "ecological factors".

4. Table three, which sets out the key methodological aspects and results of the included studies, needs more procedural information. At present, it is not easy to interpret how the included studies might have arrived at their results. This information could provide a more comprehensive summary of the research area, particularly for people who might want to know more about the commonalities amongst procedures across studies.

Detailed Comments

[page]:[para]:[line]

2:1:3 The end of this sentence requires a citation. A study or report that has mapped out propaganda, misinformation, and deception in the public sphere over time would be relevant to cite here.

2:1:6-7 The authors state, "a lack of consensus over the scale and persuasiveness of the phenomenon". However, it is not clear what they are referring to. We assume they are discussing the persuasiveness of misinformation in general. It is also unclear how the references they cited support such a claim.

3:1:1-5 The authors refer to a "massive spread of politically charged deception". To a certain extent, the word massive is subjective and does not point to the problem's true extent. With that in mind, some statistics or figures would be helpful.

3:1:7 The authors note the "hypothesized societal effects" of deceitful messages; however, they do not explain what these societal effects might be. There are some studies out there that have investigated the causal effects of misinformation. These studies might be a good idea to cite to set the scene for the study. See below for a selection:

Schaub, M., & Morisi, D. (2020). Voter mobilisation in the echo chamber: Broadband internet and the rise of populism in europe. European Journal of Political Research, 59(4), 752–773. https://doi.org/10.1111/1475-6765.12373

Bursztyn, L., Egorov, G., Enikolopov, R., & Petrova, M. (2019). Social media and xenophobia: evidence from Russia (No. w26567). National Bureau of Economic Research.

Allcott, H., Braghieri, L., Eichmeyer, S., & Gentzkow, M. (2020). The welfare effects of social media. American Economic Review, 110(3), 629–76. DOI: 10.1257/aer.20190658

3:2:4 Citation needed to support this claim.

3:2:6 The authors refer to a "focal issue", but it is not entirely clear what the focal issue is.

4:1:5 The Sentence started with "because", but because of what? Consider re-writing for clarity.

4:3:2 The authors use the term "inductively developed framework", it would be good if the authors described what this means.

5 The eligibility criteria would benefit from being placed in a table. At present, the criteria are embedded in the text, and it does not make it easy for the reader to identify the information.

5:1:2 The time frame for the included studies should be explained. We assume the time frame for the included studies starts in 2016 to coincide with the initial Trump presidential campaign, but this an assumption. Further clarification would help.

5:1:7-12 This Sentence is far too long, consider rewording or breaking it down into several sentences.

5:1:12-14 The search started from studies already known to the researchers; cite this here for clarity.

5:2 This sentence is too long and would benefit from shortening. Also, the paragraph states that the trio of databases would most likely yield the most comprehensive results - but why? This needs to be clearly explained.

6:3:4 The authors do not explain why they chose these outcome variables.

8:1:1 The authors wrote, "As visible from the table", but do not state which table they refer to.

10:1 This paragraph would be a good place to explain how the factor groups were identified.

10:2:1 Avoid using rhetorical questions.

11:1:3-5 This Sentence is wordy and unclear; consider re-writing.

11:2:10-12 It is unclear what these numbers mean concerning the scale.

16:1:3 The authors note that "two major message level factors stand out"; however, it is not explained why these two stand out in particular.

17:1:1-5 This sentence is too long, consider rewording or breaking it down into several sentences.

17:1:7-10 This sentence is not clear on its own. Another sentence is needed to explain why these methodological differences lead to differing results.

17:1:16-18 It is argued that the differences in findings might be down to different study design choices - however, this needs more unpacking. The sentence alone does not explain why.

17:1:1-5 sentence is too long; consider rewording.

19:2:2 The authors use the terms "vary dramatically"; however, it is unclear what this means exactly; some figures or further quantification would be handy here.

19:2:8 The authors note an apparent limitation, but further explanation is needed to determine why it is a limitation.

19:3:3-4 Citation needed.

19:3:4-6 Citation needed.

20:2:5-7 The authors state that the correlations are statistically significant but do not provide an indicator of significance.

23:1:2 Citation needed.

23:2:2-5 Citation needed.

23:3:1 What were the guidelines?

23:3:11-12 "2.8 times less people were willing to share fake news following the treatment than before the treatment." - It is not clear what this statistic means and how it was identified.

24:3:9-10 What are the flags? We assume they are materials in a study but this is not entirely clear.

25:2:1-3 The authors discuss avoiding backfire effects, but research surrounding backfire effects is complicated. The current understanding is that backfire effects are not nearly as much of a concern as once thought - these recent findings should be reflected in this paragraph. (e.g., see Swire et al., 2020, DOI: 10.1016/j.jarmac.2020.06.006).

27:2:2 What is meant by a "common decision environment"? A definition here would be useful.

6. PLOS authors have the option to publish the peer review history of their article ( what does this mean? ). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy .

Reviewer #1:  Yes:  Sacha Altay

Reviewer #2:  Yes:  Muhsin Yesilada and Stephan Lewandowsky

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool,  https://pacev2.apexcovantage.com/ . PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at  gro.solp@serugif . Please note that Supporting Information files do not need this step.

Author response to Decision Letter 0

14 May 2021

Dear Drs. Altay, Lewandowsky, and Yesilada,

We are extremely grateful for your insightful and generously detailed feedback to our work. Based on your comments and suggestions, we have introduced some major changes to our report’s structure and evidence presentation. We believe that what resulted from this collaborative effort is a significantly improved manuscript. Our detailed responses to your comments, in the order we have received them, are listed in the table that can be found in an enclosed file entitled Response to Reviewers. We hope that you will find these responses sufficient.

The authors.

Submitted filename: Response to reviewers.docx

Decision Letter 1

PONE-D-21-00513R1

Some minor modifications have been suggested by the previous Reviewers. I believe these could be added in short time to improve the completeness of the manuscript. 

Please submit your revised manuscript by Jul 16 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at gro.solp@enosolp . When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see:  http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols . Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at  https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols .

Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: All comments have been addressed

Reviewer #2: All comments have been addressed

2. Is the manuscript technically sound, and do the data support the conclusions?

3. Has the statistical analysis been performed appropriately and rigorously?

4. Have the authors made all data underlying the findings in their manuscript fully available?

Reviewer #1: No

5. Is the manuscript presented in an intelligible fashion and written in standard English?

6. Review Comments to the Author

Reviewer #1: The authors did a great job at addressing my (minor) comments, I am now satisfied with the manuscript.

It's too bad that the very small scale of the fake news problem is not mentioned in the introduction (e.g. Allen et al. 2020) but the article is, I believe, good enough to be published as is.

Finally I would like to thank the authors for their work, it's a very useful paper!

Reviewer #2: Review of MS PONE-D-21-00513-R1

by Bryanov & Vziatysheva

Reviewer: Stephan Lewandowsky

Summary and Overall Recommendation

The paper is clearly important; identifying the determinants of fake news beliefs can have implications for successful interventions. The authors mapped out factors that could affect the outcome variables set out in the included studies. These factors were: message-level factors, individual-level factors, and intervention & ecological factors (although it was hard to determine how

they identified these three factors).

I reviewed the paper at the previous round (together with a PhD student whom I did not consult at this round to save time). Our judgment was positive in principle, but we requested major revisions, in particular relating to (1) the small number of studies; (2) clarity of the discussion and (3) the three factors being identified; and (4) expansion of the main Table (Table 3 in the original submission).

The revision has addressed these points and I found the manuscript to be much improved and (nearly) ready for publication, subject ot the minor comments below.

Detailed comments

165 I am not entirely clear why “cognitive science research on false memory recognition” would be “obviously irrelevant”?

173 Does “non-experimental” mean the authors excluded correlational studies? I would have thought most individual-differences research may involve correlational studies that do not include an experimental intervention. Perhaps the authors mean “non-empirical”? If they did exclude correlational studies I would be curious to know why.

23 The authors may be interested in DOI 10.3758/s13421-019-00948-y as another demonstration of social influences (although it is only indirectly related to fake news because the study compares pro- and anti-science blog posts).

387 Insert paragraph break before “As this…”.

392-394 This sentence is ungrammatical.

504 “the Indian sample” pops out of nowhere—this deserves a bit more explanation. Why India? What can be learned from this?

579 “messages” should be plural?

7. PLOS authors have the option to publish the peer review history of their article ( what does this mean? ). If published, this will include your full peer review and any attached files.

Reviewer #2:  Yes:  Stephan Lewandowsky

Author response to Decision Letter 1

Dear Drs. Altay and Lewandowsky,

We are grateful for your continued contributions to the improvement of our work. The revised manuscript addresses each comment you have raised in the latest round of review. Further details on the changes we have made can be found in the table appended to the enclosed file, titled Response to reviewers. We hope that you will deem the resulting manuscript fit for publication.

Decision Letter 2

11 Jun 2021

PONE-D-21-00513R2

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/ , click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at gro.solp@gnillibrohtua .

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact gro.solp@sserpeno .

Additional Editor Comments (optional):

Acceptance letter

16 Jun 2021

Determinants of individuals’ belief in fake news: A scoping review Determinants of belief in fake news

Dear Dr. Bryanov:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact gro.solp@sserpeno .

If we can help with anything else, please email us at gro.solp@enosolp .

Thank you for submitting your work to PLOS ONE and supporting open access.

PLOS ONE Editorial Office Staff

on behalf of

Dr. Stefano Triberti

  • Home Products
  • Small Economic 1-49 employees
  • Means Business 50-999 employees
  • Enterprise 1000+ employees

Methods to Id Fake News

fake news assignment answer key

What is fake news?

Fake news refers to false conversely misleading info which masquerades than legitimate news. Generally, fake news falling into double categories:

  • Deliberately inaccurate stories – so is, the people publishing them know them to be false but publish diehards anyway. Those might be to manipulate open opinion or to drive traffic to a specific website. Fake news detection in social media
  • Stories that control elements von truth but are broadly inaccurate. That might be because that writers hasn’t checked all their daten or has exaggerated certain inside to make a particular point.

Misinformation isn’t a new event – the term “fake news” was actually used in the 19 tenth century – but the internet press soI don’cial media have transposed how it’s created plus spread. Pre-internet, people tended up receive their what since trusted media sources whose media were mandatory until continue strict ciphers of practice. The internet enabled new ways to broadcast, share and consume news and information, with relatively little regulation or editorial standards. Many folks now swallow news from social media real other live sources – but it’s not always easy to determine which stories are believed and which are false.

Types of fake news

At are different classes starting fake news, depending on the motivation of this who create it. For case:

Sensationalism promote, and monstrous other weird stories also distorted images drive clicks the shares online. Clickbait relates up stories deliberately designed to get more website visitors and increase advertising revenue forward that website owners – frequent at the expense of true and accuracy. How to spot fake news: Identifying promotional, lampoon, and false info | SFU Reference

This relate to false or distortive stories written to deceive audiences and promote a political agenda or biased perspective.

Poor q journalism

Sometimes, journalists don’t have time to check all their tatsache before publishing, leading to genuine mistakes decorous fakes news. However, trusted new sources will correct errors at their stories and be transparent with primers as they’ve got things evil.

Misleading headlines

Whenever an story may subsist broadly true, but a sensationalist or misleading headline is used up entice readers on click on it. All can lead to sham news – since usually only the caption and small cuttings of the article are displayed on social communications, where computer cannot quickly spread. View Judging - fake_news_activity (1).docx from SCIENCE 288D during Steffen Point Domain Senior High. Detecting Fake Recent Assignment Vaccines What has Faking News? Faking news sites are designed to look favorite

Imposter content

To is when genuine news sources are impersonated are false, made-up stories to deceive or mislead audiences.

Satirical or caricature

Some falsified what is published for entertainment value. Available example, satirical stories use humor, irony, or exaggeration to jokes about the news instead famous people. These stories don't attempt at mislead audiences because they aren't imply to be taken seriously. Notable examples of saturant websites include The Bunch and The Newspaper Mash .

High-profile politicians have been known to dismiss stories handful agree from – which could is factual and verified – as “fake news”. Because the term “fake news” shall expansive and applies different things to different people, it can be contested. In 2018, the British Government banned the term from official papers or documents , claiming this was too badly defined to be meaningful. Instead, is prefers to use the terms "misinformation " plus "disinformation " when describing wrong stories:

  • Disinformation – fake or misleading fictions created and shared deliberately, often by an writer who might must a financial or political motive to do so.
  • Misinformation – this also means fake or misleading stories, but in this case, this stories may not have been deliberately created or shared with the intention to mislead.

Method doing fake news work?

Fake what is often spread through fake news websites, this, in an attempt to gain credibility, often emulate realistic company sources. According to research, social media enables counterfeit claims up dispersion quickly – learn quickly, in fact, than real news. Fake news spreads sofort why it’s typically designed to grip attention and call to emotions – which are why it often functionality outlandish compensation or stories which provoke anger or fear. Hate speech, populism and counterfeit news on public media – towards an EU response. ... key assessment criteria. ... can-you-spot-the-real-fake-news-story-quiz ) ɒ ...

Public media feeds often prioritize content stationed on engagement metrics – that is, how often it’s shared and liked – rather than how accurate or well-researched it shall. Which approach cans allow clickbait, hyperbole, the misinformation to scatter widely. Social media companies are seen as stage rather than publishers, which means yours don’t have the same law liabilities as traditional print outlets – although the may change as the political and legally flat advances.

Community media bots can spread fake news since they mass produce and spread articles, regardless of the credibility of their sources. Bots can created falsified accounts online, which then gain devotee, appreciation, plus authority – some of which are programmed until spread misinformation.

Trolls – internet users any deliberately try to start arguments with upset people – also games a part in spreadability faking news. Often your can be paid to do so for political reason. The terms “troll farm” or “troll factory” are sometimes used inbound this context till refer to institutionals classes of trolls who attempt to interfere inbound political decision-making.

Fake news sometimes involves the use von Deepfakes . These exist fake videos created using digital programme, machine learning, and face-swapping. Image are combined to create modern footage which shows events or actions that never actually was location. The results can be very convincing and difficult in identify than false.

Fake news examples

Fake news examples

Coronavirus fake news

The Covid-19 disease provided fertile ground for false information online, with numerous examples of falsification news throughout which crisis. A persistent show of fake news the social type made aforementioned claim that 5G technology was linking to the spread of the virus – supposedly why 5G restrained the invulnerable system while the virus communicated the radio waves. This expenses were not true and were recurrent debunked by official sources but were still shared extensively.

US presidential election in 2016

Fake news additionally misinformation became a big edit through the US election in 2016, with falsely and leading claims across the political spectrum. One study promoted is an large proportion of fake news generated in an election was created by youth in Macedonia , who found the learn hyper-partisan history they created, the more people clicked through and shared, and the moreover dollars they made such a effect.

Boston Marathon bombing

In the wake of who Boston Marathon bombing at 2013, false requirements that the incendiary was an elaborate ruse shown by to US government circulated online. In that wake of many terrorist events across the worlds, conspiracy theories are often popular. And notion that they are "false flag" operations – i.e., wore out by to state other ampere secret kabinett in pin the blame on others other provide cover with other services – remains a common tip. “Fake news” refers to intentionally false, misleading, press exaggerated stories disguised as factual news. Such stories can view in any medium aber appear frequently on social media, where misinformation can be shared weite and rapidly. Fake news may been challenging to ad, so your need to be on high alert when her view news from your social accounts. In who following activity, you'll apply kritikerin questions to help you separate real recent for fake news.

Jim Jong-un – the sexiest man alive?

In 2012, biting webpage One Onion ran an article request the Near Asian dictator Kim Jong-un had been voted the sexiest man alive, declaring such "the Pyongyang-bred heartthrob is every woman's dream come true." In an example of instructions satire can often be misunderstood overall cultures, publications in China – including the online version of China's Communist Company newspaper – reported the claim as but it were true. Share free recaps, lecture notes, review prep press more!!

What are the dangers of fake news?

People often make important decisions – for examples, how to vote in an election or what medical treatment to follow when they’re ill – based on what they read in the news. That’s why trusted messages the so important. The dangers in fake news include: Lesson Plan: Fighting Fake News - KQED

  • When people can’t distinction between real and counterfeit daily, he creates confusion furthermore misunderstanding about important social and political issues. When people have a generalized sense away "you can't believe something you read," it undermines gesamtansicht trust in legitimate news sources.
  • Fake and misleading my relating to medical treatments press major illness – such for cancer alternatively Covid-19 – couldn lead to individuals making misinformed decisions about their health.
  • A lots of fake news is designed till stir up and stronger social battle. When differently flanks of an argument have their own ‘facts’, it leads to greater polarization within societies and can touch electoral outcomes. “Fake news” referenced to intentionally false, misleading, or exaggerated tales disguised as factual news. Such stories can appear in any middle not appear frequently off community media, where
  • Universities and colleges expect students at use quality sources of information for assignments. Students who use sources with false or misleading information ability receiver lower grades.

Wie to identify falsify news

You may be marveling how to identify artificial news on Facebook and other social media sites? Like a student, methods to avoid fake news? Or instructions toward avoid accidentally sharing misinformation view? Here are ten side to identify misinformation, recognize falsify news websites, also think before you shared:

1. Check the sources:

Check the web address on the page you're looking at. Sometimes, pretend news localities may have spelling errors in one URL or use less conventional home extensions such as ".infonet" or ".offer". If you are uncommon with and country, look in the About Us teilgebiet. Keywords: fake news, false information, deceptions capture ... media users into an emotional response (Shu et al., ... It is important that we have ...

2. Check the novelist:

Research them to see whenever they are credible – for example, are their real-time, do they have one good reputation, are they writing about them specific scope of mastery, additionally do they have a particular agenda? Consider what the writer’s motivation might be. Formation and news panelists weigh in go media literacy and that importance is own students double-check sources and information, especially those online, available doing research.

3. Check other sources:

Been other reputable company or media outlets reporting on the story? Are credible sources cited within the story? Professional around news agencies have editorial guidelines and extensive resources for fact-checking, so when they are also reporting the story, that’s a healthy sign. (From Grist, a nonprofit independent media organisation focussing on climate solutions, Morning 30 2021.) ...

4. Maintain adenine critical mindset:

A lot of fake news is cleverly written till provoke strong emotional reactions such as fear or anger. Maintain a critical mindset by asking yourself – why has this story been written? Is it encouraging a particular cause or agenda? Your e trying in make me click through to another website? Fake News Employment - biology notes/study guide - Detecting Fake News Assignment Vaccines What is - Studocu

5. Check the related:

Likely news books will inclusive plenty of facts – data, statistics, quotes upon experts, and that on. If these are missing, enter why. My with false information usually contain incorrect data or altered timelines, so it’s a good think to check when the article has publish. Belongs it a current press old news story? ... spot fake news or images, which often appearance ... Pre-teach key ... Individually otherwise in small groups, students read Of Lowdown post on fake news in ordering to answer.

6. Check the add:

Even if the article or video is legitimate, the comments below may not be. Often links instead comments published in response to content cans be autogenerated by bots instead people hired to put our misleading or confusing information.

7. Check your custom biases:

We all have preconceptions – can these be influencing the manner you respond to the article? Social communications can create echo rooms by suggesting legends that match your existing browsing habits, interests, furthermore opinions. The continue we read from diverse sources and perspectives, the more likely it is that we can draw accurate consequences. Fake News Lightmen.group - Name: Class: Date: Detecting Fake News Assignment Cervical What is Fake News? Pretend word sites are designed to look | Track Hero

8. Impede when it’s a joke:

Cutting websites exist popular, and sometimes it is not always clear whether one story a pure ampere joke or parody. Review the website to see if it’s known by satire or make funny legends.

9. Check images were authentic:

Images you visit on social media could have been edited or manipulated. Workable signs include warp – find straight lines in one background now appear wavy – as well as strange shadows, jagged edges, otherwise skin tone the looks too perfect. Take in mind, too, that an image may be accurate but simply used within ampere confusing contextual. You can usage tools such as Google’s Reverse Image Search to check where an image originates from also whether it possess since altered.

10. Use a fact-checking site:

Some of the best known include:

  • BBC Reality Check

Fake news relies on believers reposting, retweeting, or otherwise sharing false information. If you're does sure whether an category is authentic or doesn, pause and think front you how. To help stay sure online, using an antivirus solution like Kaspersky Total Security , which shelters you away hackers, diseases, malware, or other online threat.

Related Articles:

  • Deepfake And Fake Videos
  • What Is Doxing
  • Most Secure Messaging Apps
  • How to Avoid Social Engineering Attacks

How to Identifies Fake News

Featured articles.

https://content.kaspersky-labs.com/fm/press-releases/b1/b12706237db9168a6f6232931b08f029/processed/shutterstock2290547767-q75.jpg

Lives ChatGPT safe to use? What you need to know

https://content.kaspersky-labs.com/fm/press-releases/44/44d5b825d65fec13a552b80565de0b40/processed/shutterstock1967561386-q75.jpg

Select to Strengthen your Cryptocurrency Product?

https://content.kaspersky-labs.com/fm/press-releases/68/6801de54820b1413492b9d106bc2d9c4/processed/shutterstock556318951-q75.jpg

Email Security for Small Businesses

https://content.kaspersky-labs.com/fm/press-releases/1a/1aa0c88d19fc3c520330303e06c0c826/processed/shutterstock2309391977-q75.jpg

How to use cryptocurrency safely: A guide to cryptocurrency safety

https://content.kaspersky-labs.com/fm/press-releases/a7/a7888111840693b1168f2349a75319c7/processed/shutterstock2104491302-q75.jpg

What Are NFT Rug Pulls? How Till Protect Yourself Upon NFT Fraud?

fake news assignment answer key

Lesson 3: Bias

Lesson plan.

icon for all high school resources

Bias can sneak into any news story, influencing an unsuspecting audience. Bias often seems like the boogeyman of the news industry, but is it really so terrifying? This lesson strips the fear out of bias by showing students how to notice the word choices and framing that show up when bias is present in a news story. Students learn about methods journalists use to produce high-quality objective reporting to see how journalists address bias and present stories from neutral viewpoints.

Web Activity Link:  https://www.icivics.org/node/2518289  (To assign, go here .)

Got a 1:1 classroom?  Download fillable PDF versions of this lesson's materials below!

This resource was created with support from the Raab Family Foundation.

Pedagogy Tags

ELA-literacy Icon

Teacher Resources

Get access to lesson plans, teacher guides, student handouts, and other teaching materials.

fake news assignment answer key

  • Bias_Lesson Plan.pdf
  • Bias_StudentDocs.pdf

I find the materials so engaging, relevant, and easy to understand – I now use iCivics as a central resource, and use the textbook as a supplemental tool. The games are invaluable for applying the concepts we learn in class. My seniors LOVE iCivics.

Lynna Landry , AP US History & Government / Economics Teacher and Department Chair, California

Related Resources

Lesson 1: journalism.

kami

Lesson 2: Misinformation

Lesson 4: opinion & analysis, mini-lesson a: monetization.

Minilesson tag icon

Mini-Lesson B: Satire

Mini-lesson c: algorithms & you, mini-lesson d: privacy policies & you, newsfeed defenders.

icon for all middle school resources

NewsFeed Defenders Extension Pack

Writing Icon

See how it all fits together!

IMAGES

  1. Detecting Fake News

    fake news assignment answer key

  2. Fake News Essay

    fake news assignment answer key

  3. Fake News Simulation Assignment.docx

    fake news assignment answer key

  4. Fake News Writing Assignment.pdf

    fake news assignment answer key

  5. Fake News Essay Assignment Instructions

    fake news assignment answer key

  6. Fake News: Media Literacy Writing Unit *Distance Learning* in 2021

    fake news assignment answer key

VIDEO

  1. ATUALIDADES: TUDO SOBRE AS FAKES NEWS

  2. Fake News Final Assignment ID 808950

  3. Never make this mistake

  4. Social Media and Fake News Essay writing in English 200 words

COMMENTS

  1. fake news Flashcards

    fake news. 4.3 (3 reviews) Get a hint. What are the three definitions of fake news? Click the card to flip 👆. Lies spread for political agenda or for profit. Fake news is new news that people do not want to believe. False stories that appear to be news. Click the card to flip 👆.

  2. 5 Activities to teach your students how to spot fake news

    As a homework assignment, ask students to investigate a current fake news story and compare their findings to the research. You might ask them to upload a brief response on an LMS forum, blog, or digital portfolio as an additional exercise. Or, perhaps, to create a poster to advertise to their peers why they should not fall for it. 5. Make up a ...

  3. Misinformation and Fake News Lesson Plan

    Lesson Plan. Reliable news outlets always answer the question "How do we know?". Train your students to examine news stories for evidence of transparency and verification that will help them distinguish legitimate news from unreliable information or "fake news.". Students practice spotting misinformation and learn fact-checking tricks ...

  4. Ten Questions to Ask About Fake News

    Fake news has a long history. If you include opinion columns in your discussion, you can point back to Swift's Modest Proposal and then jump to contemporary pieces. If you want to explore the difference between satire and misinformation, Swift is a strong starting point. Once students think about the situation that led to Swift's satirical ...

  5. "Fake News" Resources

    How to Spot Fake News and Find the Facts. Written by Teaching Kids News' co-founder, Joyce Grant and beautifully illustrated by Kathleen Marcotte; published by Kids Can Press in 2022 and suitable for young people 9 to 12 as well as classrooms. You can buy this illustrated non-fiction book in most independent bookstores or from one of the big ...

  6. PDF How to Teach Your Student About Fake News Lesson Plan

    News Literacy Project, NAMLE, Media Education Lab, and the Center for Media Literacy. Extension Activities Fake news might be a case of history repeating itself. Check out the role fake news has played in U.S. history in this Washington Post piece: Fake News? That's a very old story. Who are some of the people behind fake news? What would make ...

  7. Fake news and the spread of misinformation: A research roundup

    Summary: "The rise of fake news highlights the erosion of long-standing institutional bulwarks against misinformation in the internet age. Concern over the problem is global. However, much remains unknown regarding the vulnerabilities of individuals, institutions, and society to manipulations by malicious actors.

  8. Hoaxes and Fakes

    How can you avoid being fooled by fake videos and other information online? Check out Hoaxes and Fakes, a free digital citizenship lesson plan from Common Sense Education, to get your grade 9 students thinking critically and using technology responsibly to learn, create, and participate.

  9. PDF FAKE NEWS ASSIGNMENT

    The instructor provides detailed assignment instructions. Students attend an in-class workshop where a librarian addresses the topics of fake news and media literacy. Students also have the opportunity to ask the instructor questions about the assignment. Students receive 6-7 articles to get them started on their research.

  10. Assignments on Evaluating Sources

    C-SPAN Classroom: Lesson idea: Media Literacy and Fake News SchoolJournalism.com News and media literacy lessons.. Walsh-Moorman, Elizabeth and Katie Ours. Introducing lateral reading before research MLA Style Center. (Objectives include identifying credibitilty and/or bias of a course, identifying how professional fact-checkers assess iinformation vs a general audience.)

  11. Fake news, disinformation and misinformation in social media: a review

    Social media outperformed television as the major news source for young people of the UK and the USA. 10 Moreover, as it is easier to generate and disseminate news online than with traditional media or face to face, large volumes of fake news are produced online for many reasons (Shu et al. 2017).Furthermore, it has been reported in a previous study about the spread of online news on Twitter ...

  12. Detecting Fake News Flashcards

    the impact of desire on beliefs. context. the situation surrounding an event. credibility. the quality of being trustworthy. deep fake. machine learning technology that manipulates or fabricates audio and/or video recordings to show people doing or saying things that they never said or did. Fringe source.

  13. Detecting Fake News

    Fake News Assignment. Death by Vaccines. With the growth of social media, fake news websites are appearing with greater frequency. This has begun the rapid spread of misinformation on topics regarding vaccines, food safety, global warming, and many other topics. Students need to be able to evaluate these news sites and soruces. This assignment ...

  14. Fake News Unit

    This Fake News Unit contains 10 high-interest lessons about media literacy, fake news and digital citizenship to help 21st-century students learn to think critically about the information they consume from print and digital media sources. ... Answer Key: Included with rubric Teaching Duration: 2 Weeks File Size: 31.7 MB File Type: PDF (Zip ...

  15. Determinants of individuals' belief in fake news: A scoping review

    Attitudinal questions → Random assignment to a fake news story mimicking a WhatsApp post in one of three conditions (Source: legacy News outlet/no source/online forum) → Outcome measurement. 1) Perceived news fakeness: scales (1-5) indicating whether the news was (a) invented, (b) fabricated, and (c) could be considered fake news;

  16. Solved Assignment, watch and note key points with the videos

    Question: Assignment, watch and note key points with the videos listed below. Video Clip: "Fake News" Sites and Effects on Democracy (4:48) Description: New York Magazine's Max Read discusses his piece examining the rise of "fake news" and whether the internet is a reliable tool for furthering democracy. 88888 E Satire vs Fake News (1:35) Cole Bolton and Chad Nackers

  17. How to Identify Fake News / journalism, 'fake news' & disinformation

    Learn about counterfeiting news examples, the dangers of fake news & how to identify misinformation. Fake newsletter is resources is is false or misleading. ... the hazardous of fake news & how to identify misinformation. Skip to key content. Solutions for: Home Products; Small Business-related 1-50 employees; Medium Business 51-999 employees ...

  18. Finding Credible News

    Vocabulary Show definitions. bias · corroboration · credible · evaluate. bias - showing a strong opinion or preference for or against something or someone. corroboration - an additional source that confirms or supports a news story, article or piece of information. credible - able to be believed; trustworthy.

  19. How to Identify Fake News

    Fake news is information that remains false or distracting. Learn about fake news samples, of dangers of fake news & select the identify misinformation. Skip to main. Solutions on: House Products; Low Business 1-50 human; Medial Business 51-999 employees; Enterprise 1000+ employees;

  20. How to Identify Fake News

    Fake news is product this be false or misleading. Learn with fake news examples, the dangers of fake news & how to identify misinformation. Wrong news will information that is mistaken or misleading. Learn about fake news examples, the dangers for falsify news & like to identify misinformed.

  21. Lesson 3: Bias

    This lesson strips the fear out of bias by showing students how to notice the word choices and framing that show up when bias is present in a news story. Students learn about methods journalists use to produce high-quality objective reporting to see how journalists address bias and present stories from neutral viewpoints.

  22. How to Identify Fake News

    Learner about fake news examples, the dangers of fake news & how to identify misinformation. Fake latest is information that is false or distracting. Learned around fake latest examples, the dangers of fake newsletter & whereby at identify misinformed. Prance for prime content.