In advance of National Media Literacy Week (October 27-31), we invite readers to spend some time with these insights and resources prepared by a leading national media literacy educator.
by Frank W. Baker
What do we want students to know and learn when it comes to social media, artificial intelligence, and all of the ways today’s media and technology can be influential and deceptive?
As educators you no doubt have your work cut out for you. That was just one of the messages I delivered when I spoke to Ohio school librarians this past summer at a meeting themed around the importance of fact checking.
As a long time media literacy educator, I had plenty to share and I am appreciative of MiddleWeb for giving me this opportunity to provide you with some of the highlights of my talk and a list of resources I believe will be helpful as educators work to support students in an increasingly “deep fake” world.
It Takes All of Us
Traditionally the school librarian has been the primary person who took on the task of teaching information and media literacy. Today the responsibility to address the risks attached to new and emerging technologies must to be shared more widely.
Every teacher in every discipline has opportunities to ensure their students are media literate. And in schools that (sadly) no longer fund a librarian/media specialist, teachers should be discussing how they can share this obligation effectively.
With the number of AI-created deepfakes exploding across every social media platform and algorithms influencing what we see (and what we know) in every corner of the Internet, teaching media and visual literacy skills has become a top priority. We may not always know ourselves if what we are seeing is real or true, but we can certainly encourage students to be more skeptical and discerning.
With that in mind, the Columbia Journalism Review recently produced this short video designed to challenge us all to question AI manipulated images. According to CJR it “was created by animating AI generated images that were widely circulated and believed to be real.”
Sounding the Alarm
Another source, the World Economic Forum, has identified misinformation and disinformation as a top global risk. The WEF said: “Misinformation and disinformation…may fuel instability and undermine trust in governance, complicating the urgent need for cooperation to address shared crises.” (Source)
A global study released in April 2025 found that members of Generation Z (aged 13-28) appear to be the group most vulnerable to misinformation. The researchers blame the low quality of information these younger consumers encounter day in and day out. (Source)
In August 2024, in an article published in the Journal of Experimental Physiology, two U.K. scientists sounded the alarm about the misrepresentation of scientific research and information. They warned that “the truth is under attack” and highlighted the urgent need for critical thinking and scientific literacy to combat the rising problem. (Source)
When a 2019 Stanford University report indicated that most high school students were unprepared to judge the credibility of online information, researcher Sam Wineburg called the results “disturbing” and “troubling”. In response, the Stanford History Education Group created the Civic Online Reasoning curriculum to help educators prepare students to better comprehend what they encounter/consume online.
Wineburg and University of Washington professor Mike Caufield went on to co-author Verified: How to Think Straight, Get Duped Less, and Make Better Decisions about What to Believe Online, a book that I highly recommend. I would urge you to examine a copy and consider how their guidance might apply to your instruction.
Where Do Students Get Their News?
When young people get their news from social media, typically they don’t know, nor do they question, the source or the reliability of what they read there.
A recent peer-reviewed study published in the journal Journalism and Media found that individuals with low critical thinking skills are significantly more vulnerable to fake news generated by artificial intelligence (AI), especially among young adults. (Source)
Several recent surveys have found that while many young people rely on social media for their news, when asked they say they do not regard it as a trusted source. This seeming paradox underscores the need to engage students in discussions that help them connect the importance of seeking out trustworthy news sources with the impact that current events will have on their own quality of life now and in the future.
To encourage students to seek out news they can trust, we need a renewed focus on media literacy education, including verifying information, recognizing potential biases, and understanding how social media algorithms shape news consumption.
Students should be taught to cross-reference information with multiple, reputable sources, including mainstream news outlets and nonprofit online newsrooms. Additionally, encouraging critical thinking and skepticism towards sensational headlines and emotionally charged content is crucial.
Fact-checking and Beyond
Sam Wineburg and other media literacy educators often call on students to think like fact-checkers:
- Fact-checkers verify claims by consulting multiple sources, not just by reading a page’s content.
- Learning lateral reading helps people spot false or biased info more accurately online.
- Checking a source’s bias, funding, and transparency strengthens media literacy skills.
In a recent article published on Medium, Harvard professor and 2015 National Teacher of the Year Shanna Peeples identified a troubling current question in America: What happens when reality becomes negotiable? She writes:
As someone who has spent decades working in education and journalism, I’ve watched with growing concern as the boundaries between fact and fiction have blurred into a haze that obscures our democratic foundations…
The most unsettling aspect isn’t that lies spread – that’s nothing new. What’s new is how sophisticated the machinery of deception has become and how willingly we’ve accepted its presence in our lives.”
As we work to counteract deception, here’s one strategy Peeples suggests to educators: “Instead of just fact-checking after the fact, teach students to recognize disinformation tactics before they encounter them.” (Source)
What are those disinformation tactics? Here’s a list of related terms (some were adapted from The Economist):
- astroturfing: the deceptive practice of presenting an orchestrated marketing or public relations campaign in the guise of unsolicited comments from members of the public. Examples include: Pharmaceutical companies may sponsor patient support groups and simultaneously push them to help market their products. Bloggers who receive free products, paid travel or other accommodations may also be considered astroturfing if those gifts are not disclosed to the reader.
- clickbait: (on the internet) content whose main purpose is to attract attention and encourage visitors to click on a link to a particular web page, often to view advertising.
- echo chambers: an environment, often created by algorithms, in which a person encounters only beliefs or opinions that coincide with their own, so that their existing views are reinforced and alternative ideas are not considered.
- catfish: A person who creates a fake social-media profile, posing as someone else. Some create a range of fake profiles to support their made-up identity with fictional friends and family. Catfish typically try to form an emotional connection with other social-media users for the purpose of harassment or financial gain.
- content farm: Websites that rely on low-paid writers or AI to churn out articles, with the goal of becoming highly ranked by search engines in order to boost revenue from advertisers. They may also be used to spread false information.
- deepfake: An AI-generated image or video that convincingly shows a person doing or saying something they have not, by superimposing their face onto the body of someone else or generating an entirely new visual. As AI tools to parrot people’s voices have been developed, the malicious use of fake audio is sometimes also described as a deepfake.
- fake news: This term was once used to describe disinformation. But it has become politicized in recent years and now is frequently used as a label to attack legitimate critics and as a retort to dismiss verified facts as untruths.
- AI-generated news: Misleading articles produced by AI for made-up-news sites or content farms. At first glance the websites that host such stories may look similar to those of real media outlets. They often run innocuous stories about travel or entertainment alongside articles that peddle harmful falsehoods.
- meme: Visual creations, ideas or inside jokes that are spread rapidly and replicated among those with similar interests, often without identifying the original creator. Because they can resonate with a target audience and go viral, they have become a popular tool for spreading false information and propaganda.
- pre-bunking: This term describes a positive step we can take as an inoculation method against disinformation. The idea is that fact-checkers identify a false narrative that is starting to circulate and make people aware of it early, so that if they do encounter it elsewhere on social media, they will do so with skepticism. Perhaps your students could contribute to the pre-bunking effort.
Because so much of our students’ world is visual, it’s important that educators also consider teaching visual literacy skills – helping young people to question deepfakes, AI-generated images, digitally altered photos and videos – all of which appear regularly on social media. Students also need a better understanding of how the makers of film and video routinely manipulate us with their cameras.
Recently the producers of a new 16-minute documentary, Death of a Fantastic Machine, about the importance of visual literacy, wrote in a New York Times op-ed piece: “We spend years in school learning to read and write, but almost no time learning how to practically understand the camera and critically analyze images – even though they dominate our lives.” (Source) You can view the short documentary at YouTube.
There Is No Time Like the Present
As educators, we need to take every opportunity to help students think critically, question what they see, and learn all the ways that information online can be untrustworthy. One strategy I recommend: encourage your students to bring examples of fake news, disinformation and the like into the classroom. Spend some time each week dissecting their examples and applying the techniques recommended in the resources below.
More resources and references
► Graphics and advice from “the art of lateral reading,” and the four-step SIFT model or the CRAAP critical thinking strategy are ways to introduce critical thinking skills.
► My popular 2021 MiddleWeb article, Teach Kids to Read the Images They See, is a pre-AI take on interrogating a photograph for information and authenticity.
► I recently created VERIFY THE IMAGE, an online activity which challenges students to document the steps they undertake when determining whether an image is authentic or fake.
► “Students Should Know”: I’ve put together this collection of short media literacy videos for educators related to AI, deepfakes and other topics.
You might also read (or have your students read and summarize) some of these recent articles:
► Falling for AI-Generated Photos: A Growing Concern in Media Literacy. (AInvest, September 2025)
► How Gen Z Became the Most Gullible Generation (Politico Magazine, April 2025)
► How to identify AI-generated videos online. (Mashable, August 2025)
► Spotting AI: Knowing How to Recognise Real vs AI Images. (Britannica Education)
► Real or AI? How to Teach Your Child to Spot AI-Generated Content. (The Bark Blog, July 2025)
► AI How and Why: Addressing AI-Generated Images in Schools (Learning Technology Center, April 2025)
► As Misinformation Rises, Can Students Believe What They See? (Seattle Spectator, October 2025)
► Why We Need a Right Not to Be Manipulated. (The Guardian, July 2025)
For 10 years, media educator Frank W. Baker contributed columns and blogs to MiddleWeb about media literacy. He is also the author of Close Reading The Media – a publishing collaboration between MiddleWeb and Routledge/Eye On Education. He can be reached at fbaker1346@gmail.com. Visit his extensive media literacy website here.