Skip to Main Content
NAU Cline Library logo

Generative AI

General Reading

AI and UDL

Assignment Ideas

Teaching Misinformation and Information Literacy

While the inaccuracies of ChatGPT and other generative AI tools can cause the problems listed below, Cline Library librarians think that there is an opportunity for teaching information literacy and other common sense fact-checking ideas.

One example that can be used is that Google's AI tool, Bard, presented inaccurate information when asked "what new discoveries from the James Webb Space Telescope can I tell my 9 year old about?". The AI answered that the telescope took the first picture of a planet outside of our solar system. This information is incorrect, but the AI isn't necessarily making things up. The telescope did take its first image of a planet outside of our solar system in September and others pointed out that NASA's "ambiguous" press release may have led the AI to confusion.

Continue reading more about this incident.

This is the type of example librarians would use when teaching about information literacy and "fake news" in the research classroom. We can ask students to look online and find other examples disproving this statement. We could ask students to identify if other journalists made this same mistake with NASA's press release.

The Information Literacy tutorial created by Cline Library's Mary Dejong walks students through activities to help them make decisions on which types of information to use in their research.

Here are some more interesting reads and tools for using AI to teach misinformation and information literacy.