March 9, 2026
Check your AI literacy
Tools like ChatGPT, Copilot, and Claude can be amazing to help us save time, gain understanding, and do further research 鈥 but what should students know to effectively and ethically leverage these tools?
To find out, we talked to , Associate University Librarian and Director of (Centre for Artificial Intelligence Ethics, Literacy and Integrity) at U荔枝视频.
Leanne Morrow at the Doucette Library
Ana DuCristea
LLMs are not a search engine (or an academic source)
Large Language Models (LLMs) such as OpenAI鈥檚 GPT, Claude鈥檚 Sonnet, and Google鈥檚 Gemini models are AI systems trained on massive amounts of text data to predict and generate human-like text by identifying complex patterns. They can sound like they鈥檙e confidently presenting facts, but what they鈥檙e really doing is predicting the next most likely word based on patterns from its dataset. That means you can get a smooth, assured answer with no guarantee that it鈥檚 correct.
Even if you hit the 鈥渨eb search鈥 or equivalent button in ChatGPT, Claude, or any other platform, your results may not be accurate. Web search will pull in pages, but the AI can misrepresent, oversimplify, or lean on low-quality sources. Plus, the open web is not the same as academic sources. As Morrow points out, strong research often depends on library databases and scholarly sources, which are not easily scrapable online.
Part of what makes LLMs convincing is a phenomenon called sycophancy: they鈥檙e designed to please you. 鈥淚t鈥檚 made to give you what you want,鈥 says Morrow, 鈥渋t will sometimes overconfidently provide you with inaccurate information because it wants you to be happy.鈥
This is especially dangerous when it comes to citations. ChatGPT and similar tools have been known to been known to generate completely fake references that seem plausible with real authors and journal names, because they are algorithmically piecing together patterns. 鈥淔aculty members know the citations in their area of discipline,鈥 Morrow warns, 鈥淭hey鈥檒l see it a mile away.鈥 Submitting fabricated citations is an academic misconduct issue, so always trace a citation back to its actual source.
Know the expectations before you start
Perspectives and approaches to the use of gen AI differ across campus 鈥 rules and expectations will vary from course to course. Some instructors are fine with it, some have restrictions, and others may prohibit it entirely. Before you use an AI tool to help with an assignment, check the course outline or ask your professor directly if you are unsure. Getting clarity upfront is much easier than explaining yourself after the fact.
Your data is the product
Free AI tools are not charities. 鈥淸Their purpose is] to make money,鈥 says Morrow. 鈥淓very time you go and do something in there, you鈥檙e sharing more data with them, and they鈥檙e building better tools to sell it right back to you.鈥 Anything you type into a public AI tool could be used to train the next version of that model.
Think twice before sharing anything sensitive (personal info, private docs, copyrighted works) with an AI chatbot. This also applies to uploading PDFs of library articles or lecture slides to get a quick summary. Even if your intentions are good, uploading copyrighted material to third-party tools may violate institutional policies or licensing agreements.
A better, safer approach is to use your own notes: type up what you took away from a lecture, then ask AI to explain further or turn it into a quiz.
Just because you can, doesn鈥檛 mean you should
AI literacy is not just about how to use these tools; it鈥檚 knowing when not to.
This is Morrow鈥檚 most important point: AI literacy is not just about how to use these tools; it鈥檚 knowing when not to. Offloading too much of your thinking to AI, especially early in your studies, means you鈥檒l lack the ability to build upon your knowledge.
鈥淯nless you have foundational knowledge in a discipline to actually evaluate something that AI has produced, you鈥檙e going to copy and paste something that鈥檚 inaccurate and you鈥檙e going to end up with a misconduct.鈥
AI works best as a support, not a replacement. Use it to brainstorm, organize, or stress-test ideas you already have, not create ideas you haven鈥檛 yet thought about.
Where to go from here
, located in the Doucette Library (Education Block 370), is your starting point for navigating AI use 鈥 whether that鈥檚 workshops, a self-paced AI literacy certificate, or a one-on-one appointment with a subject librarian who can point you in the right direction for your discipline. Starting this fall, the centre will also host monthly student drop-in mixers where you can bring a project, ask questions, and connect with other students across disciplines.
The library has also been trialling like Consensus and scite.ai, which pull information from scholarly datasets.
If you鈥檙e looking to get more technical with AI, there are communities on campus for that as well. offers a structured dive around practical AI work, and the hosts workshops, project opportunities, and more.
鈥淚 would like to see all our students graduate with some AI fluency,鈥 says Morrow, 鈥渁nd able to critically evaluate their AI use.鈥
As she puts it: AI supports, but integrity should lead.