The Educational Designer

Empowering students with skills for safe and ethical AI use

Students are using AI tools for assignments, research, and as study buddies, but many of them may not have the foundational critical thinking skills required to understand the tools they are using and the ethical and safety implications of those tools. Education needs to shift towards purposeful, scaffolded integration of AI alongside the development of digital literacy skills.

Prohibition vs. Education: The case for education over restriction

I like to use a ‘drugs’ analogy in relation to AI. If you insist on telling your kids to ‘say no to drugs, ’ then they go out into the world armed with zero knowledge about the effects of drugs on their bodies. There’s a good chance they will try drugs, and they will have no idea what is happening or how to keep themselves safe. 

AI is drugs. This analogy works for higher education in multiple ways: 

I asked Perplexity for feedback on this blog post and was told, “The analogy, while creative, may seem provocative to some.” I took this as a compliment.

The case for education over restriction

We need to think of ways to scaffold and explicitly develop our students’ professional skills, including digital literacy. Students with the skills to use something appropriately and better understand the ethical implications are less likely to misuse it. In the same way that students may try to collude or plagiarise, it’s generally related to their lack of skills to conduct the task, rather than an innate desire to cheat the system. We need to spend time in class teaching students what AI is, its drawbacks, and how to use it safely and ethically. 

This doesn’t have to be onerous and can be integrated into the teaching activities or as an add-on workshop. In some cases, a single workshop can improve student confidence in generative AI use, improve policy understanding, help students identify purposeful study uses of AI and evaluate the output (Sullivan et al., 2024). Teachers increasingly want to change their teaching practices and “teach students how AI works, how to use AI, and the critical thinking skills and the ethical values needed for working in an AI-saturated world (Bower et al., 2024).

You want to draw students’ attention to the weaknesses and biases of AI and engage their critical thinking skills. 

Scaffolding students’ AI literacy and critical thinking

Evaluate the style, content and evidence

Analyse the work for bias

Role play with an AI chatbot

Get feedback from AI

Note: The feedback Perplexity regularly gives me is that my voice switches between academic style and informal too frequently, and that my humour/literal asides are not appropriate. Alternatively, not everyone wants to sound like the majority of written information on the internet, and each unique voice should be celebrated. 🎉 Seriously, why does Grammarly constantly want to change ‘collaboration’ to cooperation? It literally doesn’t describe what I wanted to describe. 

Additional note: When I got feedback on this blog post, I may have hurt Perplexity’s feelings: “Comments about Perplexity’s feedback on your writing style, while relatable, detract from your authority on the subject.” Thanks, I’d rather be relatable. 🫶

Documenting and Defending Your AI Use

Have students gather evidence for how they have engaged with AI over the process. Having students document their process can help in multiple ways. 

Articulate and reflect on the process

First and foremost, it helps students unpack and refine their process. Often we plod through doing tasks in the same way over and over, but by analysing our process, we can often find issues, bottlenecks and gaps in our process. It’s important not to spend too much time on it, so students can unpack this with a few questions:

Embed reflective tasks in the process so that students articulate how they have used AI, how it assisted them and even how it hindered their process. This could be a summative task aligned with your professional skills and graduate attributes. 

Capture the process systematically

Secondly, capturing their process with documentation can help in case they are ever accused of misusing AI. Similar to above, it’s important not to spend too much time on this, but to design the process around capturing evidence. Strategies that can help are:

The biggest difficulty is that we need to be comfortable with AI ourselves before we teach our students. Post-COVID, many educators are burnt out, afraid of redundancies, overloaded, and even systematically underpaid. On top of this, we must grapple with one of the biggest education shake-ups of all time. We need to know, for ourselves, why AI can be unethical, unsafe, and extremely damaging. Importantly, we don’t need to know more than every student in our class. We just need to lead with transparency and be open about our own knowledge and use of AI. If we tell our students they can’t use it to write their assessments but use it to create the assessments, we’re not leading by example. Learn in iterations. Don’t try to have all the answers. Learn together with your students. 

References

Bower, M., Torrington, J., Lai, J. W. M., Petocz, P., & Alfano, M. (2024). How should we change teaching and assessment in response to increasingly powerful generative Artificial Intelligence? Outcomes of the ChatGPT teacher survey. Education and Information Technologies, 29(12), 15403–15440. https://doi.org/10.1007/s10639-023-12405-0

Sullivan, M., McAuley, M., Degiorgio, D., & McLaughlan, P. (2024). Improving students’ generative AI literacy: A single workshop can improve confidence and understanding. Journal of Applied Learning and Teaching, 7(2), 88–97. https://doi.org/10.37074/jalt.2024.7.2.7

Get new blog posts to your inbox.

Discover more from The Educational Designer

Subscribe now to keep reading and get access to the full archive.

Continue reading