How can we use Generative AI tools to help with planning, teaching and learning?

15 December 2023

In last week’s blog I focused on teaching about AI – what should be in the taught curriculum. This week I will be looking at the parallel topic teaching with AI: specifically, how can we use the new plethora of Generative AI tools to help with planning, teaching and learning?

Teacher use of Generative AI

The “generative” in Generative AI (GAI) refers to creation of new content which can be text, images or even video. ChatGPT and Google’s Bard are large language models (LLMs) which can process natural text and respond with natural-sounding text back.

Using the right prompt can generate useful content, and you can refine your prompt many times to get incrementally better results. This can help with common tasks such as the creation of teaching materials, writing reports, giving feedback and creating policies and procedures.

Tons of prompt guides are out there but one of my favourites is The Little Book of Generative AI Prompts from Mark Anderson (also known as ICTEvangelist) which suggests prompts for the basic LLMs like ChatGPT and Bard, such as “Create a set of flashcards with key terms and definitions related to 'Cybersecurity and Ethical Hacking”.

I recently used ChatGPT when creating programming questions for Craig 'n' Dave’s SmartRevise platform, because it can “learn” and quickly convert between exam board pseudo code styles and common programming languages.

More specialist applications for teachers are available, often built on top of the LLMs, and here are a few worth checking out:

  • Curipod – generates interactive lessons including quizzes and polls from just a few inputs.
  • TeachmateAI – over 50 tools in one product including a lesson plan writer, slideshow generator and text simplifier.
  • Magicschool – another large suite of tools including rubric and marking tools, and a tool that claims to “make learning relevant to your class”.

It’s vital that you, as the subject specialist, check any content generated by GAI. As I pointed out last week the output is not guaranteed to be accurate, and GAI can sometimes “hallucinate” nonsense output.

You also need to work within your school’s acceptable use and data protection policies. You can read more about tools you can use in the article by Rachel Arthur of TeachFirst here.

Learners’ use of GAI

Learners' use of GAI really falls into two categories: intentional and unintentional. This latter use is often called “cheating” with GAI, which I will explore first:

“Cheating” with GAI

ChatGPT drew a wail of grief from academics complaining they would no longer be able to assess students’ work. I believe that Kevin Roose of the New York Times was correct in his claim that teachers should “assume that 100 percent of their students are using [GAI] on every assignment, in every subject, unless they’re being physically supervised.” So-called GPT detectors claim to help but frequently flag false positives and exhibit bias against non-native English speakers (see this Washington Post article).

Thus, I believe as a tool for assessment the essay is dead. But we must remember that it was always, like all assessments, a flawed proxy for what’s in learners’ heads, and “cheating” on homework assignments has always been possible in myriad ways even before GAI.

If we need reliable data, we must control the conditions, as we do in exams and supervised tests, but first ask ourselves: do we need all this, intrinsically flawed data?

My personal hope is that with the death of the “homework assignment”, teachers think more deeply about the purpose of assignments and of assessment more generally. Most assessment should be formative and not require written submissions marked at great cost by the teacher.

For this I use questioning, observation, multiple choice quizzes, exit tickets and peer and self-marking, which all incur minimal “out-of-lesson” costs. The Teach Computing resources contain lots of low-cost formative and summative assessment opportunities including short end of unit tests and the Secondary question banks. And if you need more questions you can always ask ChatGPT to write them!

Including GAI in learning

If you’re wondering why I wrote “cheating” in quotes throughout the above section, it’s because using tech to automate boring tasks is almost the entire point of tech, so it feels wrong to criticise a computing learner for using computers to achieve a task.

Better to change the task to one more valuable than to impose arbitrary restrictions on how it can be done that don’t exist in the real world.

Co-founder of educating4ai.com, Stefan Bauschard, suggests moving towards “alternative assessment methods such as authentic learning, project-based learning, portfolios, and speech and debate” – is this the driver schools need to explore culturally-relevant pedagogy and Universal Design for Learning?

And where better to learn AI Literacy than in the Computing classroom? Last week I listed some resources to get you started teaching about AI. But why not include GAI in a range of learning activities, both in the Computing classroom and across the school?

Here are some ideas to get you started, and remember to abide by any website’s Terms and Conditions including age restrictions and parental permissions:

  • Show learners how to ask ChatGPT to debug their programs. Ensure they cut and paste before and after images into their digital workbooks, and then get them to explain the fix offered by ChatGPT in their own words. This could even work within Pair Programming: a GAI-augmented “navigator” instructing the non-GAI enabled “driver”!
  • Get them to interact and critique their own work and that of their classmates by asking ChatGPT to summarise and “mark” it with prompts like “list the main points made in this essay” or “explain what this program does.”
  • Teach them about reliability by asking ChatGPT to solve maths and logic problems (it’s very bad at this) or choose a prompt from Giuseppe Venuto’s chatgpt-failures repository.
  • Teach them about gender, race and age bias by asking Craiyon to draw “surgeons at work” and “cleaners in a hotel lobby” (see images à)

Intentionally including GAI in your teaching in such ways, even occasionally, is a great way to build AI literacy in your school. More ideas are available online across the web including the University of Kent AI Guide, and the Prompt Templates from Edge Hill University.

Safe and Responsible Use

I’ve suggested ways you can incorporate GAI into teaching and learning in these two blogs, but what about the risks and ethical issues it poses? Responsibility for creating a whole-school policy should not fall upon the Computing lead, but you will inevitably be involved. This blog by Stefan Bauschard previously mentioned is a good starting point, as is this presentation by Miles Berry. Why not discuss with your school IT manager or Edtech lead and create a GAI policy together?

Summary

GAI poses new risks and opportunities, but an outright ban is unenforceable and runs counter to the purpose of digital literacy education, which now must encompass AI literacy. Let’s take the initiative as Computing leads and give our pupils a world-class AI literacy education.


About the author

Alan Harrison is a National Specialist for Secondary Computing Leadership at the National Centre for Computing Education.