Tradition meets innovation: Generative AI in McMaster classrooms

A prof standing in front of a screen showing an AI-generated image of a prof in front of a screen.

Prof. Matt Sibbald stands in a classroom, in front of an AI-generated image that was meant to resemble him. The prompt used was: "Hyper-realistic white man with brown hair and glasses in a business suit pointing to and looking at a projector screen in a McMaster University classroom - he should be on the right hand side and any text burgundy.” (Photo by Georgia Kirkos, McMaster University)


Professional football players need to prevent and reduce injuries throughout the playing season. Adults living with osteoporosis need to exercise, but hesitate out of fear of breaking a bone. People recovering from heart attacks need an education program that includes family members, particularly those who do the cooking, so they can have heathier meals.

Rehabilitation health professionals developed real-life education programs for these very real situations in a course this fall — and they used generative Artificial Intelligence (AI) to help.

“The students used ChatGPT to begin considering different learning theories that might inform the development of an education program,” said Mary Clark, an assistant clinical professor (adjunct) in the Faculty of Health Sciences, who taught Facilitating Learning in Rehabilitation Contexts.

Circular headshot of Mary Clark
Mary Clark

Education is embedded in all the work practising rehab professionals do to help individuals understand their treatment and to manage their condition. But developing and updating more formal education programs for groups may not be prioritized over direct patient care.

ChatGPT can speed up the process by suggesting learning theories and methods that are new to some health professionals and reminding them of lesser-used ones that may be applicable.

In Clark’s course, students are assessed on their critical appraisal of ChatGPT’s output, such as comparing it with more reliable, peer-reviewed sources and verifying its accuracy.

Clark modified her course after an interactive workshop on using AI in teaching and learning led by the MacPherson Institute this past summer.

“When the use of generative AI became widespread last spring, we heard concerns about academic integrity,” says Kim Dej, vice-provost, Teaching and Learning.

“We also heard from faculty members across campus who said they were inspired by the potential benefits of incorporating generative AI into their teaching to enhance student engagement, to deepen learning opportunities, and to enable students to use AI tools in ethical and creative ways that empower them to contribute to solving the difficult problems we face in society.”

Dej co-chaired McMaster’s Task Force on Generative Artificial Intelligence in Teaching and Learning, which started work on May 1. Members produced usage guidelines and a comprehensive report with recommendations that was published on Oct. 5.

AI as a teaching tool

Ben Taylor joined the MacPherson Institute as a postdoctoral fellow in June. His research looks at how instructors in Canadian higher education are responding to the widespread availability and presumed use of generative AI tools, such as ChatGPT, to assess students.

“There are ongoing discussions about generative AI’s potential for cognitive offloading, which is a fancy way of describing how our brains only have so much capacity, so we tend to use tools, like note taking, to move or offload certain knowledge or processes to free up space,” Taylor says.

“If students focused on remembering every single detail from a lesson, there wouldn’t be cognitive capacity for the higher processes of learning, such as analyzing or evaluating the course material.”

Programs and instructors need to have conversations about what is appropriate to offload, Taylor suggests, to determine what competencies or skills are not directly linked or aligned to learning outcomes.

Students might use generative AI, for example, to create a study plan or to outline the structure of an assignment, so long as the assessment isn’t directly evaluating students’ ability to do these tasks.

Understanding the role of generative AI in the classroom is increasingly important as its use spreads across McMaster’s campus.

In the capstone course Science Communication in the Behavioural Sciences, students are writing a blog series called Coding Tales: Generative AI’s Impact on Science Writing and Storytelling.

“Students can use generative AI for brainstorming, editing and revising their course work if they correctly reference the information,” says Ayesha Khan, associate professor, Psychology, Neuroscience & Behaviour.

Where tradition meets innovation

In the School of Medicine, with its history of problem-based learning, generative AI is where “tradition meets innovation,” says associate professor Matt Sibbald.

Self-directed learning and group work are two cornerstones of teaching and learning here, “and generative AI has the potential to enhance both,” Sibbald says.

Generative AI can engage with students as a virtual patient and even introduce a greater variety of viewpoints and approaches to problems than a problem-based learning group of seven to 10 people could come up with.

Chanel Morrison, who served as a student representative in McMaster’s Task Force on Generative AI in Teaching and Learning, says she feels “privileged” to use generative AI for assignments, yet is also still hesitant.

Morrison uses ChatGPT for ideation, usually when selecting assignment topics.

“It helps me to explore various perspectives and it sparks my creativity to think more broadly. It’s like having a brainstorming partner available 24/7,” she says.

“However, there is a lingering hesitation. It comes from the uncertainty about the ethics of using these tools and the concern of overreliance. I want to ensure that my academic achievements are truly my own.”