Generative AI and teaching, learning and research at McMaster

Meena Andiappan, left, an associate professor in Human Resources & Management, and Michael Welland, an associate professor in Engineering Physics, are exploring how generative AI can support and challenge teaching practices, research approaches and student learning across disciplines. (Photo by Georgia Kirkos, McMaster University)
From reshaping how instructors teach technical concepts to helping students better manage exam stress, generative Artificial Intelligence (Gen AI) is already influencing how the McMaster community teaches, learns and works. But as the technology evolves rapidly, so do the questions and concerns it raises.
A snapshot of how Gen AI is being used across McMaster would show a community experimenting with new approaches, rethinking old ones, and grappling with the implications of rapid change.
Michael Welland, an associate professor of Engineering Physics, used Gen AI in designing his first course at McMaster, Numerical Methods for Engineering (ENGPHYS 3NM4).
He also approached the tool with caution and thoughtfulness.
Welland, who has industry experience at Canadian Nuclear Laboratories, where he previously hired McMaster co-op students, reimagined the course to reflect modern engineering practices, including AI use.
He moved away from teaching students how to code numerical methods from scratch, a practice that Gen AI can now quickly replicate, and focused on understanding, validating and interpreting AI-generated results using open-source tools like Google Colab integrated with Gemini — Google’s suite of AI models and tools.
“The course is now less about rote memorization and more about critical thinking, collaboration and analysis — the kinds of skills students will need in industry,” Welland says.
“Gen AI helped me design a course that’s more relevant, accessible and aligned with where the profession is headed.”
Students were encouraged to share their AI experiences and tips through weekly reviews, and assessments blended project-based work with oral exams to ensure students could explain and defend their decisions.
“Students reported being happy and excited to have an AI-positive environment,” says Welland.
“And midway through the course, one piece of feedback stood out: The course was hard, not because of math, but because it emphasized critical thinking.”
Better prompts and critical thinking
While Gen AI is opening new possibilities for teaching and learning, it is also raising important questions across disciplines. Topics range from academic integrity to what should, and should not, be done using AI when the tool can generate essays, solve problems, or produce code in seconds.
These tensions reflect the broader challenge of balancing innovation with academic rigour, says Matheus Grasselli, deputy provost and co-chair of McMaster’s AI Advisory Committee.
“At McMaster, we’re working to ensure our response is rigorous, inclusive and future-focused,” says Grasselli.
“That means listening to our community, sharing knowledge across disciplines, and helping our students and faculty use these tools in ways that are both effective and responsible.”
Mina Al-Barak, a McMaster student who uses ChatGPT to plan her study and wellness schedule during exams, says effective AI use demands a new kind of digital fluency: prompt engineering.
“Students might think they can talk to AI tools the way they talk to a friend, but that doesn’t get the best results,” says Al-Barak.
“With more focused prompts and follow-ups, AI becomes much more helpful. We should be teaching those skills.”
The idea of using Gen AI as a gateway for students to learn critical thinking, adaptability and ethical decision-making is exciting, says Stephanie Verkoeyen, special advisor on Gen AI.
“If we do this right at McMaster, we’re not just responding to a tech trend, we’re using it to strengthen the university’s core commitments,” says Verkoeyen.
“Gen AI opens the door to long-overdue conversations about equity, digital literacies and innovation. But it also forces us to confront the discomfort and the risk that come with major change.”
Across campus, more instructors are rethinking how they design and deliver their courses, and researchers are exploring how AI might enhance or challenge their work.
Meena Andiappan, an associate professor in Human Resources and Management at the DeGroote School of Business, found that large language models (LLMs) such as ChatGPT performed well on questions with objectively correct answers but struggled to demonstrate rational decision-making on more subjective or ambiguous tasks.
Her study, A Manager and an AI Walk into a Bar, highlights the importance of transparency and bias awareness in how these tools are used in business contexts.
The research underscores the complex interplay between AI tools and human judgment, and the importance of thoughtful, equitable implementation.
The same principles guided the Office of the Registrar in a recent pilot project at the Faculty of Health Sciences spring convocation, where AI was used to help read graduates’ names.
The system was tested to enhance accessibility and pronunciation accuracy, particularly for names that may be misread by human announcers. It also gave students the ability to review and approve the pronunciation of their name prior to the ceremony to ensure accuracy.
If they didn’t like how their name was being pronounced by the system, they could record it themselves and the system would use that to re-record it.
Feedback from the pilot will help inform whether the technology is used more broadly in fall convocation ceremonies, says University Registrar Darran Fernandez.
“Our team is evaluating the tool’s effectiveness, the student experience, as well as the cultural and technical considerations involved,” Fernandez says.
Members of the McMaster community looking to learn more and to get involved can visit the Generative Artificial Intelligence webpage and join the Generative AI at McMaster Team.