Click here to sign up for our daily newsletter.
A recent KPMG study in Canada found that 63 per cent of students use generative AI tools a few times a week, mainly to generate ideas, research, and edit assignments.(photo/ University of Saskatchewan - DAVID STOBBE and University of Regina)
AI IN EDUCATION

AI in higher education: Sask. universities grapple with the future of learning

Feb 3, 2025 | 5:00 PM

As artificial intelligence (AI) becomes increasingly integrated into student learning, universities in Saskatchewan have begun implementing guidelines to help students use AI tools ethically while ensuring academic integrity.

Dr. Nancy Turner, associate vice provost of Teaching and Learning at the University of Saskatchewan, said AI has already become a fundamental part of the student’s experience.

“There are studies suggesting anywhere from 77 per cent to 86 per cent of students in Canada are using AI in some capacity,” she said. “That doesn’t mean they are using it inappropriately or to cheat – it simply means they are utilizing it to support their learning.”

A recent KPMG study in Canada found that 63 per cent of students use generative AI tools a few times a week, mainly to generate ideas, research, and edit assignments.

KPMG International Limited, commonly known as KPMG, is a global professional services network and one of the Big Four accounting firms, along with EY, Deloitte, and PwC.

At the University of Regina, Dr. Alec Couros, director of the Centre for Teaching and Learning and professor of educational technology and media, agrees that AI rapidly transforms education.

“We have seen a huge uptake in AI use among students since November 2022 with the introduction of ChatGPT,” he said.

“Some students are using it as a learning aid, while others might be relying on it too heavily, potentially bypassing critical thinking and problem-solving skills.”

Navigating the pros and cons

Acknowledging the positive potential it brings to students’ development, they are also cautioning against its pitfalls.

“AI can be a powerful thought partner,” Turner said. “For instance, it can function as a tutor, providing immediate feedback to students or helping them work through a problem.”

She also noted that it has the potential for personalized learning– allowing students to receive assignments and materials tailored to their knowledge level rather than a one-size-fits-all approach.

However, Turner also pointed out the risks of overreliance on AI.

“We’ve seen students who depend too much on Google or Wikipedia for information, and AI is just another step in that,” she said. “The challenge for us as educators is to ensure students develop the skills to critically assess AI-generated content rather than accepting it as truth.”

Couros echoed this concern, emphasizing the ethical dilemma surrounding AI lies in its ability to fabricate information, generate false citations, and potentially produce biased results.

“It is not just a shortcut– it changes how students interact with knowledge,” he said. “Students need to learn that while AI can be useful, it should not replace fundamental research skills.”

How universities are adapting

In response to these challenges, both USask and U of R have implemented guidelines to support responsible AI use.

“At USask, we have AI principles that help students understand when and how AI should be used,” Turner said. “We provide specific guidelines for students, ensuring they know which AI tools are approved, how to protect their data and how to use it ethically.”

Couros said they also adapted to a similar approach at U of R.

“Ultimately, it’s up to the instructor to decide how AI should be used in their course, whether fully integrated, partially allowed, or completely restricted.”

One concrete example of AI integration is Microsoft’s Copilot, available to students at both campuses.

Turner explained that it is institution-approved because it aligns with university privacy and security policies compared to other tools.

“It’s a safer alternative to publicly available AI tools that may misuse students’ data,” she said.

Despite these efforts, Couros said students will likely use a wide variety of AI tools beyond the officially approved ones.

“The reality is students are using everything from ChatGPT to Perplexity AI to Claude,” he said. “Institutions need to be adaptable rather than restrictive.”

Ensuring academic integrity

One primary concern surrounding the use of AI in education is how institutions can prevent academic misconduct.Turner and Couros noted that if students submit AI-generated work, it is considered plagiarism.

“We have academic misconduct policies in place, and consequences can range from redoing the assignment to more severe academic penalties,” Turner noted.

Couros believes that rather than focusing solely on policing AI use, universities should rethink assessments to encourage integrity.

“We need to design assignments that require critical thinking and process-oriented learning,” he said. “Instead of banning AI, we should ask students to engage with it critically– maybe by analyzing AI-generated content for biases or inaccuracies.”

Turner went on to share that some instructors are already incorporating AI critique exercises into their coursework.

“Faculties would have students analyze the strengths, weakness, and biases in AI-generated outputs as a way of developing their critical thinking skills,” she said.

The future

With technology growing rapidly every second, they both agree that AI will play a transformative role in education.

“Al won’t replace educators, but it will change the way we teach,” Turner said. “We need to focus on how to leverage it to enhance learning rather than seeing it as a threat.”

Couros agrees yet remains cautious, pointing out the need for careful oversight rather than blind adoption.

“We’re heading toward an era of Artificial General Intelligence (AGI), and while AI isn’t quite there yet, it’s advancing quickly,” he said.

“The challenge for universities is to prepare students for a world where AI is everywhere while still ensuring they develop essential skills like critical thinking and ethical decision-making.”

Kenneth.Cheung@pattisonmedia.com

View Comments