Open this photo in gallery:
DREAMSTIME IMAGE
While artificial intelligence (AI) and machine learning have been components of teaching and learning in the classroom for several years, generative AI is changing the game. And students, teachers and administrators are adapting to this ever-evolving technology.
“At The York School, we’ve made intentional efforts to intelligently and purposefully integrate AI into our curriculum across all grade levels. From junior kindergarten to Grade 12, we’ve introduced AI concepts that align with each stage of our students’ learning journey,” says Justin Medved, associate head of academic innovation at The York School in Toronto.
For instance, in the junior school, students focus on building curiosity around how technology and AI work. In the middle and senior schools, students explore more complex AI concepts in subjects such as mathematics, computer science and even in the humanities, where AI tools like natural language processing assist in research projects and creative writing.
The school has also partnered with Flint K12, an organization specializing in AI education for kindergarten to Grade 12 schools with custom-built AI modules. “Through this partnership, our students gain hands-on experience with AI models, understanding machine learning algorithms and applying AI solutions to real-world problems,” Medved says. But they also learn about the ethics of AI. For example, its Theory of Knowledge course dives into discussions about bias in AI, data privacy and the societal impacts of automation.
“Incorporating AI into education is not just about preparing students for future careers; it’s about equipping them with the critical thinking skills they’ll need to navigate a world increasingly influenced by AI,” he says. “AI literacy is crucial for helping students understand how the technology they encounter every day works, its potential and, importantly, its limitations and potential for misuse.”
Holy Name of Mary College School in Mississauga, Ont., is intentionally integrating generative AI into its academic program in “age-and-stage” appropriate ways. Students use AI at least once in each of their classes, “so that they can learn how to use it responsibly, enhance their critical thinking skills by analyzing AI’s limitations and shortcomings, and foster their creativity and higher-order thinking skills by looking to do things differently than generative AI,” says Ryan Baker, director of academics at Holy Name of Mary College School.
Baker says it’s important to incorporate generative AI into the educational process to teach students how to interact with GenAI in ways that are educational and to prepare students for a world in which generative AI is increasingly a part. “To ignore generative AI is to create opportunities for it to be used in less-controlled settings,” Baker says. “This could lead to its inappropriate use, resulting in decreased academic skill acquisition because generative AI completes the work for students.”
Holy Name of Mary is a member of the CIS (Conference of Independent Schools), a networking body that fosters independent school sharing and co-operation. The CIS has developed an AI framework that member schools can use to help shape best practices in individual schools. Holy Name of Mary has adopted this framework and crafted it to fit the unique circumstances of its school.
“We have spent this past year researching and developing Havergal’s Generative AI Framework Version 1.0 and accompanying guidelines for both our faculty and students,” says Garth Nichols, vice-principal of experiential education and innovation at Havergal College, a school for girls in Toronto. “The focus is first and foremost on safety, and then on effective learning. For example, we are not having a dedicated course on generative AI; rather, we are exploring intentional ways that it can be integrated into our liberal arts approach to teaching and learning.”
At Trafalgar Castle School, a girls’ in Whitby, Ont., east of Toronto, students explore and experiment with AI tools hand-in-hand with their teachers. In visual arts classes, for instance, students experiment with AI tools to support the creative process with compositions, visual elements and narrative visualization. In science, students use AI to deepen their critical thinking and problem-solving through personalized feedback and supported scaffolding. While learning to use AI tools, students engage with constructive doubt about the role and limitations of this technology and explore how to protect authentic thought and detect bias.
“Personalized learning is wonderful, but we want to make sure that we have oversight,” says Laurie Kuchirka, dean of academics at Trafalgar. “We want to make sure we are always in the loop of what our students are doing in the classroom … whether we’re teaching them how to prompt engineer or to think critically.”
For example, students in Grades 4 to 12 are using an AI tool called SchoolAI, which provides a high level of data security and allows teachers to set up GenAI personalized experiences with students while having live oversight of the interaction. Teachers can design the experience, such as a conversation with a character from a novel or a reflection task where the bot can’t provide answers, only questions. The teacher can watch, in real time, the interaction with the student.
Teachers also use an AI tool called BRISK to differentiate a text for different reading levels, create ideas for an engaging activity and play back the progression of student writing on a Google Doc, supporting teachers in ensuring academic integrity.
“We also have other ways that we ensure academic integrity, with regular conversations throughout the research and writing process, in-class discussions during work time and a visual of a stop light of red, green and yellow that is used to … guide students in the degree of AI allowed for an assignment or assessment,” Kuchirka says.
“We also use exam.net, a tool that locks a student’s screen so they complete an activity or assessment in the classroom without any other digital support.”
When incorporating GenAI, getting faculty on board is a critical part of the process. Appleby College in Oakville, Ont., is incorporating AI tools such as Microsoft Co-Pilot, Google Gemini, Claude.ai and Perplexity into the classroom to enhance students’ personalized learning, but also to support teachers in delivering educational experiences.
“Last winter, teams of teachers began an extensive assessment of work across all subject areas designed to establish policies and guidelines for how AI is to be used in course development and assessment creation. The goal, in all cases, is to work with the technology so that it is helping improve student learning, while also equipping students with the skills and understanding to prepare them for future careers,” says Dr. Carlos Heleno, assistant head of school, academics, with Appleby College. For example, math teachers are using the AI agent chatOAME, which provides Ontario-centric responses to pedagogical and assessment inquiries.
“Our approach to AI last year was all about learning and understanding. This year it is about application and how AI tools can support our academic program, but equally as important are the different areas of the school such as boarding, HR and student well-being,” says Heleno.
It also stresses augmentation over automation. “AI tools and services should empower all members of the Appleby learning community, ensuring a safe, secure and equitable environment. AI should augment interactions and not replace human connections.”
Lakefield College School north of Peterborough, Ont., has also been diving into the impacts of AI on education. Its initiatives include an AI Working Group, a partnership with App Direct (including access to a virtual AI playground within a data protected space) and the Summer AI Institute, in which it hosted 19 Canadian Accredited Independent Schools from across the country in July to discuss different approaches to AI.
Over the past year, Lakefield teachers built their skills around AI. This year, they’re taking that one step further; all teachers are bringing AI into the classroom in some capacity, so students can start to use it in their normal workflows, “to get a sense of how it works, and then also to explore in their classes when it’s not working, such as when it’s hallucinating and when bias is showing up,” says Dean Van Doleweerd, associate head of programs at Lakefield.
Without this exploration, students and teachers could be missing out on opportunities. “If all we do is fixate on this as a tool to cheat, we’re ignoring the fact that life has changed, and we need to get to that place and discover the new processes and the new opportunities.”
For example, students are using AI in a creative writing class to help them fine-tune their creativity.
“We’re trying to educate [students] broadly in all sorts of different subjects,” says Joe McRae, director of academic operations at Lakefield, “but also trying to build skills that they can transfer to be successful in universities, that they can transfer to be successful in the workplace and beyond.”
Advertising feature produced by Globe Content Studio. The Globe’s editorial department was not involved.