Throughout the year, artificial intelligence in learning management systems (LMS) shifted in both technology and perception. Large LMS vendors — Anthology, D2L, Google and Instructure — all introduced generative AI features this year and said that education partners adopted these tools more quickly than anticipated.
ADOPTION AND CHANGING OPINIONS: FEAR TO OPTIMISM
Early apprehension surrounding AI in education — largely driven by ethical concerns and fears of misuse — has given way to cautious optimism, according to John Baker, founder and CEO of the global software company D2L, which makes the LMS Brightspace. Baker said his conversations with educators around the world about instructor-facing AI tools they’ve launched this year have been positive.In July, D2L launched Lumi, an AI-powered assistant that can generate quiz questions and discussion ideas within Brightspace. In October, they added a generative AI import feature to Creator+, a Brightspace-integrated platform in which instructors can create interactive course content.
“I think in the past, when we released these models, they would have been thought of as kind of scary,” he said. “Whereas the approach that we’ve taken this go-around, in terms of making sure we got the principles right out of the gate, we do the beta testing and then we launch … I think, has resulted in the faculty embracing it, clients really embracing it.”
This change was driven by a combination of vendor transparency and the demonstrable impact of AI-enhanced tools, he said.
Anthology’s Chief Product Officer Nicolaas Matthijs described a similar phenomenon. Anthology announced its AI Design Assistant, which helps educators structure their courses within the Blackboard LMS, in July 2023. Since then, Matthijs said, Anthology has focused on refining that tool based on feedback from users. Changes include the ability to generate rubrics and questions in different languages and a context picker, which allows a course instructor to select teaching materials and generate a different kind of material — say, a quiz— based on it.
“When we released these at the end of ’23, adoption was really slow initially, and we saw this slow trickle of institutions that decided to turn it on, and then a slow trickle of instructors that started leveraging these tools,” he said. “That’s completely turned around in 2024, so the adoption of these tools has really accelerated.”
As of July this year, the design assistant had contributed to 310,000 unique learning tasks. Now, Matthijs said, 650 of Anthology’s clients have enabled the design assistant within Blackboard, and it recently passed 1 million unique uses.
He said the passage of time helped people feel more comfortable with AI in education, as did efforts from the education community and the tech community sharing positive use cases. For example, Anthology conducted an Ethical AI in Action World Tour that brought together global educators to discuss ethical AI use.
Instructure, which announced a partnership with Khan Academy’s AI teaching assistant Khanmigo in July, made efforts to demystify AI through initiatives such as AI Nutrition Facts. The fact sheet gives an overview of data privacy practices, human involvement and other considerations for their AI tools and their partners’ at a glance. Instructure’s Chief Academic Officer Melissa Loble said the company also conducted research on the effectiveness of AI methodologies compared to traditional techniques.
“We’ve actually had really good feedback, because we’ve been so transparent about what we’re using,” she said.
Another major company in the ed-tech space, Google for Education, leveraged its 10-year AI school pilot program to refine AI applications, namely Gemini. In May the company announced LearnLM, a collection of education-focused AI models and capabilities that integrate with existing Google tools like the search engine, YouTube and the Google Classroom LMS.
PROBLEM-BASED TOOLS
Some ed-tech leaders said a big shift this year was focusing on solving pedagogical and operational challenges that could benefit from AI, rather than focusing on AI itself.
“What we don’t want is an AI button just to check a box,” Jennifer Holland, director of program management at Google for Education, wrote in an email to Government Technology.
It can be time consuming for instructors to create course content and put it into an LMS, and ed-tech vendors put AI to the task this year to expedite the process.
For example, Blackboard’s AI Design Assistant and D2L’s Lumi AI engine were created with this in mind, the latter of which uses an educator’s content to design assessments that align with Bloom’s Taxonomy, a set of educational goals. According to Baker, the Lumi AI engine reduced instructor workload and increased student engagement, with early efficacy studies showing promising results.
Within Canvas, Khanmigo Teacher Tools and Google Gemini Learning Tools can help instructors with tasks such as creating visual aids and personalizing explanations.
There are also AI tools to help instructors and administrators save time by aggregating data and automating tasks. In Canvas, AI-powered discussion summaries provide faculty with insights into student interactions, and generative AI-driven institutional dashboards in Blackboard allow administrators to derive actionable insights from complex data sets.
“It’s not just within Blackboard, but also our student information system, our data platform, our CRM, all of them have an increasing number of AI projects in flight,” Matthijs said. “Again, these are not AI for the sake of AI projects, but really using AI to solve real problems.”
Companies have also deployed AI personalization tools to help educators address the needs and preferences of dozens to hundreds of students. For example, Instructure integrated AI tools into Canvas such as intelligent translation and Smart Search that allowed students to read in their preferred language and find tailored resources. Within Blackboard, the AI Conversation tool allows students to converse with AI personas and ask questions to fill their learning gaps.
And Google’s NotebookLM, launched in September, offers students a novel way to engage with course materials — it generates podcasts that summarize whatever course content they upload.
NEXT YEAR
Some vendors saw 2024 as an important year for laying the groundwork for AI governance and technical frameworks. Many new tools focused on lightening the load for educators and administrators, and leaders see student-facing tools as a focus for 2025, particularly those that personalize learning.
“We’ll continue to refine our existing tools and explore new applications of AI to address diverse learning needs and support educators in creating more engaging and inclusive learning environments,” Holland wrote in an email.
Another facet of personalization is using AI to expand career-readiness education. Because AI can be helpful with simulated experiences and identifying where individual students need improvement, Holland said, it can help them in key fields such as medicine.
“In the future, we’ll be doing much more of that as we also look at this being an upskilling opportunity to tackle, how do we educate the whole workforce now to embrace new ways of doing engineering or nursing or medicine, leveraging these new technologies?” she said.
While schools nationwide increasingly adopted AI tools in 2024, Matthijs said only a small segment of them have expressed interest in pushing it “really far,” and he expects those schools to stand out next year. Technology vendors that work with them will need to continue refining their governance strategies to ensure ethical use.
“2025 is going to be an interesting year, where we need to find that balance between continuing to push the envelope, but also staying within the bounds of the responsible use of AI,” he said.