2025 AI Trends in Higher Education

5 AI trends to expect in higher education in 2025

Important information to consider before reading

Think of AI as a hammer. It’s a tool. One tool. Assuming you have lumber and nails, can you build a house with a hammer? Technically, yes? Maybe? But would you want to live in it? Doubt it.

You can gather every tool and material imaginable, but it takes skills, knowledge, and expertise to use them effectively—and safely—that’s where you come in.

AI can help improve accessibility (discussed later), but the tools themselves need to be accessible. Each AI tool is built for a specific task or set of tasks—like generating text. Some tools can help improve accessibility by generating text to better support students with specific disabilities, like using decodable words and formatting to help students with dyslexia. It may do that well, but that doesn’t mean it works with screen readers or other assistive technologies.

In other words, helping improve accessibility doesn’t mean they’re accessible or meet accessibility compliance standards.

Whether you’re implementing AI across the institution or just looking for some course content, one thing is non-negotiable: always, always, always have the right people review and approve it for data privacy and security before using it.

Use AI purposefully, not just because it’s there. It should be a supplemental tool that genuinely helps improve teaching and learning. Otherwise, it’s just another thing students have to figure out how to use without adding real value.

Also, consider whether you’re using each tool for its true purpose. Have you ever seen LinkedIn “experts” bragging that they’ve stumped a chatbot… with a math problem Pretty impressive—as long as you ignore the fact that chatbots aren’t designed to do math.

Would you use a calculator to write an essay? Hopefully not.

While some AI tools can handle tasks outside their main functionality, they work best when used for what they’re actually designed to do.

1. More schools investing in secure AI chatbots

The benefits of AI chatbots in higher education are clear, but so are the data privacy and security risks. That’s why you can expect to see more colleges and universities offering AI chatbots, but taking a more secure approach either by self-hosting them or only through strict agreements where the provider (Google, Amazon, Microsoft Azure) does not use the data to train larger public models.

In other words, students, faculty, and staff can interact with the chatbot and the information won’t be used publicly.

Plus, when colleges offer these and other similar tools, all students can access them, not just those who can afford a subscription.

Benefits of AI chatbots in higher education

Aside from privacy and security, colleges and universities can still enjoy many of these benefits regardless of the approach, but a secure approach adds peace of mind and aligns with the ethical responsibility to protect sensitive information.

Commonly used chatbots: ChatGPT, Microsoft Copilot, Google Gemini, Claude

Creating, developing, and repurposing course content

Faculty can use chatbots for just about any course-related task, from drafting comprehensive exam rules and rewriting test questions to creating discussion prompts and generating mock datasets for assignments.

Click below to see templated prompt examples and screenshots.

Prompt template: Write a discussion prompt for a [type/level (e.g., online graduate)] course within a module about [topics/subjects]. Use open-ended questions that encourage students to share specific personal experiences and challenges them to analyze and relate their experience to [add subject details] to demonstrate their understanding and critical thinking skills.

Example prompt output for an accounting course: Think about a time when you had to create or manage a personal budget. How did your approach compare to the principles of accrual accounting discussed in this module? Reflect on whether you followed a cash-based approach (tracking money when it is received or spent) or an accrual-based approach (tracking income and expenses when they are earned or incurred). What challenges did you face, and what does this reveal about the benefits and limitations of each method? Use specific examples from your experience and connect them to the concepts covered in this module.

Example prompt templates:

Rewrite the following test question(s) to assess the same concepts. Maintain the same difficulty level and word count, but alter the phrasing and structure.”

“Convert these [existing format (e.g., true or false, matching, etc.)] test questions into [new format (e.g., multiple-choice, fill in the blank, etc.)]”

For better results, include an example of a test question with a revised version.

Prompt template: Create an mock dataset for [type/level, e.g., undergraduate accounting] students to analyze. The dataset should represent a [specific scenario (e.g., a small online retail business, or a personal budget]. Include data spanning [time period, e.g., 12 months] with the following categories: [list categories (e.g., expenses, income, etc.)

Prompt template: Adapt this [activity, resource, etc.] about [subject/topic] in a [type/level] course to be repurposed for an [new use case]: [insert text from original activity]

Prompt to create the table below: Write HTML code for a table I can embed into my course that has 4 rows and 4 columns. Use placeholder text that I can edit as needed (for example “Column 1 Title” “Column 1 Text”). Make this the background color for the first row #ababab. Make this the background color of the first column ##d9d9d9.

The table below is the result of the code provided by Claude.

Column 1 Title Column 2 Title Column 3 Title
Row 1 Text Row 1, Column 1 Row 1, Column 2 Row 1, Column 3
Row 2 Text Row 2, Column 1 Row 2, Column 2 Row 2, Column 3
Row 3 Text Row 3, Column 1 Row 3, Column 2 Row 3, Column 3

Learn more about AI prompting techniques and find more templates and examples.

2. Improving accessibility

AI has the potential to make education more accessible and inclusive in some unexpected ways, like recognizing accents, localizing content, and detecting emotions. On top of that, it can automatically check for compliance and make changes to meet accessibility standards—almost like a set-it-and-forget-it solution for specific areas.

Languages

AI is making communication smoother, not just with basic translation tools but also through localization, accent recognition, and name pronunciation. Here are some AI tools to consider, how they work, and why they’re important.

Accent recognition AI does exactly what you’d expect—it recognizes and understands accents. It’s like Siri, but more fluent (and cultured?)

Why does that matter for online learning? Think about students who rely on speech-to-text because using a keyboard isn’t an option. Or a faculty member with an accent whose live lecture captions are filled with errors. It’s a frustrating and limiting experience.

Accent recognition AI can help fix this by making captions and transcripts more accurate and voice-controlled assistive technology more usable.

Check out these accent recognition tools: Lokalise, Smartling, Fluent AI

Did you know there are around 160 different English dialects? Some have only slight variations, while others differ significantly in pronunciation, grammar, spelling, and even word meanings.

For example, what Americans and Canadians call french fries or fries, Brits and Australians call chips. And Americans write center and localize, while Canadians, Brits, and Australians often use centre and localise.

Most of the spelling differences don’t affect understanding. But it can be confusing when words have entirely different meanings. What if a term with different meanings was used on exam or assignment and the student didn’t know?

Localization AI helps by adapting your content to specific dialects and cultures. You can also use chatbots to help localize your content and review it for language that may be confusing for some people.

AI sign language translation software can translate speech into sign language in real time.

The AI chooses from a huge collection of real videos of people signing to create smooth and accurate translations. Some solutions integrate into your websites and can be added to videos or other content in a simple picture-in-picture format.

Check out Signapse AI and Slait AI. They provide real videos of people signing, unlike previously existing tools that provide cartoon-like avatars.

If your name is even remotely unique you’ve probably heard it mispronounced. The really unfortunate part is that rather than correcting people, some just accept it and answer to all kinds of mispronunciations.

That’s where name pronunciation AI like Namecoach come into the picture. Namecoach integrates with the LMS, SIS, and most other platforms and browsers.

Students create a voice recording of their names or use audio databases for accurate pronunciation. These recordings and phonetic spellings are accessible to instructors and other students so that whether it’s a virtual event or support call, names are pronounced correctly.

Adapting content to support students with learning disabilities and diverse learning needs

Chatbots can assist faculty with adapting content to better support students with learning disabilities and diverse learning needs.

For example, faculty can ask the chatbot to adjust exam questions to meet a wider range of cognitive abilities and language comprehension levels. Chatbots can further support accessibility by creating descriptive elements for images and completing other accessibility-related tasks.

Click below to see templated prompt examples.

Shorter prompt: Please provide alternative text (125 words or less) and an image description for this photo: [attach photo]

Detailed prompt: I’m using this image on a [course page, email, etc.] about [subject/topic]. Please generate: descriptive alt text (125 words or less); detailed image description; suggested file name; and caption for this photo: [attach photo]

You can also try out this step-by-step alt-text creator.

Prompt example: Rewrite these [object (e.g., instructions, questions, etc.)] to make them clearer and more accessible for students with dyslexia. Simplify the language by using shorter sentences, straightforward vocabulary, and decodable words where possible. Break the text into smaller sections, use bullet points or numbered lists, and highlight key terms or steps. Keep the instructions concise and avoid unnecessary jargon. Here are the instructions: [add your instructions]

Example using the prompt:

Original instructions: Write a 500 word essay analyzing the themes in the assigned text. Your essay must include an introduction with a thesis statement, at least three body paragraphs with textual evidence, and a conclusion that summarizes your argument. Cite all sources using APA formatting. Submit your essay in a Word document via the course portal by Friday at 11:59 PM. Late submissions will incur a penalty of 10% per day.

Revised version for students with dyslexia (Note: This is a generic example. You’ll need to adapt the prompt based on your students and the specific activity and provide any necessary context.

Write a 500-word essay about the themes in the assigned text. Follow these steps:

  • Introduction: Start with a short paragraph that states your main idea (thesis statement).
  • Body Paragraphs: Write at least three paragraphs. Use examples from the text to support your ideas.
  • Conclusion: End with a short paragraph that sums up your main points.

Remember:

  • Use APA format to include references for any sources you use.
  • Check your spelling and grammar before submitting.

How to Submit:

  • Save your essay as a Word document.
  • Upload it to the course portal by Friday at 11:59 PM.

Important: If you turn it in late, 10% will be taken off for each day it’s late.

Emotion detection AI

Emotion AI detects and interprets emotions by analyzing text, voice (audio), and video (facial expressions and gestures) and associating specific components with related emotions.

This technology can help:

  • Students with cognitive and/or physical disabilities communicate
  • Detect signs of stress, confusion, frustration, happiness, etc.
  • Understand which course activities students are most interested or uninterested in

The images below show how video emotion AI could work to give instructors insight into learner engagement during specific activities.

  • High Attention: Learner attention increases with eye contact.
  • Low Attention: Attention drops when they look away.

While eye contact doesn’t always mean focus, and looking away doesn’t mean they’re not listening, combining this with other data—like duration, facial expressions, and body language—gives a clearer picture of behavior. Educators can use this to identify course activities that increase engagement, create confusion and frustration, and more.

High Attention

Low Attention

  • High attention example for emotion detection AI

  • Example of low levels of attention in video emotion software

But these tools aren’t perfect, and they probably never will be. That’s because of how they work. With facial expression recognition, for example—the AI looks at things like raised eyebrows and lip movements and tries to match them to emotions it’s been trained to recognize. In reality, it’s just grouping facial features into preset emotion categories, not actually figuring out the person’s true emotional state. Plus, we all express emotion differently, and some emotions are expressed in pretty similar ways—how easy would it be for happiness to be misinterpreted as surprise?

AI accessibility checkers

AccessiBe and Equally AI are two AI-powered tools that help institutions, faculty, and staff check their websites, courses, and content for accessibility. These solutions continuously monitor resources and automatically make adjustments to improve accessibility and meet compliance standards.

This isn’t AI, but Accessibility Desk provides a hub of free accessibility tools to check websites and content for accessibility.

Learn more about web accessibility compliance standards and guidelines.

3. AI will incrementally become a staple for studying

AI can be the ultimate study partner because of its diverse capabilities and interactiveness. After providing it with text, notes, files, or images, here’s how students use AI to study smarter.

Creating study guides​ and outlines

Summary about using either Notebook LM or chatbots

Chatbots

Example prompts: Based on the sources I’ve provided, create a detailed [outline or study guide] for [sections/chapters e.g., chapters 2-4]. [Add follow up text from the examples below].

Organize the outline using [preferred format e.g., headers, bullet points, etc.]. Focus on [specific information/goals (e.g., identifying connections between X and Y, comparing and contrasting X and Y)].

Include the following information for each chapter: a high-level summary paragraph, a section explaining key concepts and why they’re important, a bulleted list of terms and definitions that are easy to understand, and a list of potential exam questions.

NotebookLM

With NotebookLM, students can upload PDFs, websites, YouTube videos, audio files, Google Docs, or Slides. NotebookLM quickly turns these into detailed outlines, study guides, FAQs, or even conversational “podcasts” using all or some of the uploaded sources.

NotebookLM interface for students creating study guides

Students upload the sources here, which can range from PDFs and websites to videos and audio files. We picked a video, Google Doc, website article, and three PDFs.

Students can ask the AI questions based on the sources. The AI provides answers based on the sources and cites which source(s) the information is from.

Always check the accuracy of anything AI generates. This is their disclaimer: "NotebookLM can be inaccurate, please double check its responses."

An audio overview can be created where two AI voices discuss the sources in a podcast-like format that actually sounds realistic.

Students simply click the buttons below for comprehensive study guides, briefing docs, FAQs, and timelines.

NotebookLM interface for students creating study guides

Students upload the sources here, which can range from PDFs and websites to videos and audio files. We picked a video, Google Doc, website article, and three PDFs.

Students can ask the AI questions based on the sources. The AI provides answers based on the sources and cites which source(s) the information is from.

Always check the accuracy of anything AI generates. This is their disclaimer: "NotebookLM can be inaccurate, please double check its responses."

An audio overview can be created where two AI voices discuss the sources in a podcast-like format that actually sounds realistic.

Students simply click buttons below to comprehensive study guides, briefing docs, FAQs, and timelines.

Flashcards

Here’s an example prompt students can use to engage in an interactive flashcards activity with a chatbot: Create 10 flashcards to test my knowledge of the different parts of a cell based on their function. Present the “front” of the flashcard first, then I’ll answer, and then you present the “back” of the card so I know if I’m correct or not. If my answer is incorrect, please provide the correct answer and 1-2 sentences of additional context.

Summarize, simplify, paraphrase, or rephrase content

Whether students provide an entire study or a paragraph of text, chatbots can help summarize and simplify the information. Here’s how a few chatbots responded to a basic prompt to simplify a few sentences of overly complicated text (text & prompt below).

Text provided to the chatbot*: “Educational technology influenced in many and often centrifugal ways educational innovation as part of societal development. Successive behaviorist, cognitive, constructivist, and socio-constructivist approaches to learning and the concomitant use of technologies suggest a clear, straightforward contribution to education based on the internal dynamics of that field.”

Prompt used: Shorten and simplify this text into two sentences that are very easy to understand. Use practical, simple language instead of complex terms.

*Text is from Lowyck, J. (2014). Bridging learning theories and technology-enhanced environments: A critical appraisal of its history. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of Research on Educational Communications and Technology (pp. 3–20). Springer New York. https://doi.org/10.1007/978-1-4614-3185-5_1 

Prompt response: Educational technology has helped education grow and adapt over time. Approaches like using rewards to encourage learning, understanding how people think, and learning through real-world experiences have guided how technology is used in teaching.

Prompt response: Educational technology has driven many changes and improvements in education as society develops. Different learning approaches and the use of technology have consistently made education better.

Prompt response: Technology has played a big role in how education has changed over time. Different ways of thinking about learning, like focusing on how people behave, how they think, how they learn by doing, and how they learn from others, have all been shaped by technology and have helped improve education.

Prompt response: Educational technology has helped drive changes in how we teach and learn as society has evolved over time. Different theories about how people learn have shaped how we use technology in education, from simple cause-and-effect approaches to more collaborative learning methods.

4. More faculty will proctor written assignments

With the right prompts, chatbots can generate human-like text on basically any topic from any perspective. The downside is that students can use them to complete written assignments. But the good news is that written assignments can be proctored just like online exams.

During proctored written assignments, students WON’T be able to do the following:

  • Use chatbot applications, websites, or browser extensions.
  • Visit unauthorized websites unless the instructor permits access to specific resources like links to specific studies or articles.
  • Refer to books or notes unless approved by the instructor.
  • Paste pre-copied content into the word processor.
  • Search for information on their cell phones.
  • Ask for help from voice assistants like Siri or Alexa.

5. Engaging students with assignments that involve AI

Some instructors will design assignments that actively incorporate chatbots. For example, students might be asked to:

  • Use a chatbot to generate an essay on a specific topic, then analyze its response by identifying errors, making corrections, and explaining their reasoning.
  • Compare responses from different chatbots on the same topic to evaluate their accuracy and identify inconsistencies.
  • Teach the AI about a concept by starting with a prompt like, “Pretend you don’t know much about [topic/subject] other than [specific aspect of the topic]. I’ll explain how it works. Ask me clarifying questions throughout.”

AI is slowly being embraced in higher education—sometimes as a helpful tool, other times as something students try to sneak past their professors. As schools and faculty rethink assignment and assessment strategies, they also need to consider the challenges around privacy, security, and academic integrity at every stage. But when used the right way, AI can make learning more effective, engaging, and fair.

More resources for educators

Sign up for our blog

Receive our latest ebooks, articles, webinar invites, and more.

Web Accessibility Guidelines & Compliance Standards In Online Learning

Article on what web accessibility is and compliance standards

Just as your campus buildings have braille signs, wheelchair ramps, and elevators to make them accessible to everyone, think of web accessibility in the same way: all users, regardless of disabilities or conditions, should be able to access and interact with your online course content and technologies.

Did you know: 

More than 1 in 4 people in the US have a disability. Disabilities can include cognitive, neurological, psychological speech, auditory, mobility, and visual, among others.

Color blindness (color vision deficiency) affects about 4.5% of the world’s population. This means a school with 5,000 students has about 225, on average, with color vision deficiency.

Color Blindness Simulator​

The comparison images below show how different types of color blindness, also called color vision deficiency, can impact a student’s ability to interpret your online course content.

three brightly colored parrots sitting on branches in normal color vision Blue-Blind Tritanopia
three brightly colored parrots sitting on branches in normal color vision Green-Blind Deuteranopia
three brightly colored parrots sitting on branches in normal color vision red color blindess example with three parrots sitting on a branch
three brightly colored parrots sitting on branches in normal color vision black and white color blindness simulation showing three parrots on a branch

Scroll to see more examples

Find the information you need

What is web accessibility?

Web accessibility means designing digital resources and content that everyone can access, use, and interact with, regardless of disabilities and socioeconomic factors, like internet bandwidth and speed, that can restrict access.

It involves understanding how students with disabilities interact with learning content, recognizing the challenges they face, and designing learning materials, activities, and resources to address those challenges.

Why is accessibility important in online learning​?

Accessibility in online learning is important because it gives every student a fair chance to learn, participate, navigate, and interact with course content and activities. Without it, many students could face challenges that prevent learning or even accessing courses altogether.

POUR principles

There are four principles of accessibility that are known as POUR principles. POUR is an acronym that stands for perceivable, operable, understandable, and robust.

Perceivable

Students can sense and access online elements through vision, hearing, or assistive technologies.

Operable

Students can navigate and interact with content using controls, buttons, or navigation tools.

Understandable

Students can easily comprehend and use the content and interface.

Robust

Content works across devices, browsers, and assistive technologies.

Web accessibility guidelines and standards

There are several guidelines specific to web accessibility, as well as broader compliance standards that we’ll cover in this article:

  • Guidelines: WCAG, Mobile, UAAG, ATAG, EPUB
  • Standards: ADA, Section 508, EN 301549

Are guidelines and standards the same thing? No, web accessibility guidelines and standards are different. Guidelines are best practices and recommendations, while standards are legally required. BUT some standards require compliance with guidelines.

Web accessibility guidelines

Web Content Accessibility Guidelines (WCAG)

WCAG provides technical standards and best practices to create more accessible digital content that supports every user, regardless of disabilities, content type, or device. Instructional designers and instructors can use WCAG to create accessible course content for people with disabilities, such as physical and mobility conditions, blindness, deafness, speech difficulties, and learning disabilities.

Examples of WCAG recommendations:

  • Consistent course format: Use a consistent layout across course materials to make navigation easy (e.g., keeping headers, footers, and navigation in the same place).
  • Easy to read text: Use accessible font sizes, high color contrast, and appropriate spacing to improve readability.
  • Text alternatives for non-text content: Add text descriptions to all multimedia used in your course. For example, alt-text for images, captions and subtitles for videos, transcripts for audio, etc.
  • Keyboard Accessibility: Make all course elements usable and navigable by keyboard (e.g., navigation, reading materials, quizzes, etc.).
  • Avoid flashing elements: Don’t use any elements in your course that flash more than three times per second.

Mobile Accessibility Guidelines

W3C, the same standards organization that created the other guidelines in this article, provides broad guidelines for designing content that is accessible and works well on mobile devices.

Mobile accessibility tips for online courses:

  • Optimize for small screens: Minimize the information on each page (e.g., fewer images and modules) and use responsive design to adjust the layout.
  • Support zoom/magnification: Allow students to zoom in on content.
  • Use high color contrast: High contrast is important for readability on mobile devices, especially in bright or changing light conditions like outdoor sun glare.
  • Simplify touchscreen gestures: Use simple gestures, like single or double tapping or swiping, and provide an on-screen indicator or advisement of the gestures.
  • Easy to access buttons: Place buttons and interactive elements in locations where they easy to reach and usable with one hand.
  • Key information at the top: Place important information at the top of the page to assist users with low vision and cognitive impairments with locating content without extra interaction.

User Agent Accessibility Guidelines (UAAG)

UAAG makes internet browsers and media players (YouTube, Vimeo, Spotify, etc.) easy to use and compatible with assistive technologies. These guidelines helps create a more accessible, consistent experience regardless of the browser or media players used in your courses. UAAG is important for online courses because it allows all users—students, faculty, and staff—to access and interact with learning materials regardless of their accessibility needs, devices, and platforms.

Authoring Tool Accessibility Guidelines (ATAG)

ATAG makes website builders and content editors more accessible for designers and developers with disabilities. This is important for online courses because it allows instructional designers, e-learning staff, and instructors to create e-learning materials that all students can access and use.

ATAG  is relevant for authoring tools used to:

  • Create web pages (WYSIWYG HTML editors)
  • Generating websites (learning management systems (LMS), courseware tools, content management systems)
  • Convert documents to web technologies (word processors and HTML converters)
  • Save content in web formats (word processors and desktop publishing apps)
  • Quickly update portions of web pages (blogs, forums, etc.)
  • Produce digital multimedia content (video editors)
  • Send emails with messages using web content technologies (Mailchimp, Hubspot, etc.)
  • Create mobile applications (Adobe XD, Figma, etc.)

Ebook Accessibility (EPUB)

EPUB accessibility guidelines help make ebooks and other digital publications accessible to people with disabilities by supporting screen readers, making the content easy to navigate, and including alt text for images. In online learning, these guidelines allow all students, including those with visual or cognitive impairments, to access and engage with ebooks and other digital publications.

What makes an ebook accessible?

Accessible ebooks make reading easier for everyone, like screen reader compatibility for students with vision impairments, read-aloud text for learners with dyslexia, and switch control support for those with motor disabilities. Some also offer extra features like sign language videos for deaf users or simplified text for learners with intellectual disabilities.

Accessibility Compliance Standards

Americans with Disabilities Act (ADA)

What is ADA compliance?

The Americans with Disabilities Act (ADA) is a federal law created by The Department of Justice that protects people with disabilities by prohibiting discrimination. ADA grants equal access to public spaces, jobs, transportation, telecommunications, and online services, like education. It requires schools, businesses, and other organizations to make reasonable accommodations so everyone can participate fully.

ADA includes web accessibility guidelines like making text and links easy to read, adding alt text to describe images, and providing captions or transcripts for audio and video content to ensure clarity and consistency.

For colleges and universities, ADA compliance—including accessible web content—provides equal access to programs, resources, and facilities for all students, faculty, and staff, which builds a more inclusive environment that meets compliance standards.

Who should follow ADA compliance standards?

ADA impacts many industries and organizations, such as private employers, public organizations (such as schools and universities), and various levels of government agencies. ADA advocates that organizations should self-regulate their web accessibility standards and are encouraged to use WCAG guidelines as well.

Section 508

What is Section 508?

Section 508 is a law that requires federal agencies to make their electronic and information technology—websites, software, and multimedia—accessible to people with disabilities. Complying with Section 508 also requires agencies to make online training accessible.

How does Section 508 apply to higher education?

While Section 508 primarily applies to federal agencies, its standards influence higher education institutions, especially those receiving federal funding, which includes requirements to meet accessibility standards like Section 508. This alignment with other accessibility standards helps provide all students equal access to e-learning resources and educational opportunities.

EN 301 549

EN 301 549 is a European standard that sets accessibility requirements for digital products and services that make them usable by people with disabilities. It builds on WCAG but also covers requirements that make biometric technologies such as facial recognition and fingerprint scanning accessible for people with disabilities. While EN 301 549 is primarily used by European Union (EU) countries, other countries, including Canada, Australia, Sweden, and Norway, use it as a framework for their accessibility efforts.

Accessibility Checkers and Compliance Testing

Web accessibility checkers to help you create online courses that are accessible for everyone.

WCAG compliance checker

Accessibility color checker

Website accessibility checker

Video captioning tools

Microsoft Office

Functional web accessibility evaluator

Finish the 3-Part Accessibility Series
Sign up for more resources

Future AI Trends in Online Education

AI tools to improve online education

While some of the AI tools you’ll see aren’t widely used in online learning yet (they’re mostly used by businesses) we’ll explain how they work and the ways they can be adapted for online learning.

Understanding emotions is complex because we all feel and express them in different ways. While emotion AI isn’t perfect, it can help instructors improve engagement and support learners where they need it most.

What is emotion AI?

Emotion AI, sometimes called affective computing, detects and interprets human emotions by analyzing text, voice (audio), and video and associating specific components with related emotions.

How can emotion AI be used in online learning?

Emotion AI technology can help learners with cognitive and/or physical disabilities communicate; detect signs of confusion and frustration; and pinpoint course activities that they are interested in or uninterested in.

What are the types of emotion AI?

Text Emotion AI

Analyzes written language to identify the sentiment and emotional tone of the content.

Example uses: analyzing written responses, like forum posts and course evaluations, to identify, understand, and address emotional states.

Voice Emotion AI

Identifies emotions based on vocal characteristics like volume, tone, pitch, and speed.

Example uses: monitoring learners’ spoken responses during online classes and virtual presentations to detect nuances in learner interest or distress.

Video Emotion AI

Observes body language, facial expressions, and gestures to determine emotional states.

Example uses: observing facial expressions and body language during video conversations and online exams to understand learners’ confusion and levels of interest or disinterest.

The images below illustrate how video emotion AI software might appear from an instructor’s perspective when reviewing, offering insights into learners’ interest and engagement.

“High Attention” highlights that learner attention increases when making eye contact.

“Low Attention” shows a decreased level of attention when they look away from the screen.

While eye contact with a webcam doesn’t necessarily mean they’re really paying attention, and looking away doesn’t mean they aren’t listening, when paired with other metrics, like how long they looked away, facial expressions, and other body language, it can help gain a broader understanding of behavior and emotion.

Educators can use these individual insights or similar data from all learners to understand certain activities that boost attention and engagement, which topics confuse learners, and more.

High Attention

Low Attention

  • High attention example for emotion detection AI

  • Example of low levels of attention in video emotion software

High Attention

Low Attention

  • High attention example for emotion detection AI

  • Example of low levels of attention in video emotion software

2. AI language tools that go beyond translations

Newer language-related AI can understand the nuances of languages like slang, accents, and dialects to build truly global dialogues in online courses.

Accent Recognition AI

“Sorry, I didn’t quite catch that.”

Even if you’re a native English speaker, you’ve probably heard this or something similar from voice assistants. They’re about 95% accurate, sometimes more or less depending on which one you’re using, which is pretty good, right?

Sure, but that still means every 20th word is wrong, which is the exact length of the sentence you’re reading.

However, the accuracy varies depending on your accent… even if you’re a native English speaker. 

Question

Answer


  • Which two U.S. accents are voice assistants more likely to understand?

    1. Southern

    2. Midwest

    3. Western

    4. Eastern


  • 1 & 3

    Southern and Western accents were understood more often than Eastern and Midwest accents.

    Source: Washington Post, 2018 research

The research also found that people with non-native accents experienced 30% more inaccuracies when using voice assistants, with Spanish and Chinese accents being the least accurate. Imagine how frustrating that would be.

But there’s good news: accent recognition AI tools are available that are trained on extremely diverse data that allows it to better understand accents. They analyze and interpret speech patterns, intonations, and pronunciations specific to different accents. 

Accent recognition AI can help improve the operability of voice-controlled technologies and generate more accurate live captioning and transcriptions.

Name Pronunciation AI

What’s an easy way to stifle a sense of inclusion and belonging? Mispronouncing someone’s name.

Even if it’s an innocent mistake, it’s probably something that person encounters daily. But there’s a solution that can help: name pronunciation AI.

How does name pronunciation AI work?

  • Integrates throughout your online courses within the LMS, SIS, and other platforms to ensure that names are accurately pronounced.
  • Uses databases of audio name pronunciations and algorithms that recommend correct pronunciations.
  • Learners voice-record their names, and the recording is available throughout the platforms.

Name pronunciation AI can be used in:

  • Online class discussions
  • One-on-one advising sessions
  • Virtual information sessions
  • Recruitment conversations
  • Student support

Realistic Sign Language Interpretation AI​

Instead of using cartoon-like avatars, newer sign language interpretation AI creates a more realistic experience by splicing together videos of real people signing.

Some sign language AI can also provide real-time translation of spoken language and text to sign language and vice versa, making content accessible to deaf or hard-of-hearing learners.

Cartoon signing help
Cartoon signing help

3. Automatically finding leaked test content

Have you ever found your test questions leaked on the internet?

“Homework help” sites like Chegg and discussion forums like Reddit and Quora make it easy to find and share your test questions and answers. 

You have 3 ways to tackle leaked test content:

1. Manually searching the internet on your own

You search the internet for individual test questions and send takedown requests if you find any.

2. Manually searching but with AI’s help

You select individual questions that the AI will search for and send your own takedown requests.

3. Automatically searching with AI

The AI does all the work by automatically searching the internet for all of your test content in a few minutes and giving you the ability to send one-click takedown requests.

Automating this process is the best way, so here’s how it works with Search & Destroy™:
  • Search & Destroy™ automatically searches for all of your exam questions
  • Search results show where any of your questions are leaked
  • Send one-click takedown requests to sites displaying your questions

That’s it. No more leaked content concerns.

4. Large Language Models can help review college admissions essays without bias

Completely eliminating biases isn’t realistic, but they can and should be acknowledged and addressed, and AI can help,  even though it can be biased too.

How can AI be biased?

AI reflects the biases of the people using it, the data it’s trained on, and the ways it’s used.

For example, if AI is trained on biased data, such as data from underrepresented groups, there’s a ripple effect that can impact algorithms, outputs, and future models.

The good news is that AI biases, similar to human biases, can be recognized and addressed to help reduce them.

Which would you pick?

If you were in charge of admissions—and let’s pretend time constraints don’t exist—would you:

  • Only review objective data, like standardized test scores and GPAs
  • Review objective data and understand other areas of who they are, like their personal qualities

Generally speaking, most would choose to consider personal qualities like personality, character, leadership, and life experiences.

While personal qualities are more subjective than test scores, research shows that they can predict success in school and life. Reviewing these qualities takes more time than reviewing objective information, like test scores and GPAs. But Large Language Models can help.

Large Langage Models (LLM)

What are large language models?

Large Language Models are a type of AI that can understand, interpret, and generate human language by analyzing and learning from extensive datasets.

How do LLMs work?

LLMs are trained by “reading” billions of pieces of text from various sources, like internet articles and forums, scientific research, textbooks, newspapers and magazines, and more.

This training helps them learn patterns and understand how words and sentences are formed in different formats and contexts.

LLMs don’t actually understand language, they’re just really good at predicting what word should come next. The two models we’ll discuss are unidirectional and bidirectional

  • Unidirectional: predicts the next word based on previous words
  • Bidirectional: analyzes text from both directions to predict a word in context

RoBERTa vs. ChatGPT

Both are LLMs that share the same architecture but excel in specific tasks, similar to cars with the same frame but different tires and suspension systems for certain terrains.

  • RoBERTa (bidirectional) drives better in the city (understanding language nuances and context), but it can still make it on certain off-road trails (creating content).
  • GPT (unidirectional) drives best on off-road trails, but can navigate some city streets.

The University of Pennsylvania used a LLM, RoBERTa to review college admissions essays for personal qualities

Research published in October 2023 by the University of Pennsylvania indicates that certain LLMs, if trained properly and thoroughly, can review admissions essays for personal qualities that predict college graduation on par with human admissions staff.

The researchers and their team analyzed over 300,000 college essays and scored them on the absence or presence of seven traits: prosocial purpose (helping others), leadership, learning, goal pursuit, intrinsic motivation, teamwork, and perseverance.

Then they trained RoBERTa  to recognize and evaluate similar qualities and characteristics in essay submissions without showing biases toward race, gender, or socioeconomic status. RoBERTa was used because it excels at understanding the context and meaning of language, which makes it an effective tool for understanding emotions, text classification, and translations.

Research results and takeaways

RoBERTa recognized personal qualities without bias

It recognized qualities like teamwork and intrinsic motivation in applicants from diverse backgrounds, without showing bias towards race, gender, or socioeconomic status.

RoBERTa’s predictions were accurate

Its predictions of the likelihood of students graduating were slightly more accurate than humans, but not by much.

The researchers recommend using AI with optimism and caution

“An AI approach to measuring personal qualities warrants both optimism and caution… We recommend AI be used to augment, not replace, human judgment. No algorithm can decide what the goals of a university’s admissions process should be or what personal qualities matter most for that community.”

5. Preventing remote access software contract cheating

Have you ever had a support technician take over your desktop and fix your computer from a remote location?

That’s basically how remote access software is used to cheat on exams.

A person pays a test-taking service to have one of their experts control their computer and take the exam from a remote location.

Even though the person getting credit appears to be sitting in front of the camera during the test, it’s the off-camera expert who is actually answering the questions.

And since the person getting credit stays on screen during the exam, ID verification methods won’t help.

How can you stop remote access cheating?

Honorlock’s remote proctoring platform has a few ways to help:

  • Recording the desktop and requiring specific keyboard commands immediately before starting the exam: Exam admins use Honorlock’s exam settings or test rules to require test takers to use keyboard commands, such as Ctrl+Alt+Del (Windows) or Cmd+Opt+Esc (Mac), to display the applications and processes running on the device.
  • Displaying countries: Honorlock’s Analytics Hub™ shows the countries that tests were taken in based on IP address. If any tests are taken in countries with no known test takers, it may indicate the use of remote access test-taking services.
  • Blocking applications: Honorlock’s proctoring platform gives exam administrators the ability to block specific applications that can be used for remote access.

6. Using AI for on-demand tutoring

Whether intentionally built for tutoring or adapted, AI tools such as intelligent tutoring systems, chatbots, and writing assistants offer 24/7 interactive tutoring and support, which enhances learning while building a more diverse and inclusive educational environment that accommodates various learning styles and needs.

They also offer real-time feedback, which is crucial for learners with intellectual disabilities because it helps them make connections between their work and the instructor’s feedback.

Intelligent Tutoring Systems (ITS)

Intelligent Tutoring Systems simulate one-on-one human tutoring, offering tailored feedback and adapting course materials to meet each learner’s needs. 

They can guide learners through problem-solving steps, offer hints, break down complex topics, and recommend additional relevant content.

While an ITS can benefit any subject, it’s particularly beneficial for subjects like math, which require—for the most part—a lot of repetition.

Here’s how an ITS could work for an algebra course:

  • The ITS assesses the learner’s understanding of basic algebraic concepts and their ability to solve problems.
  • After assessing knowledge, it customizes learning activities and offers additional content based on their needs. 
  • If the learner struggles in a specific area, it provides extra help, like step-by-step explanations and practice problems.
  • If the learner excels in a topic, the ITS gradually progresses to more advanced concepts and activities.
  • The ITS provides immediate feedback and additional context, allowing learners to recognize errors and learn to correct them.

Chatbots

ChatGPT, Google Bard, and other chatbots can provide learners with instant tutoring and support anytime they need it.

Learners can use chatbots to dig deeper into complex subjects, brainstorm ideas, provide feedback on written work, translate text, and check writing quality.

AI writing assistants

Writing tools like Grammarly and Quillbot have been around for a few years now, but they’re evolving.

Initially, they just helped improve writing by correcting grammar, spelling, and style issues.

But now, they’re incorporating AI that can instantly make writing more concise, easier to understand, or sound a specific way, such as being more assertive.

These tools are particularly useful for individuals writing in a second language and for those with learning disabilities, as they can boost confidence and deliver different forms of corrections and feedback immediately.

Regardless of which AI tools you use, make sure that they’re used purposefully, ethically, and with full transparency. Always keep in mind that AI tools are just that—tools. They aren’t a replacement for the people using them.

Sign up for more online learning resources​