How to use AI to create accessible & equitable online courses

How to make online courses accessible and equitable with AI tools

AI isn’t new to online learning (or in-person). Educators use it to automate tasks that eat up time, analyze data, and even customize learning paths. While some of the AI tools you’ll see aren’t widely used in online education (yet), they could—and probably should—be soon.

But keep this in mind: AI can help educators create equitable learning environments and accessible online courses, but it can’t replace educators.

Educators have used AI in online courses for yeeeeeears to automate tasks, analyze data, and tailor learning paths. While most of the AI tools you’ll see in this article aren’t widely used in online learning, they could—and probably should—be soon.

But keep this in mind: AI can help educators create equitable learning environments and accessible online courses, but it can’t replace educators.

AI to improve equity & accessibility in online courses

Emotion AI

Emotion AI analyzes our behavior, how we speak, and what we write, then associates these with specific emotions. There are three primary types of emotion AI: video, text, and voice.

Video Emotion AI

Analyzes body language, facial expressions, gestures, and other movements to determine emotional states.

Text Emotion AI

Identifies emotions based on the tone, word choices, patterns, and other aspects of written text.

Voice Emotion AI

Listens to vocal characteristics like volume, pitch, and pace to determine emotional states.

Considerations before using emotion AI

Emotions are complex, and how we express them varies, so keep these in mind to help use them effectively and ethically:

  • Each person expresses emotions differently.
  • Physical conditions or disabilities may limit facial expressions or cause unintended body movements, which could give inaccurate feedback on emotional states.
  • Cultural differences can impact behavior, gestures, speech, etc.
    • For example, eye contact is a sign of interest and respect in some cultures, while other cultures avoid eye contact to show respect.

Considerations before using emotion AI

Emotions are complex and how we express them varies, so keep these in mind to help use it effectively & ethically:

Each person expresses emotions differently.

Physical conditions or disabilities may limit facial expressions or cause unintended body movements, which could give inaccurate feedback on emotional states.

Cultural differences can impact behavior, gestures, speech, etc.

  • e.g., eye contact is a sign of respect in some cultures while others avoid eye contact to show respect.

Video Emotion AI

Video emotion AI uses a combination of technologies to analyze and understand facial expressions and body language, like posture, gestures, and eye contact. 

  • Computer vision AI helps computers see the world as humans do by analyzing and interpreting the environment and objects around them.
  • Facial expression recognition detects facial expressions and microexpressions by analyzing facial landmarks.
  • Body language and gesture recognition identify emotions based on body movements and gestures. For example, touching the face can indicate stress, and tilting the head can indicate confusion.

How to use video emotion AI in online courses:

  • Understand and accommodate nonverbal communication: helps those with conditions and disabilities communicate through gestures, for example.
  • Measure engagement: detects signs of disinterest and confusion, for example, which can allow educators to tailor course activities in real time.
  • Nonverbal communication: helps those with specific conditions and disabilities communicate through gestures, for example.

The example images below show how video emotion AI may appear to instructors using it to understand learners’ attention levels during specific course activities.

Attention is high while making eye contact, but it decreases when they look away.

High Attention

Low Attention

  • High attention example for emotion detection AI

  • Example of low levels of attention in video emotion software

High Attention

Low Attention

  • High attention example for emotion detection AI

  • Example of low levels of attention in video emotion software

Looking away, especially from a computer screen, doesn’t mean they aren’t paying attention. And looking at the computer screen doesn’t mean they’re paying attention (or interested).

But considering that information with other aspects, like body language, facial expressions, and how long they looked away, can help instructors begin to understand emotions to a certain extent.

Instructors can use these insights to identify engaging topics and content formats, areas to revamp, and where learners are struggling. 

Text Emotion AI

Text emotion AI analyzes text to determine emotions based on two primary approaches: lexicon (our vocabulary) and machine learning.

  • Lexicon-based: matches the text to a database of keywords and phrases associated with specific emotions.
  • Machine learning: compares the text to databases of text that it was trained on and continually learns from to recognize patterns and context. This goes beyond simple keyword matching, as seen in the lexicon-based approach.

How to use text emotion AI in online courses:

  • Identifying where learners are struggling: detecting signs of confusion or frustration in written submissions, like discussion responses, can help provide timely support and course materials that address learners’ needs.
  • Personalizing feedback: understanding the emotional context of learners’ writing can help educators provide more personalized, empathetic, and supportive feedback in some cases.
  • Understanding course feedback in-depth: analyzing learners’ course feedback can provide a deeper understanding of their learning experiences and feelings about the course, which can be used to refine course content and activities.

Voice Emotion AI

Voice emotion AI listens to vocal characteristics—very carefully!—to understand emotions.

Vocal characteristics can include how fast and loud you speak, speech patterns, enunciation, inflection, etc.

Business use cases can translate to online courses

Companies use voice emotion AI to provide customer service employees with real-time insights that indicate when customers are growing frustrated, confused, and angry. If integrated and used appropriately, it can offer educators similar insights about learners in their courses.

Business use cases can translate to online courses

Companies use voice emotion AI to give real-time insights to customer service staff about when customers or learners are becoming frustrated, confused, or angry.

How to use voice emotion AI in online courses:

  • Presentation feedback: providing learners with in-depth feedback on their presentations based on pacing, volume, pitch, and other vocal qualities.
  • Adapt course activities based on interest: analyzing vocal characteristics such as tone and intensity can help gauge interest levels in real-time during a lecture, allowing instructors to switch to more interactive activities if disinterest is detected to re-engage learners.
  • Identify signs of stress and anxiety: detecting stress and anxiety through vocal characteristics allows educators to immediately intervene and offer support.

Challenges of using voice emotion AI

  • Subjectivity: interpreting speech is subjective, and it can be difficult to accurately identify emotions. For example, yelling is a sign of anger, but it can also express excitement.
  • Slang: recognizing and interpreting slang words and phrases is difficult if the AI hasn’t been trained on them.
  • Dialect and accents: the accuracy of voice emotion AI can be impacted by dialects and accents because they change speech patterns, pronunciation, and even the meaning of words in the same language.

These challenges highlight the importance of training voice emotion AI on diverse datasets that include a wide range of accents, dialects, and cultural speech patterns. This training helps the system become more adaptable and accurate in interpreting emotions from speech with different accents.

Get the 3 part ebook
AI for DEI in Online Education

Part 1: AI Tools for DEI

Part 2: AI to Develop Equitable Admissions

Part 3: How to Build Accessible Online Courses

AI for Tutoring & Support

Whether intentionally built for tutoring or adapted, AI tools can help create genuine tutoring experiences similar to one-on-one tutoring sessions. These tools offer a tailored experience and real-time feedback, which is crucial for learners with intellectual disabilities because it helps them make connections between their work and the instructor’s feedback.1

Intelligent Tutoring Systems (ITS)

ITS use AI to replicate human tutoring by providing learners with immediate feedback and content adapted to their needs—whether they’re struggling or excelling in a topic—based on their performance, learning styles, and even their preferences. In other words, ITS meet learners where they are in the learning process and give them what they need, when they need it.

They help learners incrementally digest complex course content by breaking information in smaller chunks, offering hints, and recommending extra content in formats that benefit them the most. 

Here’s how an ITS could work for an algebra course (scroll below)

Here’s how an ITS could work for an algebra course

Initial assessment: the ITS assesses the learner’s understanding of basic algebraic concepts and ability to solve problems. Customizes learning: after assessing knowledge, it customizes learning activities and offers additional content based on the learner’s performance on the initial assessment.
  • If the learner struggles in a specific area, the ITS provides extra help, like step-by-step guides & practice problems.
  • If the learner excels in a topic, the ITS gradually progresses to more advanced concepts & activities.
Immediate feedback: the ITS provides instant feedback and additional context, allowing learners to recognize errors and learn to correct them.
Assessing knowledge
The ITS assesses each learner's understanding of basic algebraic concepts and ability to solve problems.
Customizing learning
Learning activities and content are tailored for each learner based on their knowledge assessment.
Struggling?
If the learner struggles in a specific area, it provides extra help, like step-by-step guides and practice problems.
Excelling?
The ITS progresses learners to more advanced concepts and activities if they excel in a specific topic.
Providing immediate feedback
It provides immediate feedback with additional context, allowing learners to recognize errors and learn to correct them.

Chatbots

Chatbots are great for interactive tutoring, helping to break down complex subjects, provide instant answers and feedback, brainstorming, and more. The only caveat is that chatbots can provide incorrect information sometimes, so learners need to doublecheck the information they receive during these activities.

Chatbots can help with tutoring needs such as:

  • Interactive learning activities: engaging learners with real-time activities like Q&A sessions; role-playing, such as real-time debates; and memorization activities, like flashcard drills.
  • Instant feedback and response: providing feedback on a variety of assignments, such as long or short-form essays, and responding instantly to most questions.
  • Language practice: helping improve language by practicing writing and conversing in real-time, building vocabulary, improving grammar, and translating content.

Chatbot cheating concerns?

ChatGPT and other generative AI chatbots are pretty controversial in education because they can answer most questions—accurately, for the most part—in an instant. Equally concerning is that they can write just about anything, and plagiarism and AI detection tools can’t catch them.

If these detection tools don’t work, how do you stop chatbot cheating?

Honorlock’s online proctoring platform has a several different tools that block and detect unauthorized AI tools during online exams and while they’re completing assignments, like writing essays.

AI Writing Assistants

Grammarly, QuillBot, and other AI writing assistants have evolved from spotting punctuation and spelling errors to suggesting improvements to writing style, clarity, organization, and tone.

They can also help boost confidence in people writing in a second language and individuals with learning disabilities by delivering feedback and corrections in different formats.

AI writing assistants help learns by:

  • Bridging language gaps: helps learners with language barriers or learning disabilities polish their written work, ensuring their ideas are clearly articulated.
  • Offering immediate feedback: checks writing instantly and provides real-time feedback in different formats.
  • Providing feedback in various formats: offers indirect feedback (underlines errors without corrections) and direct feedback (shows errors with corrections).

Platforms with AI assessment tools & features

The platforms in this sections aren’t necessarily considered AI, but they use AI features and integrate with other AI-powered platforms and tools that can make online assessments accessible and equitable.

Learning Management Systems

Learning Management Systems (LMS) are the bread and butter of online learning; they’re a central location where online courses operate out of and where other tools plug into.

When used to the fullest, the LMS offers the ability to assess knowledge in different ways, customize settings and accommodate learners, and provide immediate scores.

Automated grading

Automated grading tools within the LMS save instructors time, but they also help support learners with intellectual disabilities1. The immediate feedback from automated grading can help these learners recognize correct answers, which reinforces what they’ve learned.

Accessible assessments

Most learning platforms follow or comply with web accessibility guidelines and requirements like Web Content Accessibility Guidelines (WCAG) and Section 508 which helps create accessible online learning environments and equitable access for all learners.

Part of that effort means supporting assistive technologies, which are crucial for test takers with disabilities, whether they are devices such as assistive keyboards or AI like voice dictation.

For example, the LMS integrates with speech-to-text AI, allowing test takers with physical disabilities to ‘write’ essays by speaking. The LMS also enables them to record video and audio responses, giving those with disabilities more ways to show what they know. their knowledge.

Remote Proctoring AI Tools

While remote proctoring prevents cheating to level the playing field, it also makes online assessments more convenient and flexible.

AI proctoring tools to prevent cheating

If test takers can use unauthorized resources to complete their assessments, exam integrity and equity are out the window.

Here are a few of Honorlock’s AI-based remote proctoring tools:

  • Cell phone detection: detects phones and other devices so you don’t have to rely on a proctor seeing them and detects when test takers try to use them to look up answers.
  • Voice detection: listens for keywords (not unimportant sounds) that may indicate cheating, like “Hey Siri,” for test takers who may ask a voice assistant for answers.
  • Secure browser: prevents test takers from accessing other browsers and using keyboard shortcuts and records their desktops.
  • Content protection: automatically finds leaked test content on the web and gives admins a one-click option to send content removal requests.

While these proctoring features (and many more, like ID verification and video monitoring, etc.) monitor assessments, if they detect any potentially problematic behavior, they alert a live proctor to review the situation and intervene if necessary.

Hybrid proctoring offers more flexibility & convenience

No exam scheduling

Honorlock’s AI proctoring software, along with our live proctors and live support, are available 24/7/365, which means exams can be taken anytime at home or from other comfortable, convenient locations. No scheduling hassles or travel costs.

Minimal system requirements

Many exam proctoring services end the exam session when potential issues occur, like if a test taker in a rural area’s internet connection becomes unstable.

Honorlock, on the other hand, adjusts by capturing still images to compensate for slower internet connections, which allows test takers to complete their proctored exams without issue.

Reducing test anxiety

Honorlock’s hybrid proctoring (humans + AI) is less distracting & noninvasive

Honorlock’s AI-based remote proctoring tools monitor test takers and alert a live virtual proctor if it detects any behavior to review. Once alerted, the proctor can review the behavior in an analysis window, and they’ll only intervene if necessary to address the behavior.

This hybrid proctoring approach highlights the importance of using AI as a tool that always requires human oversight.

Preparing test takers

  • Practice Exams: allows test takers to familiarize themselves with using Honorlock’s online proctoring platform before their real exams.
  • HonorPrep guided tutorial: prepares test takers for their first proctored test with Honorlock with a system check, authentication walkthrough, and sample room scan.
  • Honorlock Knowledge Base: provides in-depth information and detailed guides for using Honorlock’s proctoring platform.

AI for Languages & Culture

Translation tools and game-based platforms like Duolingo can help reduce language barriers. Still, they’re just the tip of the iceberg when it comes to language-related AI tools, which have many nuances, like localization and cultures.

Did you know?

Non-native accents are viewed as less trustworthy by native speakers2
Unfortunately, the unique differences in our speech, like accents and pronunciation, and our writing, such as word choice and spelling, can lead to biases. These biases, whether intentional or not, become associated with perceived intelligence, education, and abilities.3 And when non-native English speakers fall behind or score poorly on tests, educators may assume they have learning disabilities, which is inherently problematic because the real issue—a language barrier—isn’t addressed.4

Sign Language Interpretation AI

Sign language interpretation AI facilitates real-time translation of spoken language and text into sign language, and vice versa. This technology makes content accessible to deaf or hard-of hearing learners.

Recent advancements in sign language interpretation AI have led to the creation of realistic videos. The AI splices together videos and images of real people signing, moving away from cartoon-like avatars. This helps enhance clarity and comprehension, offering learners a more authentic and engaging educational experience.

Cartoon signing help
Cartoon signing help
Cartoon signing help
Cartoon signing help

Name Pronunciation AI

Unfortunately, many people have come to accept their name being mispronounced, and they answer to various mispronunciations. But they shouldn’t have to, especially when there’s name pronunciation AI that can help.

Here’s how it works:

unrivaled
accessible-asset-3
Review Recording
Integrates within your platforms

AI name pronunciation tools integrate within the LMS, SIS, and most other platforms and common browsers.

Learners record their names

Learners can voice-record their names or use audio databases for pronunciation. This allows instructors and peers to hear and see phonetic spellings of names.

Recordings are available across platforms

Audio recordings of name pronunciations are available in many different areas across your platforms in areas like learner profiles and course rosters.

Examples of when name pronunciation AI can be used:
  • Online class discussions
  • Virtual welcome experiences
  • Online program information sessions 
  • Tutoring sessions 
  • Student support

Accent Recognition AI

AI Voice Assistant

Does Siri or Alexa ever misunderstand you?

It may not happen that often if you're a native English speaker because AI voice assistants, depending on which one you're using, are between 85-95% accurate (some more, some less).

Does Siri or Alexa ever misunderstand you?

It may not happen that often if you’re a native English speaker because AI voice assistants, depending on which one you’re using, are between 85-95% accurate (some more, some less).

95% is pretty accurate, right? 

Yep, but that still means every 20th word is incorrect, which is the same length of this sentence you’re reading.

But the accuracy of these tools decreases if you have an accent, even if you’re a native English speaker with a regional accent.

A study found that Americans with Southern and Western accents were more accurately understood by voice assistants compared to people with Eastern and Midwest accents.

But what about people in America with non-native accents? 

The study showed a 30% higher rate of inaccuracies for individuals with non-native accents, especially those with Spanish and Chinese accents. This is particularly because Spanish and Chinese are the most common non-English languages in the U.S., per the Census Bureau data.

Imagine how frustrating this experience is for individuals for whom English is not their first language, and voice-related AI tools misunderstood a third of what was said. Now, imagine if English wasn’t the person’s first language and they rely on voice related AI because they have a disability that impacts their ability to type.

Did you know?

22% (69.2 million people) of the U.S. population speak languages other than English.

Source: U.S. Census Bureau

The good news is that accent recognition AI tools are available, specifically designed to recognize and better understand accents.

If these AI tools are integrated with users’ devices, it could improve the operability of voice-controlled technologies and generate more accurate live captioning and transcriptions. These tools analyze and interpret speech patterns, intonations, and pronunciations specific to different accents.

Localization AI

Even when people speak the same language, where they’re from can affect how well they are understood by others and by AI.

For instance, there are over 160 English dialects around the world, including American English, British English, and South African English, among others.

Typically, English speakers can understand one another despite their dialects, yet there are noticeable differences in pronunciation, grammar, and spelling.

Here’s how dialect can change the meaning of English words:

Comparing American English to British English

So, if a U.S.-based company sells “chips” in the U.K., their sales may struggle because people are confused and disappointed they didn’t get what they expected.

Similarly, learning content can be confusing if it isn’t localized. However, localization AI can automatically localize (or localising for the Brits) content for specific dialects and provide important context based on cultures and preferences.

Understanding AI before using it

Integrating AI in online courses starts with recognizing its limitations, such as biases and inaccuracies, and committing to ongoing efforts that ensure ethical, purposeful use of these with full transparency.

That means having a deep understanding of how the AI works, its training processes, the data it collects, how that data is protected, and open communication with learners about the purpose and how these tools tie to learning outcomes.

And the value of human oversight can’t be overstated here. Human oversight covers alllllll the nuances and complexities of different learners and situations are taken into consideration with other contextual factors that AI may overlook.

Sign up for weekly resources

Articles, research, tips, webinars, & more

1 Reynolds, T., Zupanick, C.E., & Dombeck, M. Effective Teaching Methods for People with Intellectual Disabilities.(2013).

2 Why don’t we believe non-native speakers? The influence of accent on credibility by Shiri Lev-Ari & Boaz Keysar, April 2010

3 Speaking up: accents and social mobility by Erez Levon, Devyani Sharma, Christian Ilbury. (2022, November).

4 Bias against languages other than English hurts students by Dr. Alba Ortiz. (2009)

Future AI Trends in Online Education

AI tools to improve online education

While some of the AI tools you’ll see aren’t widely used in online learning yet (they’re mostly used by businesses) we’ll explain how they work and the ways they can be adapted for online learning.

Understanding emotions is complex because we all feel and express them in different ways. While emotion AI isn’t perfect, it can help instructors improve engagement and support learners where they need it most.

What is emotion AI?

Emotion AI, sometimes called affective computing, detects and interprets human emotions by analyzing text, voice (audio), and video and associating specific components with related emotions.

How can emotion AI be used in online learning?

Emotion AI technology can help learners with cognitive and/or physical disabilities communicate; detect signs of confusion and frustration; and pinpoint course activities that they are interested in or uninterested in.

What are the types of emotion AI?

Text Emotion AI

Analyzes written language to identify the sentiment and emotional tone of the content.

Example uses: analyzing written responses, like forum posts and course evaluations, to identify, understand, and address emotional states.

Voice Emotion AI

Identifies emotions based on vocal characteristics like volume, tone, pitch, and speed.

Example uses: monitoring learners’ spoken responses during online classes and virtual presentations to detect nuances in learner interest or distress.

Video Emotion AI

Observes body language, facial expressions, and gestures to determine emotional states.

Example uses: observing facial expressions and body language during video conversations and online exams to understand learners’ confusion and levels of interest or disinterest.

The images below illustrate how video emotion AI software might appear from an instructor’s perspective when reviewing, offering insights into learners’ interest and engagement.

“High Attention” highlights that learner attention increases when making eye contact.

“Low Attention” shows a decreased level of attention when they look away from the screen.

While eye contact with a webcam doesn’t necessarily mean they’re really paying attention, and looking away doesn’t mean they aren’t listening, when paired with other metrics, like how long they looked away, facial expressions, and other body language, it can help gain a broader understanding of behavior and emotion.

Educators can use these individual insights or similar data from all learners to understand certain activities that boost attention and engagement, which topics confuse learners, and more.

High Attention

Low Attention

  • High attention example for emotion detection AI

  • Example of low levels of attention in video emotion software

High Attention

Low Attention

  • High attention example for emotion detection AI

  • Example of low levels of attention in video emotion software

2. AI language tools that go beyond translations

Newer language-related AI can understand the nuances of languages like slang, accents, and dialects to build truly global dialogues in online courses.

Accent Recognition AI

“Sorry, I didn’t quite catch that.”

Even if you’re a native English speaker, you’ve probably heard this or something similar from voice assistants. They’re about 95% accurate, sometimes more or less depending on which one you’re using, which is pretty good, right?

Sure, but that still means every 20th word is wrong, which is the exact length of the sentence you’re reading.

However, the accuracy varies depending on your accent… even if you’re a native English speaker. 

Question

Answer


  • Which two U.S. accents are voice assistants more likely to understand?

    1. Southern

    2. Midwest

    3. Western

    4. Eastern


  • 1 & 3

    Southern and Western accents were understood more often than Eastern and Midwest accents.

    Source: Washington Post, 2018 research

The research also found that people with non-native accents experienced 30% more inaccuracies when using voice assistants, with Spanish and Chinese accents being the least accurate. Imagine how frustrating that would be.

But there’s good news: accent recognition AI tools are available that are trained on extremely diverse data that allows it to better understand accents. They analyze and interpret speech patterns, intonations, and pronunciations specific to different accents. 

Accent recognition AI can help improve the operability of voice-controlled technologies and generate more accurate live captioning and transcriptions.

Name Pronunciation AI

What’s an easy way to stifle a sense of inclusion and belonging? Mispronouncing someone’s name.

Even if it’s an innocent mistake, it’s probably something that person encounters daily. But there’s a solution that can help: name pronunciation AI.

How does name pronunciation AI work?

  • Integrates throughout your online courses within the LMS, SIS, and other platforms to ensure that names are accurately pronounced.
  • Uses databases of audio name pronunciations and algorithms that recommend correct pronunciations.
  • Learners voice-record their names, and the recording is available throughout the platforms.

Name pronunciation AI can be used in:

  • Online class discussions
  • One-on-one advising sessions
  • Virtual information sessions
  • Recruitment conversations
  • Student support

Realistic Sign Language Interpretation AI​

Instead of using cartoon-like avatars, newer sign language interpretation AI creates a more realistic experience by splicing together videos of real people signing.

Some sign language AI can also provide real-time translation of spoken language and text to sign language and vice versa, making content accessible to deaf or hard-of-hearing learners.

Cartoon signing help
Cartoon signing help

3. Automatically finding leaked test content

Have you ever found your test questions leaked on the internet?

“Homework help” sites like Chegg and discussion forums like Reddit and Quora make it easy to find and share your test questions and answers. 

You have 3 ways to tackle leaked test content:

1. Manually searching the internet on your own

You search the internet for individual test questions and send takedown requests if you find any.

2. Manually searching but with AI’s help

You select individual questions that the AI will search for and send your own takedown requests.

3. Automatically searching with AI

The AI does all the work by automatically searching the internet for all of your test content in a few minutes and giving you the ability to send one-click takedown requests.

Automating this process is the best way, so here’s how it works with Search & Destroy™:
  • Search & Destroy™ automatically searches for all of your exam questions
  • Search results show where any of your questions are leaked
  • Send one-click takedown requests to sites displaying your questions

That’s it. No more leaked content concerns.

4. Large Language Models can help review college admissions essays without bias

Completely eliminating biases isn’t realistic, but they can and should be acknowledged and addressed, and AI can help,  even though it can be biased too.

How can AI be biased?

AI reflects the biases of the people using it, the data it’s trained on, and the ways it’s used.

For example, if AI is trained on biased data, such as data from underrepresented groups, there’s a ripple effect that can impact algorithms, outputs, and future models.

The good news is that AI biases, similar to human biases, can be recognized and addressed to help reduce them.

Which would you pick?

If you were in charge of admissions—and let’s pretend time constraints don’t exist—would you:

  • Only review objective data, like standardized test scores and GPAs
  • Review objective data and understand other areas of who they are, like their personal qualities

Generally speaking, most would choose to consider personal qualities like personality, character, leadership, and life experiences.

While personal qualities are more subjective than test scores, research shows that they can predict success in school and life. Reviewing these qualities takes more time than reviewing objective information, like test scores and GPAs. But Large Language Models can help.

Large Langage Models (LLM)

What are large language models?

Large Language Models are a type of AI that can understand, interpret, and generate human language by analyzing and learning from extensive datasets.

How do LLMs work?

LLMs are trained by “reading” billions of pieces of text from various sources, like internet articles and forums, scientific research, textbooks, newspapers and magazines, and more.

This training helps them learn patterns and understand how words and sentences are formed in different formats and contexts.

LLMs don’t actually understand language, they’re just really good at predicting what word should come next. The two models we’ll discuss are unidirectional and bidirectional

  • Unidirectional: predicts the next word based on previous words
  • Bidirectional: analyzes text from both directions to predict a word in context

RoBERTa vs. ChatGPT

Both are LLMs that share the same architecture but excel in specific tasks, similar to cars with the same frame but different tires and suspension systems for certain terrains.

  • RoBERTa (bidirectional) drives better in the city (understanding language nuances and context), but it can still make it on certain off-road trails (creating content).
  • GPT (unidirectional) drives best on off-road trails, but can navigate some city streets.

The University of Pennsylvania used a LLM, RoBERTa to review college admissions essays for personal qualities

Research published in October 2023 by the University of Pennsylvania indicates that certain LLMs, if trained properly and thoroughly, can review admissions essays for personal qualities that predict college graduation on par with human admissions staff.

The researchers and their team analyzed over 300,000 college essays and scored them on the absence or presence of seven traits: prosocial purpose (helping others), leadership, learning, goal pursuit, intrinsic motivation, teamwork, and perseverance.

Then they trained RoBERTa  to recognize and evaluate similar qualities and characteristics in essay submissions without showing biases toward race, gender, or socioeconomic status. RoBERTa was used because it excels at understanding the context and meaning of language, which makes it an effective tool for understanding emotions, text classification, and translations.

Research results and takeaways

RoBERTa recognized personal qualities without bias

It recognized qualities like teamwork and intrinsic motivation in applicants from diverse backgrounds, without showing bias towards race, gender, or socioeconomic status.

RoBERTa’s predictions were accurate

Its predictions of the likelihood of students graduating were slightly more accurate than humans, but not by much.

The researchers recommend using AI with optimism and caution

“An AI approach to measuring personal qualities warrants both optimism and caution… We recommend AI be used to augment, not replace, human judgment. No algorithm can decide what the goals of a university’s admissions process should be or what personal qualities matter most for that community.”

5. Preventing remote access software contract cheating

Have you ever had a support technician take over your desktop and fix your computer from a remote location?

That’s basically how remote access software is used to cheat on exams.

A person pays a test-taking service to have one of their experts control their computer and take the exam from a remote location.

Even though the person getting credit appears to be sitting in front of the camera during the test, it’s the off-camera expert who is actually answering the questions.

And since the person getting credit stays on screen during the exam, ID verification methods won’t help.

How can you stop remote access cheating?

Honorlock’s remote proctoring platform has a few ways to help:

  • Recording the desktop and requiring specific keyboard commands immediately before starting the exam: Exam admins use Honorlock’s exam settings or test rules to require test takers to use keyboard commands, such as Ctrl+Alt+Del (Windows) or Cmd+Opt+Esc (Mac), to display the applications and processes running on the device.
  • Displaying countries: Honorlock’s Analytics Hub™ shows the countries that tests were taken in based on IP address. If any tests are taken in countries with no known test takers, it may indicate the use of remote access test-taking services.
  • Blocking applications: Honorlock’s proctoring platform gives exam administrators the ability to block specific applications that can be used for remote access.

6. Using AI for on-demand tutoring

Whether intentionally built for tutoring or adapted, AI tools such as intelligent tutoring systems, chatbots, and writing assistants offer 24/7 interactive tutoring and support, which enhances learning while building a more diverse and inclusive educational environment that accommodates various learning styles and needs.

They also offer real-time feedback, which is crucial for learners with intellectual disabilities because it helps them make connections between their work and the instructor’s feedback.

Intelligent Tutoring Systems (ITS)

Intelligent Tutoring Systems simulate one-on-one human tutoring, offering tailored feedback and adapting course materials to meet each learner’s needs. 

They can guide learners through problem-solving steps, offer hints, break down complex topics, and recommend additional relevant content.

While an ITS can benefit any subject, it’s particularly beneficial for subjects like math, which require—for the most part—a lot of repetition.

Here’s how an ITS could work for an algebra course:

  • The ITS assesses the learner’s understanding of basic algebraic concepts and their ability to solve problems.
  • After assessing knowledge, it customizes learning activities and offers additional content based on their needs. 
  • If the learner struggles in a specific area, it provides extra help, like step-by-step explanations and practice problems.
  • If the learner excels in a topic, the ITS gradually progresses to more advanced concepts and activities.
  • The ITS provides immediate feedback and additional context, allowing learners to recognize errors and learn to correct them.

Chatbots

ChatGPT, Google Bard, and other chatbots can provide learners with instant tutoring and support anytime they need it.

Learners can use chatbots to dig deeper into complex subjects, brainstorm ideas, provide feedback on written work, translate text, and check writing quality.

AI writing assistants

Writing tools like Grammarly and Quillbot have been around for a few years now, but they’re evolving.

Initially, they just helped improve writing by correcting grammar, spelling, and style issues.

But now, they’re incorporating AI that can instantly make writing more concise, easier to understand, or sound a specific way, such as being more assertive.

These tools are particularly useful for individuals writing in a second language and for those with learning disabilities, as they can boost confidence and deliver different forms of corrections and feedback immediately.

Regardless of which AI tools you use, make sure that they’re used purposefully, ethically, and with full transparency. Always keep in mind that AI tools are just that—tools. They aren’t a replacement for the people using them.

Sign up for more online learning resources​