Is a Browser Lock Enough to Protect Online Exams?

This article will show you what a browser lockdown can and cannot do and provide additional proctoring solutions to solve for areas that browser lockdown software may miss.

Browser lockdown software can help protect exams to a certain extent, but it isn’t enough to prevent cheating on exams. It can help block basic misconduct like accessing websites and opening new tabs, but it won’t stop test takers from using cell phones and other devices, which is half the battle.

‘If the goal is to check the box that your institution proctors exams, locking the browser works. But if your institution actually wants to prevent cheating and protect academic integrity, you’ll need a real proctoring solution.

This article shows you what browser lockdown software can and cannot do and provides other proctoring solutions to solve for areas that browser lockdown software may miss.

What is browser lockdown software?

Browser lockdown software is an automated proctoring solution that can help prevent cheating on online exams by restricting access to other browser tabs and websites, and blocking keyboard shortcuts such as copy and paste.

Here’s what browser lockdown software can do:

  • Restricts access to other websites, browsers, and applications
  • Disables keyboard shortcuts and functions such as printing, and copy and paste
  • Removes toolbar and menu options such as browser refresh, back, forward, and stop
  • Blocks the ability to resize the browser window

Why browser lockdown software isn’t enough to protect online exams

The features of browser lockdown software highlighted above will help prevent cheating, but there are significant things it cannot do.

Browser lockdown software can’t see the student

While a browser lock can “see” if a student attempts to access another browser tab or computer application or if the student attempts to use keyboard shortcuts, it doesn’t offer the benefits of video proctoring, which uses the student’s webcam to view them during the exam.

Video proctoring often involves recording the student’s exam session for later review by the proctor and instructor. Video proctoring is especially useful because it allows ID verification and can tell if a student is using unauthorized resources or if another person is in the room.

Browser lockdown software can’t detect cell phones

A student can simply use their cell phone or another device to look up test questions and answers on the internet. 

Cannot detect voices or audio

Browser lockdown software can’t hear when a student may be talking on the phone or asking a friend for an answer to a test question.

Some online proctoring software monitors the exam session and goes beyond simple audio detection by listening for specific keywords or phrases, such as “Hey Siri” or “OK Google.”

A browser lockdown can’t verify student ID

ID verification is a priority before proctored exams and a basic browser lockdown doesn’t help solve this. Without video proctoring, browser lockdown software can’t verify ID.

Browser lockdown software lacks the human touch of online proctoring that students deserve

Because browser lockdown software is only an automated proctoring solution, there’s no human touch involved to help support students during the exam. We know that exams can cause anxiety and that technical issues can happen during the test and that’s where a human proctor can help students successfully complete their exams. Online proctoring services that combine AI test monitoring with human proctors are the best of both worlds.

Online proctoring options when browser lockdown software isn’t enough

The good news is that Honorlock provides a browser lock as a part of our standard proctoring features, but it also provides many more features and benefits such as:

  • Video proctoring 
  • Detecting cell phone use and voices 
  • Blending AI test monitoring software and live proctors
  • Verifying student ID
  • Reducing unauthorized use of your test questions and content on the Internet 

Honorlock can detect cell phone use during proctored exams

Honorlock proctoring software can detect when students use a cell phone, tablet, or laptop to access test bank content during an online exam.

The proctoring software also captures a screen recording of specific websites that the student visited to provide evidence in the event of a violation. The recording can be reviewed by the instructor to determine if academic dishonesty occurred.

Advanced video proctoring

Early forms of live video proctoring were unsettling for students because many didn’t appreciate having a remote proctor’s face hovering on the screen during the entire exam.

Honorlock’s video proctoring allows:​

  • 60-second ID verification
  • Room scans to make sure that no other people or test resources are present 
  • Second camera and side angle behavior monitoring 

Detects voices with Honorlock proctoring software

Honorlock’s smart Voice Detection feature listens for specific keywords or phrases to identify students who may be talking to another student or using Siri or Alexa to get the answers to test questions. If the proctoring software detects potential academic dishonesty, it alerts a live test proctor to review and intervene if cheating occurred. 

Helps protect your test questions and answers from unauthorized use on the Internet

You’ve probably heard of students finding and sharing test questions and answers on test banks and homework-help sites. Maybe you’ve even seen your own test content on the Internet. It’s frustrating and it’s quick and simple for any test-taker. 

However, Honorlock’s remote proctoring technology searches for leaked test questions, answers, and content and can also request the removal of the material by filing DMCA copyright takedown notices.

Blends AI test monitoring with live test proctors

Live Pop-in™ protects online exams, reduces test anxiety, and provides a less intimidating and noninvasive testing experience. AI alerts a live remote proctor to review and intervene (if necessary) to address the situation.

Choose to proctor online exams a better way

Browser lockdown software checks the box as a first line of defense against academic integrity but with all its shortcomings, choose to proctor online exams a better way with Honorlock. Our approach to online proctoring creates a non-invasive and fair testing environment that benefits the student, instructor, and institution.

What Google Homework Helper Is and How to Block It

Google Homework Helper: What it is and How to Block it During Exams

Google’s homework help tool can be a useful resource for learning, but it also creates concerns for academic integrity. As more homework help tools like this will inevitably appear, it’s important to understand what they can do and how to prevent them from being used during exams.

What is Google homework helper?

Google’s homework helper is a Chrome browser tool that combines Lens and Gemini to provide immediate answers when exam questions or other content are highlighted. It uses Google Lens to select the elements and Gemini to provide answers, explanations, step-by-step instructions, and additional resources to learn more about the topic.

How does Google homework help work?

With Google’s homework help, learners only need to click a button and highlight a question to get immediate answers.
  • Google homework help tool works on any tests, pages, and content within the LMS, test delivery platform, or other websites accessed in Chrome or Google App.
  • The homework helper is part of Google Lens, and launches when learners click the Google Lens icon, which appears in the Chrome search bar and browser navigation.
  • After it launching it, learners highlight test questions and receive answers, or select other on-screen information they need help with.
  • Google’s homework helper uses Gemini to provide answers, explanations, videos, and other resources in a browser side panel.
  • Learners can also use their phones to access Google’s AI homework helper through the Google app or Lens app. To get help, they can point the camera at a question or upload a photo.
Note: The walkthrough below shows how the Google Lens homework help feature works within an LMS. It works the same way in any LMS, test delivery platform, or page accessed through Chrome.

Please rotate your device to view the interactive walkthrough

How to prevent the use of Google homework helper during exams

Browser lockdown software isn’t enough to prevent the use of homework help features because test takers can still use other devices. Tests need to be remotely proctored to prevent test takers from using Google’s homework helper and other similar tools, such as Microsoft Edge’s Copilot.

The proctoring software should be able to:

  • Detect and flag the attempted use of any homework help features in the browser.
  • Pause the exam to inform the test taker that those features are not permitted.
  • Alert a live proctor to review the behavior and address the situation if necessary.

Additional proctoring tools to block other types of AI

Preventing AI cheating is tricky because so many different tools exist and they can be accessed in different ways across almost any device. For example: 

  • Test takers can open chatbots in any device with a browser or even use voice commands to navigate them. 
  • AI assistants like Cluely, which operate like transparent chatbot teleprompters, can be controlled with simple keyboard shortcuts on the test-taker’s device or run on a second device that stays out of the webcam’s view.

The examples could go on and on, but the point is that you’ll need proctoring software with tools to detect and prevent the use of AI, such as:

  • Cell phone detection: Detects attempts to look up questions on cell phones and other secondary devices like smartwatches and tablets, and the presence of Apple devices in the test environment.
  • Application blocking: Test admins can allow specific applications needed to take the test (e.g., Word, Excel, coding platform) while blocking all other applications, including invisible AI assistants, screen recorders, and remote access applications.
  • Smart speech detection: Listens for test takers to say phrases that activate AI assistants like Siri and Alexa.
  • Recording and monitoring: Desktop activity and behavior are recorded and monitored during exams and a second side camera can be used to provide a full view of the test environment.

While Google’s homework helper was created with good intentions to support learners, it also makes preventing cheating more difficult. Honorlock’s proctoring technology is built to stay ahead of tools (any everything else) like this and keep exams secure and fair.

Proctoring For Large Colleges And Universities

Proctoring for Large Colleges and Universities Comparison Considerations

When it comes to remote testing and academic integrity, large colleges and universities (those with over 5,000 full-time equivalent (FTE) students) have some pretty specific needs. With thousands of students taking exams across different departments and campuses, and sometimes states and continents, scalability is a must.

These schools aren’t just giving multiple-choice exams anymore, especially as they’re figuring out how to keep students from using AI. They assign scaffolded projects that students complete using real software, use hands-on coding assessments, and assess students through virtual presentations and oral exams. That means their remote assessments need a proctoring solution that’s flexible enough to handle all kinds of exam formats and work with learning management systems (LMS) like Canvas, Blackboard and D2L.

Beyond functionality
But it’s not just about the functionality of the technologies used in online learning and testing. Campus leaders and faculty care deeply about protecting student data, making assessments accessible for all students who have disabilities or face obstacles like unstable internet, and creating a fair and supportive assessment experience. Faculty also want the freedom to set their own rules, while administrators need clear reporting tools and real-time insights into academic integrity issues.

Scalable, not scammy
Cost obviously matters too. Large schools need pricing that’s clear and scalable. So why is it so convoluted and scammy? Are hidden fees part of the product or what are we doing here? It shouldn’t be that way, and it doesn’t have to be.

Find what you need

Find the right type of proctoring for exams and your students

When higher education institutions deliver proctored exams to thousands of students, every detail matters. It has to work everywhere, for everyone, at any time. But picking the right proctoring method is tricky.

Get it right, students barely notice, faculty actually use it, and exams are secure. Get it wrong, everyone feels it and administrators definitely hear about it.

If you choose the wrong one, your institution is likely to deal with at least two of the following headaches from a solution that:

  • Doesn’t really prevent cheating (a cell phone can beat it)
  • Ruins the test-taking and making experience for students and faculty members
  • Slips in last-second charges for implementation and support, and tacks on extra fees when exams take longer than expected

Remote proctoring options to secure online assessments

“We’re paying for a service that just works. We’re getting a full-time position on the distance education team back. Now that she doesn’t have to spend 100% of her time making proctoring work, she can focus on value-added work for our students. There’s so much more we can do.”  Stephanie Ungerank, Director of Distance Education, Arkansas State University-Beebe

How to control the cost of proctoring at large colleges and universities

Look for online proctoring companies that offer flat-rate pricing per exam or per student. When you pay by the hour, exams that run just a few minutes over will cost you.

Many proctoring providers round up by the hour, so if a student crosses the 60-minute mark, you could be charged for a full second hour. But even if they only add up the extra minutes, those extra minutes can still add up to hundreds of unexpected hours when you’re proctoring exams for thousands of students.

And make sure the cost of proctoring services includes implementation and on-going support for students and faculty. To help determine the value, ask whether support is live or automated, whether it’s available 24/7, response times, etc. and have them list what their implementation and training includes and who’s involved to make sure it meets your institution’s needs.

$1.14
million

See how the University of Florida saved $1.14 million in remote proctoring costs over three semesters and gave faculty 8.5 hours back per course each semester to focus on teaching instead of proctoring.

8.5
hours

Support that solves, not stalls

Make sure the proctoring service offers 24/7/365 live support for students and faculty. Students have busy lives and need the ability to take online exams anytime, day or night. On-demand support gives them the flexibility they need.

Ask the proctoring company:

  • How long it takes for support agents to respond and how quickly issues are resolved
  • Where the support team is located
  • If support is included or an added cost

These details have a big impact on the testing experience and your budget.

“Having access to customer support 24/7/365 is huge. Especially when students may be taking exams at any time of day. You want them to have access to someone who can troubleshoot with them.”

Daphnee St. Val, Senior Instructional Designer, Broward College

LMS integration

The integration between the proctoring platform and the LMS is the difference between a smooth testing experience and a constant flood of support tickets for large colleges and universities. It makes implementation quick and easy, secures data, and creates a better test experience for students and faculty.

Creates a familiar test environment

Students launch proctored online exams from the LMS just like they already do. There’s no new platform to learn. The only difference is that they show their ID to the webcam to verify who they are and complete a quick room scan. It takes about a minute and they’re good to go. 

Faculty create proctored exams the same way they build any other assessment in the LMS. After that, they pick the exam proctoring features to use and can add notes to provide specific accommodations for students.

No extra passwords or logins

Students and faculty shouldn’t need extra logins and passwords just to make or take a proctored exam. 

Why is this important?  

Imagine you’re about to take a big exam. You log into the LMS and click to start the proctored test, but you’re hit with a prompt to create an account or enter your credentials (but you don’t remember them).

“You want an integrated system. You don’t want faculty and students to have separate sign-on. That was important for us.”

Steven Daniello, Director of Assessment Services at Indian River State College

So, now you’re annoyed and wasting time. Creating a new account might seem like a small task, but delays and last-minute issues right before an exam can add to or trigger test anxiety, which is already common among college students (Gerwing et al., 2015; Kolski & Weible, 2018).

Test anxiety

While we’re on the subject of test anxiety, you’ll want to make sure the remote proctoring software you choose doesn’t contribute to it.

When students are already anxious, any extra worry can amplify the negative effects of test anxiety, increasing cognitive load, interfering with concentration, and lowering test performance (Pekrun, 2024), especially in online settings where technical issues are common sources of stress (Alibak et al., 2019)

However, proctoring-related test anxiety often comes from technical concerns, like whether their device will work with the software or internet stability, not from being monitored (Andreou et al., 2021; Woldeab & Brothen, 2019)

How are they helping colleges and universities address test anxiety?

  • Are proctors trained to identify signs of physical stress and reply appropriately?
  • Can students take practice tests to get comfortable with the software? 
  • Does the company offer 24/7/365 live support? If so, how long does it take them to respond?
  • Do they have an easy way to test their system requirements? 
  • Are there other help resources available for students (e.g., articles, FAQ, video tutorials, etc.)?

“It’s the empathy that sets Honorlock apart. Our students are anxious enough; having someone actually respond and care made all the difference.

Caryn Sever, Director of Instructional Design and Development, Northern Virginia Community College

The answers to these questions can affect student learning and test performance. But they can also affect academic integrity because test anxiety increases academic dishonesty. However, access to practice exams and system checks can help lower students’ anxiety, which can reduce cheating (Gribbins and Bonk, 2023).

“Honorlock was a lot less invasive and less glitchy. It felt way more seamless and creates an easier testing experience. It just lets you take your test without any interruptions.”

Madi Kuokos, Student at the University of Florida graduate

Remote proctoring features for online exams

When you’re evaluating remote proctoring services, it’s not enough to check the box that the software includes certain features. You also need to understand how those features and processes work, because the details behind things like ID verification, phone detection, and support access can impact the entire test experience for students, faculty, and staff.

Detecting cell phones and other secondary devices

Cell phones make it easy to cheat during exams, and nearly every student has one. That’s why some online proctoring companies say they offer “cell phone detection.”

It’s tough to keep a straight face and call it cell phone detection because, in reality, it’s just a live proctor squinting at a dozen video thumbnails of students taking their exams and hoping to spot a phone at some point.

In other words, they’re crossing their fingers and hoping for the best.

71%
of violations during proctored exams involve cell phones or other secondary devices.
The good news is that some proctoring services offer technology that detects cell phones instead of wishful thinking. This technology detects the presence of cell phones and when students try to use them to look up answers.

Preventing students using AI to cheat on exams before it starts

Whether it’s exams or written assignments, faculty obviously want to prevent students from using AI, especially chatbots like ChatGPT and Claude, along with clever desktop assistants like Cluely These tools can write and answer almost anything.

What is Cluely?

Cluely is an AI desktop assistant that helps students cheat by answering questions it sees and hears on-screen. It uses transparent overlays and bypasses keyboard shortcuts, which makes it almost impossible to detect during online assessments.
Example of what Cluely AI looks like during an exam. Shows its transparent interface answer a question on an exam.
Student view of Cluely during an online exam.

With the right solution, you can proctor writing assignments (even handwritten) just like an exam. But you’ll need a combination of virtual proctoring tools to do it. These tools work together to block students from accessing unauthorized AI on computers, smartwatches, and tablets.

Listening for speech that actually matters

Some students talk to themselves while they work through exams. Some live in noisy households. With sound detection, faculty have to review flags for unimportant noises, like a student talking through a problem or coughing. 

Smart Speech Detection listens for certain keywords that may indicate cheating. For example, voice commands that activate virtual assistants like Siri and Alexa on devices that are off-screen.

Finding leaked exam content

There’s a lot of leaked test content available online. Some companies pose as “homework help” sites, but they’re really just big repositories for your content and other resources to cheat on exams.

Search & DestroyTM is a proctoring tool that automatically checks the internet for all of your exam content. If it finds any, you can send takedown requests with one click.

When you compare proctoring platforms, dig into how each one searches for leaked exam content, or you may end up manually selecting each question it searches for instead of the system automatically searching for all of them.

“There’s nothing more nerve-wracking than when you see exam questions have been leaked. Having that quick ability to make sure that when you’ve uploaded an exam, none of those questions have been compromised is important to the teaching and learning process.”

Naz Erenguc, Director of Admissions, Warrington College of Business, University of Florida

Securing third-party exams

The online test proctoring software should allow faculty to secure exams on third-party platforms like MyMathLab, Knewton Alta, WebAssign, and more. Ideally, instructors should only need to register the exam with a password-protected link before setting up the exam. Then students launch the proctored exam and can exit and remove the Honorlock extension when they finish.

Monitoring and recording behavior during exams

Uses the webcam to monitor and record students’ behavior and desktops during proctored exams. Faculty can also require a second side camera for a wider field of view during exams where test takers show their work or have external resources available.

Verifying identity and scanning the room

Before taking an online proctored exam, faculty can have students confirm their identity and complete a quick room scan. This confirms it’s the right person and that there are no unauthorized resources or people in the room.

Using remote proctoring software can also help institutions meet state, federal, and accreditation requirements for identity and attendance verification.

“Honorlock helps us adhere to state and federal requirements, in addition to accreditation requirements, because it provides the opportunity to capture identity verification and attendance verification at the same time. Through Honorlock and Canvas, we were able to verify identity, the time and date stamp, and provide that documentation to the auditor easily.”

Dr. Lee Conerly, Director of Academic Instruction, Coastal Alabama Community College

Improving testing with reporting

Exam reports after the test offer a ton of useful information, but they need to be organized and easy to understand to be helpful. Otherwise, it’s just a lot of data that never gets used.

Look for a solution with an analytics dashboard that provides: 

  • Understand the most common exam violations
  • Compare the use of proctored testing from one semester to the next
  • Analyze the impact of specific exam features

 Honorlock Analytics HubTM

Honorlock’s Analytics Hub provides extensive data that institutions and faculty can use through a single, easy-to-use portal that saves time, improves assessments, improves academic integrity, and supports students along the way.

“The Honorlock Analytics Hub™ has helped us tell our story. It truly tells the story of where our students are and what their needs are.”

Stephanie Ungerank, Director of Distance Education, Arkansas State University-Beebe

“Honorlock was more than a tool to guard or block students from using inappropriate information. It was also a means to detect and determine many different ways that students approach the exams. Because of access to the wealth of data and information, I became better able to utilize it.”

Ryan P. Mears PhD, Lecturer, University of Florida

Accessibility and accommodations

To be truly accessible, proctoring software should support a range of student needs. It should follow web accessibility standards, work with assistive technologies like screen readers, and allow instructors to set accommodations such as extended time, breaks, multiple attempts, or testing without a webcam. It should also include failsafes that keep the system working even when internet speeds drop below streaming thresholds. And to support students with busy or unpredictable schedules, it should offer 24/7 proctored exams with access to live support whenever they need it.

“Honorlock helps instructors ensure that students’ individual needs are met. An instructor can specify individual student accommodations, and then the live proctor will ensure that those accommodations are afforded.

Dana Einfeld, Ed.D, Division Chairperson, Academic Transfer Mathematics & Engineering, Coastal Alabama Community College

“Honorlock helps us get to the core of our mission here at Warrington: being able to offer an MBA to just about anybody.”

Naz Erenguc, Director of Admissions
Warrington College of Business, University of Florida

Online proctoring services can prevent students from gaining an unfair advantage during online testing and written academic work, but it’s not just about catching students who cheat. 

That’s a big part of it, but not the whole point. It’s also about giving students access and flexibility they need and making education possible for more people. And as a result, it helps create a level playing field where online testing feels just as solid and valuable as it would in a classroom or testing center.

Schedule a Demo

How to Cite AI in APA, MLA, and Chicago

How to cite AI in APA, MLA, and Chicago with examples and templates

Using generative AI tools is pretty cool because they can help improve your writing or you can make images, videos, and audio of basically anything you can think of. But then APA, MLA, and Chicago citation styles* showed up with a bunch of rules and policies we have to follow.

*fun-ruiners

The good news? This article has examples and copy/paste templates for citing AI in each style.

We’ll update this page as the citation guidelines change (which is pretty often) so save the link/bookmark it for quick reference. To bookmark, press Ctrl+D (Windows) or Cmd+D (Mac).

Links to share specific sections with learners

Copy the link to a citation style to share with your learners.

  • APA: honorlock.com/blog/how-to-cite-ai/#apa
  • MLA: honorlock.com/blog/how-to-cite-ai/#mla
  • Chicago: honorlock.com/blog/how-to-cite-ai/#chicago

Share links to specific sections with your learners

  • APA: honorlock.com/blog/how-to-cite-ai/#apa
  • MLA: honorlock.com/blog/how-to-cite-ai/#mla
  • Chicago: honorlock.com/blog/how-to-cite-ai/#chicago

How to cite AI in APA

APA AI citation format examples

Citing AI-generated text in APA

Reference list entry
  • Template: Author. (Date). Title (Month Day version) [Additional Descriptions]. Source
  • Example: OpenAI. (2025). ChatGPT (May 14 version) [Large language model]. https://chatgpt.com/
In-text citations
  • Parenthetical citation: (Author, Year) (OpenAI, 2025)
  • Narrative citation: Author (Year) OpenAI (2025)
Context for each part of the APA AI citation template​
  • Author: The company that owns the AI tool (e.g., OpenAI owns ChatGPT)
  • Date: The year of the version you used.
  • Title: The name of the model (e.g. ChatGPT, Claude, Gemini). The version number is included after the title in parentheses. According to APA, you don’t need to include the specific version, like ChatGPT-3 or ChatGPT-4, in the title because “ChatGPT” is the general name.
  • Bracketed text: Describe the type of AI model. We used "Large language model" in the example above because that's how OpenAI refers to ChatGPT. If it's an AI tool that creates images or videos, use "AI image generator" or "AI video generator"
  • Source: Use a link that takes readers directly to the tool or model, not just the company’s homepage. For example, link them to https://chatgpt.com/ instead of https://openai.com/.

How to cite AI generated images and videos in APA

APA hasn’t published specific guidelines for citing AI-generated images or videos, but you can cite them in a format similar to the examples below.

Example APA citations for AI images:
  • Reference: Krea. (2025). Krea (Idealogram 3.0) [AI image generator]. https://www.krea.ai/image
  • In-text citations: Krea (2025) or (Krea, 2025)
  • Add a note that’s similar to the one under this AI-generated image of a chameleon.
Example APA citations for AI videos
  • Reference: OpenAI. (2025). ChatGPT (Sora) [AI video generator]. https://sora.chatgpt.com/
  • In-text citations: Open AI (2025) or (Open AI, 2025)
AI-generated image of a chameleon walking on a green leaf covered in water droplets with a rainforest background.
Note. Image generated using the prompt “Create a photorealistic close-up image of a chameleon walking on a leaf,” by Krea, Idealogram 3.0, 2025 (https://www.krea.ai/image)

Key points to know about APA policies for citing AI

Here are important points to know about APA policy guidelines for citing AI-generated content by ChatGPT, Claude, and Gemini.

How does APAP define AI?

APA defines AI as generative LLM tools, but it doesn’t include other forms of AI tools, such as grammar checkers, citation tools, or plagiarism detectors.

What to do if you cite AI in an APA publication

If you cite AI in an APA publication, you need to: 

  • Cite it using the software citation format
  • Disclose it in the Methods section
  • Upload the full output of the AI in an appendix or as supplemental material, including the prompts you used

You can use AI for editing, but you need to disclose how it was used.

AI can’t be listed as an author on research papers

According to APA guidelines, AI tools like ChatGPT and Gemini can’t be listed as authors because they don’t meet authorship criteria since they can’t take responsibility for the work or consent to its publication.

Author responsibility

Authors are entirely responsible for verifying the accuracy of all AI-generated content, which includes text and any citations it provides.

How to cite AI in MLA

MLA formatting AI examples

How to cite AI-generated text in MLA

Works cited MLA AI citation
  • Template: “Prompt text” prompt. Title of Container, Day Month version, Publisher, Day Month Year, location.
  • Example: “Explain how Edgar Allan Poe used alliteration in his writing” prompt. ChatGPT, 14 May version, OpenAI, 20 May 2025, https://chatgpt.com/.
MLA AI in-text citation example:
  • (“Explain how Edgar Allan Poe used alliteration in his writing”)
Context for each part of the MLA AI citation template​
  • Prompt text (Title of Source): Add the prompt you used or briefly describe what the AI tool generated. 
  • Title of Container: The name of the AI tool (e.g., ChatGPT, Claude, etc.)
  • Version: Include the AI tool version and the release date, if it's available.
  • Publisher: Company that made the AI tool (e.g., OpenAI, Anthropic, etc.)
  • Date: The date the content was generated
  • Location: URL for the AI tool.

How to cite AI-generated images in MLA

AI-generated images require a caption that includes a description of the prompt along with the AI tool, version, and date the image was created.

The text below the image of the napping golden retriever shows how to cite an image generated with ChatGPT’s image generation tool, DALLE-3.

DALL-E lets users create a shareable link. If you use a shareable link, include that specific URL instead of the general site link.

Oil painting of a golden retriever sleeping on a brown couch.
Fig. 1. “Oil painting of a golden retriever napping on a couch” prompt, DALL·E, version 3, OpenAI, 20 May 2025, https://chat.openai.com/.

Key points to know about MLA guidelines for citing AI

Here are a few key points to know about citing generative AI in MLA:

How does MLA define AI?

MLA defines generative AI as a tool that can as a tool that analyzes and/or summarizes content from a set of information available on the internet (e.g., web pages, books, and other writing), and uses that information to generate original content.

Be transparent about how you use generative AI in your work

Always cite when and how you use generative AI tools, whether it’s paraphrasing, quoting, translating, or generating content like text, images, video, and even code or data.

However, if you use an AI tool to find a source, just cite the source itself; you don’t need to cite that you used the AI tool specifically to find that source or that you found it through the tool. For example, if Google Gemini directs you to a research study that you use in your work, you’d only cite the research study, not Google Gemini.

Do not treat the AI tool as an author

MLA guidelines are clear that AI can’t be listed as authors.

Author responsibility

Similar to APA guidelines, MLA indicates that it’s up to the author(s) to verify any AI-generated content used in their work.

How to cite AI in Chicago Style

Citing AI in Chicago style is less rigid than APA and MLA. Chicago says to cite AI content in a note, not in the bibliography because the chat can’t be shared like a source. But some AI tools offer shareable links, so this rule MIGHT change.

Example AI citations in Chicago

In most cases, you can simply acknowledge the AI tool in your text (e.g., “The list of tropical fish was generated by ChatGPT”).

However, if you need a more formal citation for a research paper, for example, you’d add a numbered footnote or endnote like this: Text generated by Claude, Anthropic, May 20, 2025, https://claude.ai/.

If you’ve edited the AI-generated text, just mention it at the end of the note, like this: Text generated by Claude, Anthropic, May 20, 2025, https://claude.ai/. Edited for style and content.

Detecting and preventing AI cheating

If you have concerns about chatbot cheating, you aren’t alone. While AI detectors and plagiarism checkers tools won’t really help because they’re ineffective when AI-generated text is edited, remote proctoring software can detect and block AI tools during exams and written assignments.

More AI Content

Blog Sign Up

ChatGPT vs. Honorlock Online Proctoring

Using online proctoring services to stop students from cheating with ChatGPT

ChatGPT has had educators nervous since it became available to the public in the fall of 2022. It wasn’t the first generative AI chatbot, but it was the one that got everyone’s attention. Since then, many similar AI chatbots have popped up, like Gemini and Claude, and other chatbot-like tools such as Perplexity and NotebookLM are gaining popularity.

And people use chatbots for just about everything now. They can write everything from long-form essays to punny jokes, solve math problems, summarize and simplify text, and write code. Some even use them to organize their schedule or just to vent about their day.

The problem is that ChatGPT and other chatbots have been used to cheat on exams (which is why you’re here). ChatGPT is highly capable and there are very few exams it can’t help with.

Early-ish versions of ChatGPT passed these exams

ChatGPT exam scores for UBI, CPA, LSAT, and GRE

2025 Guide to Preventing Chatbot Cheating

6-part guide on how chatbots work, ways to use them in courses, how to prevent cheating, and guidance to help start AI initiatives.

Book with text saying Guide to Preventing Chatbot Cheating

Can ChatGPT be detected as plagiarism?

Plagiarism tools can’t reliably detect ChatGPT because they look for text that’s copied exactly from the source or looks pretty similar to it. That’s a big problem because ChatGPT doesn’t just copy from the resources it was trained on, like websites, research studies, news, code, etc.

It’s similar to how you learn from all kinds of different sources. But if someone asks you a question about it, you don’t just recite answers word for word. You put it into your own words, make connections, and you can explain it differently depending on who you’re talking to. It isn’t an perfect analogy, but that’s the gist of how ChatGPT works and why plagiarism detection doesn’t.

Can AI detectors detect ChatGPT?

If a student copies text directly from ChatGPT without making any edits, AI detectors are usually accurate. But realistically, students will probably make a few changes before submitting. Once AI-generated text is altered, even if another AI tool is used to make the changes, the accuracy of AI detection decreases (Liu et al., 2024; Weber-Wulff et al., 2023).

How does AI detection work?

AI detection assesses how predictable the writing is (perplexity), how sentence length and structure vary, and whether it sounds a little too “cookie cutter.”

Human writing usually has more variety and specific details, so when text is predictable or the sentences are all a similar length, AI detectors might flag it as AI-generated.

What is an acceptable AI detection score?

That’s the million dollar question, but there’s no real answer. Whether the AI detector indicates that the text is 10%, 30%, or 85% AI-generated, educators have to make that call.

How can Honorlock’s online proctoring software prevent the use of ChatGPT for written assignments?

Honorlock’s hybrid proctoring solution prevents the use of ChatGPT using a combination of AI and live proctors. Some of the proctoring features we use to control AI chatbots are covered below.

Detect cell phones and other secondary devices

Several of Honorlock’s remote proctoring features work together to detect cell phones and other secondary devices (other laptops, tablets, smartwatches, etc.), so you don’t have to rely on a live proctor to catch it in real time.

Honorlock detects when test takers try to access test bank content using a cell phone or other device, and it uses exclusive AI-powered Apple Handoff technology to spot nearby Apple devices during the exam.

Lock the browser

With Honorlock’s BrowserGuard, instructors can block access to all external resources, like websites, applications (like ChatGPT), and browsers—or allow access to specific resources, like research journals and case studies, software such as Word, or other online word precessors like Google Docs.

This gives educators the flexibility to allow learners to access supplemental materials while still securing it from unauthorized use of chatbots and other AI writing.

Honorlock’s BrowserGuard also records the participants’ desktops and prevents keyboard shortcuts, like copying and pasting. If copy/paste is attempted, the participant receives a notification that it isn’t allowed and the AI marks it as a medium flag.

However, locking the browser is just one piece of the puzzle to preventing chatbot cheating. If you’re only using browser lockdown software, there’s no way to know if someone is:

  • Using a phone to access ChatGPT or browse the internet
  • Asking Siri or Alexa to access apps/sites
  • Having someone else write the essay or take the exam
  • Typing what a friend is reading to them from off-camera
Use smart voice detection

If your proctoring platform doesn’t have smart voice detection, participants can use voice assistants like Siri or Alexa to open ChatGPT, ask questions, and have the answers read aloud.

Honorlock’s Smart Voice Detection listens for specific keywords or phrases, like “Hey Siri” or “OK Google,” and it records and transcribes audio while the exam is in progress. If it detects potentially dishonest behavior, a live proctor is alerted to review the situation and intervene if necessary.

Scan the room and record behavior

Honorlock’s remote proctoring platform makes it quick to check the room for unauthorized resources, like cell phones, notes, and even other people. And our platform monitors and records behavior throughout the exam.

This way, you’ll know that test takers aren’t accessing ChatGPT from other devices, using their notes, or asking other people for help.

Set the amount of time they have to write the essay once they begin

Instructors can create rules that specify a certain number of hours participants have to complete the written assignment once they begin. Honorlock’s remote proctoring software and services are available 24/7/365, so they can write their essays anytime they’re ready within your set time frame, and you’ll know it’s being proctored.

Other online proctoring tools from Honorlock
Live Pop-In® (AI + live proctoring)

Honorlock’s Live Pop-In secures online assessments and assignments with a hybrid proctoring approach that uses AI to monitor the test environment and behavior, and a live proctor reviews anything the AI flags to determine if they need to intervene or not.

The AI monitors exams and assignments and alerts a live proctor if something looks suspicious. The proctor reviews the situation and only steps in if absolutely necessary, so test takers aren’t interrupted unless there’s a real concern. Our blended proctoring solution delivers a less intimidating and non-invasive experience for participants.

Finds leaked test content on the internet
Honorlock’s Search and Destroy™ technology scours the internet automatically for leaked test content and gives instructors the ability to send takedown requests with one click. This way, you’ll know your test content isn’t available on sites like Quizlet, Chegg, Reddit, and Quora.
Verifies test takers’ identity

You need to know that the person taking the proctored exam is the same person getting credit in the class. Honorlock’s ID verification takes about 60 seconds. It captures a picture of the test-taker along with their photo ID.

Stay ahead of AI cheating

ChatGPT and many of the AI chatbots are powerful tools that have advantages and disadvantages in online learning. Use Honorlock’s solution to stay ahead of the curve and in control of if, how, and when chatbots and other AI are used in your courses and assessments. 

Fill out the form to see how Honorlock works.

More resources

What is Cluely & How to Block It

How to Block Cluely AI During Exams and Interviews to prevent cheating

Cluely, the AI application built to cheat on everything, as described by its founders, Chungin “Roy” Lee and Neel Shanmugam, has raised major concerns in higher education, professional education, and employers by putting the integrity of exams and interviews at risk. Cluely’s manifesto even attempts to justify using the app for cheating by claiming that if everyone cheats, no one is really cheating. But what is it, and why are so many concerned?

What is Cluely AI?

Cluely AI is an application that sits on screen as a transparent overlay that feeds users real-time responses based on what it sees on screen and hears through audio during exams and interviews. It can answer questions on the screen during an exam, or it can hear what the interviewer asks and write responses quickly enough for the user to read aloud and sound natural.

In other words, Cluely is basically an invisible AI assistant that makes it really easy to cheat and it’s really difficult to detect.

Mockup example of what Cluely AI looks like during an online quiz. Displays the application’s AI overlay answering the quiz question.
Example of what Cluely's transparent overlay looks like during an exam

Why is Cluely AI so difficult to detect?

Cluely is difficult to detect because it hides activities by:

  • Bypassing keyboard logging with hidden global shortcuts
  • Masking tab activity with a transparent overlay that tricks the system into thinking the test taker never leaves the exam tab
  • Staying invisible during screen sharing

Cluely intentionally hides these activities because it makes it almost undetectable. Almost.

Honorlock proctoring platform blocks Cluely (and all other applications)

Honorlock makes it easy to detect and prevent the use of Cluely. Exam admins simply enable the Honorlock Application to block Cluely (and all other applications) during exams, assignments, interviews, presentations, and any other tasks you can think of.

Admins also have the flexibility to allow specific applications while keeping all others blocked. They can use a simple drop-down menu to allow tools like Microsoft Word for proctored essays or Excel for tasks like building balance sheets or analyzing data.

Simply put, Honorlock makes it easy for admins to block Cluely and all other applications/websites, or allow the ones they want based on their exam goals.

Other remote proctoring tools to secure integrity

In addition to blocking AI like Cluely, Honorlock’s hybrid proctoring solution, which combines AI with live proctoring, offers several tools to detect and prevent the unauthorized use of AI tools, such as ChatGPT, during online exams. Honorlock also provides extensive proctoring tools to protect exam integrity from every angle, such as:

  • Detecting cell phones and other devices: AI detects cell phones and other secondary devices like tablets and smartwatches without relying on a proctor to catch them in real time.
  • Identifying leaked exam content: Search & DestroyTM automatically searches the internet to identify leaked exam questions and provides a one-click option to send DMCA takedown requests.
  • Verifying ID and scanning the room: Before the exam begins, the webcam is used to verify identity and scan the room for unauthorized resources like books, notes, secondary devices, and even other people.
  • Monitoring behavior: Honorlock monitors and records test takers’ behavior and desktop activity during assessments, interviews, and other tasks.
  • Detecting speech: Listens for keywords and phrases that potentially indicate cheating, such as “Hey Siri,” instead of flagging unimportant sounds like coughing or even a test taker speaking aloud as they work through a problem.

Honorlock integrates with Canvas, Blackboard, Moodle, D2L, Open LMS, Intellum, and Docebo, and provides open APIs for custom integrations. Exam admins also have access to in-depth, filterable reports with time-stamped HD recordings all in one easy to use portal.

Learn more about Honorlock

Learn more about Honorlock

AI Prompting Examples, Templates, and Tips For Educators

Text says AI Prompting 101 For Education. Examples and Templates

Prompt engineering is a complex, delicate art form that requires sophisticated training and ongoing, dedicated effort. Even then, only a prestigious few can hope to master advanced AI prompting techniques… Sike.

Writing effective prompts is actually pretty simple if you’re clear, specific, and provide the chatbot with context.We cover the basics below and reveal the latest prompting technique, Quad-shotting, and you can skip to the prompt examples, templates, and tips.

What is a prompt?

A prompt is the message you type into a chatbot or other generative AI to tell it what you want. Prompts can be a question, requesting help, or giving it a task like summarizing text, generating code, and even translating text to other languages.

What is prompt engineering?

Prompt engineering is just how to write your requests to get the response you’re looking for from a chatbot. “Engineering” makes it sound like a really technical (sometimes it can be), but for most educators’ use cases, you just need to be clear and specific about what you want and provide details to help the chatbot tailor its response.

What is the best AI chatbot?

The best AI chatbot really depends on what you’re doing. If it’s a general writing task, ChatGPT and Claude are likely your best bet. If it’s technical writing, Gemini and DeepSeek Chat are solid options. You can compare AI chatbots here.

Types of prompting techniques + unnecessary terms

There are a ton of terms that make prompting seem like an extremely complex and technical process, but they mostly just describe things people naturally do when they use AI.

Take shot prompting, for example. It includes Zero-shot, One-shot, and Few-shot. It just refers to the number of examples you provide in your prompt that show the chatbot what you want.

  • Zero-shot = 0 examples
  • One-shot = 1 example
  • Few-shot = a few examples

Then there’s prompt chaining and chain of thought prompting, which sound similar but are different, and neither is overly technical.

  • Prompt chaining means giving the chatbot one prompt at a time instead of everything at once. The idea is that each answer helps with the next step/response.
  • Chain of Thought (CoT) prompting asks a chatbot to show its reasoning before answering. It can be as simple as adding, “Explain your steps before answering,” to the end of a prompt.

Not so daunting, right? Not so fast. Here’s the real expert revealer…

Have you heard of Quadrillion-shot prompting, which is the latest and greatest prompting technique? Quadrillion-shot prompting, also called Quad-shotting, is when you use a quadrillion examples in your prompt. We just made it up to show how unnecessary some prompting terms are, especially the ones in the dropdown below.

Unnecessary prompting terms

Thread of Thought (ToT) prompts just include something like, “Walk me through how [topic/subject] works in a step-by-step manner while summarizing each one.”

Contrastive Chain of Thought (CCoT) prompting gives the chatbot correct and incorrect examples/reasoning.

Generated Knowledge prompting is when you ask the chatbot about a topic and then ask it to answer a question based on its response. 

  • Prompt 1: Explain how [topic/subject] works and create a list of key concepts.
  • Prompt 2: Using the information you provided, explain the steps to [related aspect of topic/subject].

Meta-prompting is basically just asking the chatbot to create its own prompt (or improve the one you provided), then following up with another prompt asking it to answer the prompt it created or improved.

Chatbot prompting tips for educators:

Provide specific context and clear instructions

Providing specific, relevant details can improve the chances of getting the response you want.

Think of writing a prompt like writing a grocery list for someone who’s shopping for you. The chatbot will usually get the items on your list, but if you want something specific—like a brand, flavor, or amount—you have to say so.

If you want vanilla ice cream, adding “snacks” to your list doesn’t really help. Even adding “ice cream” might get you the wrong flavor. Be specific: ½ gallon of Haagen-Dazs Vanilla Ice Cream.

Details to consider providing when writing prompts for AI chatbots:

  • Voice: Who should it write like? e.g., “Write as a historian” or “Write like [specific person]
  • Tone: Casual, formal, humorous, academic, dramatic?
  • Audience: Who’s reading this? Students? What kind—third graders, college freshmen, PhD researchers? What’s their major? How much do they already know?
  • Format: How should the response be provided to you? Paragraphs or bullet points? If you want a chart or table, be specific about the type, how many rows and columns, titles, etc.
  • Length: How many words, sentences, paragraphs, bullet points, etc.?
  • Context: Any key details? What information and nuances are needed?
  • Examples: Provide good and bad examples so the chatbot understands what you want.

AI prompt examples and templates

Use these prompting examples and templates as a starting point, and tailor your prompt with information specific to your assignments and assessments.

Template: Based on the text I provided, create [number] [question type] questions to assess [audience] on their understanding of [topic(s)]. [Q&A requirements and specifications.]

Example: Based on the text I provided, create 30 multiple-choice questions to assess participants preparing for a Food Safety Manager Certification exam on their understanding of methods to prevent cross-contamination. Each question should be objective, concise, and written in plain language. Provide four answer choices per question, with only one correct answer.

More tips on how to write better exam questions.

Template: Based on the following [question type] questions, create a [alternative question type] question(s) that assesses the same underlying concepts or skills. The new question should encourage [objective/outcomes (e.g., critical thinking and practical application rather than recall)]. [Add questions here.]

Provide a clear explanation of your reasoning for how you adapted the original questions into this format. Then, include a concise description of what an appropriate student response should contain.

Example of the 1st paragraph from above: Based on the following multiple-choice questions, create an essay question that assesses the same underlying concepts or skills. The new essay question should encourage critical thinking and practical application rather than recall. [Add questions here.]

Note: If you’re repurposing the question for a different audience or subject, briefly describe them so the chatbot can adapt its response. 

Learn more about writing better exam questions.

Writing objective exam rules is waaaay trickier than it seems. Many times, rules that seem crystal clear can still be misinterpreted by test takers. Here’s a complete guide with examples of objective exam rules, and you can give the prompt below to a chatbot.

Prompt to use: Create objective online rules for test takers. The rules should comprehensively address:
*Behavior (e.g., no talking or using cell phones to look up answers)
*System requirements (e.g., computer with a functioning webcam and microphone)
*Test environment (e.g., quiet room, clear desk/surface, no other people present)

Use direct, unambiguous language that leaves no room for misinterpretation. Provide the rules in a numbered list that can be copied and pasted into the assessment platform/LMS. 

Here are two examples of basic test rules followed by improved versions. Use the improved versions as examples of the level of clarity and specificity expected as you create the set of rules:

*Rule 1 (basic): No talking during the test.
*Rule 1 (improved): No communicating with other individuals by any means, whether verbal, non-verbal, or electronic.

*Rule 2 (basic): Your desk must be clear of all items except for the device you use to take the test.
*Rule 2 (improved): The testing area and any surface your device is placed on must be clear of all items except the device used to complete the test. This includes books, papers, electronics, and other personal belongings.

Note: When you type an asterisk (*) at the start of a line, chatbots, generally speaking, get the gist that you’re trying to create a bullet point. 

Template: Based on the project instructions I provided below, create a rubric that evaluates [add what’s being assessed, e.g., specific skills and competencies]. Each item should include [type of performance scale, e.g., a five-point numeric scale from 1 to 5 (1 = Novice, 5 = Excellent) or a three-point scale with the levels: Excellent, Developing, and Needs Improvement].
Format the rubric in a [type, e.g., table or chart] to use in [format, e.g., Word Doc, Google Sheets, LMS, etc.]. Here are the assignment details and instructions: [add information]

Example: Based on the project instructions I provided below, create a rubric that evaluates practical application skills. Each item should include five performance levels using a numeric scale from 1 to 5 (1 = Novice, 5 = Excellent).

Format the rubric in a table to use in a Word Doc and include a column for written feedback. Here are the assignment details and instructions: [add information]

Note: This prompt template is just that: a template—a starting point. You’ll really need to fill in the blanks with as much information and context as possible—learning goals, topics/subjects, specific characters, realistic issues and scenarios—to help the chatbot tailor the case study to your course needs.

Create a [length]-word case study about [specific scenario/situation] for [audience, e.g., undergraduate business students, professionals in a certification program, etc.].

The case study should include:

*Comprehensive, realistic background information about [scenario/situation]

*Key characters, stakeholders, and/or groups involved

*Specific opportunities, challenges, and decision points for the learner to evaluate

*Five short-answer questions that require in-depth problem-solving and a proposed plan to address the situation

Write an engaging two-part discussion prompt for [audience, e.g., undergraduate business majors, adult learners in professional development programs, etc.] focused on [topic/subject/situation].

Part 1 (initial response): Prompt learners to connect their real-world experiences and/or prior knowledge to [topic/subject/situation] by sharing specific examples or situations that highlight how the topic applies to their personal, academic, or professional life.

Part 2 (respond to peers): Ask learners to review and analyze at least two of their peers’ posts and respond with [e.g., thoughtful questions, alternative viewpoints, etc.] to extend the discussion.

Prompts that summarize text—whether it’s an entire research paper or a paragraph from an article—can be as simple as, “Summarize this.”

But you can also ask the chatbot to summarize it in specific formats or focus on certain topics. Just copy and paste the text into the chatbot or upload the document, then start the prompt with something like “summarize” or “simplify” (or any other related terms), and then include details on the length, format, and focus. 

Here are a few example prompts you can try:

  • Summarize this research study into 2-3 concise sentences, then provide bullet points on the important information in each section, especially the sections about data collection methods and results. 
  • Simplify this text and explain it in practical, easy-to-understand language.
  • Explain what this text is saying in simple terms that [audience] can understand.
  • Shorten this text into a [resource type, e.g., study guide, FAQ, etc.] for [audience].

I’m a [job role] using this image in my [online course module, presentation, article, etc.] about [subject/topic]. Please write the following for the image:

  1. Descriptive alt text (approximately 125 characters or fewer)
  2. Detailed image description for accessibility
  3. Suggested file name (use lowercase letters and hyphens, no spaces)
  4. Text caption to display alongside the image

Note: You can upload images to most AI chatbots like as ChatGPT, Gemini, and Claude.

Don’t overthink it

  • Don’t worry about order: Chatbots use machine learning to process your whole prompt at once, so the order of the information you provide usually doesn’t matter that much in most cases. Just focus on providing the right information.
  • Write naturally: Chatbots can “understand” everyday language (even our slang and sarcasm) thanks to a technology called natural language processing, which is part of how large language models (LLMs) work. So, you don’t need to use special wording or commands. Just write to it like you would a real person, and it’ll usually get the gist of what you’re saying.

Remember chatbots’ original purpose and fact-check everything

Chatbots are used for many things, but their original purpose is still the same: to chat with you. 

You might ask them to explain complex topics, check facts, or even solve math problems—but at their core, chatbots are built to converse. They aren’t inherently designed to conduct research, verify information, or solve math problems. They can help with those tasks, and some even have features like “Deep Research,” but those are just extensions of their original purpose.

That said, always, always, always fact-check chatbot responses because they don’t generate text based on facts. They just predict what word is most likely to come next.

Treat all conversations with AI like public conversations

Regardless if it’s a public or private chatbot, don’t include any personally identifiable information (PII) or sensitive organizational details in your prompts or files you upload (Excel, Word, etc.).

Anonymize information to protect learners, yourself, and your organization. For example, use “University/Company ABC” instead of your company name, or “Student/Employee A” instead of specific names.

Control AI use during exams and assignments 

Generative AI chatbots have plenty of benefits for educators and learners.

They can serve as a smart assistant that helps educators develop course materials, personalize learning experiences, and even improve accessibility. When learners use them appropriately—which means not overly relying on them or simply copying and pasting the responses as their own—chatbots can increase engagement, improve learning and retention of information, and sharpen critical thinking and problem-solving skills. That said, when they’re overused, the exact opposite is true.

The good news? You can control if, when, and how learners use AI during online exams and assignments by using hybrid virtual proctoring. It allows you to block AI entirely or allow it at certain times. So, whether it’s a traditional online exam, a written assignment, or even a math test (online or handwritten), you’ll know learners didn’t use AI.

How Students Justify Academic Dishonesty

5 ways students justify cheating

Students cheat for all sorts of reasons beyond wanting better grades or just because they can, but what’s really fascinating is how they justify it. Understanding how students justify misconduct can help faculty get to the root of academic dishonesty. This article breaks down common mindsets and shares practical ways to address them before they lead to bigger issues, like a “cheating culture.”

5 ways students justify cheating

1. "Teeeeechnically, I didn’t cheat."

Some behaviors are unethical but not technically against the rules—and students know it.

On top of that, cheating may be unintentional because students don’t always realize that certain actions count as academic violations, especially because expectations vary across classes and disciplines (Waltzer & Dahl, 2023).

All of this blurs the line between what’s considered cheating and what’s simply unethical—and it’s been that way for years.

A student survey from 2010 about whether behaviors were cheating, unethical, or neither found predictable trends in students’ opinions but also highlighted inconsistencies that can make life difficult for educators.

Here are some interesting takeaways from the student survey:

96.4% believe copying exam answers is cheating, but only 73% think letting classmates copy your answers is cheating.

Nearly 40% say working with peers on an online exam without permission is cheating, while the other 60% are split between calling it unethical or not a problem at all.

Just over 9% think witnessing cheating without reporting it is cheating, almost 60% see it as unethical but not cheating, and about 30% say it’s neither.

Cheating
Unethical
(not cheating)
Neither
Copy classmates’ answers
during an exam
96.4%
2.3%
1.1%
Witnessing cheating but not reporting it
9.2%
59.6%
30.3%
Allowing others to look at
your answers during exams
73.3%
3.4%
23.3%
Working with peers on online exams without permission
39.8%
29%
30.2%
Asking classmates about exam questions before taking it
26.7%
45.6%
26.6%
Falsely claiming you submitted an assignment
61.1%
33%
4.5%
Copying homework from classmates
72.9%
22.6%
3.9%
Copying work from other students’ past assignments
52.3%
31.1%
16%
Storing answers/formulas in a calculator used in an exam
74.5%
15.6%
9.8%

How to help

Writing clear, objective rules for tests and assignments helps reduce confusion and prevent academic dishonesty. The challenge is making them comprehensive enough to cover key details while keeping them concise—because what seems clear and straightforward often has just enough wiggle room for students to take advantage of.

Give students a detailed definition of what you consider cheating, and talk through example situations. The example situations should be specific because broad examples are ineffective (Awosoga et al., 2021). This makes your definition more relatable and helps them better understand how it applies in real scenarios. 

It’s easy to assume that talking to students about academic integrity will go in one ear and out the other, but it can actually make a difference in reducing cheating in some situations (Malesky, 2022; Tatum & Schwartz, 2017). 

That said, it should be just one small part of a larger strategy—it helps, but it’s not nearly enough on its own.

2. “Professors don’t seem to care when people cheat.”

Most faculty members believe academic integrity is important, and most realize it’s an issue in higher education (MacLeod & Eaton, 2020; Awosoga et al., 2021), but unfortunately, academic dishonesty is often unreported*.

Why is academic dishonesty underreported if it’s so important? 

According to a few faculty surveys, academic dishonesty is underreported because it’s a slow process, evidence of cheating is rarely conclusive, and they have little faith that the institution will enforce consequences for cheating*

Learn more about why faculty often avoid reporting cheating.

Some faculty survey responses indicated that the process took so long that they forgot the case details.

“When the feedback finally got back to me I’d actually forgotten what had happened… I thought oh jeez, yeah, I remember that now, that was a while ago.” 

“I think it takes too long… It does take weeks sometimes for a decision to come back down.”

What happens if a school allows cheating?

If students believe faculty don’t care about cheating—or don’t care enough to address it—it can quickly become the norm, creating a culture where academic dishonesty is accepted and even expected (Tolman, 2017). And when that happens, academic misconduct happens more often, and each time it happens, students see it as a less serious offense (Shu et al., 2011).

How to help

Talking to students about suspected cheating is awkward and stressful, but these conversations are important because addressing each instance of academic dishonesty helps decrease future cheating (Tatum & Schwartz, 2017)

These conversations don’t have to be as negative as they seem either. But what should you say? What shouldn’t you say? What evidence do you need? This guide to talking to students about cheating has everything you need.

A hybrid proctoring solution provides objective evidence—audio, video, images, clicks, and more—that can support misconduct cases. It offers clear, conclusive evidence that helps faculty and the teams involved in the reporting process.

Universities should focus on faculty training on academic dishonesty to create awareness of school policies and a shared definition of what counts as misconduct. This is particularly important for newer faculty members to get acclimated and up to speed on integrity policies (Awosoga et al., 2021).

Make it easier to report cheating by asking faculty and staff where the process gets stuck and how it can improve. Use their feedback to cut out unnecessary steps, add helpful tools like electronic forms or scheduling software, and create a system that supports everyone better.

*(Carpenter et al. 2010; MacLeod & Eaton, 2020; Staats et al., 2009)

3. “At work, they’d call it being resourceful, not cheating.”

A student survey by Cole et al. (2014) found that many justify academic dishonesty, like using internet resources or sharing answers during online exams, by arguing that they’ll be able to do it in the workplace.

This rationale also influences students’ prioritization of grades over learning, as one survey response explained: “When you get to the workplace, they teach you what you want to know, your diploma is just your foot in the door for the most part.” (Cole et al., 2014, p. 43).

How to help

Faculty need to show why the learning material and activities matter in real life to help students understand the importance and why they should care about it. When students feel they’re learning for themselves, especially when it’s practical information they’ll use in the future, they’re much less likely to cheat (Anderman & Won, 2018; Carpenter et al., 2010).

4. “I was paraphrasing, not plagiarizing.”

Sometimes, students knowingly plagiarize because they see it as a way to save time—especially when they don’t care about the subject or think it’s relevant to their futures (Moss, 2018).

Others may not even realize they’re plagiarizing, like the 70% of students who believe internet resources are public information that doesn’t need to be cited (Larkin & Mintu-Wimsatt, 2015). Students can also unintentionally plagiarize when they’re trying reeeeeally hard to sound overly academic, which is a struggle that some doctoral students face (Fatemi & Saito, 2020).

“Plagiarism can be particularly difficult for students to perceive. The lines between plagiarism and appropriate paraphrasing are often blurry, which can generate considerable disagreement among students and instructors about what constitutes plagiarism.” (Waltzer & Dahl, 2023, p. 131)

How to help

APA or MLA? Chicago, maybe? When do you need to cite a source? What counts as common knowledge? Questions like these can make plagiarism feel confusing, especially with all the gray areas. Make it easier for students by providing helpful resources in the course and teaching them how to cite properly.

Tie written assignments to students’ future work when you can. This can be challenging when teaching students who are taking the course as an elective, not out of interest.

For example, if a business administration major is in an art history class, offer flexibility to write a plan for a profitable art gallery business. While this kind of flexibility isn’t always realistic, even small connections to their future goals can make a difference.

Long and short-form written assignments can be proctored just like online exams.

During proctored written assignments, students WON’T be able to:

  • Access unauthorized websites, applications, and extensions (including chatbots) unless faculty permit access to them (e.g., allowing students to visit specific articles and case studies.
  • Use their cell phones to look up information.
  • Use books or notes unless approved by the instructor.
  • Paste content into the word processor that was copied before the assignment started.
  •  Ask voice assistants like Siri or Alexa for help.

Share links to helpful citation resources at your institution, like the writing center, and from reliable citation websites.

5. "I was so anxious."

Tests are inherently stressful; the more anxious a student gets, the more likely they are to cheat (Eshet et al., 2024). Test anxiety and the fear of failing the test make cheating hard to resist, even for students who typically wouldn’t cheat (Eshet et al., 2024; Salgado et al., 2022).

How to help

Offering practice exams gives students a chance to familiarize themselves with the testing platforms and check to make sure their device meets the minimum system requirements.

Talk to students about accommodations, like extended time or assistive technologies, and how to request them or get help if they need it.

Give students a clear idea of what the test will cover, its format, the types of questions, time limit, etc.

Knowing how and why cheating happens is important for educators to be proactive against academic dishonesty and create a culture of honesty and fairness. Hybid online proctoring supports this effort by helping detect and prevent dishonest behavior so educators can focus on teaching.

Anderman, E. M., & Won, S. (2019). Academic Cheating in Disliked Classes. Ethics & Behavior, 29(1), 1–22. https://doi.org/10.1080/10508422.2017.1373648

Awosoga, O., Nord, C. M., Varsanyi, S., Barley, R., & Meadows, J. (2021). Student and faculty perceptions of, and experiences with, academic dishonesty at a medium-sized Canadian university. International Journal for Educational Integrity, 17(1), 1–26. https://doi.org/10.1007/s40979-021-00090-w

Carpenter, D. D., Harding, T. S., & Finelli, C. J. (n.d.). Using Research to Identify Academic Dishonesty Deterrents Among Engineering Undergraduates.

Cole, M. T., Swartz, L. B., & Shelley, D. J. (2014). Students’ Use of Technology in Learning Course Material: Is it Cheating? International Journal of Information and Communication Technology Education, 10(1), 35–48. https://doi.org/10.4018/ijicte.2014010104

Eshet, Y., Grinautsky, K., & Steinberger, P. (2024). To behave or not (un)ethically? The meditative effect of mindfulness on statistics anxiety and academic dishonesty moderated by risk aversion. International Journal for Educational Integrity, 20(1), 6–18. https://doi.org/10.1007/s40979-024-00151-w

Fatemi G., Saito E. (2020) Unintentional plagiarism and academic integrity: the challenges and needs of postgraduate international students in Australia. J Furth High Educ 44(10):1305–1319. https://doi.org/10.1080/0309877X.2019.1683521

Larkin, C., & Mintu-Wimsatt, A. (2015). Comparing cheating behaviors among graduate and undergraduate online business students. Journal of Higher Education Theory and Practice, 15(7), 54–62.

MacLeod, P. D., & Eaton, S. E. (2020). The Paradox of Faculty Attitudes toward Student Violations of Academic Integrity. Journal of Academic Ethics, 18(4), 347–362. https://doi.org/10.1007/s10805-020-09363-4

Malesky, A., Grist, C., Poovey, K., & Dennis, N. (2022). The Effects of Peer Influence, Honor Codes, and Personality Traits on Cheating Behavior in a University Setting. Ethics & Behavior, 32(1), 12–21. https://doi.org/10.1080/10508422.2020.1869006

Moss, S. A., White, B., & Lee, J. (2018). A Systematic Review Into the Psychological Causes and Correlates of Plagiarism. Ethics & Behavior, 28(4), 261–283. https://doi.org/10.1080/10508422.2017.1341837

Salgado, J. F., Cuadrado, D., & Moscoso, S. (2022). Counterproductive Academic Behaviors and Academic Performance: A Meta-Analysis and a Path Analysis Model. Frontiers in Psychology, 13, 893775–893775. https://doi.org/10.3389/fpsyg.2022.893775

Shu, L. L., Gino, F., & Bazerman, M. H. (2011). Dishonest deed, clear conscience: When cheating leads to moral disengagement and motivated forgetting. Personality and Social Psychology Bulletin, 37(3), 330–349. https://doi.org/10.1177/0146167211398138

Staats, S., Hupp, J. M., Wallace, H., & Gresley, J. (2009). Heroes Don’t Cheat: An Examination of Academic Dishonesty and Students’ Views on Why Professors Don’t Report Cheating. Ethics & Behavior, 19(3), 171–183. https://doi.org/10.1080/10508420802623716

Tatum, H., & Schwartz, B. M. (2017). Honor Codes: Evidence Based Strategies for Improving Academic Integrity. Theory into Practice, 56(2), 129–135. https://doi.org/10.1080/00405841.2017.1308175

Tolman, S. (2017). Academic Dishonesty in Online Courses: Considerations for Graduate Preparatory Programs in Higher Education. College Student Journal, 51(4), 579–584.

Waltzer, T., & Dahl, A. (2023). Why do students cheat? Perceptions, evaluations, and motivations. Ethics & Behavior, 33(2), 130–150. https://doi.org/10.1080/10508422.2022.2026775

Sign up for
our blog
More resources

2025 AI Trends in Higher Education

5 AI trends to expect in higher education in 2025

Important information to consider before reading

Think of AI as a hammer. It’s a tool. One tool. Assuming you have lumber and nails, can you build a house with a hammer? Technically, yes? Maybe? But would you want to live in it? Doubt it.

You can gather every tool and material imaginable, but it takes skills, knowledge, and expertise to use them effectively—and safely—that’s where you come in.

AI can help improve accessibility (discussed later), but the tools themselves need to be accessible. Each AI tool is built for a specific task or set of tasks—like generating text. Some tools can help improve accessibility by generating text to better support students with specific disabilities, like using decodable words and formatting to help students with dyslexia. It may do that well, but that doesn’t mean it works with screen readers or other assistive technologies.

In other words, helping improve accessibility doesn’t mean they’re accessible or meet accessibility compliance standards.

Whether you’re implementing AI across the institution or just looking for some course content, one thing is non-negotiable: always, always, always have the right people review and approve it for data privacy and security before using it.

Use AI purposefully, not just because it’s there. It should be a supplemental tool that genuinely helps improve teaching and learning. Otherwise, it’s just another thing students have to figure out how to use without adding real value.

Also, consider whether you’re using each tool for its true purpose. Have you ever seen LinkedIn “experts” bragging that they’ve stumped a chatbot… with a math problem Pretty impressive—as long as you ignore the fact that chatbots aren’t designed to do math.

Would you use a calculator to write an essay? Hopefully not.

While some AI tools can handle tasks outside their main functionality, they work best when used for what they’re actually designed to do.

Faculty can use chatbots to:
  • Write long and short-form course content
  • Make writing sound more professional
  • Break down complex tasks to complete them step by step
  • Summarize and organize information seconds
  • Double-check answers to test questions

Students can too. Download our 6 part guide to preventing chatbot cheating in higher education.

1. More schools investing in secure AI chatbots

The benefits of AI chatbots in higher education are clear, but so are the data privacy and security risks. That’s why you can expect to see more colleges and universities offering AI chatbots, but taking a more secure approach either by self-hosting them or only through strict agreements where the provider (Google, Amazon, Microsoft Azure) does not use the data to train larger public models.

In other words, students, faculty, and staff can interact with the chatbot and the information won’t be used publicly.

Plus, when colleges offer these and other similar tools, all students can access them, not just those who can afford a subscription.

Benefits of AI chatbots in higher education

Aside from privacy and security, colleges and universities can still enjoy many of these benefits regardless of the approach, but a secure approach adds peace of mind and aligns with the ethical responsibility to protect sensitive information.

Commonly used chatbots: ChatGPT, Microsoft Copilot, Google Gemini, Claude

Creating, developing, and repurposing course content

Faculty can use chatbots for just about any course-related task, from drafting comprehensive exam rules and rewriting test questions to creating discussion prompts and generating mock datasets for assignments.

Click below to see templated prompt examples and screenshots.

Prompt template: Write a discussion prompt for a [type/level (e.g., online graduate)] course within a module about [topics/subjects]. Use open-ended questions that encourage students to share specific personal experiences and challenges them to analyze and relate their experience to [add subject details] to demonstrate their understanding and critical thinking skills.

Example prompt output for an accounting course: Think about a time when you had to create or manage a personal budget. How did your approach compare to the principles of accrual accounting discussed in this module? Reflect on whether you followed a cash-based approach (tracking money when it is received or spent) or an accrual-based approach (tracking income and expenses when they are earned or incurred). What challenges did you face, and what does this reveal about the benefits and limitations of each method? Use specific examples from your experience and connect them to the concepts covered in this module.

Example prompt templates:

Rewrite the following test question(s) to assess the same concepts. Maintain the same difficulty level and word count, but alter the phrasing and structure.”

“Convert these [existing format (e.g., true or false, matching, etc.)] test questions into [new format (e.g., multiple-choice, fill in the blank, etc.)]”

For better results, include an example of a test question with a revised version.

Prompt template: Create an mock dataset for [type/level, e.g., undergraduate accounting] students to analyze. The dataset should represent a [specific scenario (e.g., a small online retail business, or a personal budget]. Include data spanning [time period, e.g., 12 months] with the following categories: [list categories (e.g., expenses, income, etc.)

Prompt template: Adapt this [activity, resource, etc.] about [subject/topic] in a [type/level] course to be repurposed for an [new use case]: [insert text from original activity]

Prompt to create the table below: Write HTML code for a table I can embed into my course that has 4 rows and 4 columns. Use placeholder text that I can edit as needed (for example “Column 1 Title” “Column 1 Text”). Make this the background color for the first row #ababab. Make this the background color of the first column ##d9d9d9.

The table below is the result of the code provided by Claude.

Column 1 Title Column 2 Title Column 3 Title
Row 1 Text Row 1, Column 1 Row 1, Column 2 Row 1, Column 3
Row 2 Text Row 2, Column 1 Row 2, Column 2 Row 2, Column 3
Row 3 Text Row 3, Column 1 Row 3, Column 2 Row 3, Column 3

Learn more about AI prompting techniques and find more templates and examples.

2. Improving accessibility

AI has the potential to make education more accessible and inclusive in some unexpected ways, like recognizing accents, localizing content, and detecting emotions. On top of that, it can automatically check for compliance and make changes to meet accessibility standards—almost like a set-it-and-forget-it solution for specific areas.

Languages

AI is making communication smoother, not just with basic translation tools but also through localization, accent recognition, and name pronunciation. Here are some AI tools to consider, how they work, and why they’re important.

Accent recognition AI does exactly what you’d expect—it recognizes and understands accents. It’s like Siri, but more fluent (and cultured?)

Why does that matter for online learning? Think about students who rely on speech-to-text because using a keyboard isn’t an option. Or a faculty member with an accent whose live lecture captions are filled with errors. It’s a frustrating and limiting experience.

Accent recognition AI can help fix this by making captions and transcripts more accurate and voice-controlled assistive technology more usable.

Check out these accent recognition tools: Lokalise, Smartling, Fluent AI

Did you know there are around 160 different English dialects? Some have only slight variations, while others differ significantly in pronunciation, grammar, spelling, and even word meanings.

For example, what Americans and Canadians call french fries or fries, Brits and Australians call chips. And Americans write center and localize, while Canadians, Brits, and Australians often use centre and localise.

Most of the spelling differences don’t affect understanding. But it can be confusing when words have entirely different meanings. What if a term with different meanings was used on exam or assignment and the student didn’t know?

Localization AI helps by adapting your content to specific dialects and cultures. You can also use chatbots to help localize your content and review it for language that may be confusing for some people.

AI sign language translation software can translate speech into sign language in real time.

The AI chooses from a huge collection of real videos of people signing to create smooth and accurate translations. Some solutions integrate into your websites and can be added to videos or other content in a simple picture-in-picture format.

Check out Signapse AI and Slait AI. They provide real videos of people signing, unlike previously existing tools that provide cartoon-like avatars.

If your name is even remotely unique you’ve probably heard it mispronounced. The really unfortunate part is that rather than correcting people, some just accept it and answer to all kinds of mispronunciations.

That’s where name pronunciation AI like Namecoach come into the picture. Namecoach integrates with the LMS, SIS, and most other platforms and browsers.

Students create a voice recording of their names or use audio databases for accurate pronunciation. These recordings and phonetic spellings are accessible to instructors and other students so that whether it’s a virtual event or support call, names are pronounced correctly.

Adapting content to support students with learning disabilities and diverse learning needs

Chatbots can assist faculty with adapting content to better support students with learning disabilities and diverse learning needs.

For example, faculty can ask the chatbot to adjust exam questions to meet a wider range of cognitive abilities and language comprehension levels. Chatbots can further support accessibility by creating descriptive elements for images and completing other accessibility-related tasks.

Click below to see templated prompt examples.

Shorter prompt: Please provide alternative text (125 words or less) and an image description for this photo: [attach photo]

Detailed prompt: I’m using this image on a [course page, email, etc.] about [subject/topic]. Please generate: descriptive alt text (125 words or less); detailed image description; suggested file name; and caption for this photo: [attach photo]

You can also try out this step-by-step alt-text creator.

Prompt example: Rewrite these [object (e.g., instructions, questions, etc.)] to make them clearer and more accessible for students with dyslexia. Simplify the language by using shorter sentences, straightforward vocabulary, and decodable words where possible. Break the text into smaller sections, use bullet points or numbered lists, and highlight key terms or steps. Keep the instructions concise and avoid unnecessary jargon. Here are the instructions: [add your instructions]

Example using the prompt:

Original instructions: Write a 500 word essay analyzing the themes in the assigned text. Your essay must include an introduction with a thesis statement, at least three body paragraphs with textual evidence, and a conclusion that summarizes your argument. Cite all sources using APA formatting. Submit your essay in a Word document via the course portal by Friday at 11:59 PM. Late submissions will incur a penalty of 10% per day.

Revised version for students with dyslexia (Note: This is a generic example. You’ll need to adapt the prompt based on your students and the specific activity and provide any necessary context.

Write a 500-word essay about the themes in the assigned text. Follow these steps:

  • Introduction: Start with a short paragraph that states your main idea (thesis statement).
  • Body Paragraphs: Write at least three paragraphs. Use examples from the text to support your ideas.
  • Conclusion: End with a short paragraph that sums up your main points.

Remember:

  • Use APA format to include references for any sources you use.
  • Check your spelling and grammar before submitting.

How to Submit:

  • Save your essay as a Word document.
  • Upload it to the course portal by Friday at 11:59 PM.

Important: If you turn it in late, 10% will be taken off for each day it’s late.

Emotion detection AI

Emotion AI detects and interprets emotions by analyzing text, voice (audio), and video (facial expressions and gestures) and associating specific components with related emotions.

This technology can help:

  • Students with cognitive and/or physical disabilities communicate
  • Detect signs of stress, confusion, frustration, happiness, etc.
  • Understand which course activities students are most interested or uninterested in

The images below show how video emotion AI could work to give instructors insight into learner engagement during specific activities.

  • High Attention: Learner attention increases with eye contact.
  • Low Attention: Attention drops when they look away.

While eye contact doesn’t always mean focus, and looking away doesn’t mean they’re not listening, combining this with other data—like duration, facial expressions, and body language—gives a clearer picture of behavior. Educators can use this to identify course activities that increase engagement, create confusion and frustration, and more.

High Attention

Low Attention

  • High attention example for emotion detection AI

  • Example of low levels of attention in video emotion software

But these tools aren’t perfect, and they probably never will be. That’s because of how they work. With facial expression recognition, for example—the AI looks at things like raised eyebrows and lip movements and tries to match them to emotions it’s been trained to recognize. In reality, it’s just grouping facial features into preset emotion categories, not actually figuring out the person’s true emotional state. Plus, we all express emotion differently, and some emotions are expressed in pretty similar ways—how easy would it be for happiness to be misinterpreted as surprise?

AI accessibility checkers

AccessiBe and Equally AI are two AI-powered tools that help institutions, faculty, and staff check their websites, courses, and content for accessibility. These solutions continuously monitor resources and automatically make adjustments to improve accessibility and meet compliance standards.

This isn’t AI, but Accessibility Desk provides a hub of free accessibility tools to check websites and content for accessibility.

Learn more about web accessibility compliance standards and guidelines.

3. AI will incrementally become a staple for studying

AI can be the ultimate study partner because of its diverse capabilities and interactiveness. After providing it with text, notes, files, or images, here’s how students use AI to study smarter.

Creating study guides​ and outlines

Summary about using either Notebook LM or chatbots

Chatbots

Example prompts: Based on the sources I’ve provided, create a detailed [outline or study guide] for [sections/chapters e.g., chapters 2-4]. [Add follow up text from the examples below].

Organize the outline using [preferred format e.g., headers, bullet points, etc.]. Focus on [specific information/goals (e.g., identifying connections between X and Y, comparing and contrasting X and Y)].

Include the following information for each chapter: a high-level summary paragraph, a section explaining key concepts and why they’re important, a bulleted list of terms and definitions that are easy to understand, and a list of potential exam questions.

NotebookLM

With NotebookLM, students can upload PDFs, websites, YouTube videos, audio files, Google Docs, or Slides. NotebookLM quickly turns these into detailed outlines, study guides, FAQs, or even conversational “podcasts” using all or some of the uploaded sources.

NotebookLM interface for students creating study guides

Students upload the sources here, which can range from PDFs and websites to videos and audio files. We picked a video, Google Doc, website article, and three PDFs.

Students can ask the AI questions based on the sources. The AI provides answers based on the sources and cites which source(s) the information is from.

Always check the accuracy of anything AI generates. This is their disclaimer: "NotebookLM can be inaccurate, please double check its responses."

An audio overview can be created where two AI voices discuss the sources in a podcast-like format that actually sounds realistic.

Students simply click the buttons below for comprehensive study guides, briefing docs, FAQs, and timelines.

NotebookLM interface for students creating study guides

Students upload the sources here, which can range from PDFs and websites to videos and audio files. We picked a video, Google Doc, website article, and three PDFs.

Students can ask the AI questions based on the sources. The AI provides answers based on the sources and cites which source(s) the information is from.

Always check the accuracy of anything AI generates. This is their disclaimer: "NotebookLM can be inaccurate, please double check its responses."

An audio overview can be created where two AI voices discuss the sources in a podcast-like format that actually sounds realistic.

Students simply click buttons below to comprehensive study guides, briefing docs, FAQs, and timelines.

Flashcards

Here’s an example prompt students can use to engage in an interactive flashcards activity with a chatbot: Create 10 flashcards to test my knowledge of the different parts of a cell based on their function. Present the “front” of the flashcard first, then I’ll answer, and then you present the “back” of the card so I know if I’m correct or not. If my answer is incorrect, please provide the correct answer and 1-2 sentences of additional context.

Summarize, simplify, paraphrase, or rephrase content

Whether students provide an entire study or a paragraph of text, chatbots can help summarize and simplify the information. Here’s how a few chatbots responded to a basic prompt to simplify a few sentences of overly complicated text (text & prompt below).

Text provided to the chatbot*: “Educational technology influenced in many and often centrifugal ways educational innovation as part of societal development. Successive behaviorist, cognitive, constructivist, and socio-constructivist approaches to learning and the concomitant use of technologies suggest a clear, straightforward contribution to education based on the internal dynamics of that field.”

Prompt used: Shorten and simplify this text into two sentences that are very easy to understand. Use practical, simple language instead of complex terms.

*Text is from Lowyck, J. (2014). Bridging learning theories and technology-enhanced environments: A critical appraisal of its history. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of Research on Educational Communications and Technology (pp. 3–20). Springer New York. https://doi.org/10.1007/978-1-4614-3185-5_1 

Prompt response: Educational technology has helped education grow and adapt over time. Approaches like using rewards to encourage learning, understanding how people think, and learning through real-world experiences have guided how technology is used in teaching.

Prompt response: Educational technology has driven many changes and improvements in education as society develops. Different learning approaches and the use of technology have consistently made education better.

Prompt response: Technology has played a big role in how education has changed over time. Different ways of thinking about learning, like focusing on how people behave, how they think, how they learn by doing, and how they learn from others, have all been shaped by technology and have helped improve education.

Prompt response: Educational technology has helped drive changes in how we teach and learn as society has evolved over time. Different theories about how people learn have shaped how we use technology in education, from simple cause-and-effect approaches to more collaborative learning methods.

4. More faculty will proctor written assignments

With the right prompts, chatbots can generate human-like text on basically any topic from any perspective. The downside is that students can use them to complete written assignments. But the good news is that written assignments can be proctored just like online exams.

During proctored written assignments, students WON’T be able to do the following:

  • Use chatbot applications, websites, or browser extensions.
  • Visit unauthorized websites unless the instructor permits access to specific resources like links to specific studies or articles.
  • Refer to books or notes unless approved by the instructor.
  • Paste pre-copied content into the word processor.
  • Search for information on their cell phones.
  • Ask for help from voice assistants like Siri or Alexa.

5. Engaging students with assignments that involve AI

Some instructors will design assignments that actively incorporate chatbots. For example, students might be asked to:

  • Use a chatbot to generate an essay on a specific topic, then analyze its response by identifying errors, making corrections, and explaining their reasoning.
  • Compare responses from different chatbots on the same topic to evaluate their accuracy and identify inconsistencies.
  • Teach the AI about a concept by starting with a prompt like, “Pretend you don’t know much about [topic/subject] other than [specific aspect of the topic]. I’ll explain how it works. Ask me clarifying questions throughout.”

AI is slowly being embraced in higher education—sometimes as a helpful tool, other times as something students try to sneak past their professors. As schools and faculty rethink assignment and assessment strategies, they also need to consider the challenges around privacy, security, and academic integrity at every stage. But when used the right way, AI can make learning more effective, engaging, and fair.

More resources for educators

Sign up for our blog

Receive our latest ebooks, articles, webinar invites, and more.

How to Detect Cheating On Online Exams

Tips to prevent and detect cheating on online exams

It’s tough to detect and prevent cheating on exams, especially online. Students are smart, tech-savvy, and have plenty of tools that make it easier to cheat unless you have the right assessment strategies, proctoring technology, and a deeper understanding of why students cheat beyond obvious reasons like wanting better grades or thinking they can get away with it.

Why do college students cheat on exams?

College students often cheat on exams because they feel unprepared, want better grades, or just because they can get away with it. But beyond the obvious reasons, cheating is complex and has a lot of gray areas.

You can learn more of the real reasons students cheat, but here are three interesting reasons:

Do attitudes and emotions impact cheating?

We like to think that when students have strong beliefs that cheating is wrong, they’ll avoid it, and their moral compass will steer them toward academic integrity. But, that usually isn’t the case. Attitudes, morals, and values impact intentions to cheat, but ultimately, they don’t reduce cheating behavior (Ababneh et al., 2022; Johnson‐Clements et al., 2024; Kasler et al., 2023).

It’s kind of like how we know that driving above the speed limit is illegal, so we don’t intend to speed to the airport… But if you’re running late for a flight, you’ll probably put the pedal to the metal even though you know it’s wrong.

Negative emotions, like anxiety, increase cheating, especially if they expect to perform poorly or aren’t prepared (Eshet et al., 2024; Salgado et al., 2022).

Plagiarism is confusing

Students generally understand plagiarism, but there’s still confusion about how to do it and what information needs to be cited (Larkin & MintuWimsatt, 2015).

Some believe using internet content without citing it is acceptable because it’s publicly available. They also struggle with understanding what qualifies as “common knowledge.” Yale defines common knowledge as knowledge that most educated people already know or can find in an encyclopedia or dictionary. Obviously, there’s a ton of information in those two sources, and Yale acknowledges that the definition of common knowledge is ambiguous and explains that it changes based on the audience.

So, is it still cheating if students genuinely don’t know how or what to cite?

Regardless of whether it’s intentional behavior or an innocent mistake, it needs to be addressed.

Book with text saying Guide to Preventing Chatbot Cheating

They believe others students are cheating and faculty don’t care

When students know their peers are cheating, it begins to feel like normal, acceptable behavior, which makes them more likely to cheat, and it’s easier to justify misconduct (Bath et al., 2014; O’Rourke et al., 2010; Tatum, 2017). The issue is amplified when faculty don’t address cheating, and students begin to think that they don’t care or won’t do anything about it.

Faculty often avoid addressing and reporting cheating for several reasons (MacLeod & Eaton, 2020; Staats et al., 2009):

  • Reporting cheating is time-consuming, and faculty often doubt their institution handles misconduct consistently or effectively.
  • Evidence of cheating wasn’t conclusive enough.
  • Accusing a student of cheating is stressful for both faculty and students.
  • Worries about the situation escalating, such as the student denying the accusation or the risk of legal repercussions.

Using assessment strategies and remote proctoring to prevent cheating on online exams

Protect test content from being leaked online

With forums like Reddit and websites like Chegg and Quizlet posing as “homework help,” it’s easy and common for your test content to be leaked online.

One university found that 56% of their exam content was leaked on Chegg. Engineers at the university said, “Alarmingly, we found over half of the audited units had cheating content on Chegg, [which] is broadly used to cheat and 50% of questions asked on Chegg are answered within 1.5 h[ours]. This makes Chegg an appealing tool for academic misconduct in both assignment tasks and online exams.

Instead of manually searching for individual questions, Search & Destroy™ automatically scans the web for your leaked test content in a few minutes. If it finds any, it shows you where and allows you to send one-click takedown requests. Easy peasy.

Restrict time limits

Unless students require accommodations for additional time, setting strict time limits can help reduce cheating. Several studies found that students take about twice as long to complete unproctored exams compared to students taking proctored exams—which may be because they spend more time searching for answers online during unproctored tests (Alessio et al., 2017; Howard, 2020; Zhang, 2024).

Detect cell phones, smartwatches, and other secondary devices

Cell phones are a huge threat to exam integrity. In fact, it’s the most common way students try to cheat on proctored exams.

Most proctoring services hope a live proctor, who usually watches 10-12 exams at once, will catch a student using a phone in real-time, which isn’t realistic. However, some services take a different approach that uses AI to detect phones and other devices, with a live proctor reviewing it once it’s flagged.

Stop students from using AI during assignments and exams

Students can use AI chatbots like ChatGPT, Claude, and Microsoft Copilot to write about pretty much anything. Unfortunately, using AI detection software won’t really help because once a student changes a few words in the AI-generated text or uses a paraphrasing tool, AI detection is ineffective and inaccurate.

However, there are a few proctoring tools that work together to block unauthorized AI during exams and assignments.

In addition to using proctoring to block AI chatbots, you can also use:

  • Authentic assessments for students to prove they can apply what they’ve learned to complete projects and tasks in real-world situations.

For example, instead of asking nursing students to answer a series of multiple-choice questions, an authentic assessment may ask them to record a short video where they identify the parts of a stethoscope and demonstrate how to use it while explaining what they’re doing.

  • Assignments tied directly to in-class content, such as creating a short essay that compares and contrasts two peer discussion posts or a personal reflection on the results of an in-class survey.

Example assignment connected to class: Select two peer responses from last week’s discussion posts, each focusing on a different learning theory. Write a 2-page comparison of their similarities and differences, then create a simple visual (mind map, concept map, or infographic) to show how your own response aligns with or differs from the two you chose.

  • Scaffolded assignments that break larger assignments into a series of smaller assignments. For example, replace a single, long-form research paper with smaller assignments like brainstorming topics, submitting a proposal and outline, and writing a draft for peer review before writing the final research paper. You can also ask for a reflective component. Similar to the Ask students to submit each step in a different format, such as a mind map for brainstorming, a concept map for the proposal/outline, and a voice-over presentation for the reflection.

Help reduce students’ test anxiety

Tests are naturally stressful for most people, and they can increase anxiety. And when anxiety increases, some students are more likely to cheat (Eshet et al., 2024).

Practical tips for faculty to help reduce student test anxiety:

  • Clearly explain what the test covers, question types, time limit, grading criteria, etc.
  • Show how to use any technologies (include a demonstration if possible.)
  • Provide practice exams to test technology and devices.
  • Review the rules and answer any questions.
  • Share technical support contact information and details on how to access it.
  • Explain available accommodations and how to request them.

Monitor behavior from all angles

Another thing to consider is how you’ll monitor student behavior during online exams, such as:

  • Video monitoring, which can include a second/side camera, so you can see if they’re trying to look at their notes and use other resources during the test.
  • Smart Speech detection (not sound detection) so you can tell if they’re talking to someone else in the room or asking Siri for help, but not have to review flags for unimportant noises like coughing or a doorbell ringing.
  • Locking the browser and recording their screen to see if they try to access other websites or attempt to copy and paste questions into a document.

Verify that it’s the right person taking the test

Online proctoring systems can help verify identity by capturing a photo of both the test taker and their ID. This way, you’ll know that the person taking the proctored exam is getting credit for the online course. 

Ababneh, K. I., Ahmed, K., & Dedousis, E. (2022). Predictors of cheating in online exams among business students during the Covid pandemic: Testing the theory of planned behavior. The International Journal of Management Education20(3), 100713. https://doi.org/10.1016/j.ijme.2022.100713

Alessio, H. M., Malay, N., Maurer, K., Bailer, A. J., & Rubin, B. (2017). Examining the Effect of Proctoring on Online Test Scores. Online Learning21(1). https://doi.org/10.24059/olj.v21i1.885

Bath, M., Hovde, P., George, E., Schulz, K., Larson, E., & Brunvatne, E. (2014). Academic integrity and community ties at a small, religious-affiliated liberal arts college. International Journal for Educational Integrity10(2), 31–43. https://doi.org/10.21913/IJEI.v10i2.1005

Eshet, Y., Grinautsky, K., & Steinberger, P. (2024). To behave or not (un)ethically? The meditative effect of mindfulness on statistics anxiety and academic dishonesty moderated by risk aversion. International Journal for Educational Integrity, 20(1), 6–18. https://doi.org/10.1007/s40979-024-00151-w

Howard, D. (2020). Comparison of exam scores and time taken on exams between proctored oncampus and unproctored online students. Online Learning, 24(4), 204-228. https://doi.org/10.24059/olj.v24i4.2148

Johnson-Clements, T. P., Curtis, G. J., & Clare, J. (2024). Testing a Psychological Model of Post-Pandemic Academic Cheating. Journal of Academic Ethics. https://doi.org/10.1007/s10805-024-09561-4

Kasler, J., Sharabi-Nov, A., Shinwell, E. S., & Hen, M. (2023). Who cheats? Do prosocial values make a difference? International Journal for Educational Integrity19(1), 6–16. https://doi.org/10.1007/s40979-023-00128-1

Larkin, C., & Mintu-Wimsatt, A. (2015). Comparing cheating behaviors among graduate and undergraduate online business students. Journal of Higher Education Theory and Practice, 15(7), 54–62.

MacLeod, P. D., & Eaton, S. E. (2020). The Paradox of Faculty Attitudes toward Student Violations of Academic Integrity. Journal of Academic Ethics18(4), 347–362. https://doi.org/10.1007/s10805-020-09363-4

O’Rourke, J., Barnes, J., Deaton, A., Fulks, K., Ryan, K., & Rettinger, D. A. (2010). Imitation Is the Sincerest Form of Cheating: The Influence of Direct Knowledge and Attitudes on Academic Dishonesty. Ethics & Behavior20(1), 47–64. https://doi.org/10.1080/10508420903482616

Salgado, J. F., Cuadrado, D., & Moscoso, S. (2022). Counterproductive Academic Behaviors and Academic Performance: A Meta-Analysis and a Path Analysis Model. Frontiers in Psychology13, 893775–893775. https://doi.org/10.3389/fpsyg.2022.893775

Staats, S., Hupp, J. M., Wallace, H., & Gresley, J. (2009). Heroes Don’t Cheat: An Examination of Academic Dishonesty and Students’ Views on Why Professors Don’t Report Cheating. Ethics & Behavior19(3), 171–183. https://doi.org/10.1080/10508420802623716

Tatum, H., & Schwartz, B. M. (2017). Honor Codes: Evidence Based Strategies for Improving Academic Integrity. Theory into Practice56(2), 129–135. https://doi.org/10.1080/00405841.2017.1308175

Zhang, N., Larose, J., & Franklin, M. (2024). Effect of unproctored versus proctored examinations on student performance and long-term retention of knowledge. The Journal of Chiropractic Education. https://doi.org/10.7899/JCE-23-16