AI Mitigation Resources

While the unauthorized use of various tools and resources in academic work is far from new, GAI has made it easier for students to take shortcuts and harder for instructors to prove it. The integration of AI into widely used programs from Grammarly and Google Docs to Microsoft Word and Adobe Acrobat have made the technology ubiquitous and difficult to opt out of; students may not always realize when they are using it, and all uses may not constitute violations of academic integrity.

As campus-wide bans are technically impracticable (and would be contrary to academic freedom and classroom autonomy), educators have had to adapt quickly and often experimentally to mitigate the unauthorized use of GAI. The efficacy and applicability of any given approach depends to some extent on disciplinary distinctions, course goals, available support, and pedagogical style. The following represent a brief selection of tactics and strategies that will continue to change as the technology, students, and institutions of higher education continue to evolve. Few if any are absolutely AI-proof.

General Recommendations

  • explain to students your position on GAI. What is the class trying to accomplish, and how/why would using GAI interfere with or contradict that purpose? In the near and/or long term?
  • emphasize process rather than (or as well as) product
  • include specific language regarding GAI on your syllabus, rubrics, exams, and assignments
  • periodically collect examples of in-class writing as baselines of student facility with rhetoric/composition and subject knowledge
  • be honest and direct about the broader implications of using or relying on GAI--ethical, environmental, legal, political, cognitive, economic, etc. 

Policy Shop

Faculty from the Computer Science department received a Hill Grant to devise a set of policies for AI usage in their area; the following are general policies adapted from their work. Consider which might be best applied or adapted to your specific courses or assignments, and be sure to include them on syllabi, rubrics, and prompts. We highly recommend that you incorporate specific examples from your course or discipline alongside the language below.

  1.  No AI Use Permitted
    Students are strictly prohibited from using any AI tools or resources for any aspect of assignments, projects, exams, or other course-related work. This includes, but is not limited to, content creation, idea generation, analysis, problem solving, or information retrieval.
  2. AI Use for Learning and Exploration Only (No Submission)
    Students may only use AI tools as a tutor—to ask for explanations of concepts, methods and terminology. AI may not be used to generate any content used for submission. (This prohibition also applies to the submission of paraphrased or altered AI generated content.) All submitted work must be produced entirely by the student without copying from AI.
  3. AI Use for Specific Approved Tasks (Limited Use)
    AI tools are permitted for specific, clearly defined tasks within assignments, provided that the AI's contribution is explicitly cited and the student demonstrates a clear understanding of the AI-generated content.
  4. AI Use with Citation (Comprehensive Use)
    Students may use AI tools as a substantial aid in completing assignments. All AI assistance must be clearly cited, specific details documented, and the student must demonstrate a clear understanding of the AI-generated content.
  5. Unrestricted AI Use (Exploratory/Research Contexts)
    Students are free to use AI tools as they see fit to complete assignments and projects, with minimal or no restrictions on their application. The focus is on the outcome and the student's ability to achieve objectives. However, citation is encouraged.

Mitigation: No-Tech/Low-Tech

Some of us are old enough to remember when the internet didn’t exist, and you could have a personal computer in any color you wanted so long as you wanted beige. True classics, though, never go out of style, and several tools and techniques from analog times are again finding favor in the classroom.

  • Blue Books: don't call it a comeback--they've been here for years, and they are available for purchase at the Campus Bookstore. You can instruct students to purchase them as needed (0.55 apiece), and/or request that your department order a communal supply.
  • Course Readers: it's not just the writing we have to worry about. The NAEP report card for 2024 indicates that 12th graders’ reading scores have dropped to their lowest level in more than 20 years. Cliffsnotes and the like have long offered summaries and study guides, but Adobe products now feature built-in AI that prompts users to request summaries of PDFs. Hard copies won't prevent students asking AIs for the short, short version, but they make it a little less convenient. Instructors can order course packets during the adoption period by emailing a PDF copy to the Campus Bookstore's Follett contact person (haneyj@rhodes.edu or j.haney@follett.com) and specifying print preferences (e.g., double sided, 3 hole-punch, or spiral bound).  Be sure to plan ahead for course packets, as it takes time to get permissions for course materials (and we want to be ethical in our usage of intellectual property).  
  • Photocopying: the faculty handbook specifies page and content limits for Photocopying for Academic Use. Please bear these limits in mind when printing materials for courses, if you choose not to order a course reader through the Campus Bookstore.
  • Project Portfolios: requiring students to create, retain, and turn in evidence of intellectual process in the form of low-stakes scaffolding assignments (notes, outlines, statements, drafts, etc.) can increase personal investment in scholarly and creative work and provide samples of authorial style and mechanics against which to measure final versions.  
  • Oral Assessments: Plato and Socrates were fans, and once upon a time all the exams at Oxford and Cambridge  were viva voce—"by the living voice.” Though more effective for getting around AI, instructors may find them less effective for demonstrating depth of knowledge and student may find them more anxiety-provoking. The McGraw Center for Teaching and Learning at Princeton has useful guidelines for devising and administering oral exams.

Mitigation: Digital/High-Tech

There are online and digital tools available for AI resistance and mitigation, but they can feel like fighting fire with less good, more cumbersome fire, and they aren't necessarily supported by Rhodes' IT department or compatible with programs like Outlook and Canvas. 

  • LockDown Browser: this Respondus product is meant to prevent students taking a quiz or exam within a Learning Management System (LMS) from "switching applications, capturing screen content, or going elsewhere on the internet (unless the instructor allows it)." The Respondus Monitor add-on operates as "a fully-automated proctoring tool" for use outside the classroom. Neither, however, can prevent students working at home from accessing the internet on additional devices. The browser is compatible with Canvas, but Rhodes has found that it is easy for students to work around and has disabled the add-on capability. 30-day free trials are available. (Google Forms has a "locked mode" that operates in a similar way, but it requires all students to have a school-managed Chromebook.)
  • Revision History: this Chrome Browser extension allows users to review the writing and editing process of documents composed in Google Docs. The extension tracks editing sessions, time spent writing, and revision patterns; highlights cut/pasted material; and provides a video replay of the entire writing process. The free version caps the number of documents you can review per month; unlimited document analysis requires a paid subscription.
  • LLM/AI Detection Programs: numerous sites and platforms offer what they market as AI detection tools. GPTZero and Copyleaks are stand-alone ventures while sites including Turnitin, Grammarly, and Quillbot have AI detection features built in (ironically, some of these sites may also be used to generate the content they promise to detect). Pangram is a system that claims significantly greater accuracy than other models. The accuracy of AI detectors is generally inconsistent, and other sites offer “humanization” algorithms designed to defeat them (though their efficacy is also questionable). False positives and negatives, as well as indeterminate results, are likely; results from AI detection should be used only in conjunction with other forms of evidence or mitigation tactics.

The Letter and the Spirit of the Law

If you believe a student has used AI and violated class or college policy, you can:

  • Resolve the issue independently and submit a report to the Associate Dean of Residential Experience & Community that explains the case and states that no further action is required
  • Submit a report to the Associate Dean of Residential Experience & Community requesting the case be referred to the Honor Council  

Please do not simply address the issue without a report; those reports are reviewed and become useful for cases of repeat offenses. When filing your report, please try to be thorough - the more information, the better the response.

Additional Resources