AI can be friend, foe, neither, either, or both, depending on user and use. If some instructors are alarmed by its potential to hinder learning and facilitate cheating, then others are excited by its ability to provide useful information, help generate ideas, and improve efficiency. Students might have reservations about using AI, but they may also be concerned that not understanding or using it may adversely affect their employment prospects.
The following are general recommendations for using AI in the classroom, examples of discipline- and course-specific assignments and exercises administered by Rhodes instructors, and links to additional resources. There are many guides, sites, and books out there about both teaching AI as well as teaching with AI. The tips and links below are offered only as examples.
General Recommendations
- Learn the Basics: familiarize yourself with the fundamentals of the AI tools you plan to use before committing to a plan for your course. Consider consulting an AI primer (see additional resources below), then experiment with exercises or assignments using different platforms or programs.
- Policy: discuss your course AI policy on the first day of classes. Some instructors choose to devise policy in consultation with students; you might even use GAI to help write and refine it as a class. In any case, include the finished policy on your syllabus.
- Trust, but Verify: LLMs and other tools don’t lie, but only because they don’t know when they’re giving you inaccurate or misleading information. Remain cognizant of the knowledge gaps between you and your students; you might see problems in AI outputs they don’t (and vice-versa). Emphasize maintaining a reasonably skeptical stance even (or especially) when outputs seem consistently reliable.
- Data Safety First: never enter personal or identifying student information into any GAI--that violates FERPA and college policy.
Policy Shop
Faculty from the Computer Science department received a Hill Grant to devise a set of policies for AI usage in their area; the following are general policies adapted from their work. Consider which might be best applied or adapted to your specific courses or assignments, and be sure to include them on syllabuses, rubrics, and prompts.
- No AI Use Permitted
Students are strictly prohibited from using any AI tools or resources for any aspect of assignments, projects, exams, or other course-related work. This includes, but is not limited to, content creation, idea generation, analysis, problem solving, or information retrieval. - AI Use for Learning and Exploration Only (No Submission)
Students may only use AI tools as a tutor—to ask for explanations of concepts, methods and terminology. AI may not be used to generate any content used for submission. (This prohibition also applies to the submission of paraphrased or altered AI generated content.) All submitted work must be produced entirely by the student without copying from AI. - AI Use for Specific Approved Tasks (Limited Use)
AI tools are permitted for specific, clearly defined tasks within assignments, provided that the AI's contribution is explicitly cited and the student demonstrates a clear understanding of the AI-generated content. - AI Use with Citation (Comprehensive Use)
Students may use AI tools as a substantial aid in completing assignments. All AI assistance must be clearly cited, specific details documented, and the student must demonstrate a clear understanding of the AI-generated content. - Unrestricted AI Use (Exploratory/Research Contexts)
Students are free to use AI tools as they see fit to complete assignments and projects, with minimal or no restrictions on their application. The focus is on the outcome and the student's ability to achieve objectives. However, citation is encouraged.
Classroom Exercises
- Turing Test: run Turing's Imitation Game in class to demonstrate the features of LLMs and discuss the importance of tone, style, and developing an individual authorial voice
- Sandbox: give students unstructured time to experiment with GAIs and report/discuss their experience
- Split Screen: give identical prompts to different GAIs (ChatGPT, Claude, Perplexity, etc.) and compare outputs
- Quality Control: generate text, images, or video with GAI and ask students to evaluate its accuracy--what did it get right/wrong?
- Two Truths and an AI: an LLM-powered version of the social ice-breaker. Generate (or have students generate) one inaccurate and two accurate outputs about a subject relevant to the course (or themselves) and ask the class or group to identify the inaccurate outputs. This demonstrates the uniform and often convincing confidence with which LLMs deliver both accurate and inaccurate information.
Additional Resources
- Jisc AI maturity toolkit for tertiary education
- Stanford AI Lab Blog
- MIT Raise Initiative
- Sample collection of AI Policies for different classes/disciplines
- NSW CESE Teacher's Prompt Guide to ChatGPT
- AI Alliance Guide to Essential Competencies for AI
- Georgetown Center for New Designs in Learning and Scholarship AI Toolkit