Skip to Main Content

AI Resources for Teaching and Learning

Artificial intelligence is no longer a futuristic concept—it's a practical tool that is reshaping the future of education. For educators, AI can automate administrative tasks, offer new ways to create content, and provide insights into student progress. For students, it acts as a personalized tutor, an instant study aid, and a tool for deeper inquiry. This resource hub is your starting point for exploring innovative AI tools that can enrich the teaching and learning journey.

AI's Impact on Teaching

AI’s impact on teaching will differ depending on factors like the field of study, classroom context, and disciplinary norms. Due to this, there isn’t a single way to address AI usage in the classroom or develop a comprehensive response for all courses.

Below, you will find some information on different ways to think about how AI might fit into your course.


Consider changes to your course policies and assessment practices.  

Include clear policies around AI writing on your syllabus. Tell students how (or if) they will be allowed to use generative AI and let them know why.

While there is no single way to completely eliminate the possibility of students using ChatGPT or generative AI in course work, there are a variety of strategies that can help instructors adjust their course curricula to adapt to generative AI. Generative AI detection tools, like Turnitin, can produce both false positives and false negatives, so instructors should not over rely on these platforms when determining whether students are using AI.

Generative AI encourages instructors to reevaluate their assessment practices. If possible, instructors should consider either changing assignments to help sidestep AI usage or integrating generative AI into their assignments. For example, turning to in-class exams or presentations can help ensure students are utilizing their own embodied knowledge when completing assignments. While students could potentially use generative AI for study strategies or topic brainstorming, they will not have access to those tools at the time of the assessment.

Instructors should ask: is the writing portion of this assignment just a way for students to demonstrate content knowledge, or is the actual writing process important to the course objectives? If the writing is simply a way to display knowledge, try having students record a short video. If you feel like the actual writing process is important to your course goals, consider having students incorporate generative AI.

Engaging in an open conversation with students about the advantages, disadvantages, and ethical implications of generative AI tools is one of the best ways to prevent misuse. For example, students might not realize that large language models have a small chance to generate incorrect information, so using it to learn new information might be risky. Students need to know that ethical AI usage involves things like fact checking the information from LLMs that they incorporate into their work.

Discussing why you are prohibiting students from using AI on specific assignments or in certain courses will also help students use these tools purposefully. If learning certain discipline-based skills without the aid of AI tools is important, let students know why. After students learn these skills without AI, consider showing them how AI can help complete the task more efficiently.

It is important to consider professional, departmental, and academic norms within your field when crafting an AI syllabus policy.

Due to the wide-ranging possible usage of AI within the classroom, it is impossible to have a “one size fits all” syllabus statement for all courses. Syllabus statements on AI usage should be crafted with the specific goals and content of your course in mind.  Given that, some suggestions:

  • You could completely ban AI usage in your course. Syllabus language outlining this may read something like, “Students are prohibited from using all forms of AI when completing work in this course. Any use of AI when completing coursework will be considered to be an instance of academic misconduct.”

  • Alternatively, you could allow students to use AI on specific assignments or as a part of a larger lesson. In your syllabus, you might let students know by writing, “AI usage is generally prohibited in this course. However, you will be allowed to use it on specific assignments that are built around AI. However, you can only use AI on assignments where it is explicitly permitted in the assignment description.”

  • Finally, you could allow AI usage as you would any other tool or source. In this case, students would be allowed to use AI in any assignment, but they would need to acknowledge how and why they used it. In this scenario, you may have syllabus language like, “You are allowed to use AI in this course. If you decide to use AI when completing an assignment, you will be required to cite the AI tools as you would any other source. Failure to reveal AI usage when completing an assignment will be considered to be an instance of academic misconduct.”

Workshops on AI & Teaching

UC regularly holds workshops on AI and teaching. To sign up for one of our generative AI workshops, please visitFaculty OneStop.

Recordings of these workshops are also available on CET&L’s Bearcats Landing and below.

Academic Integrity & AI

Near the top of the list of most educators’ questions about AI is, "How can I prevent students from using AI to cheat?"

To address this question, let’s break it down into three relevant questions:

  • Can I create an AI-proof assignment?’

  • Can AI-generated writing be detected?

  • and, What should I do if I suspect a student is misusing AI?

Unfortunately, just like many academic integrity questions, there are no simple and concise answers. Instead, we must reflect on our individual expectations, context, and moral beliefs to ensure a well-defined policy for AI and academic integrity.  

While creating an AI-proof assignment may not be feasible, AI tools' rapid and impressive growth presents a projection of near-limitless potential. Our goal is not to create a barrier to learning but to encourage students to learn with and without the help of AI tools. Therefore, we recommend creating authentic assessments, promoting process over product, and establishing guidelines that discourage accidental AI misuse.  

  • Additional guidance for developing process-focused assessments can be found in the AI Assessment section of the Center for the Enhancement of Teaching & Learning's Bearcats Landing site (UC login may be required).  

  • TurnItIn provides a useful AI misuse checklist to help prevent AI misuse. 

  • AI misuse rubric can help guide designing AI-Aware assignments. 

The research suggests this answer is still up for debate. Some research has concluded that it is impossible to accurately detect AI-generated writing, resulting in OpenAI abandoning its own authenticator, saying it does not believe it is possibleOther research has highlighted the impressive results of some academic AI checkers, most notably TurnItIn’s AI detection tool. Although there is uncertainty, we can draw two helpful conclusions from the growing body of research:

  1. AI detection tools designed for specific populations and types of writing can be performed with high accuracy.

  2. Even if we cannot be 100% certain of AI-generated text, we should approach potential academic integrity violations with the same care and through the same procedures as previously.  

If you suspect a student submitted AI-generated work without permission, it's important to follow the same procedures you would with other academic integrity violations. However, given the relative newness of these tools and the ongoing confusion and disagreement on their proper use, it's suggested to approach potential AI misuse with more intentionality and grace, fostering a culture of understanding and empathy. 

  1. Carefully Analyze the student work – what other issues become apparent, commonly the result of AI. If using TurnItIn, familiarize yourself with their AI detector with this guide. 

  2. If AI misuse is suspected, it is highly recommended that you reach out to the student. It is prudent not to accuse the student of AI misuse, but to inquire about the irregularities in the assignment. Here is a useful guide on how to approach students regarding potential AI misuse.

  3. If you feel that the infraction violates UC Conduct Policy (PDF), treat the academic misconduct as any other violation and follow college policies for reporting misconduct.

Ethical Considerations

Generative AI is a powerful tool, but it also raises important ethical considerations that should be addressed when using it in education.

It is important to discuss the ethics of incorporating AI writing into student writing. While the ethics of this may seem obvious to some instructors, students might view AI writing as just another technology tool. If your field, discipline, or department has determined that AI writing violates their ethics, it is important to let students know why this is and what ethical use of AI looks like.

For some disciplines, it can be helpful to turn to prominent journals or the preferred citation format in your field for help.

For example, both MLA and APA have guidelines for citing AI usage in academic work. By turning to these guidelines, instructors can show students that it is possible to use these AI tools in responsible ways.

Additionally, many journals across disciplines now have specific frameworks in place for submitting work created with the help of AI. If you are designing an AI policy for your course, consider using the journals in your field as guidance. It can also be helpful to have a discussion with students about how the journals in your field are addressing AI usage. This conversation can help students understand the specific AI ethics your field has developed.

Many AI models are trained on copyrighted texts without attribution. While some attempt to cite sources, the accuracy of these citations can vary widely.

Students may not realize how these models generate text, making it important to explain the process and its limitations.

AI output can reproduce errors or biases from its training data or user prompts, which may mislead students who lack the background knowledge to identify them.

Other Resources

Resources on UC websites (Faculty and staff UC login may be required).

Resources on UC websites (Faculty and staff UC login may be required).

Other resources in higher education.

Collaborate with Us

For all general questions, support requests, or feedback, please contact us via email or phone. We aim to respond to all inquiries within one business day.

Contact Form

Contact Us