Teaching with Generative AI

Many SF State faculty are reimagining their classrooms to address new technologies: embedding AI literacy skills and knowledge into their curriculum, talking with students about how tools shape their academic work, and redesigning assignments to ensure students are prepared for the future. CEETL is pleased to support faculty innovation and to partner with faculty as we work through the implications of new technology for our students. As part of that support, we have developed what we are calling Three Laws of Curriculum in the Age of AI, a play on Isaac Asimov’s “Laws of Robotics,” written to ensure that humans remained in control of technology. 

Our Three laws are not laws per se; they are a framework for addressing the use of AI tools in the curriculum at all levels, from the individual classroom to degree roadmaps for disciplinary programs, from general education through graduate courses. The framework is designed to support curricular innovation as faculty incorporate emerging tools and technologies into their teaching. 

The first law encourages faculty to teach students about Gen AI in ways appropriate to their discipline and course level. Teaching about AI means teaching students how the technology works, its biases and limitations, and its potential harms. The second law encourages faculty to develop teaching strategies for teaching students to work with AI tools where appropriate, to enhance student learning and to prepare students with the AI skills and familiarity they may need in the future. The third law encourages faculty to protect student learning from AI shortcuts by redesigning assignments to make them AI-resistant where possible. 

For more on the Three Laws framework, look for our forthcoming essay in Inside Higher Ed.

Below, we use this framework to highlight SF State faculty curricular innovation and share resources for further reading and exploration. 

 

Guiding principles about AI

  • At CEETL, we approach AI and emerging technologies as tools that may be used to enhance teaching and scholarship, and to increase equity, accessibility, and engagement for students. 
  • We center faculty’s expertise and students’ experiences; we recognize that disciplines may vary in their approaches to technology.
  • We center critical approaches to technology rooted in SF State’s shared principles of equity and social justice. 
End of Story Line

Other resources you may be interested in:

End of Story Line

Law One: Teaching Students About AI

"Let us not just employ AI in education, or conversely, completely write off the use of AI in the classroom, but consider its uses and limitations with intention and empathy." – Dr. Kira Donnell, JEDI Faculty Director at CEETL and Lecturer Faculty in the Department of Asian American Studies
End of Story Line

Students need to build knowledge about AI, including how the tools work as well as their social, cultural, environmental, and labor impacts, potential biases, tendencies toward hallucinations and misinformation, and propensity to center Western European ways of knowing, reasoning, and writing. Teaching students about AI aligns with core equity values at our university, and our JEDI (justice, equity, diversity, inclusion) framework.

To take a JEDI approach to AI, it is important to first understand the ways that AI reinforces systemic biases and harms. This list introduces an array of ethical concerns relevant to higher education that AI scholars have identified.  We encourage you to partner with CEETL as you redesign assignments and activities to teach students about how AI tools work, and to educate them about the issues AI raises, from environmental justice to copyright violations to bias and hallucinations. 

End of Story Line

Law Two: Teaching With AI

End of Story Line
“Talking with my students helps me learn more about how they use AI tools to accomplish aspects of the writing process, from brainstorming to editing. I want to teach students how to use AI to support but not overtake their writing process -- to edit, for example, without losing their meaning or voice. It turns out that teaching students to edit with AI requires teaching them to critically read, consider their audience and purpose, and compare the rhetorical impact of different drafts -- skills my curriculum was designed to teach in the first place.” -- Jennifer Trainor, English Professor and CEETL Faculty Director

End of Story Line

What do students need to know to work ethically and equitably with AI as these tools become increasingly embedded in the platforms and programs we already use, as they are marketed aggressively, and as they are integrated into the jobs and careers our students hope to enter? As Kathleen Landy recently wrote, Now is the time to move toward …collaborative articulation of learning outcomes relative to AI. What do we want the students in our academic programs to be able to do with generative AI? What AI skills do they need to be successful in our classes and beyond?  This overview of AI classroom innovations across the CSU provides a sense of how our curriculum may evolve in the age of AI.

Faculty at SF State are exploring ways to use AI to support existing learning goals, such as the development of critical and analytic thinking and rhetorical awareness. 

For example, faculty have created lessons that:

  • Ask students to use Microsoft Copilot to come up with discussion questions and then have students rank and rewrite the questions as needed in order to develop students’ analytic thinking;
  • Ask students to debate Microsoft Copilot in order to develop their critical thinking; 
  • Engage students in AI-generated scenarios to create engagement and critical thinking; 
  • Ask students to evaluate AI solutions to a case study to support critical thinking;
  • Investigate 2 different AIs-generated essays, looking at the strengths and weaknesses of each to develop students’ rhetorical awareness.

Faculty are also flipping the classroom with the aid of AI (see Jason Johnston’s on AI Assignment Flipping).  For more, consider this crowdsourced slide deck of how teachers around use AI with their students.

Students are often early adopters of new technology. Your students may already be using AI tools to support them in a variety of academic and non-academic tasks. To hear how students are using AI tools, view the Fall 23 Workshop on AI Tools for Students recording.  

To see how SF State faculty are innovating with and exploring AI in their classrooms, check out these recordings from SF State’s Symposium on AI hosted by CEETL in Spring 2024

Each recording features an SF faculty member instructor who shared how they use AI in their course as well as lessons learned.

Faculty are also exploring the potential for AI to assist them in curricular design. CEETL’s Fall 23 discussion circle focused on exploring AI tools for faculty use.  Participants shared how they used Generative AI to draft outlines for lessons, generate thank-you notes or recommendation letters, and summarize complex information.

Takeaways included: 

  • AI is a Starting Point: Faculty found ChatGPT useful for breaking creative blocks and drafting outlines.
  • Customization is Essential: AI-generated content lacks personal voice and context, and AI is prone to hallucinations, requiring instructors to check for misinformation and personalize to their course and students. 
  • Privacy Concerns: Participants raised concerns about inputting personal information or student data into AI systems.
  • Learning vs. Efficiency: Faculty worried that over-reliance on AI might undermine the learning process by prioritizing fast outcomes over deeper engagement with material.
  • Ethical Guidelines Needed: Faculty emphasized the need for clear ethical guidelines in AI use, including ensuring students can opt out of AI-based assignments and acknowledging AI contributions in any scholarly or educational context.

CEETL’s Fall 24 workshop on "AI Tools for Faculty" takes a look at the benefits and limitations of using Co-Pilot to enhance teaching materials and increase student engagement. Facilitator Jennifer Trainor shared guiding principles, a demonstration of how to use and critique AI’s output when it comes to course materials, and led a discussion where faculty shared how they use AI in teaching.   

End of Story Line

Law Three: Protecting Student Learning

End of Story Line
“Just because AI can be used for an assignment or in a course does not mean that it should be. We have to consider whether the use of AI supports existing learning goals or merely provides an easy shortcut around the intentional difficulty we know is at the heart of learning. As a writing teacher, I have to ask: do these tools support the development of students’ rhetorical skills, their sense of agency and confidence in their own voice? These considerations are essential as AI has the potential to harm the development of foundational skills and learning processes.” --Jennifer Trainor, English Professor and CEETL Faculty Director
End of Story Line

Protecting student learning from AI shortcuts is an essential challenge. We like this quote from Washington University’s Center for Teaching and Learning: “Sometimes students must first learn the basics of a field in order to achieve long-term success, even if they might later use shortcuts when working on more advanced material. We still teach basic mathematics to children, for example, even though as adults, we all have access to a calculator on our smartphones. GenAI can also produce “hallucinations” and often only a user who understands the fundamental concepts at play can recognize this when it happens. … Sometimes, the end product isn’t even the focus in education per se. Many assignments, for example, are much more about learning how to think and to make arguments than about simply creating a block of text on a particular topic. Like a runner training for a marathon, the point isn’t to just get from point A to point B, but to transform oneself along the way. A car might be faster, but it won’t build the muscle and endurance we seek. In these cases, it may make sense to adopt some teaching strategies that avoid AI and assignments that are more AI resistant.” 

AI tools can produce authoritative-sounding outputs, and because these outputs sound so good, students can feel convinced by them, leading to situations where the output overrides or displaces students' own thinking. AI tools may curtail opportunities for students to develop and practice the kinds of thinking that undergird many learning goals. It is understandable that faculty are clamoring for AI detectors and prohibitive AI policies. But a focus on protecting student learning shifts us away from policing student behavior as a way to ensure learning, and instead invites faculty to think about how they might redesign assignments to provide spaces that protect learning and that encourage students to do their own thinking. 

Providing and protecting such spaces undoubtedly poses new challenges for faculty. But we also know that protecting student learning from easy shortcuts is at the heart of formal education. Consider the planning that goes into determining whether an assessment should be open-book or open-note, take-home or in-class. These decisions are rooted in the third law: what would most protect student learning from shortcuts (textbooks; access to help) that undermine their learning?

To see how faculty are working to protect student learning, view the Fall 2023 Workshop on Writing with Integrity recording here. 

Read about protecting academic integrity in first-year writing: How I use AI to teach Academic Integrity, by Jennifer Trainor 

Join us for the Second Annual AI Symposium on "Embracing and Resisting AI: Best Practices in the Classroom"