You are here

Incorporating Generative AI in the Chemical Engineering Classroom

Rebecca K. Lindsey

A perspective for CACHE by Rebecca K. Lindsey
Dow Early Career Assistant Professor of Chemical Engineering
University of Michigan, Ann Arbor

Over less than a decade, generative AI (GenAI) has transitioned from a niche novelty to a mainstream productivity tool with remarkable speed. Within the context of this perspective, I focus on large language models (LLMs) such as ChatGPT [1], Gemini [2], and Claude [3], which are trained on massive volumes of internet data and are reductively viewed by many generalist users as an “oracle” that communicates in plain English. These models can perform complex tasks given very simple prompts such as explaining a dense technical passage in simple language or extracting and analyzing data from it. Increasingly, educators are under pressure to acknowledge and integrate these tools in the classroom, driven both by students’ curiosity and by growing expectations from academic institutions and industry that graduates be prepared to work in AI-enabled environments. Yet how to do so responsibly and effectively remains contested, with many rightly worried about how these tools might erode foundational learning and critical thinking skills.

My perspective on this issue is colored by the calculator controversy that shaped STEM education from the 1970s through the 1990s [4]. Early critics argued that calculator use would weaken arithmetic fluency and other foundational skills. After much debate, the field settled on a structured integration strategy with clear limits that is now well accepted: teaching and assessments are designed such that they still require independent demonstration of core competence, integrating calculators only when they enhanced problem solving and modeling. Ultimately, I believe it is our responsibility as engineering educators to ensure we are adding all the newest tools to our students’ toolboxes. But to understand how and when to use them, students must also understand the job at hand, how the tools work, and the limitations of those tools. In what follows, I provide my thoughts on some of the current challenges and strategies for responsibly and effectively incorporating GenAI in the chemical engineering classroom.

 

GenAI in the Classroom: Challenges & Considerations

Before delving deeper, I must address the following:

  1. The technology is evolving rapidly: Public GenAI models are updated frequently, and model performance can change substantially over short time scales [5, 6]. As a result, examples of what “works” and what “fails” can age quickly. Versioned model names and release notes help, but persistence is still not guaranteed, providers routinely deprecate or retire models [7, 8], and model availability can differ across products and subscription tiers [9–11].
  2. Student access is uneven: Students do not all have access to the same GenAI capabilities. Free tiers cap the number of inquiries users can make. Moreover, paid tiers provide broader and more powerful functionality that can materially affect how students engage with these tools. This creates fairness challenges for instruction and assessment when assignments implicitly assume uniform access. At the University of Michigan, I am fortunate that institutionally supported tools (including U-M GPT) are available at no cost to active students, faculty, and staff, which makes classroom integration more feasible than it may be at institutions without comparable infrastructure [12].
  3. The cost of this technology extends beyond subscription prices: AI’s material footprint is tied to a strained hardware supply chain, where advanced AI chips depend on scarce, geopolitically concentrated manufacturing capacity, and on critical minerals that face rising demand and supply-risk pressure as deployment scales [13]. In the U.S., data centers consumed approximately 4.4% of national electricity in 2023 and could reach 6.7 to 12% by 2028 [14]. Moreover, siting decisions can shift energy, water, and infrastructure burdens onto local communities, introducing uneven risk for those in water-stressed or socioeconomically disadvantaged regions [15].

From an engineering education perspective, this means “responsible AI use” should extend beyond plagiarism policies to include sustainability-aware decision making. With these constraints in mind, we now focus on the question: what might responsible, effective GenAI integration look like in day-to-day teaching?

 

GenAI in the Classroom: Strategies

A common sentiment in discussions of GenAI in teaching is that it should be prohibited, with courses redesigned to prevent students from using these tools on assignments. I initially shared that view when I began teaching in 2022. However, after four years of classroom experimentation, my approach has changed substantially. I now find that strict no-GenAI policies are difficult to enforce consistently, and, more importantly, they do little to teach students how to use these tools responsibly in professional settings. In this section, I outline why prohibition alone is not an effective long-term strategy, how I frame GenAI use in my courses, and provide practical examples that may be adaptable to other chemical engineering classrooms.

On Forbidding GenAI:

One of the most effective course configurations for ensuring no GenAI is used in assessments is a flipped format, which moves content delivery outside class and reserves class time for problem solving and assessment. In many upper-level engineering courses, however, relevant problems are too complex to fit reliably within a single lecture period. Moreover strict prohibition of GenAI does not build the judgment students will need in AI-enabled workplaces. Hence, it is my opinion that as educators in chemical engineering, it is our job to teach students how to use these tools responsibly. What that looks like will vary from class to class, depending on that class’s content; I will provide some explicit examples below.

My Recommendations for Every Chemical Engineering Course:

At the beginning of each semester, I set aside time to discuss GenAI explicitly with students. I start with a brief demonstration of what these tools are and how they work. I then emphasize a core point: GenAI is highly effective for some tasks and unreliable for others, and many of its most consequential weaknesses (particularly abstract reasoning under physical constraints) fall directly within the competencies engineering courses are designed to build.

I also remind students that they are in the course to learn, and that this has two immediate implications. First, they do not yet have the expertise required to reliably judge whether GenAI outputs are correct. Second, unapproved or uncritical use can undermine long-term learning by bypassing foundational knowledge development, reinforcing incorrect concepts and weakening problem-solving skills that are quintessential to engineering. I also acknowledge that detecting use of GenAI is becoming increasingly difficult – this means I will do my best to teach them how to use these tools responsibly, but the onus is on them to follow the guidelines I provide.

Before each assignment or in-class activity, I revisit expectations for appropriate use in that specific context. I have found that the most effective guidance is concrete and assignment-specific. For example: “Responsible use for this assignment includes asking how to generate a plot in Excel using your data. Irresponsible use includes asking GenAI to interpret what your pressure-volume-temperature trends mean.” In class, I review these examples and explain why they are considered appropriate or inappropriate for that learning objective.

Finally, I include a short reflection prompt at the end of each assignment asking whether students used GenAI, how they used it, and what they felt worked well or failed. This has been one of the most useful components of my approach. Students frequently identify creative use cases I had not considered, allowing me to either incorporate those uses in future assignments or build safeguards where needed. Notably, I frame this as an opportunity for them to teach me, which I have found makes them excited and honest about reporting their experiences.  

Selected Examples from Me and Colleagues:

In what follows, I provide examples of from my classes and others that I believe exemplify teaching responsible use of AI in three different chemical engineering courses.

 

Example 1: Undergraduate Unit Operations Laboratory

This course covers experiment design and execution, and data analysis. Students enter with competency in plotting and basic statistical analysis using Microsoft Excel. A potential pain point is applying these skills when they need to process the much larger datasets that can be generated during their Unit Operation labs. This provides an opportunity to teach how GenAI can be used to transition from familiar tools (e.g., Excel) to those more compatible with scalable workflows (e.g., Python), while preserving accountability for interpretation and scientific reasoning.

Sample Problem: Using Excel, determine the mean, standard deviation, and variance of each temperature series dataset provided. Generate a bar plot of mean value as a function of temperature and include the standard deviation in error bars. Using UM-GPT to assist, repeat using Python. To do so, provide the following prompt to UM-GPT. Compare the results from your hand-made Excel analysis and that produced by the UM-GPT-generated Python script. Comment on whether any aspect of the results (i.e., reported statistics or plot) differed between them, and whether that result was expected or unexpected. If it was unexpected, speculate as to why and how it could be fixed.

U-M GPT Prompt: Write the simplest possible Python script for beginners that reads three CSV files (T300K.csv, T350K.csv, T400K.csv) where column 2 contains flow rate (m/s). Use only NumPy and Matplotlib. Loops are allowed but avoid advanced features. Compute mean, standard deviation, and variance for each file, print the results, and make a bar chart of mean flow rate vs temperature with standard-deviation error bars. Add very clear comments for someone new to Python.

This assignment teaches students to use GenAI in settings where outputs can be explicitly validated against a known baseline. It also introduces core prompt-engineering habits by requiring students to request specific outputs and explanatory detail, so the generated code becomes something they can interrogate and learn from rather than copy. Finally, the reflection component requires students to evaluate discrepancies, articulate why a result may be wrong, and propose concrete fixes, reinforcing critical evaluation as part of the workflow.

 

Example 2: Graduate Transport Phenomena (Provided by Prof. Ronald Larson, University of Michigan)

This course focuses on heat and mass transfer with chemical reaction in three dimensions. Students in this class can struggle to apply the mathematical tools they learn in class to complex real-world problems and can be tempted to simply plug them into GenAI models. This provides an opportunity to show students that these models can fail when faced with complex engineering tasks, and teach them how to assess work by others, while helping reinforce concepts and problem-solving strategies learned in class.

 

Sample Problem: Consider the following problem: For a sphere at a uniform initial temperature of 1000°C, how long after being exposed to air at 30°C with an outside convective heat-transfer coefficient of 10 W/m2-K is required for the surface temperature to drop to 200°C?  What would be the center temperature at this time?  The ball is 20 cm in diameter, and is made of cast iron with the following properties:  k = 55 W/m-C; cp = 630 J/(kg-K); ρ = 7,800 kg/m3.  

Inspect the two proposed solutions (attached). One of these solutions has been generated by U-M GPT. Assess the correctness of each step of the proposed solutions given. 

This assignment teaches students to critically evaluate GenAI-generated work rather than treating the model as an authority. By auditing a pre-generated solution, students practice checking assumptions, logic, and physical consistency. It also exposes a practical limitation of GenAI as a learning aid: model-generated solution paths can be technically valid yet pedagogically misaligned with course methods, making them difficult for novices to learn from directly. Finally, because the task centers on critique and justification, students must demonstrate their own reasoning, which reduces the utility of using GenAI to bypass the assignment.

 

Example 3: Undergraduate Thermodynamics

This course focuses on understanding the types of energy, how it can be converted between forms, and how it can be manipulated to extract work from a system. Students often struggle to reconcile the abstract nature of thermodynamics with more tangible nature of most STEM classes they have previously taken. This provides an opportunity to teach students how GenAI can be used to help them assess their own understanding, such that they can seek additional guidance if needed.

Sample Problem: A well-insulated rocket is traveling to the moon in space. At regular time intervals it burns 1 m3 of fuel to maintain speed but does not accelerate or decelerate. Consider the system of the entire rocket, including the combustion chamber and fuel storage, but not the gas by-products that go into space after burning the fuel. Which of the following describe the system: Steady state, equilibrium, isolated, adiabatic. Explain your answer for each.

Input the problem statement and your response into U-M Gemini, then ask it to evaluate each classification (steady state, equilibrium, isolated, adiabatic) and provide source-backed justification for every claim. Verify each claim against the cited references yourself, then submit a short reflection identifying where your reasoning agreed or differed, with corrected explanations and citations. Include screenshots of the specific source passages used to validate your final answers.

This assignment teaches students how to use GenAI to strengthen critical thinking and evaluate their own understanding, while avoiding situations where GenAI introduces or reinforces misconceptions. It also teaches practical strategies for validating AI-generated responses when students are not yet experts in the topic. Most importantly, it emphasizes that GenAI should be treated as a tool to interrogate and refine reasoning, not as an authority to trust without verification.

 

Concluding Thoughts

GenAI capabilities and access are evolving at an extraordinary pace. Instructional strategies should be responsive to these changes, but the rapid pace can be at odds with the time required to update course content. For these reasons, I do not believe there is a single correct way to integrate GenAI into chemical engineering classrooms. What we need is sustained, rigorous dialogue among colleagues, along with transparent sharing of assignments, failures, and refinements, so we can collectively identify what preserves foundational learning and problem-solving skill development while preparing students for responsible, AI-enabled practice.

 

References

  1. chatgpt.com (Accessed: Feb. 17, 2026)
  2. gemini.google.com (Accessed: Feb. 17, 2026)
  3. claude.ai (Accessed: Feb. 17, 2026)
  4. Jackson, Allyn. "The Math Wars: California Battles it out Over Mathematics Education Reform (Part I)." Notices of the AMS 44, 695 (1997)
  5. ai.google.dev/gemini-api/docs/models (Accessed: Feb. 17, 2026)
  6. help.openai.com/en/articles/9624314-model-release-notes (Accessed: Feb. 17, 2026)
  7. ai.google.dev/gemini-api/docs/deprecations (Accessed: Feb. 17, 2026)
  8. help.openai.com/en/articles/6825453-chatgpt-release-notes (Accessed: Feb. 17, 2026)
  9. support.google.com/gemini/answer/16275805?hl=en (Accessed: Feb. 17, 2026)
  10. help.openai.com/en/articles/9275245-chatgpt-free-tier-faq (Accessed: Feb. 17, 2026)
  11. help.openai.com/en/articles/11909943-gpt-52-in-chatgpt (Accessed: Feb. 17, 2026)
  12. genai.umich.edu (Accessed: Feb. 17, 2026)
  13. Mineral Commodity Summaries 2025. Version 1.2, March 2025. Reston, VA: U.S. Geological Survey, 2025. https://doi.org/10.3133/mcs2025
  14. 2024 United States Data Center Energy Usage Report. Berkeley, CA: Lawrence Berkeley National Laboratory, 2024. https://doi.org/10.71468/P1WC7Q
  15. Li, Pengfei, Jianyi Yang, Mohammad A. Islam, and Shaolei Ren. "Making AI Less ‘Thirsty’." Communications of the ACM 68, 54 (2025)

Follow Us

    

Contact Us

CACHE Corporation
P.O. Box 126
Notre Dame, IN 46556
Phone: (574) 631-5687
cache@cache.org

Join Us

Subscribe to CACHE newsletter 

Contact us to join CACHE as an academic or industrial affiliate

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer