Featured faculty: Jose Luis García del Castillo y López
Teaching Professor of Computational Design
College of Arts, Media and Design
Why try it? Same instructor, same tool, opposite policies. The distinction isn't comfort level: it's whether AI bypasses the skill being taught.
What he's doing: Jose Luis teaches two technology-focused courses with opposite AI policies because his learning outcomes help guide use. The contrast illustrates how the same tool can be appropriate or inappropriate depending on what students need to learn.
In the undergraduate course Prototyping with Code, students learn to code from scratch and then apply those skills to creative visual projects. Developing foundational coding ability is one of the core learning outcomes and vibe coding is not appropriate here. Jose Luis compares it to teaching math: we still teach it by hand even though calculators exist because the thinking and problem-solving processes have value beyond the calculations themselves. Students may use AI as a coding coach to do things like asking it to explain concepts, debug broken code, generate practice problems, or help with syntax errors at midnight when Jose Luis is not available. Essentially, any AI use that helps students learn is fair game; anything that replaces their learning is not. For Jose Luis, code has personal style, like handwriting, so when students submit work in week two using advanced techniques never covered in class, the source is evident. Jose Luis tests each assignment himself with Claude to see what AI-generated submissions might look like, but he is still strategizing around how to best discuss overuse of AI with his students.
In contrast, the graduate seminar Human-Centered AI is a non-technical course which prepares future leaders to make decisions about AI implementation and management in organizations. Here, vibe coding is welcomed because the curriculum doesn't include learning to code. Students use AI to generate working prototypes. It's simply another tool, and a valuable professional skill for students who will lead AI initiatives rather than build them. In addition, every assignment includes a comparison component: students first complete work independently, then use Claude for the same task, and finally analyze how their thinking differs from the AI's output. This structured comparison helps future leaders understand what AI can and cannot do, which is essential knowledge for the decisions they'll face in industry.

A self-portrait by Sara Dassanayake, a student in Prototyping with Code who used Figma to map coordinates before translating them to p5.js, demonstrating a bridge between visual planning and code. Reproduced with permission.
This framework transfers to any discipline. A writing instructor might limit AI drafting in a composition course where students learn to generate ideas, but welcome it in an advanced editing course focused on revision skills. A science instructor might limit AI-generated lab reports in introductory courses where students learn to interpret data and draw conclusions, but welcome it in advanced research methods where synthesizing literature is the focus. Law faculty might restrict AI in legal writing courses but embrace it in clinical simulations where students practice client counseling. |
For Jose Luis, the question is: What are students supposed to learn? If AI use bypasses the skill or thinking process, try to develop and set boundaries. If it supports or extends learning without replacing it, embrace it.
What's next: Jose Luis plans to continue refining his syllabus guidelines, which he calls the "Generative AI Constitution" for each course. There, he explicitly names productive and unproductive AI uses based on that course's specific learning outcomes. He's also exploring whether AI can offer useful critique of student-created visual work, though he remains uncertain whether AI judgment on design and art is trustworthy enough to assist students in the immediate absence of human instructors' feedback.
