Using AI To Teach in College

I’m beginning to discover what thousands of other professors have already discovered: That artificial intelligence programs can probably teach better than I can. By teach I mean design and give lectures, plan activities including discussion questions and research projects, create tests and quizzes. Not only can AI programs do these things for me so I don’t have to: they can do so better than me. 

I finally looked to AI for help last week while trying to invent word-problems for my behavioral research students. Every one of my examples concerned outcome measures for a hypothetical depression treatment. My imagination couldn’t stretch any further, no matter how long or hard I stared at my hands. I finally asked ChatGPT to give me an example of an interaction effect in psychology, and it did, this time regarding the relationship between caffeine consumption and reaction time. Then I asked it for another, and it described the impact of sleep on attention. It was amazing. I had two examples without squeezing the juices out of any of my neurons.

 

Then I wondered if AI could be used to teach my students how to apply scientific reasoning to solve word problems. It could, I decided, but there seems to be a learning curve, and students have to know how to feed the AI the right prompts. I describe it all hereIf students could use AI for tutoring, then it meant that they would be able to freely ask as many questions as they wished and have an inexhaustible mentor. I estimate that the ChatGPT analytic thinking tutor would be about 5x as good as one me on my best day (since I typically only have the patience for about 50 minutes of sincere tutoring in me at any given time, and I can be easily distracted). Thankfully one of my students admitted that she preferred my lectures and examples to anything ChatGPT could provide. I told her that I was honored by her observation.  

Lectures

AI can lecture, too. I learned this through an alleged fan of my work, who submitted my dissertation to the LLM program called GoogleNotebook. It produced a 16-minute conversation between what seemed to be two knowledgeable and eminently down to Earth podcast hosts. It was excellent. Were I to attempt the same thing, it would have taken about 30 hours of my time. Then my lecture-version would be full of irrelevant comments about how I discovered such and such an idea or philosopher, and so on. In other words, the whole Patrick-created podcast would be covered with a distinct Patrick-sheen. That makes the AI version 360x more efficient (it only took 5 minutes) and probably twice as clear. And it wouldn’t say “Um…” 200 times like I would.

 

And I’m significantly worse when it comes to lectures about, say, neonatal development or Biopsycholology, so why not outsource those to AI, too? After all, if AI can do a better, more thorough, more engaging and shorter version than I could, then wouldn’t it be somehow unethical to require that students limit themselves to Patrick-created lectures?

 

I tested this approach and asked GoogleNotebook to make me a lecture using a general psychology textbook chapter. Then I created a YouTube video by pairing it with powerpoint slides. Again, it was better than anything that I could have done on my own in twice the time, and I have a PhD in psychology. 


[Still, I prefer my peculiar and off-the-cuff lectures to the AI-version, which probably speaks to my narcissism. I have decided to continue recording my own lectures for students. I created a how-to video for enterprising students who prefer the AI-version to the Patrick-version.]

But what about soft skills? 

I suspect that AI can probably empathize with students better than I can, too, not that perfect empathy is really the goal. When humans empathize, they have to suspend their own worldview and so on in order to understand what something is like for another person. I can sometimes do this briefly, but it’s hard to really shed my own biases, values, and beliefs. AI doesn’t have any of this human stuff to get in its way.

 

But maybe the key isn't obtaining perfect empathy; maybe it’s about obtaining imperfect empathy. Young adults need to learn that other people—college professors included—are incapable of perfect empathy. This stings at first, because students must realize that professors are not only in their (the students') corner of the boxing ring, so to speak. Professors and teachers and coaches might be there occasionally, supporting individual students, but they must also occupy the corners of their spouses, children, nieces and nephews, and so on. When students realize this, it will help them grow.

 

In any event, I’m sure AI could communicate all of this through a short story, podcast, or powerpoint better than I can in writing right now. 

Takeaway

I think this means that college students have all that they need in AI to learn whatever there is to learn, provided they take the time to learn it. But I don’t feel any pressure to up my game to compete with AI as a professor. I can invite students to use AI if that’s what they prefer. I can even teach them how to build those skills. This actually frees me up to give my imperfect and nuanced lectures about topics to which AI is ill-suited, or to emphasize my own vantage point with all of its neuroses and biases. I don’t have to be an encyclopedia anymore. That’s what AI is for. I can be me instead.

 

Comments