Is Higher Ed to Collapse from A.I.?
Description
Steve Pearlman: Today on actual intelligence, we have a very important and timely discussion with Dr. Robert Neber of a SU, whose recent opinion piece in inside higher education is titled AI and Higher Ed, and an impending collapse. Robert is a teaching professor and honors faculty fellow at the Barrett Honors College at a SU.
And the reason that I invited him to speak with us today on actual intelligence is his perspective on artificial intelligence and education. And his contention roughly that higher Ed's rush to embrace artificial intelligence is going to lead us to some rather troubling places. So let's get to it with Dr.
Robert Niebuhr.
Robert. We talked a little bit about this on our pre-call, and I don't usually start a podcast like this, but what you said to me was so striking, so, uh, nauseating. So infuriating that I think it's a good place to begin and maybe some of [00:01:00 ] our listeners who value actual intelligence will also find it as appalling as I do, or at least a point of interest that needs to be talked about.
You were in a meeting and we're not gonna talk about exactly, necessarily what that meeting was, but you're in a meeting with a number of other. Faculty members and something interesting arose, and I'll allow you to share that experience with us and we'll use that as a springboard for this discussion.
Robert Neibuhr: Yeah, sure. Uh, so obviously, as you can imagine, right, I mean, faculty are trying to cope with, um, a perceived notion that students are using AI to create essays. And, and, uh, you know, in, in the, where I'm at, you know, one of the backbones, um, in my unit to. Um, assessed work is looking at argumentative essays.
So the, the sort of, the idea that, that this argumentative essay is a backbone of a, of a grade and assessment. Um, and if we're, if we're suspecting that they're, they're using ai, um, you [00:02:00 ] know, faculty said, well, why should we bother grading essays if they're written by bots? Um, and, and you know, I mean, there's a lot, there's a lot to unpack there and a lot of things that are problematic with that.
Um, but yeah, the, the, the idea that, you know, we, we don't have to, to combat a, to combat the perceived threat of, of student misuse of ai, we just will forego critical assessment. Um, that, that was, you know, not a lone voice in the room. That that seemed to be something that was, that was reasonably popular.
Steve Pearlman: Was there any recognition of what might be being sacrificed by not ever having students write another essay just to avoid them using ai, which of course we don't want them to just have essays write, uh, so of course we don't want them to just have AI write their essays. That's not getting us anywhere.
But was there any conception that there might be some loss in terms of that policy? [00:03:00 ]
Robert Neibuhr: I mean, I, I think, I think so. I mean, I, I imagine, uh, you know, I think. My colleagues come from, from a place where, where they're, they're trying to figure out and, and cope with a change in reality. Right? But, um, there, there is also a subtext, I think across, across faculties in the United States of being overworked.
And, and especially with the mantra among, you know, administration of, you know, AI will help us ramp up or scale up our, our class sizes and we can do more and we can. All this sort of extra stuff that it would seem like faculty would be, um, you know, more of their time and, and more of their effort, you know, as an ask here that I think that's, that, that may be, that may have been part of it.
Um, I, I, I don't know that the idea of like the logical implication of this, that, you know, if we no longer. Exercise students' brains if we no longer have them go through a process that encourages critical [00:04:00 ] thinking and art, you know, articulating that through writing, like what that means. I, I don't know that they sort of thought it beyond like, well, you know, this could be, we could try it and see was kind of the mentality that I, I sort of gauged from, from the room.
But, uh, it's, I mean, it's a bigger problem, right? I think the, the, the larger aspect of. What do we, what do we do? What can we do as faculty in this sort of broad push for AI all over the place? And then the idea of the mixed messages. Students get right. Students get this idea, well, this is the future. If you don't learn how to, how to use it, if you don't, you know, understand it, you're gonna be left behind.
And then at the same time, it's like, well, don't use it from my class. Right? Learn it, but don't use it here. And that's. That's super unclear for students and it's, it's unclear for faculty too, right? So, um, it, it's one of those things that it's not, um, I don't think in the short term it works. And as you, as you, as you implied, right, the long term solution here of getting rid of essay [00:05:00 ] assignments in, in a discussion based seminar that relies on essays as a critical, I mean, this is not a viable solution, right?
We, we got the entire purpose of, of the program in this case.
Steve Pearlman (2): And yet a lot of faculty from what you described and a lot of what I've read as well, is also moving towards having AI be able to grade. The students work not just on simple tests, but on essays. And as you point out in your article, that's potentially moving us to a place where kids are using AI to write the essays, and then faculty are using AI to grade the essays.
And who, when did the human being get involved in between, in terms of any intellectual growth?
Robert Neibuhr: Yeah. No, it, it's, I think it's a, it's, it's really, it's a, it's a really big, it's a really big problem because, um. Again, those long-term implications, uh, are, are clear as, as, as you laid out. But, um, it's also, I mean, like, again, like this notion that [00:06:00 ] there's, there's a tool that obviously can help us, you know, multiple avenues where AI can be, can be something that's, that's helps us be more efficient and all this, those sort of stuff that, that's, that's, that's true.
Um, so it's, it's there. So we should gauge and understand it. Um, but it doesn't mean you just use it everywhere. You know, you, you can buy, I don't know, you can buy alcohol at the grocery store. It doesn't mean you have it with your Cheerios, right? I mean, there's a, there's a time and place polite society says, you know, you can consume this at these times with these meals or in this company, right?
It's not all, all of this. So things, so, you know, the message that I think it's a level of respect, right? If we, we don't respect the students, if we don't lay out clear guidelines and. We don't show them respect, we don't ask for respect back if, if we use bots to grade and the whole thing just becomes a charade.
And, and I, I think the, again, the system [00:07:00 ] begins to, to break down and I think people wind up losing the point of what the exercise is all about anyway. And I, I may not just the assignment or the class, but like higher education. Right. I mean, the, the, the point is to. Teach us how to be better thinkers to, to gauge, evaluate information, uh, you know, use evidence, uh, apply it in our lives as, as we see fit.
And, and if it's, and if we're not prepped for that, then, then what did they prep us for? If, if, you know, the student's perspective, it's like, well, what did I just do? What did I pay for? That's, that's a, that's a huge long term problem
Steve Pearlman (2): it seems like. Uh. That, what did I pay for? Question is gonna come to bear heavily on higher education in the near future because if students are able to use AI to accomplish some of their work, and if faculty are using AI to grade some of their [00:08:00 ] work and so on, and then the, you know, the, these degrees are costing hundreds of thousands of dollars.
And it's an effectual piece of paper that maybe that loses value in essence also because the students didn't really get anything from that process or get as much as they used to because they're using ai. You know, is this moving towards some kind of gross reassessment of the value of higher education or its role in our society entirely?
Robert Neibuhr: I mean, it it, I think it certainly. It certainly has the potential, right? I mean, I would, I would even look back and, and think of a, a steady decline, right? That this is, this is one of, of many pieces that have gone, gone down. And I, you know, I mean mentioning in, in your, in your question just now, right? That the sense of, you know, students as client or customer, uh, how that has changed the sort of the, the interface and, and [00:09:00 ] how, you know.
Uh, we, we think of this, uh, this whole, this whole endeavor, right? I mean, um, and, you know, and this leads to things like, oh, retention numbers and, and all these sort of things that the mental gymnastics that happens to, um, you know, do all these things and, and the truth be told, right? Different paths for different people, right?
There's not, you know, there's not a single, like, you don't have to get the degree in physics to be as successful, but the, the student as, as, as customer, I think also has, um. Solidified this, this notion, um, that we can le list the student feedback, right? And, and student feedback is important. So I'll qualify that that standards were, were low.
I, I know for my own example, you know, even 20 years ago, right, that that undergraduates would have to produce a capstone thesis as part of their bachelor's degree. And I know firsthand that at from the time that, you know, [00:10:00 ] the history department had looked at, um, exit