AI Cheating and the Undergraduate Experience

The rise of students cheating using AI is having a disastrous effect at StFX and in the academic field by fundamentally changing the way people experience university education, the goal of which is to get good grades in order to get a good job.  Some students use AI to keep their academic workload light. To them, even if it is cheating, it's justified; they're doing what they need to do to succeed. I interviewed a business student who admitted to using AI to cheat. When asked if they believe cheating makes the class experience worse, they argued that using AI to summarize readings allows people to engage in conversations they wouldn’t otherwise participate in, which improves discussion. However, they also said that students should not be allowed to pass off AI writing as their own. “I feel like you should only be able to use it as a reference” they said, adding that “anything over 25% [written by] AI is quite bad, and not professional, not beneficial to you or the school … if its more than that, I’m gonna feel bad about it”. They later added that when it comes to writing, “authenticity is important”. 

Professors argue that AI is damaging to the university structure and that cheating restricts the development of key practical skills that students must develop. For Dr Steven Baldner, professor of philosophy, one of these important skills include “writing, being able to understand a problem independently, explain what the problem is and give a solution to the problem in a clear and coherent way”. The student I talked to argued that AI makes the university experience easier and described AI as “so beneficial … especially for time saving”. They also said that “sometimes the way professors explain things is so advanced because they're so advanced and AI can really dumb things down”. But they did not hesitate to say that cheaters should be punished: “If you're getting an assignment and 100% generating it with AI and handing it in, then you should be punished, that’s not how school works”.  

Professors argue that cheating erodes their relationship with students by affecting their ability to trust them. Dr Baldner believes that “if you forbid the use of AI and a student has done that, it’s a real rupture in our relationship, and I have had that experience already, sadly. Its upsetting”. It is not hard to imagine that a lack of trust inevitably strains the professor-student relationship because reliability comes from trust; if you don’t trust your students, then the way you treat them changes - and likely for the worst. When asked if good prof-student relationships improve the class environment, the student answered that it makes the experience “just incrementally better” but that AI should not be seen as a form of betrayal. “I would say it's kind of the opposite of a betrayal, I feel like a betrayal would be me listening to the prof, not understanding what he says, and also never trying to understand … but AI allows me to - if I don’t understand something, actually dive deeper into it and care about what the prof is saying”. These two views on AI are incompatibly different, yet it is not difficult to sympathize with both opinions. 

In our conversation, Dr Baldner mentioned that AI falls short as an academic tool because that’s not what it was designed to be: “I don’t think it's intelligence … it is just a very elaborate way of producing word association and I think for that reason it will never achieve what the human author can do”. AI can produce relevant answers with proper grammar but is not concerned with truth or argumentative quality. It is not hard to see how AI could be helpful; in our conversation, Dr Baldner said that AI “certainly has a lot of very helpful applications in a number of fields,” but was clear that he is concerned about AI use among undergraduates.  

Dr Baldner used an analogy of a tennis ball machine to explain his view on AI; these machines are used to practice, not to compete. Dr Baldner said, “The point is for you to learn how to hit…we don’t put the machines in the game”. ChatGPT might be able to show you how to use a semicolon or what a good use of the exclamation mark is, but it’s production cannot be passed off as the work of a student. This is because the student didn’t participate in the generation of the arguments, often the most important part of an essay. The school considers AI to be plagiarism, not just because it generates sentences that are not one’s own, but because they also do not belong to the AI model. An AI database is not created by the machine; it is fed to it. AI results come from compilations of human data, and a student who uses these results is plagiarizing work from that database.  

Cheating is dishonest, and this alone is often seen as bad in and of itself. But that is not where the problems stop; professors at StFX are compelled to implement preventative measures to stop cheating. These measures take different forms, such as not allowing technology in the classroom, or in-class exams replacing take-home essays. But these changes also negatively affect students by stopping them from learning how to do take-home assignments or prepare research papers. In addition, in-class essays disfavour certain students. Dr Baldner recognized this by mentioning that certain “students have anxiety in those quiz situations”. In-class essays also take away from lecture time, resulting in either rushed examinations to fit a lecture in afterwards or missing topics that should be included in the teaching of a subject. “You're actually cutting down the amount that you can accomplish”, as Dr Baldner told me. These consequences come from trying to prevent cheating, something that has to be done to retain classroom equality, but the preventative measures negatively impact certain students, which also disrupts classroom equality.  

The ‘real world’ and the careers we are preparing for at university are competitive and complex. To excel, you will need to be trained and prepared for them. When a university does not set students up for success, then a good transcript becomes the sole purpose of their education. Many see a university diploma as an expensive addition to a job resume, so why not make this addition as easy as possible? During our conversation, the student told me that if there was no risk of punishment for AI that they “would use it always”. Being able to do your future job well is not common sense; it has long been the role of universities to prepare students for this transition, but now it seems that AI can do this. The problem is that AI does not provide its users with an education, only the answering of prompts. The job of AI is not done alongside a user, but for them, cheating them of the opportunity to learn how to do tasks themselves moving forward. When preparing for the job field, it is either education or reliance on AI.