Artificial intelligence is advancing, also at TU/e. Although AI tools like ChatGPT can be useful, it’s not wise to rely on them completely. When asked if it’s a good idea to use ChatGPT to answer exam questions, ChatGPT itself says no.

Over the past weeks, I was technically responsible for an exam for the basic course Data Analytics for Engineers. That course is no longer taught this year and the lecturer left TU/e last year. Unfortunately, it turned out that all references to figures had changed after the final check and before the exams got printed. It could’ve been worse – the students still understood which figure went with which question – but it did cause stress during the exam and that’s not ideal. I understand. 

What I don’t understand, and hadn’t counted on, is the huge debate about the exam contents I had to engage in with the students afterwards. They literally scrutinized and criticized each and every sentence in the exam. Sometimes I saw their point and had to admit I would’ve worded the question differently myself (in which case this was taken into account in the grading), but sometimes their comments didn’t make any sense. But what bugged me, is that sometimes it seemed like I was debating the entire internet rather than a student. 

Take, for example, this comment by a student: “The exam is wrong, because there’s this internet forum that says it’s best to calculate an average using a dichotomous variable and I used that forum to prepare.” To my remark that the forum isn’t part of the course material, the student responded that the site in question is used as such in Delft. (For the sake of clarity: the site in question explained that you can invoke the mean function in R, but that the outcome actually has a different interpretation). 

It’s even worse if students involve ChatGPT: “The exam is wrong, because ChatGPT says the answer is B and not A.” It would appear we have a generation of students on our hands that are convinced ChatGPT knows the answer to every question. But when asked if it’s a good idea to use ChatGPT to answer exam questions, ChatGPT itself says no

These comments show that students no longer study the course material provided, but mainly use old exams, practice assignments and the internet in the broadest sense of the word, so including ChatGPT. This is a definite concern for the Examination Committees at TU/e. How far can students go in using ChatGPT and at which point does it become fraud? 

To see how intelligent ChatGPT really is, I asked it this question: “What is the longest five-letter word in English?”. ChatGPT’s answer: “twelfth”. This led to a somewhat hilarious discussion, with as a final conclusion that “longer” is a longer word than “shorter”. I think students would do well to reconsider using ChatGPT and definitely ought to think twice before sending a lecturer an email saying that ChatGPT knows best. 

Boudewijn van Dongen is a professor of Process Analytics at TU/e. The views expressed in this column are his own. 

Share this article