
Does ChatGPT make you smarter? Or lazier and more dependent?
More and more students are using ChatGPT for their studies, but does it actually help them move forward?
ChatGPT and other generative AI models can be useful tools for students. Nonetheless, their use also raises questions about dependence, critical thinking, creativity and the line between original work and plagiarism. Associate professor Joris Remmers explores the pros and cons of this technology in education.
When ChatGPT suddenly exploded onto the scene in 2022, Joris Remmers was preparing a new programming course: Scientific Computing at the Department of Mechanical Engineering. Remmers, an associate professor with expertise in materials science – in particular 3D-printed materials and composites – decided to start experimenting with the AI tool right away.
He had the AI model complete a few assignments, and the results were far from disappointing. “That’s when I thought: I have to do something with this,” he says. “I asked the department: can I let my students use it?” They agreed, on the condition that he linked it to a study so they could learn from it as well.
Less interaction
Remmers allowed his students to use ChatGPT, provided that they were transparent about its use. In collaboration with the Eindhoven School of Education, the tool’s use was evaluated through interviews. Two findings emerged: students knew how to use ChatGPT, but it did not lead to better learning results. “This was partly because they became dependent on the help the tool provided, reducing their own critical thinking,” says Remmers.
“They also had less contact with each other,” he continues. According to Remmers, the social aspect faded into the background, even though it’s an important part of the course. “ChatGPT partly replaced cooperation between students, which resulted in less interaction and learning efficiency, while the students themselves felt they had done a good job.”
Some lecturers are strictly opposed to the use of ChatGPT, but that’s neither realistic nor sensible
Opening the dialogue
On the other hand, the program also offered clear benefits. For example, it proved useful in explaining a code, detecting errors and correcting them. But having it generate a full code? “That’s not a good idea because you don’t learn anything from that,” says the lecturer.
The study was the first of its kind and made a big impression, but it didn’t stop there, Remmers says. “I delved even further into the topic and, almost naturally, ended up becoming the ambassador of AI in education within the Academy for Learning and Teaching (ALT). In this community, we explore the best ways to integrate AI into education, among other topics.”
He stresses that they are neither for nor against AI, but rather critically examine its possibilities and limitations. “We have to face the fact that students are already making extensive use of it – there’s simply no way around that anymore. Some lecturers are strictly opposed to the use of ChatGPT, but that’s neither realistic nor sensible. What we want to do is open the dialogue on how to use AI responsibly so that this technology supports and accelerates the learning process.”
Guidelines
There are currently no national guidelines for the use of ChatGPT and other generative AI in education. Instead, universities and universities of applied sciences draw up their own regulations, including TU/e. “It’s up to the lecturers to determine if and how AI may be used,” says Remmers.
Most lecturers allow limited use of the tool, but this varies greatly by discipline. “At Mechanical Engineering, programming is mainly a tool for research. At Mathematics & Computer Science, it’s a core skill, so more critical consideration must be given to what kinds of AI use are allowed. This makes it difficult to draft general guidelines for the entire university,” says Remmers.
Assessment
How do you check that students aren’t overusing ChatGPT? “It’s becoming increasingly important to think carefully about appropriate assessment methods that actually measure whether students understand the material,” argues Remmers. “For example, a report is often not the best option, because it’s easy to have ChatGPT write it for you. Oral exams and handwritten tests might be more reliable ways to assess students. What also helps in this respect is to ask targeted questions that can only be answered if the student truly understands the material. “We need to keep finding more creative ways to deal with that.”
And how big is the risk that students will actually commit fraud? And should the university keep a closer eye on that? “The university has every right to expect students to act responsibly,” Remmers responds. Master’s students, in particular, understand that they’re here to learn, but I think it’s more challenging for bachelor’s students. They’re often not as mature in this regard. Moreover, they often take courses with hundreds of students at once. The smaller the groups, the easier it is to detect fraud. So personal contact between students and lecturers remains of the utmost importance.”
Chatbot as tutor
Together with colleagues, Remmers examined whether a customized version of ChatGPT would be able to provide students with better support in their learning. “The latest version allows you to create your own chatbot and set specific rules for it,” Remmers explains. “For example, you can prohibit it from generating answers or codes, and instead instruct it to point out errors or ask questions in return.” Such a customized chatbot acts as a sort of tutor: it does not take over the work, but guides students, gives them personal feedback and encourages critical thinking. “That way, ChatGPT can strengthen education rather than undermine it,” he argues.
This tutor chatbot was tested in the same programming course with around one hundred students. A third of them were allowed to use ChatGPT freely, another third worked with the customized version, and the remaining students were not allowed to use AI tools at all. They were given an assignment that was intentionally a bit too difficult, with the goal of seeing how far they would get and what they would understand from it. The results are not in yet, so whether the hypothesis is correct is still uncertain, says Remmers. “I expect that the students who used ChatGPT got further along with the assignment, but that the students who used the customized chatbot understood it better.” If that is confirmed, it may offer a new approach to using ChatGPT in a way that facilitates learning.
Fundamentally different
“I can imagine that ChatGPT will change what we teach on a fundamental level,” says Remmers. “Certain skills may become obsolete and disappear from the curriculum. Maybe we won’t need a programming course anymore in the future.” For courses that remain, the learning objectives may shift and the emphasis may move elsewhere. Since ChatGPT and similar models are evolving at lightning speed, this requires a continuous process of evaluation and adaptation, he argues.
His advice to lecturers? “Stay critical, look at how students are using ChatGPT and adjust your learning objectives and assessment accordingly. For example, I know that anyone can use ChatGPT to write an error-free code, so in my programming course, I pay much more attention to the design now, or I ask very targeted questions. I’ve started looking at things from a completely different perspective.”
Would you like to learn more about this topic or engage in discussion with your colleagues? ALT regularly organizes lunch sessions for lecturers and other interested parties. Check out the upcoming events here.
Discussion