Technology has become an indispensable part of higher education: students, researchers and administrative staff are increasingly using digital platforms. But the digital infrastructure of (applied) universities is often outsourced to companies. This entails some risks, warns The Young Academy. This association of relatively young top scientists therefore advocates a “fundamental change of course”.
These tech companies collect personal data and other data and it is not always clear where exactly those end up. Students also have to share their personal data regularly when using educational and examination software.
Losing control
Because institutions often outsource the development and management of digital systems, they are in danger of losing control. They have too little technological knowledge in-house and are therefore unable to intervene when something goes wrong. This makes them vulnerable to so-called vendor lock-in: switching to another supplier becomes virtually impossible.
This goes against the core values of science, such as autonomy, responsibility, and academic freedom, according to The Young Academy. According to her, educational institutions have a duty of care “to guarantee a safe digital ecosystem for their employees and students”.
Ethical
The Young Academy therefore advises educational institutions to pay less attention to costs and ease of use when choosing ICT applications, but to look primarily at the ethical consequences. “Individual freedom, the right to privacy and the sustainable autonomy of scientific institutions” should be central. Institutions must collaborate more and be transparent in important decisions.
The manifesto also warns against the use and development of AI technologies. When training AI models, it is often unclear which data is used to feed the model and how this works. That is why The Young Academy is calling for clear rules.
Open letter
The Young Academy is not the only one concerned about the rise of AI: last week an open letter was published calling on higher education to be critical in the introduction of AI technologies. It now has around 500 signatures, including from lecturers, researchers and professors.
They are concerned about the critical thinking skills of students, but also about “the values of ecological sustainability, human dignity, pedagogical safeguards, data privacy, scientific integrity, and democracy”.
Discussion