AI in schools: Dutch study warns algorithms may discriminate against students
A new study commissioned by the Dutch government raises concerns about the potential for algorithmic bias in educational technology. The research released on Monday suggests that learning systems that make use of artificial intelligence (AI) can put some groups of students at a disadvantage even though the systems are intended to make education for personalized.
The study examined how algorithms are used in both primary and higher education, the Netherlands Institute for Human Rights said. Researchers found that many schools utilize adaptive learning systems, which automatically adjust exercises based on student performance. While these systems can provide targeted support, they may also reinforce existing biases when poorly designed.
“Unfortunately, prejudices about groups of students already play a role in education,” the report stated. Students are already more likely to receive lower initial school recommendations despite their test scores when they come from lower-income families or if they have a background including parents or grandparents who immigrated to the Netherlands. AI systems that are not rigorously tested on diverse student populations risk amplifying these inequalities.
The study also highlighted concerns about how algorithms assess student learning styles. A system might struggle to accurately evaluate a child with attention deficit hyperactivity disorder or dyslexia, potentially labeling them as weak learners and hindering their progress.
“Every student has the right to develop and be treated equally in education,” the report emphasized. Digital resources can be valuable tools, offering additional support to students who need it. However, the research underscores the importance of critical evaluation before implementing AI-powered learning systems.
“Schools must and can impose requirements on software suppliers in the areas of equal treatment, privacy, autonomy and transparency,” the Netherlands Institute for Human Rights said. However, schools cannot be left without support from the government.
The report called on the Ministry of Education, Culture and Science to play a more active role in the process. The researchers argued the ministry encourage further studies on the impact of educational technology and provide resources to schools and educators on the potential risks of bias. Additionally, the report suggested developing a framework with clear standards for educational software developers to ensure fairness and transparency in their products.