
Anti-cheating webcam software may have discriminated based on skin color
The Netherlands Institute for Human Rights issued its first ever determination that it was plausible an algorithm discriminated against a person. A student at the Vrije Universiteit in Amsterdam (VU) filed a complaint alleging that the software used to proctor an exam that was given remotely did not recognize her because of her skin color when she attempted to log in and complete the exam. The VU must now demonstrate that the anti-cheating software it used does not discriminate, the organization's board wrote in an interim decision.
During the height of the coronavirus crisis, exams at the VU were often taken online. To prevent fraud, students had to install anti-cheating software on their computers. Before they were given access to the exam questions, they had to go through several checks, including one which used a webcam.
According to the Master's student, the software often did not recognize her face. She then received error messages, such as "face not found" or "room too dark." The software also threw her out of the exam several times, forcing her to log back in again.
The situation caused her substantial stress "and a feeling of insecurity," according to the board of the independent institute. The woman said she believes the software could not properly function because of her skin color. "It is generally known that face detection algorithms perform worse on people with a darker skin color," the institute wrote. The organization ruled "that the woman has succeeded in providing sufficient facts for a suspicion of discrimination."
The VU has inquired with the software's supplier, and the VU claimed that after an investigation the supplier found no indication of discrimination by the algorithm. According to the VU, the student did not need an exceptionally long time to log in, and was not kicked out of exams very frequently. One possibility is that any problems which arose were the result of a bad internet connection, the university argued.
However, the institute said that the VU has not provided enough verifiable data showing that the software is not biased. The university has ten weeks to prove that the software did indeed not discriminate.
The verdict is an "important moment," said Jacobine Geel, the chair of the Netherlands Institute for Human Rights. "Someone managed to say, 'Hey, what this algorithm does is strange. I suspect that the algorithm is discriminatory.'"
Naomi Appelman, the chair of the Racism and Technology Center, said she hopes the ruling is "a lesson" for organizations that want to use facial recognition. "Guarantees from the supplier of the technology are not enough."
A VU spokesperson said that it is "going to work" with the verdict. In the meantime, the university "cannot give a substantive response."
Amsterdam