At least 25 Dutch municipalities using algorithms to profile residents: report
At least 25 Dutch municipalities make use of predictive systems and algorithms to profile their citizens, for example to detect social assistance fraud, predict where potential undermining crime could occur, and to help citizens with their debts in time, NOS reports after researching this in collaboration with the regional broadcasters.
In practice, almost all governments and companies use algorithms, often for simple processes like registering speeders or sending letters automatically. But the 25 municipalities the broadcasters identified classify citizens according to profiles.
The municipalities of Nissewaard, Brielle, Deventer and Goirle created risk profiles for social assistance recipients, on the basis of which it is determined whether a file should be investigated further. In Deventer this is still an experiment.
The municipality of Twenterand also creates profiles of citizens, but to map out debts and "care problems". The municipalities of Leudal, Nederweert, and Maastricht use such systems to track down undermining crime. Dronten is experimenting with this. Breda uses algorithms to determine what type of homes to build or demolish in order to improve the quality of life.
Risk profiling is a controversial topic in the Netherlands, especially since the childcare allowance scandal. The Tax Authority used dual nationality as one of its criteria to determine whether someone was likely to commit fraud. Last year the court also banned the use of the government's SyRI profiling system, partly because it was insufficiently transparent and entailed a risk of discrimination.
Anton Ekker, the lawyer who successfully litigated against the SyRI system, called the broadcasters' findings cause for concern. "The cabinet made a solemn promise that there will be no new allowance affair, but there is no clear picture of what municipalities are doing," he said to NOS. "I wonder whether local governments have enough in-house knowledge to do this properly. They are often highly dependent on external IT suppliers."'
Critics warn that algorithms always entail a risk of discrimination as they are usually fed by people, whose biases are then included. "It is difficult to rule out discrimination completely," Nadia Benaissa of civil rights organization Bits of Freedom said to NOS. "I believe that municipalities mean well, but as a citizen you are quickly put in a corner."