Skip to main content
Home

Main navigation

  • Top stories
  • Health
  • Crime
  • Politics
  • Business
  • Tech
  • Culture
  • Sports
  • Weird
  • 1-1-2
Image
Workplace discrimination
Workplace discrimination - Credit: AndrewLozovyi / DepositPhotos - License: DepositPhotos
Business
discrimination
self learning algorithm
YoungCapital
ManpowerGroup Netherlands
Randstad
Monday, 1 November 2021 - 19:50
Share this:
  • facebook
  • twitter
  • linkedin
  • whatsapp
  • reddit

Staffing agencies caution against self-learning algorithms due to discrimination potential

Employment agencies are avoiding the use of self-learning algorithms as much as possible to match employers with employees. They want to avoid potential adverse effects, such as discrimination. This initiative began with Randstad, ManpowerGroup Netherlands, and YoungCapital.

The staffing agencies say they actively choose not to use self-learning algorithms during the selection process. The use of these algorithms can lead to the systematic exclusion of certain demographics by age, postal code, or other factors, even without the staffing agency realizing it.

The companies are also cautious about predictive algorithms. These are the algorithms used by the Tax and Customs Administration that went awry during the recent childcare benefits scandal. The prominent temporary employment agencies want to retain control over the selection process instead of leaving it to technology.

"With self-learning algorithms, even the creators can't discern what processes lead to the choices," explains Peter Pernet of YoungCapital. "Therefore, we cannot work with a self-learning algorithm, and the selection of candidates will lie in the hands of our recruiters." Randstad Netherlands is also aware of their company's social impact. That is why discriminating factors such as age, nationality, and postal code are not entered into the system, says Labor Market Director Marjolein ten Hoonte. Exactly the same thing happened at ManpowerGroup Nederland, which removed privacy-sensitive data from the system.

Erik Janse, responsible for digital developments at ManpowerGroup, noticed that there was too much privacy-sensitive information in the systems. "That's where it often goes wrong with the police and the tax authorities; they enter a lot of private data, which algorithms have access to." ManpowerGroup now only uses data that does not encourage discrimination.

Janse emphasizes a big difference between employment agency platforms and other 'supply and demand platforms' such as Bol.com and Booking.com. "An employment agency deals with people and not hotel rooms, so the technology has to be used differently, which is why there are always consultants involved in the selection process." Together with several universities, Randstad regularly tests the existing algorithms against a code of ethics and believes that it is always essential to have people watching.

Reporting by ANP

Follow us:

Latest stories

  • Truck goes off viaduct and falls on delivery van on A15 highway; Only minor injuries
  • Extinction Rebellion announces another A12 highway blockade in March
  • Dutch player Anish Giri wins Tata chess tournament
  • Regulator shoots down RTL-Talpa merger plans
  • Amsterdam civil servants to strike Monday morning
  • Netherlands struggles to find qualified teachers; Shortage hits Amsterdam hard

Top stories

  • Netherlands struggles to find qualified teachers; Shortage hits Amsterdam hard
  • Philips scrapping 1,100 jobs in Netherlands, 6,000 worldwide
  • Netherlands got most asylum applications since 2015 last year
  • Dutch royal family under fire for “caricature insensitivity”
  • Prime Minister Mark Rutte will speak at Holocaust commemoration
  • A12 open again after Extinction Rebellion protest, more than 500 people arrested

© 2012-2023, NL Times, All rights reserved.

Footer menu

  • Privacy
  • Contact
  • Partner content