Clearview AI fined €30.5 million for “illegal” facial recognition database
The Dutch Data Protection Authority (Dutch DPA) has fined Clearview AI 30.5 million euros for building an “illegal” facial recognition database with the photos of billions of people, including Dutch people, without their knowledge or consent. The Dutch DPA also warned that using Clearview AI’s services is illegal.
Clearview is an American commercial business that offers facial recognition services to intelligence and investigative services. Customers can provide camera images to Clearview to find out the identity of people shown in the images. Clearview has a database with over 30 billion photos of people scraped off the internet without the involved people’s knowledge or consent. Clearview then converts the photos into a unique biometric code per face.
The American company says it only provides services to intelligence and investigative services outside the European Union, many of which don’t have the same level of privacy protection as the EU does. According to the Dutch DPA, this is a clear and serious violation of the General Data Protection Regulation (GDPR).
“Facial recognition is a highly intrusive technology, that you cannot simply unleash on anyone in the world,” said Dutch DPA chairman Aleid Wolfsen. “If there is a photo of you on the Internet – and doesn't that apply to all of us? – then you can end up in the database of Clearview and be tracked.” He added, “This really shouldn't go any further. We have to draw a very clear line at incorrect use of this sort of technology.”
Wolfsen acknowledged that techniques like facial recognition can contribute to safety and the detection of criminals by official authorities. “But certainly not by a commercial business. And by competent authorities in highly exceptional cases only. The police, for example, have to manage the software and database themselves in that case, subject to strict conditions and under the watchful eye of the Dutch DPA and other supervisory authorities.”
Clearview should never have built this database with photos and the unique biometric codes linked to them. The company also failed to inform people in its database about the fact that it is using their photo and biometric data. People in a database have the right to access their data, but Clearview does not cooperate in requests for access, the Dutch DPA said.
The Dutch DPA has ordered Clearview to stop these violations. “If Clearview fails to do this, the company will have to pay penalties for non-compliance in a total maximum amount of 5.1 million euros on top of the fine,” the Dutch DPA said.
Other data protection authorities have already fined Clearview on several earlier occasions, but the company has failed to adapt its conduct so far. “Such a company cannot continue to violate the rights of Europeans and get away with it. Certainly not in this serious manner and on this massive scale. We are now going to investigate if we can hold the management of the company personally liable and fine them for directing those violations. That liability already exists if directors know that the GDPR is being violated, have the authority to stop that, but omit to do so, and in this way consciously accept those violations,” Wolfsen said.
The company did not release a response to the Dutch DPA’s statement. “Our platform, powered by facial recognition technology, includes the largest known database of 50+ billion facial images sourced from public-only web sources, including news media, mugshot websites, public social media, and other open sources,” Clearview writes on its website.
Its products were mainly used by government authorities and law enforcement but also counted banks and retailers among its clients. Aside from the identification of persons of interest in investigations, the facial recognition data has also been used to identify victims of online child sexual abuse.
The company is headquartered in New York but is privately owned. Only a few of its investors, like Peter Thiel and Naval Ravikant, have ever been made public.