The rise and fall of Clearview.AI and the evolution of facial recognition
Facial recognition software has come full circle, from tech darling to tech disaster.
Now, its rise and fall have been chronicled by Kashmir Hill (pictured below), a technology reporter for the New York Times who has tracked its potential and problems through one of its major innovators, Clearview AI Inc. Her new book, “Your Face Belongs to Us: A Secretive Startup’s Quest to End Privacy as We Know It,” came out last month, and I met up with her recently on her book tour.
Some of her tale reaches back into time, such as this quote: “Instantaneous photographs and newspaper enterprise have invaded the sacred precincts of private and domestic life.” You might be surprised to find out that this was written more than 130 years ago, taken from a law review article co-authored by Louis Brandeis, and inspired by the invention of Kodak film.
Hill’s book focuses on Clearview’s evolution from scrappy startup to a powerful player in the field, exposing its many missteps, failures and successful inroads into becoming a potent law enforcement tool. SiliconANGLE has tracked some of these moments, including being fined by U.K.regulators back in May of 2022 and a data breach in 2020 when customer data was stolen.
What made Clearview a market leader and such a potent force was its philosophy: It was determined to “scrape” the web for personal photos, no matter where and how they were posted. Today, various sources claim it has accumulated more than 30 billion images.
All of these images, as Hill and others have pointed out, were collected without anyone’s explicit permission. This collection would become infamous and exemplify a world “in which people are prejudged based on choices they’ve made in the past, not their behavior in the present,” she writes. You could say that on the internet, everyone knows you once were a dog.
Clearview wasn’t the only tech firm to develop facial recognition software: Google, Facebook, Microsoft, IBM, Apple and Amazon all had various projects that they either developed internally or purchased — Google with Pittsburgh Pattern Recognition and Apple with Polar Rose, for example. In either case, these projects were eventually stopped because they were afraid to deploy them, as Hill writes.
Facebook, for example, had face recognition projects as early as 2010 “but could afford to bide its time until some other company broke through.” It eventually shut down its facial recognition projects in 2021. However, Facebook didn’t delete the code but merely turned it off, leaving the door open for some future time when perhaps the technology would be more accepted. IBM withdrew from this market back in 2020.
Hill documents one of the biggest challenges: being able to identify people in various candid poses, with dim lighting, with poor resolution street surveillance cameras, and looking away from the ever-seeing lens. Another challenge is legal, with lawsuits coming at Clearview from literally all corners of the globe. Leading the charge is ACLU lawyer James Ferg-Cadima and the state of Illinois, which was an early adopter of biometric privacy back in 2008. Since then, 10 other states have enacted similar legislation. There are no overall federal regulations that specifically govern biometric data such as this.
Clearview has also prompted many activists to protest and lobby for restrictions. One said that “face recognition should be thought about in the same way we do about nuclear or biological weapons.” Clearview soon “became a punching bag for global privacy regulators,” Hill writes, and describes several efforts in Europe during the early 2020s that resulted in various fines and restrictions placed on the company.
Police departments were early adopters of Clearview, thanks to today’s smartphone users that post everything about their lives. That has led to one series of legal challenges that was self-inflicted. Hill documents many cases where the wrong person was identified and then arrested, such as Robert Williams and Randall Reid. “It wasn’t a simple matter of an algorithm making a mistake,” she writes. “It was a series of human beings making bad decisions, aided by fallible technology.”
She wrote that one for a 2020 Times article entitled, “Wrongly Accused by an Algorithm.” In many of these wrongful arrest cases, the accused were black men, a problem that could be traced back to inadequate training data of nonwhite images. Facebook acknowledged this problem for many years with its image recognition algorithm, which led to its decision to terminate its project.
Some of Clearview’s story is inextricably bound to Hill’s own investigations, where early on she tipped off the company about her interests and was initially blocked from learning more about its technology. Eventually, she would interview Clearview Chief Executive Hoan Ton-That numerous times to connect the dots. “It was astonishing that Ton-That had gone from building banal Facebook apps to creating world-changing software,” she wrote in summing up his career in her book.
The current facial recognition dilemma
Hill’s book ends without any conclusive ending, and that is partly by design, because Clearview is still very much in business, selling mostly to law enforcement and other government customers. Its promises to be a more careful curator of its huge image repository have largely been hollow and unfulfilled, and legal challenges continue to mount across the world as new biometric regulations are being considered. Other examples, such as the massive investment Chinese authorities have put into their own facial surveillance systems, come to mind.
Technology will always be ahead of the regulators, and the story about facial recognition and the rise and fall of Clearview is just the latest example in this push-and-pull tension. This makes Hill’s book all the more compelling and why it should be required reading for both business and government managers who are concerned about whether or not to deploy this technology and how to control its use and, more importantly, its misuse.
One of the current legal flashpoints is dealing with the right to be left alone, and as she cites in her book, “being unseen is a privilege.” Unfortunately, it is getting harder and harder to be unseen, because even if people petition Clearview to remove their images from their searches and from public web sources, it still has a copy buried deep within its massive database. Hill finds that Clearview created a “red list” that would remove certain VIPs from being tracked by its software by government edict.
“Your Face Belongs to Us” is an essential chronicle about how this technology has evolved, and what we as citizens have to do to protect ourselves.
Image: Pixabay; photo: Kashmir Hill
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One-click below supports our mission to provide free, deep and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger and many more luminaries and experts.
THANK YOU