Resource Center

Law Enforcement Using Database with Billions of Images for Facial Recognition

Feb 6, 2020 8:30:00 AM

Hundreds of police departments have signed up across the nation for a service that is now crossing some into a territory that could be considered a legal grey area. What is the service? A database of billions of faces.

The End of Privacy

Clearview AI is a facial recognition system and database. It’s comprised of billions of images across millions of public domain websites such as Facebook, Twitter, Instagram, YouTube, and even Venmo. Clearview AI’s work could end privacy as we know it.

The use of certain security devices and facial recognition systems by police is already a growing concern. However, the sheer size of Clearview’s database is troubling, not to mention the method used to obtain all of the images that could be violating many websites’ terms of service.

Who’s Using It

This software has been provided to anyone ranging from local cops in Florida, the FBI, and the Department of Homeland Security. It’s as simple as taking a picture of a person, uploading it, then seeing all public data and photos of that person appear before you, including the links to where those photos appeared. Over 600+ law enforcement agencies who have started using Clearview within the last year have declined to provide their names to the public.

Wholesale alarm monitoring services use video verification to help prevent false alarms from happening. But in no way is it as invasive — wholesale monitoring doesn’t use facial recognition — as what some of these law enforcement agencies are using.

While the upside to this program is obvious (some cases have been solved within seconds), the downside is a lack of privacy, even for those without social media. A crime this year was solved using a gym photo taken by a random social media user, and the suspect happened to be in the background of the image. Clearview AI has already helped the police solve crimes including shoplifting, identity theft, credit fraud, murder, and child sexual exploitation.

The Argument

According to a study done by the National Institute of Standards and Technology, certain groups within populations can be misidentified up to 100 times more frequently than others. NIST says it found evidence that traits such as age, gender, and race all impact these algorithms. This “bias” is something that needs to be worked out before use from law enforcement officials.

Facial recognition is becoming more frequently used in law enforcement today, but its regulations are far and few between. However, even improvement of the technology won’t matter if there are no standards depicting the appropriate uses.

During uncertain times, it's best to do business with a company you can trust. Reach out to Avantguard today to find out what we can do for you, your family, and your business.

Take advantage of our robust library of industry and AG related news, articles, webinars and other resources available through our resource center to enhance your success.  You will also discover valuable insights and content you can share with your subscribers through your website, newsletters, and emails.

Receive more useful content like this by signing up for our weekly AG Newsletter below: