American Government Is Not Ready for Facial Recognition & Artificial Intelligence

Today, the recurring story in my social media feeds is that of police body worn cameras outfitted with facial recognition software.  And I have but one thought: American government is not ready for this.

In the last decade, nothing has complicated American policing as much as the deployment of body worn cameras.
  • When are they turned ON?
  • When are they turned OFF?
  • What about during civil protests?
  • Can police record First Amendment activities?
  • What are the Fourth Amendment concerns? 
  • Can officers watch the video prior to report writing?
  • How about before interrogation about wrongdoing?
  • What if the officer is involved in a citizen's death?
  • When can the videos be released to the public?
  • Who determines what is edited out for privacy?
  • What about buffer zones that go back in time and record?
  • How are innocent persons' privacy protected?
  • When can video be used in courts of law? 
  • How about for police internal investigations?
  • What if _____?
The list goes on and on. Body worn cameras have certainly created more questions than answers. Some are legal. Some are technical. Some are ethical. All of them have competing interests.

Now there is talk about adding facial recognition software to body worn cameras. Most of the news articles discuss the application to finding lost or missing children. Imagine: uploading an image of a lost child into the system. As officers walk a crowd, they get an alert that the child has been "spotted" by the camera. This is an emotional play by manufacturers and developers to get support for such technology. Only a few of the articles have touched upon a deeper concern.
  • What other sorts of hot lists or hit lists will be uploaded?
  • Wanted criminal fugitives to apprehend them?
  • Reportedly suicidal persons to get them to help?
  • Registered child sex offenders to ensure they're not with kids?
  • Documented gang members?
Again, sounds like reasonable uses to maintain order in our society. But...
  • Who can be detained to confirm identity?
  • What level of speculation is necessary or reasonable to detain?
  • For how long?
  • How much force can be used to make that detention?
  • To what level of inconvenience to the innocent doppelgänger?
  • How will police handle misidentification detentions?
These are only questions that arise for potential comparisons to uploaded hot lists - lists of persons whose images are to be compared to the public faces that find themselves in front of police cameras.

A more complicated venture is that of historical data...much like current technology with vehicle license plate readers. Cameras scan car registration plates against hot lists - which include arrest warrants, stolen cars, Amber Alerts for kidnapped children, owners with suspended drivers licenses, parking ticket violators.

Whether these cameras are mounted on police meter maids' cars, auto repossessors' tow trucks, red light signals, utility poles along the roadway, entrances to private parking garages...there is data saved for later use. The sharing of private and public data is a cooperative effort not seen too many places.

Investigators can query databases to find criminals fleeing a crime scene. Or find out where a car frequents to find its owner (who might the target of a criminal investigation...or a suicidal person in need of emergency help).

As a police detective, I've used this sort of technology to solve crimes and find people who did want to be found. It's an amazing advancement in criminal justice technology. It catches bad guys. As a private citizen, I've been equally amazed at where my car has been photographed. It's scary. I felt somewhat violated when I queried my own car's license plate history.

Now imagine putting similar technology into police body worn cameras. Software that not only compares to hot lists, but collects your face for future use...when someone wants to know:
  • Where you are?
  • Where you've been?
  • When were you there?
  • Who were you with?
  • What you were wearing?
  • Did you have a bad hair day?
  • What car did you get into? 
Then, besides inside police body worn cameras, where else will we see this facial recognition software being used:
  • Entrances to jewelry stores?
  • Banks?
  • Schools?
  • Government buildings?
  • Hospitals?
Sure we can make a case for safety in each of these above places...just like the emotional sales pitch for finding lost children. But how about when it becomes a music concert? an outdoor festival? a college speech? a public park?

How will we handle the release of video, editing pictures, curating databases, data storage, security of information, and false positive alerts?

The legal and ethical dilemmas are growing almost as quickly as technology grows.

As a policeman, I crave data and information and intelligence. It keeps society safe.

As a human, I appreciate my family's privacy and freedoms. It keeps me happy.

I'm convinced that American government is not ready to tackle the troubles associated with video, predictive policing, big data, facial recognition, or artificial intelligence.

What about the rest of humanity? Are you ready?

At any rate, we need to slow down.


Lou Hayes, Jr. is a police training unit supervisor in suburban Chicago. He studies human performance & decision-making, creativity, emotional intelligence, and adaptability. Follow Lou on Twitter at @LouHayesJr


Popular posts from this blog

Presentation Hack: Your Last Slide(s)

Presentation Hack: "For those of you who don't know me..."

The Generalist versus The Specialist