MARCANO: Don’t adopt facial recognition technology until error rates become equal across racial groups


                        Randal Quran Reid in Atlanta on March 13, 2023. Because of a bad facial recognition match and other hidden technology, Reid spent nearly a week in jail, falsely accused of stealing purses in a state he said he had never even visited. (Nicole Craine/The New York Times)

Randal Quran Reid in Atlanta on March 13, 2023. Because of a bad facial recognition match and other hidden technology, Reid spent nearly a week in jail, falsely accused of stealing purses in a state he said he had never even visited. (Nicole Craine/The New York Times)

Lebanon may become the latest city in the area to employ plate readers as a tool for police to serve its community better.

Plate readers aren’t a bad thing. They help police spot criminal suspects, missing persons, and Amber Alerts. Lebanon officials, in a story in this newspaper, seem to be working hard to balance security and privacy by taking steps to ensure the technology isn’t misused. Most departments don’t keep data for longer than 30 days or tie the cameras to third-party systems that could map where cars go.

Opponents will note that plate reading technology can be ripe for abuse by targeting law-abiding citizens going to the grocery, visiting parents, or taking children on a family jaunt.

There’s an easy answer to that concern. If an employee abuses the system, fire and prosecute them to the fullest extent possible.

But there is a bigger issue.

In that same story, the city said that adopting the plater readers could be the first step toward utilizing facial recognition technology.

That’s going too far for any community, and citizens should stand up and yell, “No.”

That technology, which we’ll call “FRT” for short, has several issues that make it unreliable — at least for now.

Two non-partisan reports by Harvard University and The Gender Shades Project concluded that people with darker skin tones, especially women, have a far higher rate of FRT errors. The results were so jarring that IBM and Microsoft, the companies that built their own FRT, made technical changes to help account for bias. Then, in 2020, MIT reported that those two companies and Amazon Rekognition either placed a moratorium on or stopped selling the equipment.

The technology is so flawed that Black and Asian men are misidentified up to 100 times more than white men, according to a federal study performed under the Trump administration.

Critics blame a number of reasons for the discrepancy, including poor lighting and picture quality. An MIT paper released in 2021 identified another problem. Programmers developing the algorithms use mostly white faces, which limits the range of skin shades and facial features. That, the researchers said, leads to a “demographic bias” built into FRT.

Then there are stories like this one.

Detroit police arrested Robert Williams, a 43-year-old father, based on FRT that identified him as a suspected shoplifter. He was handcuffed at home, in front of his daughters, and interrogated for 30 hours before the charges were dropped for “insufficient evidence.” The Detroit police chief criticized his department’s sloppy work and told Vice News that FRT misidentifies 96% of the time.

Williams has filed a civil lawsuit against Detroit police.

The same misidentification has occurred in New Orleans, New Jersey, and California, where FRT misidentified 26 lawmakers as criminal suspects. In Atlanta, Randal Quran Reid was arrested for stealing purses in Louisiana, a state he never visited. The case collapsed, but not before he spent a week in jail.

Local communities shouldn’t consider using facial recognition technology until the error rates become equal across all racial groups. In the meantime, communities can develop common-sense rules governing FRT. For example, police departments should use FRT as a part of their investigation, not as a basis for arrest. It’s easy for two different people to look somewhat alike, and it’s easy for a trained eye to miss the nuances.

Just ask Robert Williams.

The photo police used to arrest Williams showed a man with a mole on his face. Williams doesn’t have a mole. No one noticed until after he was interrogated.

Ray Marcano’s column appears on these pages each Sunday. He can be reached at raymarcanoddn@gmail.com.

Ray Marcano

icon to expand image

About the Author