• World
    • Africa
    • Asia Pacific
    • Central & South Asia
    • Europe
    • Latin America & Caribbean
    • Middle East & North Africa
    • North America
  • Coronavirus
  • Politics
    • US Election
    • US politics
    • Joe Biden
    • Brexit
    • European Union
    • India
    • Arab world
  • Economics
    • Finance
    • Eurozone
    • International Trade
  • Business
    • Entrepreneurship
    • Startups
    • Technology
  • Culture
    • Entertainment
    • Music
    • Film
    • Books
    • Travel
  • Environment
    • Climate change
    • Smart cities
    • Green Economy
  • Global Change
    • Education
    • Refugee Crisis
    • International Aid
    • Human Rights
  • International Security
    • ISIS
    • War on Terror
    • North Korea
    • Nuclear Weapons
  • Science
    • Health
  • 360 °
  • The Interview
  • In-Depth
  • Insight
  • Quick Read
  • Video
  • Podcasts
  • Interactive
  • My Voice
  • About
  • FO Store
Sections
  • World
  • Coronavirus
  • Politics
  • Economics
  • Business
  • Culture
  • Sign Up
  • Login
  • Publish

Make Sense of the world

Unique insight from 2,000+ contributors in 80+ Countries

Close

Facial Recognition Technology and the Future of Policing

The replacement of systemic, interpersonal racism with hard-coded, digital racism is not the reform policing needs.
By Benjamin Verdi • Oct 21, 2020
Benjamin Verdi, Young Professionals in Foreign Policy, police reform, facial recognition technology, facial recognition technology racial bias, facial recognition technology inaccuracies, police use of facial recognition technology, tech news, police surveillance technology, facial recognition dangers

© Fractal Pictures / Shutterstock

It is unquestionably inspiring that the Black Lives Matter movement has inspired millions of Americans to question their understandings of race, policing, justice and the true nature of history. It represents, perhaps, the most hopeful global development in this most tumultuous year. Encouragingly, some cities and states have already taken steps toward reforming or altogether eliminating their police departments in an unprecedentedly swift translation of passion into action. This, too, has served to inspire, but for reasons both political and logistical, a national approach to police reform has yet to materialize.


“Defund the Police”: A Simple Slogan for a Complex Problem

READ MORE


Policing in America is, by definition, a state and local responsibility. But this has not stopped the federal government from weighing in on police matters or funding them in significant ways. New national-level reforms to law enforcement will present obvious challenges, but where they would make the greatest impact is in the field of regulating technologies and tools police have available to them. Police have never been more powerful on this front, and particularly worrisome is the availability of invasive — and often malfunctioning — facial recognition software. Intelligent regulation of tools like these must navigate some problematic global developments regarding these technologies and, where appropriate, take cues from the industry responsible for their production.

Global Trends

Globally, there appear to be two trends developing regarding government regulation and use of facial recognition software. While both are underwhelming, they each underscore truths about the nature of these tools with which any better alternatives must also contend.   

First is the more immediately worrisome approach to facial recognition that comes in form of a familiar Faustian bargain: the exchange of privacy for safety. This approach involves a full embrace of facial recognition’s capabilities in the name of national security, now including essential and expansive COVID-19 contact tracing. While straightforward, this rationale for facial recognition is far too easy to exploit. For example, it is widely reported and understood that facial recognition software is indispensable to China’s ongoing efforts to target Uighur Muslims. Similarly, the use of facial recognition in India is fertile ground for privacy and civil liberties concerns that show no signs of abetting.

The second approach appears to be, for lack of a better term, punting. This approach was best epitomized by the European Union Commission’s omission of any meaningful reference to facial recognition in its latest white paper on artificial intelligence. This will certainly be remembered as a moment the EU missed. While it might still articulate a comprehensive regional approach to the use of such tools, it is clear that attempting to do so in the age of coronavirus was too much to ask.

logo

Make Sense of the World

Unique insight from 2,000+ contributors in 80+ Countries

Make Sense of the World
Unique insights from 2000+ contributors in 80+ countries

While the EU may have punted such a decision into the future, US federal government has seemingly punted it to local authorities. What has arisen instead of a coherent national strategy is a patchwork approach of states and cities banning certain uses of certain tools, as well as a swarm of other jurisdictions that use them all with gusto, ban them completely or have no facial recognition software policy at all.   

Fortunately, global demonstrations against the overuse of police power have, for the time being, held further expansion of these tools at bay and on this front, the most meaningful reform has come from technology companies themselves. IBM suspended its development of facial recognition software citing its problematic implications in the hands of law enforcement. Similarly, Amazon suspended the sale of its facial recognition platform called Rekognition to law enforcement agencies for one year, reasoning that lawmakers would need that time to regulate facial recognition programs in a fair, transparent way. Microsoft and Google have also come forward to announce their resistance to providing facial recognition software to law enforcement agencies for the time being.

Line of Business

When IBM, Amazon, Microsoft and Google all decide that a line of business is not worth the risk or the money, lawmakers need to pay attention. That said, concerned citizens must note that most of these encouraging company policies are designed to be temporary, and even those not billed as such can change as quickly as one can schedule a board meeting. Just as critically, these firms mainly cite the hypothetical, improper uses of their software in the hands of others as the primary reason behind their current unfitness. One must squint harder to discern whether these firms believe facial recognition tools are problematic in and of themselves.

To miss this distinction is to miss the entire need for government action and the limits of corporate self-regulation as a national strategy. Though unmentioned in these tech titans’ official statements, the full technical context for why such tools are concerning in any user’s hands are neatly summarized in reports like this one from MIT Media Lab and Microsoft Research in 2018 that lays out specific ways in which facial recognition programs routinely misidentify non-white faces, particularly those of non-white women. These findings demonstrate that such technologies, even when used competently, will not make policing any smarter, less discriminatory or less capable of encroaching on people’s freedoms. The replacement of systemic, interpersonal racism with hard-coded, digital racism is no reform at all.

People need to know what tools are currently in use to police their communities. And they are, indeed, in use. In the fall of 2016, Georgetown Law’s Center on Privacy & Technology published a study claiming that 50% of American adults already had images of their faces captured in at least one police database. Additionally, as recently as 2019, the Department of Homeland Security announced plans to use facial recognition software to monitor 97% of air travelers. A lack of transparency and further delays to meaningful regulation only allow the programs these tools support to render themselves more essential to the day-to-day operations of law enforcement.

The effects of national incoherence on this critical issue are not neutral. Self-imposed bans, incongruous local decisions lacking clear federal guidance, and the steady march of more authoritarian uses of such technologies around the world are insufficient responses to the questions this moment of global reckoning has thrust upon the world’s police and security agencies.

*[The views expressed in this column are the author’s own and not of his employer, Grant Thornton International Ltd. Fair Observer is a media partner of Young Professionals in Foreign Policy.]

The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy.

Share Story
CategoriesAmerican News, Asia Pacific, China News, Europe, Europe News, Insight, North America, Technology, US news, World News TagsBen Verdi, facial recognition dangers, facial recognition technology, facial recognition technology inaccuracies, facial recognition technology racial bias, Police Reform, police surveillance technology, police use of facial recognition technology, tech news, Young Professionals in Foreign Policy
Join our network of more than 2,000 contributors to publish your perspective, share your story and shape the global conversation. Become a Fair Observer and help us make sense of the world.

Fair Observer Recommends

In China’s Net City, Opportunity Comes at Uncertain Costs In China’s Net City, Opportunity Comes at Uncertain Costs
By Benjamin Verdi • Jan 12, 2021

Post navigation

Previous PostPrevious Trans and Non-Binary Voters Face Disenfranchisement in US Election
Next PostNext Donald Trump: The Worst Kind of Populist
Subscribe
Register for $9.99 per month and become a member today.
Publish
Join our community of more than 2,500 contributors to publish your perspective, share your narrative and shape the global discourse.
Donate
We bring you perspectives from around the world. Help us to inform and educate. Your donation is tax-deductible.

Explore

  • About
  • Authors
  • FO Store
  • FAQs
  • Republish
  • Privacy Policy
  • Terms of Use
  • Contact

Regions

  • Africa
  • Asia Pacific
  • Central & South Asia
  • Europe
  • Latin America & Caribbean
  • Middle East & North Africa
  • North America

Topics

  • Politics
  • Economics
  • Business
  • Culture
  • Environment
  • Global Change
  • International Security
  • Science

Sections

  • 360°
  • The Interview
  • In-Depth
  • Insight
  • Quick Read
  • Video
  • Podcasts
  • Interactive
  • My Voice

Daily Dispatch


© Fair Observer All rights reserved
We Need Your Consent
We use cookies to give you the best possible experience. Learn more about how we use cookies or edit your cookie preferences. Privacy Policy. My Options I Accept
Privacy & Cookies Policy

Edit Cookie Preferences

The Fair Observer website uses digital cookies so it can collect statistics on how many visitors come to the site, what content is viewed and for how long, and the general location of the computer network of the visitor. These statistics are collected and processed using the Google Analytics service. Fair Observer uses these aggregate statistics from website visits to help improve the content of the website and to provide regular reports to our current and future donors and funding organizations. The type of digital cookie information collected during your visit and any derived data cannot be used or combined with other information to personally identify you. Fair Observer does not use personal data collected from its website for advertising purposes or to market to you.

As a convenience to you, Fair Observer provides buttons that link to popular social media sites, called social sharing buttons, to help you share Fair Observer content and your comments and opinions about it on these social media sites. These social sharing buttons are provided by and are part of these social media sites. They may collect and use personal data as described in their respective policies. Fair Observer does not receive personal data from your use of these social sharing buttons. It is not necessary that you use these buttons to read Fair Observer content or to share on social media.

 
Necessary
Always Enabled

These cookies essential for the website to function.

Analytics

These cookies track our website’s performance and also help us to continuously improve the experience we provide to you.

Performance
Uncategorized

This cookie consists of the word “yes” to enable us to remember your acceptance of the site cookie notification, and prevents it from displaying to you in future.

Preferences
Save & Accept