Facial Recognition Technology Disputes Resolved

Facial recognition technology has greatly advanced into our everyday life. From things as basic and fun as Facebook images to things not so funny as being a crime suspect. When placed side by side with other biometric systems (fingerprints and iris recognition), the accuracy level of Facial recognition is notably lower. However, it is still employed due to its conventional and contactless process.

For years, the algorithms involved in facial recognition have been accused of bias, especially since they are alleged to better process white faces than non-white faces. Recent outbursts of abuse by law enforcement agents have escalated interests in activists, scholars, leaders in statehouses, and Congressmen.

The undeserved arrest of Robert Williams, a Detroit, African-American man has sparked new concerns about the extent to which these machines make the decision. Stanford University’s Digital Society Lab fellow, Mutale Nkonde said, ‘what is different at this moment is, we have explainability and people are beginning to realize the way these algorithms are used for decision-making.’

The algorithms are trained using data sets that are exclusive of a wide array of people. Since the system is only as good as the data with which they are trained, developing these systems using only one group of people results in a predilection for that group only and a mindless prejudice against any unfamiliar group.

Neutrality vs brutality

Poet of code and founder of the Algorithm Justice League, Joy Buolamwini has been, in her own words, on a mission to stop an unseen force that is rising. In her fight to ensure the ethical use of technology, she has shown, through research how facial recognition applications from tech-giants IBM, Microsoft, Amazon, and China’s Megvii exhibit prejudice towards non-white people.

A 2018 research by MIT scholars, led by Buolamwini revealed shortcomings in the use of facial recognition technology and prompted reactions from MicrosoftIBM, and Amazon. While Microsoft, IBM, and Megvi sought for improvements, Amazon criticized Joy’s research methods and dismissed her claims as erroneous and misleading.

Intelligent Facial Recognition

Dueces

In June of 2020, these big software companies finally put their feet down, limiting the use of the facial-recognition biometric. This follows the killing of George Floyd, a 46-year-old black man, in Minneapolis, Minnesota, who died after a policeman dug his knee into his neck for almost eight minutes.

IBM, in a letter to Congress, disclosed that it will no longer offer facial-recognition services. This 109-year old company has decided to give up the manufacture of facial-recognition software completely, choosing rather support Congress in ensuring justice and equality. This move was followed by Amazon two days later. The internet sales company said it has placed a one-year moratorium on the use of Rekognition, its facial recognition tool, by law enforcement. However, they will offer their facial recognition to rights organizations focused on lost children and human trafficking.

Microsoft was next in line saying it rescinds the offer of its facial recognition technology to police departments across the U.S until proper legislation that accommodates human rights is put in place.

This notwithstanding, the race to get prejudice out of our machines is still on. Buolamwini insists that there must be a choice in the use of these innovations and that these technologies must be developed with great oversight.

Other Schools

Not everyone agrees, however, that bans should be placed on the use of facial recognition technology by U.S. law enforcement agencies until proper legislation is ensured. Vice President at the information technology and Innovation Foundation (ITIF) Daniel Castro says banning facial recognition makes little sense and will not advance efforts at police reform. He insists that what is rather needed is more testing and transparency since there are accurate systems. Castro inferred that the bias can be solved by improving policy rather than deracinating technology.

Recommended Posts

No comment yet, add your voice below!


Add a Comment

Your email address will not be published. Required fields are marked *