Before we get started, we would like to be as fair as possible. Google is not looking to develop new technology that works as a digital enhancement for the critical race theory curriculum. The new technology that is being developed is not meant to serve as that type of extension. However, there are a number of concerns that need to be addressed.
Their new system will “measure skin tone” but it is not being used for only white people. The technology will also measure how black or brown the person is. According to Google, they have a very noble goal. They are looking to improve upon the facial recognition software that is currently on the market.
Google also wants to know if the current software is biased against those who are not white. Yes, the idea of a software program with inherent racial bias seems silly on its face but that does not mean that the algorithms cannot be flawed. In these instances, they can produce results that are truly awful.
Once they are able to pull this off, we find ourselves wondering where we go from there. Will the product be used for applications that are focused on the concept of racial justice? Google tried their best to describe the project to Reuters:
“Google told Reuters this week it is developing an alternative to the industry-standard method for classifying skin tones, which a growing chorus of technology researchers and dermatologists says is inadequate for assessing whether products are biased against people of color.
At issue is a six-color scale known as Fitzpatrick Skin Type (FST), which dermatologists have used since the 1970s. Tech companies now rely on it to categorize people and measure whether products such as facial recognition systems or smartwatch heart-rate sensors perform equally well across skin tones.
Critics say FST, which includes four categories for “white” skin and one apiece for “black” and “brown,” disregards diversity among people of color.”
Facial recognition software is definitely rife with flaws, as we have seen in the past. Amazon’s Rekognition software has already been called out for being laughably terrible when it comes to racial benchmarks. The success rates for white males and Hispanic males were high enough.
Oddly, white females are where this software had its worst struggles. They were identified as males at least 7 percent of the time. Black females had an even worse success rate, at less than 50 percent. The results are a bit disturbing but we can’t lie, they are also pretty funny.
The ACLU tested out this software by scanning the images of legislators from California and comparing them with a mugshot database. Over two dozen elected officials were identified as criminals. Emotionless software should not struggle this badly. The Fitzpatrick Skin Type test is said to be the culprit but we are not so sure. This may be fixable but we imagine that it would take a sizable amount of work to handle a problem of this nature.
Google’s new solution just might make its way into our nation’s ongoing racial justice debate. No one can say whether this is a good idea or not. It’s hard to say until we have seen the software in action. Our skepticism stems from the amount of people in the world who would be described as racially ambiguous. With all of the mixed race people that there are in the world, how can any form of software properly capture that?
This is just the latest scheme from the far left to keep the country divided along racial lines. The Democrats and the mainstream media are at the head of this list as well. Computer algorithms should not be able to make these types of decisions, regardless of which side of the aisle you find yourself on. Gone are the days when we were able to judge people by the content of their character, we suppose.