Big Tech Is Now Scanning Your Phones for Illegal Content

0
753

This is one of those things that seems inarguable when you first come across it but once you take a closer look? It’s not what it seems to be. That’s how the liberals roll, that’s how they do things. Apple initially announced that their users’ phones were going to be scanned for child pornography and evidence of child sexual abuse. Those who are found to have engaged in such crimes are going to be turned into the authorities.

Of course, this is where most people would stop reading. That seems simple enough. Why shouldn’t criminals this heinous be brought to justice? The concerns that are being raised are easy to justify once you stop to examine what is taking place.

For starters, who is going to be responsible for determining what illegal activity is taking place? Will they have any experience in the law enforcement field? Their answer is very disconcerting: an artificial intelligence algorithm is going to be making these very important decisions that will affect everyone’s future (and freedom). That’s how the initial steps are going to be unfolding before a human has a chance to get involved.

If you think that encrypted messages are safe, nothing could be further from the truth. Those messages are also going to be scanned, so best of luck. NPR has more about this plan, which is sure to be controversial as the rest of America slowly learns the awful truth:

“Apple unveiled plans to scan U.S. iPhones for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused, including by governments looking to surveil their citizens.

The tool designed to detected known images of child sexual abuse, called “neuralMatch,” will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified.

Separately, Apple plans to scan users’ encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates.”

Digital privacy advocates are already pointing out the most obvious concern: what happens to people who happen to have innocent photos on their phones that could be considered wrongful? For example, there are sure to be a lot of parents who are worried because they have pictures on their phones of their children in the bath, etc.

Apple claims that this is not going to be a problem because they will only be looking for images that match the National Center for Missing and Exploited Children database. This system sounds imperfect. All of the children that are currently being abused and have yet to be discovered will not be helped. Secondly, why would a human need to review the image any further?

This is certainly going to be a wake up call for those who believe that they are safe from all forms of digital surveillance because they are using encrypted messaging apps. Apple can claim that they are only doing this because of their very noble goal but that’s hardly the case. If they can do this now, what is stopping them from doing it all of the time?

NPR also spoke with a a cryptography researcher from Johns Hopkins University, who raised another possibility that has yet to be considered. Researchers at the university tested the system by sending innocent images to one another, which were still being flagged. Apple claims that this cannot and will not happen but you will have to excuse everyone for being skeptical on this one.

There’s no real way to stop this, either. Apple’s not beholden to the American government. They are a corporation that is more than able to call their own shots. Since there is not much we can do at this point, all we can say is this: Arizona Democratic State Senator Tony Navarrete should have been caught that much sooner!