Of the notable announcements that tech giant Apple made recently, is one towards protecting children.
We completely understand and support cracking down on child trafficking, sexual abuse, and more. However, where we start to get weary is when we learn about the scope of how Apple is going about this.
Apple has laid out plans to implement this in every part of their operating systems from their phones to their computers.
On one hand the program adds a parental control option to Messages, hiding sexually explicit images for users under the age of 18 and alerting parents if a kid under the age of 12 views or transmits such images.
On the other hand, Apple will search all incoming messages for this content and report it to the necessary parties. The last new function searches iCloud Photos for child sexual abuse material or CSAM. It reports it to Apple administrators, who can then forward it to the National Center for Missing and Exploited Children, or NCMEC. Apple claims that this feature was created particularly to preserve users' privacy while searching for unlawful content.
And this is where it gets scary.
Because who says that this won't be used for other more "big brother" reasons in the future? One day it's these photos, the next day it's targeting people who are on the grid for unpaid parking tickets, or more. If they can stop a photo in its tracks, does that mean texts are next?
Right now this stops at Apple, but how can we be certain that Government involvement isn't a possibility down the line. And if you read our blog regularly, are these tech giants like Apple, Google, or Amazon really in our best interest?
Thankfully, there are ways to protect yourself in the future against possible other crackdowns.
Comments will be approved before showing up.