Apple started out saying that they can't decrypt data from anyone's phone. They fought a lawsuit from the FBI over the San Bernadino terrorist phones. This is one of the reasons why I went all-in on Apple, because they were willing to fight the government over our privacy.
Now, years later, they don't encrypt iCloud backups because the FBI told them not to.
Google used to human review all copyright violations on Youtube. Fast forward a few years later, and all copyright violations are demonstrably shown to be approved and the content generator needs to prove that they didn't violate copyright. Look at the violations over white noice. Google doesn't even care anymore, they just let the copyright violations go through and affect the content creators with no review.
To believe that Apple employees will review CSAM-flagged photos 5 years from now is so incredibly naive, it's actually funny. You can bet they are working on AI that will handle this for them right now in Cupertino.
And then, it will be random chance whether or not we are flagged and labelled as pedophiles, or if a government wants to tag us as "problematic" and wants access to our phones because we are journalists and they want to see our anonymous sources.
It's naive to think that it won't go in this direction.
If you're a pedophile, after this announcement you will delete all the photos off your iPhone and never use it again. After the first few rounds of arrests and cleansing, it will be well knowing within the pedophile community not to use iPhones. And then the only ones who will be getting their photos scanned will be innocent people. So the entire feature doesn't make sense at all. It's a ruse.
Apple started out saying that they can't decrypt data from anyone's phone. They fought a lawsuit from the FBI over the San Bernadino terrorist phones. This is one of the reasons why I went all-in on Apple, because they were willing to fight the government over our privacy.
Now, years later, they don't encrypt iCloud backups because the FBI told them not to.
Google used to human review all copyright violations on Youtube. Fast forward a few years later, and all copyright violations are demonstrably shown to be approved and the content generator needs to prove that they didn't violate copyright. Look at the violations over white noice. Google doesn't even care anymore, they just let the copyright violations go through and affect the content creators with no review.
To believe that Apple employees will review CSAM-flagged photos 5 years from now is so incredibly naive, it's actually funny. You can bet they are working on AI that will handle this for them right now in Cupertino.
And then, it will be random chance whether or not we are flagged and labelled as pedophiles, or if a government wants to tag us as "problematic" and wants access to our phones because we are journalists and they want to see our anonymous sources.
It's naive to think that it won't go in this direction.
If you're a pedophile, after this announcement you will delete all the photos off your iPhone and never use it again. After the first few rounds of arrests and cleansing, it will be well knowing within the pedophile community not to use iPhones. And then the only ones who will be getting their photos scanned will be innocent people. So the entire feature doesn't make sense at all. It's a ruse.