Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

To be fair - I think reasonable people can disagree on this.

I don't think it's a rationalization to point out that it only occurs when the same baseline conditions are met (using the cloud). I think those constraints/specifics matter. I wouldn't be in favor of the policy if they were different (and I'm not even sure I'm in favor of it now).

My personally preferred outcome would be e2ee by default for everything without any of this, but I also understand the concerns of NCMEC and the general tradeoffs/laws around this stuff (and future regulatory risk of CSAM) - and just the general issue of reducing child sexual abuse.



I largely agree with your point fossuser.

I am also in favour of E2E by default for everything without any device or cloud based scanning. However, Apple doesn't want to be caught in having developed a service that enables for child exploitation. Doing nothing may have even more invasive requirements legally forced by government, so Apple is stuck with a dilemma. Also lets not forget that Apple should also not want child exploitation to occur and therefore also should do something.

The question I have for drenvuk is how else is Apple able to prevent or detect child exploitation and the storage or distribution of content such as this on Apple's services?


They can rely on users reporting it. They are not obligated by law to do this.


For now, but if they don’t do something like this laws can change and that change could make things a lot worse for privacy than this implementation.

I suspect they’re trying to get ahead of that and solve this in the most privacy protecting way possible.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: