To be fair - I think reasonable people can disagree on this.
I don't think it's a rationalization to point out that it only occurs when the same baseline conditions are met (using the cloud). I think those constraints/specifics matter. I wouldn't be in favor of the policy if they were different (and I'm not even sure I'm in favor of it now).
My personally preferred outcome would be e2ee by default for everything without any of this, but I also understand the concerns of NCMEC and the general tradeoffs/laws around this stuff (and future regulatory risk of CSAM) - and just the general issue of reducing child sexual abuse.
I am also in favour of E2E by default for everything without any device or cloud based scanning. However, Apple doesn't want to be caught in having developed a service that enables for child exploitation. Doing nothing may have even more invasive requirements legally forced by government, so Apple is stuck with a dilemma. Also lets not forget that Apple should also not want child exploitation to occur and therefore also should do something.
The question I have for drenvuk is how else is Apple able to prevent or detect child exploitation and the storage or distribution of content such as this on Apple's services?
I don't think it's a rationalization to point out that it only occurs when the same baseline conditions are met (using the cloud). I think those constraints/specifics matter. I wouldn't be in favor of the policy if they were different (and I'm not even sure I'm in favor of it now).
My personally preferred outcome would be e2ee by default for everything without any of this, but I also understand the concerns of NCMEC and the general tradeoffs/laws around this stuff (and future regulatory risk of CSAM) - and just the general issue of reducing child sexual abuse.