The difference is that Apple has built the infrastructure for _global_, continuous image scanning. No need to hack individual phones or know who your targets are, all you have to do is get some images added to list. If you think the Chinese (or most likely American) government wouldn't love to be able to find out everyone who had copies of certain images, say from protests or inside camps, you're far too optimistic.
This is Apple creating a mechanism for worldwide surveillance and declaring they'd never compromise because they're good people who we can trust. Note that they're not making any actual promises that might land them in court if the system is abuses, they're just repeating that they're good people who'd never cooperate with authoritarian regimes.
- Order Apple to scan all images not just ones being uploaded to iCloud
- Order Apple to change the reporting threshold or remove the safety voucher system unless you only care about users with many matching images
- Compromise or order Apple to skip the human review and pass the reports directly to your government agency
Really you have to compromise or order Apple to change every step in the pipeline except “compare images to hashes” which is the part an intern could do in a weekend.
I’m sure governments would love to know which users have “undesirable” images on their device. I’m just saying instead of doing the above steps to take advantage of the CSAM system, they could just order Apple to scan for the images they want scanned for. Apple’s choice is the same, comply or risk participation in that market.
This is Apple creating a mechanism for worldwide surveillance and declaring they'd never compromise because they're good people who we can trust. Note that they're not making any actual promises that might land them in court if the system is abuses, they're just repeating that they're good people who'd never cooperate with authoritarian regimes.