Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't love it but at least it's not that disastrous on-device "CSAM" scanning they were planning to roll out with this. That must have been one of Apple's biggest blunders in history. Apple announcing they would scan our phones for government banned content. Pants on head crazy.


> Apple announcing they would scan our phones for government banned content

Until you realise Apple has been scanning your iCloud photos for government banned content for years.

And that the on device option was better for your privacy not worse.


>>And that the on device option was better for your privacy not worse.

No it wasn't. Previously pictures weren't scanned directly on my phone, then they were about to be. That was a downgrade of privacy, not an improvement. The fact that it was only going to be applied to photos due to be uploaded to iCloud is irrelevant - you are still using my own device to scan my own photos, and if it fails some completely opaque check that I have no control over, you will report me to law enforcement(who will then do god knows what). Not to mention that during the analysis process my pictures will be shown directly to a human in a centre somewhere who will judge them if they are legal or not. Again, I'm really struggling to see how this is an improvement in privacy.

>>Until you realise Apple has been scanning your iCloud photos for government banned content for years.

Source? Apple has maintained for years that they don't scan pictures already uploaded to iCloud, since they are encrypted and Apple doesn't have access to them.


Jane Horvath, Apple’s chief privacy officer, said at a tech conference that the company uses screening technology to look for the illegal images. The company says it disables accounts if Apple finds evidence of child exploitation material, although it does not specify how it discovers it.

https://www.telegraph.co.uk/technology/2020/01/08/apple-scan...

And of course the whole issue with backups not being encrypted.


Dont have time to look for a source, but iCloud Photos are encrypted with keys that Apple does have access to. They hand it over for law enforcement.


Right, but they don't actively scan everything uploaded to iCloud, is my understanding - they will decrypt your iCloud storage when asked, but they don't scan the contents by default. At least that's how I understand it.


>Previously pictures weren't scanned directly on my phone, then they were about to be. That was a downgrade of privacy, not an improvement.

Absolutely not true.

When you scan on the server, like Google does, that information is open to abuse by anyone who issues a warrant, and I wouldn't be willing to bet that the warrant is a hard requirement.

>Google’s Nest Will Provide Data to Police Without a Warrant

https://petapixel.com/2022/07/27/googles-nest-will-provide-d...


>>Absolutely not true.

Which part isn't true? Pictures aren't scanned directly on the phone anywhere, that's true of both Android and iOS. Apple's implementation would change that, again, making their privacy implementation worse not better.

>>When you scan on the server, like Google does

I don't see how that's relevant. Apple wasn't scanning on their servers(at least that's what they say publicly, I don't doubt that NSA has access to the data anyway).


Apple designed their system so that they don't have access to the scan data or the scan results.

Google is scanning everything in your Google Drive on server, and that data can be accessed and abused by anyone willing to issue a warrant.

Google's system is worse for privacy.


>>Apple designed their system so that they don't have access to the scan data or the scan results.

That's explicitly not true - your phone would scan the photos, if it failed the scan, then the photo would be uploaded to their verification centre where a human would look at the pictures in person and decide to forward them to law enforcement or not. We can argue whether the "verification centre" is Apple having access to scan data/results, but I feel like that's splitting hairs. It's a downgrade for the privacy that you used to have with Apple, not an improvement.

>>Google's system is worse for privacy.

I have zero idea why you keep bringing Google up. I'm saying that Apple's proposed system is a downgrade of their privacy implementation, not an improvement. What google does or does not do is not relevant to that argument.


> That's explicitly not true - your phone would scan the photos, if it failed the scan, then the photo would be uploaded to their verification centre where a human would look at the pictures in person and decide to forward them to law enforcement or not.

You seem to be discussing the system Apple never implemented.

I am discussing the system that has just been announced.

Yes, in the system just announced, it's an opt in parental control where Apple has no access to the images being scanned and doesn't know the scan results, while Google is busily scanning all the contents of everything in your Google account.

How do Google customers opt out of that?


>>You seem to be discussing the system Apple never implemented.

What gave it away? The fact that I originally replied to the comment which was discussing that never released CSAM scanning system?

>>I am discussing the system that has just been announced.

So as I suspected, you are arguing against an argument I have never made.

>>How do Google customers opt out of that?

How do I opt out of you asking about Google if literally no part of my argument was about Google?


Every major cloud provider scans their hosting for CSAM automatically. It's the cost of doing business. Either you do it or the feds do it for you. The automated way is a lot cheaper and easier.

If Apple could do it on-device reliably, they could encrypt their storage and there would be no "think of the children" -reason to allow government agencies access to the files anymore.


Again, as I said in my other comments - Apple has explicitly said many times that they don't scan your icloud photos by default, because they are encrypted already. They will hand your encryption keys to law enforcement when presented with a warrant, but by default nothing is scanned.


Apple can scan whatever it likes on their servers. That is worlds away from scanning content on my phone. How on earth can you claim that is better for privacy? It's the exact opposite.


Their proposed system was to scan stuff on your phone as part of the upload-to-iCloud process. I.e. only stuff that was about to be on their servers (where it could be scanned) anyway would get scanned, and you could thus completely opt out by turning off their iCloud Photo Library stuff.

To me this winds up feeling totally fine as a trade-off, particularly if it wound up paving the way for actual end-to-end encryption of the photos, though I see that it bothers some people.


If Apple announced end-to-end, you might have a point. But they didn't, they still haven't, and there are zero indications they will ever offer end-to-end encryption for photos and videos. Given this, this would have been a clear step backwards for privacy, and it was Apple leading the charge this time.


Oh, I totally agree that the end-to-end part is speculation and tea-leaves reading. If it happened it'd take this approach from completely neutral to a net positive. (Again, in my eyes.)


Didn't we have yet another thread on Google scanning all the files in your Google Drive just the other day?


What does Google have to do with any of this? Isn't Apple supposed to be better at privacy than Google?


What does Google have to do with scanning all a customers data? It's something they have already been doing for years.

For instance, this article from 2014:

>Google scans everyone's email for child porn, and it just got a man arrested

https://www.theverge.com/2014/8/5/5970141/how-google-scans-y...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: