Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think the "confusion" was 100% intentional. That the two features (iMessage scanning & on-device spying pre-upload to iCloud) were intentionally released at the same time to make the whole thing harder to criticize in a soundbite.

Confusion is the best-case scenario for Apple because people will tune it out. If they had released just the on-device spying, public outcry and backlash would have been laser targeted on a single issue.



Fanatics also have a tendency to try to latch onto whatever details may offer a respite from the narrative. The core problem here is that Apple is effectively putting code designed to inform the government of criminal activity on the device. It’s a bad precedent.

Apple gave its legendary fan base a fair few facts to latch onto; the first being that it’s a measure against child abuse, which can be used to equate detractors to pedophile apologists or simply pedophiles (these days, more likely directly to the latter.) Thankfully this seems cliché enough to have not been a dominant take. Then there’s the fact that right now, it only runs in certain situations where the data would currently be unencrypted anyways. This is extremely interesting because if they start using E2EE for these things in the future, it will basically be uncharted territory, but what they’re doing now is only merely lining up the capability to do that and not actually doing that. Not to mention, these features have a tendency to expand in scope in the longer term. I wouldn’t call it a slippery slope, it’s more like an overton window of how much people are OK with a surveillance state. I’d say Americans on the whole are actually pretty strongly averse to this, despite everything, and it seems like this was too creepy for many people. Then there’s definitely the confusion; because of course, Apple isn’t doing anything wrong; everyone is just confusing what these features do and their long-term implications.

Here’s where I think it backfired: because it runs on the device, psychologically it feels like the phone is not trustworthy of you. And because of that, using anti-CSAM measures as a starting point was a Terrible misfire, because to users, it just feels like your phone is constantly assuming you could be a pedophile and need to be monitored. It feels much more impersonal when a cloud service does it off into the distance for all content.

In practice, the current short-term outcome doesn’t matter so much as the precedent of what can be done with features like this. And it feels like pure hypocrisy coming from a company whose CEO once claimed they couldn’t build surveillance features into their phones because of pressures for it to be abused. It was only around 5 years ago. Did something change?

I feel like to Apple it is really important that their employees and fans believe they are actually a principled company who makes tough decisions with disregard for “haters” and luddites. In reality, though, I think it’s only fair to recognize that this is just too idealistic. Between this, the situation with iCloud in China, and the juxtaposition of their fight with the U.S. government, one can only conclude that Apple is, after all, just another company, though one whose direction and public relations resonated with a lot of consumers.

A PR misfire from Apple of this size is rare, but I think what it means for Apple is big, as it shatters even some of the company’s most faithful. For Google, this kind of misfire would’ve just been another Tuesday. And I gotta say, between this and Safari, I’m definitely not planning on my next phone being from Cupertino.


> I’d say Americans on the whole are actually pretty strongly averse to this, despite everything, and it seems like this was too creepy for mant people.

You mean that country which gives a damn about privacy altogether because all those fancy corps are giving them toys to play? You know, those companies which feed on the worlds populations data as a business model. The country which has a camera on their front door which films their neighbourhood 24/7? The country which has listening devices all over their homes in useless gadgets?

You have to be joking or that scale you impose here is useless.

This whole thing will go by fast and there won't be much damage on the sales side. Apple is the luxus brand. People don't buy it for privacy. Most of the customers won't probably even understand the problem here.

The only thing we might be rid of are those songs of glory in technical spheres.


> Apple is the luxus brand. People don't buy it for privacy.

Privacy is the main selling point Apple is pushing in their current PR campaigns. They've been slowly building up a brand around privacy with new privacy features.

They've just sunk that entire brand/campaign. Instead of "iPhone, the phone that keeps all your data private", it's "iPhone, the phone that looks through your pictures and actively rats you out to police to ruin your life".


Maybe in the future people will realize that it's a mistake to market your company as being unequivocally on the side of privacy. Nothing was ever private at the point where cloud companies are forced to comply with the government to some extent or face extinction through legal pressure.

It isn't like the ultimate goal of protecting children isn't worth fighting for, and the ICMEC considers half the countries in the world having no laws against CSAM to be "simply unacceptable." But companies that insist that everything they host can remain private to everyone are lying to their users, and will have to align their marketing claims with the reality of the law, or this kind of backlash will result.

But in general, there are a lot of other descriptors besides "private" that are nothing more than baseless Corporate Memphis copy.


The reason they pushed privacy was because of the media attention that Androids bad privacy got. Please don't tell me you believe privacy was at the usual consumers mind when they bought their devices...this is ridiculous or you don't meet many normal users. It's marketing. They'll something new. You can fit everything in front of a white background...


From https://www.sellcell.com/blog/apple-privacy-survey/

>A majority (72%) of iPhone & iPad users are aware of new privacy changes in recent software updates. When asked how well they understand Apple’s new privacy policies, these were the responses: Extremely well (13%), Very well (29%), Moderately well (21%), Slightly well (9%), and Not well at all (28%). Two in three (65%) users are “extremely” or “very” concerned about their activities being tracked as they use certain websites and apps, while only 14% said they were not at all concerned.

And from https://www.androidauthority.com/android-app-tracking-transp...

>You told us: You really want an Apple-like anti-app tracking feature on Android... Over 30,000 people voted in favor of an App Tracking Transparency feature on Android.

People are becoming extremely conscious of online (and on-phone) privacy issues. Where have you been?

Your statement "don't tell me you believe privacy was at the usual consumers mind when they bought their devices... this is ridiculous or you don't meet many normal users." is itself, ridiculous.


Besides the obvious and continuos popularity of apps with bad reputation in the privacy department, those surveys have all been made after the campaign was launched. The only thing it does is to show that the advertising message has been delivered. Nothing else.


So, the marketing is working and people are increasingly aware of privacy issues? In that case, would you say this statement:

>don't tell me you believe privacy was at the usual consumers mind when they bought their devices... this is ridiculous or you don't meet many normal users.

Is true or false? Because you seem to be contradicting yourself. Are people aware of privacy or not? Is the marketing working or not?

I suspect you were trying to say that people weren't aware previously. Is that correct? Because I don't think anyone would disagree with that.


You just quote some survey stats. They are bullshit. What people want and what they say they want are two entirely different things.

Small example: I run a small app. The number of GDPR-related requests is zero. The number of emails like "can my deleted account and deleted data still be recovered?" is like 1 per month or so.

WhatsApp is also a good counter example. People want privacy, but what they want more is utility and network effects. Most people I know didn’t abandon WhatsApp despite numerous privacy mishaps and I think the same will happen here with Apple. This will blow over – unfortunately.


People don't care about privacy, got it. No one is quitting Facebook, no one is de-Googling their lives, installing privacy extensions/apps and making purchases based on privacy issues.

The worlds largest online companies increasingly making privacy a priority in their marketing are all wrong. Got it.

>Most people I know didn’t abandon WhatsApp despite numerous privacy mishaps

I never said privacy was the top issue, trumping all else. You're absolutely correct that other issues like convenience and network effects are important.


I bought my first iPhone this year, and privacy was the reason.


Congratulations.

How did that work out for you?


Can't say I'm pleased right now. Apple can still change course.


Could we not pretend, please, that the US is the only country with a lot of pervasive surveillance. Because that's clearly laughable.


Could we not build straw man, please.

I never did that.

Americans were the topic here. See quote.


> The core problem here is that Apple is effectively putting code designed to inform the government of criminal activity on the device. It’s a bad precedent.

This is wildly disingenuous.

Apple is putting code on the device which generates a hash, compares hashes, and creates a token out of that comparison. That is 100% of what happens on the device.

Once the images and tokens are uploaded to iCloud photos, iCloud will alert if 30+ of those security tokens show a match, it will alert Apple's team, and they will get access to only those 30+ photos. They will manually review those photos, and if they then discover that you are indeed hoarding known child pornography then they report you to the authorities.

Thus, it would be more accurate to say that apple is putting on your device code which can detect known child pornographic images.

> And it feels like pure hypocrisy coming from a company whose CEO once claimed they couldn’t build surveillance features into their phones because of pressures for it to be abused.

This isn't a surveillance feature. If you don't like it, disable iCloud Photos. Yes, it could theoretically be abused if Apple went to the dark side, but we'll have to see what this 'auditability' that he was talking about is all about.

Honestly, with all of the hoops that Apple has jumped through to promote privacy, and to call out people who are violating privacy, it feels as though we should give Apple the benefit of the doubt at least until we have all the facts. At the moment, we have very few of the facts.


Describing the implementation details does nothing to change the reality that the device is acting as an informant against its owner. The number of hoops literally changes nothing. Adding an AI model versus using SHA sums changes nothing. Adding some convoluted cryptography system to implement some additional policy changes nothing. In trivial cases like anti-piracy measures or anti-cheat in games, we tolerate that the device will sometimes act against our best interest, but at least in this case, the stakes are low and the intentions are transparent.

We have every fact we need to know to know this shouldn’t be done, and I’m glad that privacy orgs like EFF have already spoken much to this effect.


> Yes, it could theoretically be abused if Apple went to the dark side, but we'll have to see what this 'auditability' that he was talking about is all about.

Or we can just short circuit the entire issue by deciding firmly we don't want this and punish Apple's behaviour accordingly. Which is what appears to be happening.

> it feels as though we should give Apple the benefit of the doubt

It really doesn't feel like this to me at all. Users have clearly stated: we don't want this. It's time for Apple to simply pull it all back and apologize.


> Once the images and tokens are uploaded to iCloud photos, iCloud will alert if 30+ of those security tokens show a match, it will alert Apple's team, and they will get access to only those 30+ photos. They will manually review those photos, and if they then discover that you are indeed hoarding known child pornography then they report you to the authorities.

Apple: We have 29 matches on your device of "known CSAM"[0]. However, despite our confidence level being very high, we won't report it to the authorities because we value your privacy!

[0] for varying definitions of 'known CSAM'.


> This isn't a surveillance feature.

> Thus, it would be more accurate to say that apple is putting on your device code which can detect known child pornographic images

> If you don't like it, disable iCloud Photos.

> Yes, it could theoretically be abused if Apple went to the dark side [...]

> [...] it feels as though we should give Apple the benefit of the doubt at least until we have all the facts.

No, nobody gets "the benefit of the doubt". The very use of that phrase admits that you are being put into a situation where you could be screwed in the future.

There is zero transparency or oversight into the code that does the scanning, the in-person review process, or the database of images being scanned for.


They created a tool that, in principle, lets a government ask about certain hash matches that are on the iPhone but not necessarily on iCloud, correct?

There is no way to determine whether the hashes are about CP or about HK protests.


> Yes, it could theoretically be abused if Apple went to the dark side, but...

> ...we should give Apple the benefit of the doubt...

You have to take off your apple branded rose tinted glasses my friend.

Any company as big as apple needs to be scrutinized as harshly and critically as possible.

Their influence on the world is so big that a botched roll out of this sort of tech could be absolutely devastating for so many people, for so many reasons.

I don't care if it's hashed tokens or carrier pidgins. We should only allow companies to act in ways that improve our lives. Full stop.


With enough eyeballs, all disinformation/bugs are shallow.


Do you have a source on the iMessage thing? I don’t remember seeing anything about iMessage but maybe I failed to adequately read the press release.


It's a feature that only applies to kids under 18 who're in a family group, whose parents turn it on. It warns the kid before letting them see an image which machine-learning thinks is nudity. If the kid is 12 or under, their parents can be notified if they choose to see it. It apparently does no reporting to anyone apart from that parental notification.

Check the section "WHAT IS APPLE DOING WITH MESSAGES?" in this article: https://www.theverge.com/2021/8/10/22613225/apple-csam-scann...


I shudder to think of the thousands of gay children this will out to their unaccepting parents, some resulting in physical, verbal or emotional abuse -- or worse.

This feature is a catastrophe.


The thousands of gay children under the age of 13, whose parents have opted into this program, who are sending or receiving pornographic images via iMessage and have chosen when prompted to notify their parents of that?

I can’t imagine that’s a big issue…




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: