Private Cloud Compute Security Guide

(security.apple.com)

Comments

lukev 6 November 2024
There's something missing from this discussion.

What really matters isn't how secure this is on an absolute scale, or how much one can trust Apple.

Rather, we should weigh this against what other cloud providers offer.

The status quo for every other provider is: "this data is just lying around on our servers. The only thing preventing a employee from accessing it is that it would be a violation of policy (and might be caught in an internal audit.)" Most providers also carve out several cases where they can look at your data, for support, debugging, or analytics purposes.

So even though the punchline of "you still need to trust Apple" is technically true, this is qualitatively different because what would need to occur for Apple to break their promises here is so much more drastic. For other services to leak their data, all it takes is for one employee to do something they shouldn't. For Apple, it would require a deliberate compromise of the entire stack at the hardware level.

This is very much harder to pull off, and more difficult to hide, and therefore Apple's security posture is qualitatively better than Google, Meta or Microsoft.

If you want to keep your data local and trust no-one, sure, fine, then you don't need to trust anyone else at all. But presuming you (a) are going to use cloud services and (b) you care about privacy, Apple has a compelling value proposition.

solarkraft 6 November 2024
Sibling comments point out (and I believe, corrections are welcome) that all that theater is still no protection against Apple themselves, should they want to subvert the system in an organized way. They’re still fully in control. There is, for example, as far as I understand it, still plenty of attack surface for them to run different software than they say they do.

What they are doing by this is of course to make any kind of subversion a hell of a lot harder and I welcome that. It serves as a strong signal that they want to protect my data and I welcome that. To me this definitely makes them the most trusted AI vendor at the moment by far.

lxgr 6 November 2024
This is probably the best way to do cloud computation offoading, if one chooses to do it at all.

What's desperately missing on the client side is a switch to turn this off. It's really intransparent which Apple Intelligence requests are locally processed and which are sent to the cloud, at the moment.

The only sure way to know/prevent it a priori is to... enter flight mode, as far as I can tell?

Retroactively, there's a request log in the privacy section of System Preferences, but that's really convoluted to read (due to all of the cryptographic proofs that I have absolutely no tools to verify at the moment, and honestly have no interest in).

jagrsw 6 November 2024
If Apple controls the root of trust, like the private keys in the CPU or security processor used to check the enclave (similar to how Intel and AMD do it with SEV-SNP and TDX), then technically, it's a "trust us" situation, since they likely use their own ARM silicon for that?

Harder to attack, sure, but no outside validation. Apple's not saying "we can't access your data," just "we're making it way harder for bad guys (and rogue employees) to get at it."

h1fra 6 November 2024
Love this, but as an engineer, I would hate to get a bug report in that prod environment, 100% don't work on my machine and 0% reproducibility
sourcepluck 7 November 2024
Does anyone have any links to serious security researchers discussing this adversarially or critically? Or, if it's too soon as it's such a recent release, links to the types of serious security researchers who publish that sort of thing.

The discussion here would seem to suggest there's definitely a need for such a thing. Bruce Schneier comes to mind, and doing a search of:

"cloud" site:https://www.schneier.com/

did have a few results. Would be interested in more trustworthy figures to have a read of.

m3kw9 6 November 2024
I will just use it, it’s Apple and all I need is to see the verifiable privacy thing and I let the researchers let me know red flags. You go on copilot, it says your code is private? Good luck
majestik 7 November 2024
PCC is a highly secure transport system for routing user queries to Siri, which then failover to ChatGPT over the public internet.
curt15 6 November 2024
For the experts out there, how does this compare with AWS Nitro?
gigel82 6 November 2024
I'm glad that more and more people start to see through the thick Apple BS (in these comments). I don't expect them to back down from this but I hope there is enough pushback that they'll be forced to add a big opt-out for all cloud compute, however "private" they make it out to be.
natch 6 November 2024
>No privileged runtime access: PCC must not contain privileged interfaces that might enable Apple site reliability staff to bypass PCC privacy guarantees.

What about other staff and partners and other entities? Why do they always insert qualifiers?

Edit: Yeah, we know why. But my point is they should spell it out, not use wording that is on its face misleading or outright deceptive.

_boffin_ 6 November 2024
I really don’t care at all about this as the interactions that I’d have would be the speech to text, which sends all transcripts to Apple without the ability opt out.