Can Apple Intelligence—or Any AI, Really Be Private?

You already know the answer: yes, and no

  • Apple Intelligence privacy is supported by all kinds of promises.
  • Apple's end of the deal looks surprisingly robust.
  • But you should be wary of cloud platforms in general. 
A colorful brain next to a guitar.
Would you sacrifice your privacy for this?.

Apple

Cloud computing is pretty much the opposite of private, and yet Apple assures us that it has managed to do the impossible, using its usual combination of Apple hardware and software. The result is surprisingly robust, but there are still a few dangers.

Apple has made a lot of promises about Apple Intelligence privacy in iOS 18, including lots of on-device work, Private Cloud Compute, and warnings before it kicks you out into the wild west of ChatGPT. But can AI ever really be private?

"Based on [Apple's white paper on Private Cloud Compute], I would say that they appear to have designed a very tightly restricted architecture at both the hardware and software level, from the client/endpoint all the way to processing. Given how the architecture has been described, it appears likely that it would be extremely difficult for an attacker to steal data within the PCC space," Clyde Williamson, senior product security architect at data-security company Protegrity, told Lifewire via email. "The remaining risk, really, is how much we trust Apple. When we pass information to someone else, at the bottom of the security model, there is always an assumption of trust."

Secure By Design

There are three parts to Apple's AI products, with increasing levels of trust required as you escalate. First are operations that happen on-device, for example Siri may draw on your contacts and calendar events to make suggestions. Then there is Apple's "Private Cloud Compute," which sends queries off to the cloud and executes them on servers running on Apple Silicon chips. And the third level is where all bets are off—this is where your data is sent off to ChatGPT (or other future partners) to use those services.

Screenshot of a grid of pictures of a person with stickers on their face.
Apple Intelligence is going to revolutionize computing.

Apple

In many ways these security levels are similar to the security tradeoffs of cloud computing in general. Most of us don't worry about what we keep in Dropbox, for example, even though all your Dropbox data can be accessed by Dropbox itself. AI in the current format is new, and so we're giving it more scrutiny than the stuff we use already, like Google search, all email, Amazon, and so on, none of which is private.

Of Apple's options, on-device is clearly the most private, and Apple has a great track record in this regard. For years, the Photos app has used AI to recognize faces, and this data does not leave your devices. Your Health data is similarly protected and has only recently been allowed to sync between your devices.

One possible hole in Apple's on-device AI privacy is in your iCloud backups, where otherwise secure data can be accessed—your iMessage backups, for example. This is especially true if you don't take advantage of security features baked into iOS. Fortunately, iOS 18 also adds encryption for almost everything in your iCloud backup.

The third option—ChatGPT—is so obviously a privacy nightmare that we won't go deep into it. While Apple does offer some ChatGPT specific protections, if you're concerned about a company taking your data, just never ever choose to use a third-party AI service in iOS 18.

This brings us to Private Cloud Compute, perhaps the most interesting part of this whole story.

Private Cloud Compute

With a messaging app like iMessage or Signal, you can encrypt the message "end-to-end," so the servers used to send it cannot ever see the message itself. But because cloud AI has to actually see your request, it cannot be end-to-end encrypted.

Apple's answer is to end-to-end encrypt the data on its way to the server but decrypt it before feeding it onto its new private computer "nodes." which are built in a similar way to your iPhone, with a secure enclave, etc. Nothing that takes place inside these nodes is accessible to anyone, not even Apple employees. Once the node has worked out that your pizza needs more glue or has generated your picture of shrimp Jesus, it encrypts it and sends it back to you.

Screenshot of the new writing tools in Apple Intelligence.
When writing, you really want to be sure your work stays private.

Apple

"Apple's safeguards—encryption, secure hardware, independent audits—are certainly commendable steps toward securing data in the cloud. However, it's important we remain clear-eyed about limitations. No cloud infrastructure is impervious by design. While exceedingly unlikely, remote possibilities of compromise through cyber intrusion can never be fully eliminated," Alexander De Ridder, CTO for AI company SmythOS, told Lifewire via email.

The takeaway is that Apple has come up with a typically Apple-like solution that uses insane engineering to protect your privacy, although if you're totally paranoid, then you should probably not only avoid these new AI toys but stay away from cloud computing in general.

Apple contacted us after this article was published and asked us to point out that “independent experts can inspect the code that runs on Private Cloud Compute servers.”

If you're moderately paranoid, and you trust Apple (which, as the operating system vendor, you kind of have to), then stick to the on-device and Private Cloud Compute features. And even if you are not one tiny bit paranoid, then you should probably avoid the third-party AI stuff. It's the only way to be sure you really achieve Apple Intelligence privacy.

Update 08/02/2024: Updated information throughout the article to clarify some of Apple's data protection features.

Was this page helpful?