Security
Headlines
HeadlinesLatestCVEs

Headline

Apple Intelligence Promises Better AI Privacy. Here’s How It Actually Works

Private Cloud Compute is an entirely new kind of infrastructure that, Apple’s Craig Federighi tells WIRED, allows your personal data to be “hermetically sealed inside of a privacy bubble.”

Wired
#vulnerability#web#ios#mac#apple#google#intel#auth

The generative AI boom has, in many ways, been a privacy bust thus far, as services slurp up web data to train their machine learning models and users’ personal information faces a new era of potential threats and exposures. With the release of Apple’s iOS 18 and macOS Sequoia this month, the company is joining the fray, debuting Apple Intelligence, which the company says will ultimately be a foundational service in its ecosystem. But Apple has a reputation to uphold for prioritizing privacy and security, so the company took a big swing. It has developed extensive custom infrastructure and transparency features, known as Private Cloud Compute (PCC), for the cloud services Apple Intelligence uses when the system can’t fulfill a query locally on a user’s device.

The beauty of on-device data processing, or “local” processing, is that it limits the paths an attacker can take to steal a user’s data. The data never leaves the computer or phone, so that’s what the attacker has to target. It doesn’t mean an attack will never be successful, but the battleground is defined and constrained. Giving data to a company to process in the cloud isn’t inherently a security issue—an unfathomable amount of data moves through global cloud infrastructure safely every day. But it expands that battlefield immensely and also creates more opportunities for mistakes that inadvertently expose data. The latter has particularly been an issue with generative AI given the unintended ways that a system tasked with generating content may access and share information.

With Private Cloud Compute, Apple has developed an array of innovative cloud security technologies. But the service is also significant for pushing the limits of what is an acceptable business proposition for a cloud service, seemingly prioritizing secure architecture over what would be most technically efficient or economical.

“We set out from the beginning with a goal of how can we extend the kinds of privacy guarantees that we’ve established with processing on-device with iPhone to the cloud—that was the mission statement," Craig Federighi, senior vice president of software engineering at Apple, tells WIRED. “It took breakthroughs on every level to pull this together, but what we’ve done is achieve our goal. I think this sets a new standard for processing in the cloud in the industry.”

To remove many of the potential attack points and pitfalls that cloud computing can introduce, Apple says its developers focused on the idea that “security and privacy guarantees are strongest when they are entirely technically enforceable” rather than implemented through policies.

In other words, you can have a plate of cupcakes on the counter and make a policy for yourself that you’re not going to eat any of them. Or you could have a policy that you never make or buy cupcakes. But the Private Cloud Compute approach would be to move to a town with no bakeries, rip out your kitchen, and close your credit cards to prevent yourself from buying Easy Bake Ovens. That way there’s no question about your cupcake access or the possibility of accidental cupcake hoarding.

Starting Fresh

Apple created purpose-built servers running Apple processors for PCC and developed a custom PCC server operating system that’s a stripped down, hybrid version of iOS and macOS. The scheme incorporates hardware and software security features the company has developed for Macs and iPhones over the past two decades.

Unlike these consumer devices, though, PCC servers are as bare-bones as possible. For example, they don’t include “persistent storage,” meaning that they don’t have a hard drive that can keep processed data long-term. They do incorporate Apple’s dedicated hardware encryption key manager known as the Secure Enclave and randomize each file system’s encryption key at every boot up as well. This means that once a PCC server is rebooted, no data is retained and, as an additional precaution, the entire system volume is cryptographically unrecoverable. At that point, all the server can do is start fresh with a new encryption key.

PCC servers also use Apple’s Secure Boot to validate the integrity of the operating system and use a code verification feature the company debuted with iOS 17, known as Trusted Execution Monitor. Instead of using Trusted Execution Monitor in the usual way for oversight, though, PCC runs it in a much stricter mode where once the server restarts and completes the boot sequence, the system locks down and can’t load any new code whatsoever. Essentially, all the software the server needs to run gets pummeled with checks and validation and then goes into an envelope that’s sealed before user requests and data can begin to process through.

More broadly, Apple says it completely replaced its normal server management tools for PCC. For example, most cloud platforms have policies and controls to prevent unauthorized access, but they also build “break in case of emergency”-type options so highly trusted system administrator accounts can take quick action in case of a bug or failure. In keeping with Apple’s focus on technically enforceable guarantees versus policy guarantees, PCC doesn’t allow privileged access and drastically limits remote management options.

In recent years, Apple took a major security step by offering its users end-to-end encryption for iCloud backups, in which the company simply holds data in its cloud infrastructure for its customers and doesn’t have the technical capability to decrypt and read that data. With current technology, such a scheme is impossible to implement for generative AI because the system needs to process the inputs to give an output. For example, if you want Apple Intelligence to give you a summary of all the text messages and emails you’ve received in the past three hours, the system needs access to those messages. End-to-end encryption would make that access virtually impossible.

Apple says it is still committed to doing as much Apple Intelligence processing as possible on-device, and a brand new iPhone 16 with its A18 chip, for example, will be able to do more AI processing locally than an iPhone 15 with an A16 chip. Still, the reality seems to be that Apple will need to do a substantial amount of Apple Intelligence processing in the cloud—hence the investment in developing PCC. (In iOS 18.1, users can go to Settings > Privacy & Security > Apple Intelligence Report to view a log of which requests are processed on device versus in the cloud.)

“What was really unique about the problem of doing large language model inference in the cloud was that the data had to at some level be readable by the server so it could perform the inference. And yet, we needed to make sure that that processing was hermetically sealed inside of a privacy bubble with your phone," Federighi says. “So we had to do something new there. The technique of end-to-end encryption—where the server knows nothing—wasn’t possible here, so we had to come up with another solution to achieve a similar level of security.”

Still, Apple says that it offers “end-to-end encryption from the user’s device to the validated PCC nodes, ensuring the request cannot be accessed in transit by anything outside those highly protected PCC nodes.” The system is architected so Apple Intelligence data is cryptographically unavailable to standard data center services like load balancers and logging devices. Inside a PCC cluster, data is decrypted and processed, but Apple emphasizes that once a response is encrypted and sent on its journey to the user, no data is retained or logged and none of it is ever accessible to Apple or its individual employees.

Open Book, Closed System

Apple says the overarching vision for PCC is that an attacker should have to compromise the entire system—a difficult thing to do at all much less without being detected—in order to target a specific user’s personal data. Even if an attacker could physically compromise an individual live PCC node, the system is devised with an anonymous relay feature so the queries and data on any one node can’t be connected to individual users.

It all sounds pretty groovy, but the notoriously secretive company seems to be aware that professing to do all of these things and claiming to offer technical guarantees is ultimately only compelling with proof and transparency. So PCC includes an external auditing mechanism that serves a crucial dual purpose.

Apple is making every production PCC server build publicly available for inspection so people unaffiliated with Apple can verify that PCC is doing (and not doing) what the company claims, and that everything is implemented correctly. All of the PCC server images are recorded in a cryptographic attestation log, essentially an indelible record of signed claims, and each entry includes a URL for where to download that individual build. PCC is designed so Apple can’t put a server into production without logging it. And in addition to offering transparency, the system works as a crucial enforcement mechanism to prevent bad actors from setting up rogue PCC nodes and diverting traffic. If a server build hasn’t been logged, iPhones will not send Apple Intelligence queries or data to it.

PCC is part of Apple’s bug bounty program, and vulnerabilities or misconfigurations researchers find could be eligible for cash rewards. Apple says, though, that since the iOS 18.1 beta became available in late July, no on has found any flaws in PCC so far. The company recognizes that it has only made the tools to evaluate PCC available to a select group of researchers so far.

Multiple security researchers and cryptographers tell WIRED that Private Cloud Compute looks promising, but they haven’t spent significant time digging into it yet.

“Building Apple silicon servers in the data center when we didn’t have any before, building a custom OS to run in the data center was huge,” Federighi says. He adds that “creating the trust model where your device will refuse to issue a request to a server unless the signature of all the software the server is running has been published to a transparency log was certainly one of the most unique elements of the solution—and totally critical to the trust model.”

To questions about Apple’s partnership with OpenAI and integration of ChatGPT, the company emphasizes that partnerships are not covered by PCC and operate separately. ChatGPT and other integrations are turned off by default, and users must manually enable them. Then, if Apple Intelligence determines that a request would be better fulfilled by ChatGPT or another partner platform, it notifies the user each time and asks whether to proceed. Additionally, people can use these integrations while logged into their account for a partner service like ChatGPT or can use them through Apple without logging in separately. Apple said in June that another integration with Google’s Gemini is also in the works.

Apple said this week that beyond launching in United States English, Apple Intelligence is coming to Australia, Canada, New Zealand, South Africa, and the United Kingdom in December. The company also said that additional language support—including for Chinese, French, Japanese, and Spanish—will drop next year. Whether that means that Apple Intelligence will be permitted under the European Union’s AI Act and whether Apple will be able to offer PCC in its current form in China is another question.

“Our goal is to bring ideally everything we can to provide the best capabilities to our customers everywhere we can,” Federighi says. “But we do have to comply with regulations, and there is uncertainty in certain environments we’re trying to sort out so we can bring these features to our customers as soon as possible. So, we’re trying.”

He adds that as the company expands its ability to do more Apple Intelligence computation on-device, it may be able to use this as a workaround in some markets.

Those who do get access to Apple Intelligence will have the ability to do far more than they could with past versions of iOS, from writing tools to photo analysis. Federighi says that his family celebrated their dog’s recent birthday with an Apple Intelligence–generated GenMoji (viewed and confirmed to be very cute by WIRED). But while Apple’s AI is meant to be as helpful and invisible as possible, the stakes are incredibly high for the security of the infrastructure underpinning it. So how are things going so far? Federighi sums it up without hesitation: “The rollout of Private Cloud Compute has been delightfully uneventful.”

Wired: Latest News

Bitfinex Hacker Gets 5 Years for $10 Billion Bitcoin Heist