(Image credit: Future / Lance Ulanoff)
Apple is having its generative AI cake and eating it, too. Apple Intelligence, which was unveiled today at WWDC 2024, is largely a product of local generative models of varying sizes, but even the power of the A17 Pro chip is not always enough to handle every one of your substantive queries.
Sometimes, Apple will have to go out to the cloud. Not any cloud, mind you, but its own Private Compute Cloud where your data is protected in ways, according to Apple, it might not be on other cloud-based generative AI systems.
In a post-WWDC 2024 keynote deep dive session, Apple Senior Vice President of Software Engineering Craig Federighi and Apple Senior Vice President of Machine Learning and AI Strategy John Giannandrea explained exactly how Apple Intelligence and the systems it will support, like that all-new Siri, will decide when to keep your queries on device, when to reach out to Apple’s Private Compute Cloud, and how Apple Intelligence decides what to share with that cloud.
“It is very early innings here,” said Federighi while explaining the AI journey, the challenges Apple faced, how they solved them, and the road ahead.
What Apple is doing here is no small thing, and it could be said that Apple dug the hole in which it sits. Apple Intelligence is essentially a series of generative AI models of varying sizes that see deep inside your iPhone to know you. Knowing you means they can help you in ways other LLM models and generative AIs probably cannot. It’s like how your partner or parent can soothe you because they know everything about you, whereas a stranger can only guess what might comfort you but is just as likely to get it wrong. Knowing you and all the data on your phone is Apple Intelligence’s superpower and potential weakness, especially when it comes to privacy.
Federighi explained that Apple created a two-part solution to mitigate this issue and avoid disaster.
(Image credit: Future / Lance Ulanoff)
First, the onboard intelligence decides which bits of all your data are crucial to deriving the right answer. It then sends only that data (encrypted and anonymized) to the Private Compute Cloud.
Sign up for breaking news, reviews, opinion, top tech deals, and more.
The second part of the solution is how the cloud is built and how it manages the data. This is a cloud that runs on efficient Apple Silicon but has no permanent storage. Security researchers have access to the server but not your data to conduct privacy audits. The iPhone will not send these data bits to a server that has not been publicly verified. Federighi likened it to the keys and tokens found on cryptocurrency servers.
“No one, not Apple or anyone else would have access to your data,” added Federighi.
To be clear, your on-device data is at the heart of what Apple is doing with Apple Intelligence and the new Siri. It’s a “rich understanding of what’s on your device,” and that knowledge base is one “that will only get richer over time,” said Giannandrea.
We also got some insight into how Siri’s semantic index, which can look at data from across the phone, including meta information in photos and videos, gets supercharged when combined with the Apple Intelligence models. All of this helps pull together an understanding of what you’re referring to, said Federighi.
Apple has been working on the semantic index for years. “So it’s really a story of us building over many, many years toward a really powerful capability on device.”
The pair also clarified whose models you’ll be using and when. It turns out that the local ones, unless you request, say, ChatGPT, are all Apple’s.
“It’s important to reemphasize that Apple Intelligence and the experiences we talk about are built on top of the Apple-built models,” added Federighi.
As one does, Apple trained these models on data. Some of it is from the public web (based on Apple’s ongoing project in Web-based search), though Giannandrea said publishers can opt out of having their data included. Apple also licensed news archive data and even applied some in-house data to its diffusion model.
The duo also confirmed that Apple Intelligence will only work on iPhones running the A17 Pro chip. As a way of explanation, Giannandrea said, “the core foundational models require a huge amount of computing.” Federighi added that the latest A17 Pro neural engine is “twice as powerful as the generation before” and that it has advanced architecture to support Apple’s AI. All of which is probably cold-comfort for iPhone 15 (A16 Bionic) and iPhone 14 (Pro and and standard) owners.
As for how Apple Intelligence will work with third-party models, Federighi pointed out that some of them have expertise you might not find in their models, like answering the question, “What can I make with these ingredients?” Then Fedeigi added something that might unintentionally put OpenAI’s platform in an unintended light, “Even hallucinations are useful; you end up with a bizarre meal.”
You might also like
Apple WWDC keynote live: iOS 18, Apple Intelligence, Siri …WWDC live blog: get ready for iOS 18, macOS 15 and a lot of …Apple WWDC – 13 things we learned including what Apple …macOS 15 Sequoia announced at Apple’s WWDC event …iOS 18 confirmed: big upgrades are coming to Mail, Messages …
A 38-year industry veteran and award-winning journalist, Lance has covered technology since PCs were the size of suitcases and “on line” meant “waiting.” He’s a former Lifewire Editor-in-Chief, Mashable Editor-in-Chief, and, before that, Editor in Chief of PCMag.com and Senior Vice President of Content for Ziff Davis, Inc. He also wrote a popular, weekly tech column for Medium called The Upgrade.
Lance Ulanoff makes frequent appearances on national, international, and local news programs including Live with Kelly and Ryan, the Today Show, Good Morning America, CNBC, CNN, and the BBC.
>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : TechRadar – https://www.techradar.com/phones/ios/apples-craig-federighi-on-the-apple-intelligence-generative-ai-journey