Apple Intelligence heralds a new era of privacy — outshining Microsoft’s aspirations for unparalleled security with Private Cloud Compute

(Image credit: Apple)

So, WWDC 2024. Apple just held its long-anticipated event which was packed with a ton of AI announcements as earlier speculated. Perhaps Apple Intelligence and Apple's new partnership with OpenAI to bring ChatGPT to the ecosystem stood out the most. 

Apple can be considered a late bloomer in the AI race compared to competitors like Microsoft which has invested billions of dollars in OpenAI and integrated the technology across most of its products and services. 

Granted, the Redmond giant is seemingly flying high with the AI wave which has turned it into the world's most valuable company. Market analysts attribute its immense success to its early investment and adoption of AI. Trends also suggest that Microsoft could be on the verge of reaching its iPhone moment with AI.

This isn't to say that the journey has been a smooth sail either. Microsoft's big AI ambitions have placed it under fire with antitrust watchdogs breathing down its neck over privacy and security-related issues. This might be among the main reasons Apple has shied away from AI advances. Well, at least until the WWDC event. 

What's Apple Intelligence?


(Image credit: Apple)

Apple has seemingly covered its security and privacy bases with Apple Intelligence — a new AI-powered system integrated across iOS 18, iPadOS 18, and macOS Sequoia. It works like other AI-powered chatbots like Microsoft Copilot or ChatGPT but with a personalized touch.

According to Apple:

"It harnesses the power of Apple silicon to understand and create language and images, take action across apps, and draw from personal context to simplify and accelerate everyday tasks."

But unlike other AI-powered chatbots, Apple Intelligence ships with its powerful foundation model to support most experiences but is small enough to run on-device.  

No compromises on quality

Apple Intelligence

(Image credit: Apple)

The company states that the AI model has undergone vigorous training to fine-tune its capabilities, allowing it to handle tasks like text generation and summaries better. The foundational model also leverages a new technique dubbed adapters, allowing it to handle newer tasks better despite not being trained on the same.

This is why you may spot performance disparities when using tools like Copilot to generate images. The tool is great at generating detailed structural designs to the extent of claiming architecture and interior design jobs from professionals, but fails at simple tasks like creating a plain white image. Apple says Apple Intelligence ships with a broad set of adapters fine-tuned for a specific feature, ultimately scaling its capabilities to greater heights.

As you may know, Apple Intelligence is limited to a handful of devices because it requires sophisticated hardware and software support. Compressing a 3 billion parameter LLM to run on the iPhone 15 Pro or later without compromising quality is no small feat.

ChatGPT and Copilot come with paid subscription versions that 'guarantee' faster performance and an enhanced user experience even during peak times, but this isn't always the case. Apple Intelligence focuses on inference performance and optimization which promises a shorter period to process prompts and generate responses.

Private Cloud Compute will do most of the heavy lifting

WWDC Developer app on iPhone

(Image credit: Future)

While Apple Intelligence runs smoothly, there are complex tasks or queries it can't handle. This is where Private Cloud Compute comes in, which is in place to run large foundation models. But isn't this similar to using the cloud? And where does privacy and security stand in all this?

Private Cloud Compute is designed to process AI privately, ascertaining the user's privacy and security while interacting with the models. It runs on a new OS using a subset of the foundations of iOS based on Apple's operating system security.

It's worth noting that Private Cloud Compute doesn't ship with features like persistent storage that may pose potential risks to AI servers, ultimately preventing unauthorized access to data. It also ships with Secure Enclave, Secure Boot, Trusted Execution Monitor, and more as extra layers of security, preventing data breaches.

Apple isn't afraid of a little criticism 

Apple logo at the London Battersea Apple headquarters

(Image credit: Future)

The user's data remains private and isn't stored by Apple to train its models, ultimately preventing access by unauthorized parties. In addition, Apple promises to make virtual images of its Private Cloud Compute productions publicly available for in-depth analysis and critique from security researchers and concerned parties.

A lesson Microsoft's book of privacy and security transgressions fueled by AI ambitions

Hands-on with Windows 11's new AI Recall, Cocreator, and Studio Effects for Copilot+ PCs! - YouTube Hands-on with Windows 11's new AI Recall, Cocreator, and Studio Effects for Copilot+ PCs! - YouTube
Watch On
WWDC 2024

WWDC 2024

(Image credit: Apple)

1. iOS 18what's next for iPhone?
iPadOS 18 — will Apple finally unleash the power of iPad?
macOS 15 — what's new for Mac?
4. Apple Intelligence — what will it be able to do?

Apple boasts vast experience running machine learning tasks on-device across its platforms while leveraging Apple Silicon's power, allowing it to provide a better user experience with low latency. In addition to ensuring the user's data remains private and secure. 

Privacy and security are among the major challenges facing AI and deterring it from progressing. For instance, Microsoft has faced backlash and criticism over its controversial Windows Recall feature. Windows Recall is part of next-gen AI features unveiled during Microsoft's special Windows and Surface event in May that captures snapshots of everything you do on your device and stores them locally. Users can run a semantic on their PC to retrace their steps and refer to something that might have slipped their minds.

While useful and impressive, the tool has seemingly raised alarm among users referring to it as a "privacy nightmare" and a hacker's paradise. The controversial AI feature is supposed to ship exclusively to Copilot+ PCs later this month but is already causing trouble to the extent of attracting the UK data watchdog's attention. 

Although the security concerns are valid a security researcher was able to bypass the tool's security layers and access its data, despite Microsoft's 100% privacy-focused promise. However, the company has since added stringent security measures like mandatory Windows Hello enrollment to enable Windows Recall. It's also an opt-in experience now.

We're still in the early phase of Apple's debut in the AI landscape. It'll be interesting to see how the company lives up to its privacy and security promises and the reception from users. Besides, Apple's new partnership with OpenAI presents an opportunity for the company to succeed in the category coupled with the fact that the company indicated it prioritizing where it users are when launching its desktop ChatGPT app exclusively to Mac users, snubbing Windows.

Kevin Okemwa

Kevin Okemwa is a seasoned tech journalist based in Nairobi, Kenya with lots of experience covering the latest trends and developments in the industry at Windows Central. With a passion for innovation and a keen eye for detail, he has written for leading publications such as OnMSFT, MakeUseOf, and Windows Report, providing insightful analysis and breaking news on everything revolving around the Microsoft ecosystem. You'll also catch him contributing at iMore on occasion about Apple and AI. While AFK and not busy following the ever-emerging trends in tech, you can find him exploring the world or listening to music.