close_game
close_game

The world thought it’ll be called iAI, but it’s called Apple Intelligence instead

Jun 13, 2024 07:20 AM IST

Our conversation this week transcends time zones, as I try to help you navigate everything Apple announced at the WWDC keynote in Cupertino, California.

Our conversation this week transcends time zones, as I try to help you navigate everything Apple announced at the WWDC keynote in Cupertino, California. I must start by saying that contrary to general opinion, it turns out, Apple was never really falling behind in the AI battles. It was simply preparing, and in one fine move with what it calls Apple Intelligence, has taken a significant lead over anything Android, Windows or standalone AI tools have in the ring. So much so, Apple’s entered the fray with significant superiority of its AI models, and OpenAI’s latest and greatest GPT-40 is simply an optional layer within the grand scheme of things. On the broader spectrum, Apple’s integration of machine learning and generative AI touches almost every task, or app you may be using.

There are many improvements landing on the iOS 18 platform for iPhones.
There are many improvements landing on the iOS 18 platform for iPhones.

Our analysis, as Apple writes a new

Deep dive into Apple Intelligence models, optional ChatGPT and privacy approach
Deep dive into Apple Intelligence models, optional ChatGPT and privacy approach

Phone calls recorded and transcribed. Email replies drafted after you’ve answered contextual questions based on what the sender wrote. Writing tools to proofread or change language of the text you’ve written. Notes app improving your handwriting. Scheduling messages. A conversational Siri that holds context too during follow-ups. Let me quickly summarise a few things for you, before talking more about the AI models.

  • Apple CEO Tim Cook describes, a “personal intelligence system”. He hopes Apple Intelligence and the company’s approach to AI, will “set a new standard for privacy and AI”.
  • Core to Apple Intelligence are multiple models that they’ve built in-house, on-device processing, a smarter Siri, the privacy focused Private Cloud Compute for server-side processing, updates for App Intents framework that’ll allow third-party apps to plug into the intelligence features and an optional OpenAI GPT-4o integration.
  • Specifically for Private Cloud Compute which is a unique solution for consumer interfacing AI, Craig Federighi, Senior Vice President for Software Engineering at Apple explained, “what's happening there is your request to our servers is actually first of all anonymized, so your IP address is masked. Then it talks to a server that has no permanent storage, cannot log, but most importantly is running software where the image is publicized for security researchers to audit or your iPhone won't even talk to that server is a clever kind of blockchain issue.”
  • The clarity of thought with Apple Intelligence’ structure, is clear. “We think that the right approach to this is to have a series of different models and different sizes, for different use cases. We put a lot of time and effort on the 3 billion parameter model that will run on your iPhone, and it is one of the most capable models today,” explains John Giannandrea, Senior Vice President for Machine Learning and AI strategy, Apple. That was our first confirmation of the specifics of ‘Apple on-device’.
  • OpenAI’s ChatGPT-4o in iOS 18, iPadOS 18, and macOS Sequoia, isn’t the default AI model at work. In fact, reaching ChatGPT will be optional for any queries or tasks a user may have, and this will be invoked if Apple’s own models feel they do not have the necessary expertise or response to offer.

OpenAI has also detailed the privacy measures put in place for queries generated from Apple devices. They say privacy protections are built in when accessing ChatGPT within Siri and Writing Tools, to the extent that the user requests are not stored by OpenAI, and users’ IP addresses will be obscured.

My attention was drawn to a technical paper which Apple released a few hours after the WWDC keynote. It details the core tenets, which is how Apple Intelligence have been fine-tuned for user experiences such as writing and refining text, prioritizing and summarizing notifications, creating playful images for conversations with family and friends, and taking in-app actions to simplify interactions across apps. To that point, Apple has detailed their Responsible AI Principles, which includes the pursuit to identify areas where AI can be used responsibly to create tools for addressing specific user needs, work continuously to avoid perpetuating stereotypes and systemic biases across AI tools and models, identify at stages including design, model training, feature development, and quality evaluation to identify how AI tools may be misused or lead to potential harm and most importantly, do not use users' private personal data or user interactions when training foundation models.

Apple, time and again, made it a point to clarify that Apple Intelligence has a significant advantage because it will know you better, have access to your data for query responses and have context from your calendar, location, and conversations, for instance. Responses to daily queries such as “show me the document which XYZ sent me a few days ago” or “can I finish the 4pm meeting and be in time for the 5:15pm meeting” will have a lot more context to work with. The sort that a standalone tool such as ChatGPT wouldn’t. Add to that, if Apple’s previous record of pursuing user data privacy is anything to go by, they’ll have Apple Intelligence data absolutely in control. Models won’t be trained on your data, period. The sort of control that perhaps hasn’t been seen thus far with consumer interfacing generative AI.

UNFOLD

Some of you may remember, last week we had a conversation about a new chapter for tech that’s being crafted in the summer of 2024. Smartphones is a critical element to that, and I have a feeling the point we are talking about today is just the tip of a much bigger iceberg waiting to be revealed (which I shall, in the coming weeks too). Foldable phones are catching everyone’s attention. All the data too points to that, with as many as 17.7 million foldable phones expected to be shipped by the time the year draw to a close. That’s no surprise, because generation by generation and slowly but surely, the combination is coming together well – foldable form factor is finding utility for users, and these phones themselves are getting better. Last year’s OnePlus Open left me quite impressed, and quite a few months later, we are now experiencing Vivo’s good habit of learning with time. Which brings us to the Vivo X Fold3 Pro, their first foldable phone to go on sale in India but it is actually a third-generation device.

I would credit the passing of time and the feedback from that, as the reason why Vivo’s stretched technical limitations to achieve what it has – the slimmest foldable phone, the largest display in a foldable phone, the lightest foldable phone and so on. In my piece, you’d find some comparative points that define how this phone is building on the sort of time-passing improvements the OnePlus Open had given us. And that, compared with Samsung’s foldable, which released prior. You’d be parting with around 1,59,999 for the X Fold3 Pro and that’s in the same ballpark as a Samsung Galaxy ZFold5 (around 1,54,999) though a OnePlus Open (around 1,39,999) still represents good value in my book.

Unlike Samsung’s approach to cameras on foldables (or at least thus far), OnePlus didn’t really hold back on its Hasselblad trump card (and to a large extent, the photos it clicked are really likeable) and Vivo most certainly wants to have its Zeiss partnership make a significant difference. It has worked. Vivo’s troika of a 50-megapixel wide, 64-megapixel periscope and a 50-megapixel ultra-wide, and that’s given Zeiss’s behind the scenes smarts a solid foundation to work with. I find a lot of similarities in the photos the Vivo X Fold3 Pro’s camera outputs, with its flagship, the X100 Pro. That’s just the ticket for this price tag. It is quite remarkable that most of Vivo’s recent phones have more often than not, hit the sweet balance between pricing, value and experience. This seems destined in that direction (value being subjective, depending on how the price tag weighs on you). That sort of strike rate isn’t easy to achieve.

More coverage about foldables and flagships …chapter…

The new Samsung Galaxy Z Fold 5. (HT Photo)
The new Samsung Galaxy Z Fold 5. (HT Photo)

OVERVIEW

Gemini Live will figure prominently in Google’s Messages app, with the added context of your chats.
Gemini Live will figure prominently in Google’s Messages app, with the added context of your chats.

A couple of weeks ago, we discussed some reported inaccuracies by Google’s AI Overviews feature for Search. That was just days after the company had widened the scope of this functionality at the I/O 2024 keynote (Our detailed coverage from the time). Worrying? Google has in the time that’s passed since, reached out and detailed what they’ve found after some urgent investigations were mounted. While Google isn’t entirely ruling out the use of fake or morphed screenshots on social media, they point to two things – certain queries are designed to elicit an erroneous response. Or as Liz Reid, who is vice president and head of Google Search says whilst not mincing any words, “nonsensical new searches”. That said, Google has identified a few things that can be improved, based on millions of users putting AI Overviews through the paces.

  • “One area we identified was our ability to interpret nonsensical queries and satirical content. Let’s take a look at an example: “How many rocks should I eat?” Prior to these screenshots going viral, practically no one asked Google that question.” They reference this as a data void, because when queried, Search pointed the user to the only website that tackled this sort of a query.
  • “We saw AI Overviews that featured sarcastic or troll-y content from discussion forums. Forums are often a great source of authentic, first-hand information, but in some cases can lead to less-than-helpful advice, like using glue to get cheese to stick to pizza”. This has been identified as a misinterpretation of the language on webpages that have inaccurate information, which is then amplified. Algorithms have received updates to change this behaviour.
  • There are now more query categories or specific search types which will not trigger AI Overviews. This was already in place for anything related to health or news.

PLAY

A few days ago, game developer Ubisoft confirmed a number of their titles coming to Apple devices, in different phases. In terms of the here and now, Assassin’s Creed Mirage can already be downloaded from the App Store for iPhone 15 Pro, iPhone 15 Pro Max, and iPad Air and iPad Pro (that’s with an M1 chip, or later). Another title, Rabbids: Legends of the Multiverse, lands on the Apple Arcade subscription bundle and is playable on iPhone, iPad, Mac, Apple TV, and Apple Vision Pro. Meanwhile, Prince of Persia: The Lost Crown is coming to Macs at some point this winter, and preorders for this are open. Finally, Apple is getting gaming right for the iPad and Mac in particular, which if this momentum continues, will mount some pressure on Microsoft’s gaming foundations with Windows PCs. It’s a long road ahead.

Unmissable Offers in Amazon Sale (May 2025) Grab amazing deals on summer appliances, laptops, large & kitchen appliances, gadgets and more in Amazon Great Summer Sale (2025).
Unmissable Offers in Amazon Sale (May 2025) Grab amazing deals on summer appliances, laptops, large & kitchen appliances, gadgets and more in Amazon Great Summer Sale (2025).
SHARE THIS ARTICLE ON
SHARE
Story Saved
Live Score
Saved Articles
Following
My Reads
Sign out
New Delhi 0C
Wednesday, May 07, 2025
Follow Us On