Last update: Monday 6/17/24
Welcome to our 17Jun24 TL;DR summary of the past week's top AI stories on our "Useful AI News" page ➡ 1) Apple Intelligence is boring and practical, (2) Why Apple is taking a small-model approach, and (3) Elon Musk drops lawsuit against OpenAI
A. TL;DR summary of Top 3 stories
1. Apple Intelligence | 2. Small Models | 3. Musk lawsuit
1) Apple Intelligence is boring and practical
Our first two events for last week were really one Big Event viewed at two levels. The Big Event was Apple's 2024 developer conference. Apple was the last of the five biggest BigTech companies to get into the generative artificial intelligence race. Apple called its AI entry "Apple Intelligence" -- a name that seemed, at first, to be way too cute, which it would have been had Apple's late entry into the race embodied the same unlimited scope and depth as its predecessors; but it didn't.
Prior to Apple's announcements, generative AI was driven by models that were designed to know everything about everything. These ever more massive models had been trained all on of the information on the Internet, often obtained without their owners' permission. In stark contrast, Apple Intelligence was primarily based on the limited body of information on the users' smartphones and personal computers. Most tech pundits posted opinions that expressed their disappointment with this limited focus right after Apple's keynote on the first day of the conference.
But within 12 to 24 hours there was a tectonic shift in their subsequent evaluations. Perhaps Apple was right. Perhaps Apple has redefined the generative AI race. Perhaps Apple was now the leading provider of the kinds of genAI services that most users really wanted, specifically most of the world's billion plus iPhone and Mac users. Note that Apple wasn't very specific about when most of these services would be delivered, other than 2023 on Macs & iPads, and 2024 on iPhones. According to TechCrunch:
- "Outside of some sillier additions like AI emoji, Apple Intelligence is coming to everyday apps and features, with additions like writing help and proofreading tools, AI summaries and transcripts, prioritized notifications, smart replies, better search, photo editing, smarter Siri, and a version of “Do Not Disturb” that automatically understands what important messages need to come through, among other things ...
... The company [Apple] says it focused on use cases where it could identify specific problems that are much more solvable, rather than dealing with the complications that come with working with an AI chatbot. By narrowing its focus, Apple is more assured of providing users with expected results, not hallucinations, and can limit the dangers and safety concerns that come from AI misuse and prompt engineering."
2) Why Apple is taking a small-model approach
As noted above, Apple Intelligence is primarily based on the limited body of information on the users' iPhones, Macs, and iPads. Its proposed services should therefore be within the scope of well designed small language models (SLMs), models that are cost-effective, rather than on expensive cloud-based large language models (LLMs) that are not cost-effective.
More importantly, small models that process a user's most personal data can be run entirely on the user's iPhones, Macs, and iPads if they contain Apple's latest, most powerful chips. Apple claims that its small language models are as powerful as Microsoft's Phi-3 models.
Whatever happens on a user's iPhones, Macs, and iPads can stay securely on these personal devices, thereby guaranteeing that the privacy of the user's data is never compromised ... thereby conforming to Apple's deep corporate commitment to protecting the privacy of its users' data.
Well not quite ... Some complex queries might be handled in a faster, more satisfactory manner if they were processed by large language models running on more powerful chips in a cloud. So Apple designed a new kind of cloud called "PCC" (Personal Cloud Compute") that operates like Web servers in the good old days before cookies. In other words, the PCC is stateless; it does not remember a user after it responds to the user's query. Indeed, it does not identify its usesr and its internal workings are not accessible to Apple, just as Apple does not have access to its users' passwords.
Finally, for queries that cannot be handled by Apple's PCC, Siri wil ask users if they want to pose their queries to OpenAI's ChatGPT running GPT-4o ... free of charge ... but at their own risk. Sometime in the future, Apple may pass users to other AI services in addition to ChatGPT, but, again, at their own risk. Note that Apple is not paying OpenAI for this free service, nor is OpenAi paying Apple for access to some of Apple's billion plus users.
3) Elon Musk drops lawsuit against OpenAI
Muisk dropped his case this week, but he did not say why. However he did so in a manner that would permit him to reopen the case at a future date.
B. Top 3 stories in past week ...
- Other Models
"Apple’s AI, Apple Intelligence, is boring and practical — that’s why it works", Sarah Perez, TechCrunch, 6/11/24 ... and a video on X/Twitter ***
-- This story also covered by Engadget, - SLM News
"Why Apple is taking a small-model approach to generative AI", Brian Heater, TechCrunch, 6/11/24 ***
-- Apple's small language model strategy also discussed by VentureBeat, Wired, The Verge, ... and Apple (SLM) plus Apple (PCC "Private Cloud Compute" extension of SLM) - OpenAI
"Elon Musk drops lawsuit against OpenAI", Emma Roth, The Verge, 6/11/24 ***
-- This story also covered by CNBC, Bloomberg, Reuters, Engadget, Wall Street Journal,
No comments:
Post a Comment
Your comments will be greatly appreciated ... Or just click the "Like" button above the comments section if you enjoyed this blog note.