Microsoft invested in OpenAI and launched the new Bing, Google launched Bard, Adobe launched FireFly, Amazon launched Amazon Titan, with the popularity of ChatGPT, AI tools or assistants of big companies in Silicon Valley instantly burst out.

Only Apple, located in a corner of California, seems to the outside world as if it is a world not disturbed by AI.

Silence

After the March Madness of AI, companies of all sizes are cutting into the AIGC field, and they are marked with Beta testing, as well as needing to go through a waitlist phase.

They may end up getting it wrong, or they may not get it right, or they may even cause a spike in public opinion, but Silicon Valley and we are still happy to be a part of it.
Apple has remained relatively silent about the AIGC, and no executives or employees have ever expressed Apple’s true opinion of it to the public.

Apple has always been very concerned about maintaining its image and company brand, and they rarely announce immature products or technologies to the public.
Therefore, this also explains why Apple is always slower than other companies in terms of new technology.
Apple tries to avoid the phenomenon of unconventional water and land when applying new technologies to products.

At present, most of the products in the AIGC industry still revolve around web dialogs, and the killer applications are still in the blank stage.
It makes sense that Apple, which is focused on its products, would not comment.

However, according to the information available there, Apple engineers are working on combining the LLM model with Siri to create a smarter Siri, which will logically be used in next year’s iOS.
Just like in 2011, when Apple first announced Siri, the intelligent assistant that came with the iPhone 4s, it was a major functional upgrade to the iPhone.

High Profile

From today’s perspective, Apple has added a neural engine to the chip, calling the SoC chip “bionic” and applying various artificial intelligence models and algorithms to photography, biometrics, writing, and other aspects.

It is often hard to notice the significant optimization and enhancement of the usage experience, and Apple’s use of AI is quite low-key.
But in 2011, Siri was Apple’s highest profile AI technology debut.

I still remember Scott Forstall introducing Siri by saying that Siri could understand natural language without requiring users to remember a specific format or grammar, and could be customized to fit the user’s habits.
In the promotional videos that followed, Siri really became a phone’s built-in assistant that always responded correctly and was quite intelligent and human.

It’s just that Siri’s technology iteration seemed to stall after 2011. More than a decade later, it’s still following the same “naive” answers it did in 2011.
As people continue to be refreshed by other companies’ more advanced voice assistants, Siri seems a bit clunky and overly cute.
Especially since Siri now covers almost all of Apple’s devices – iPhones, iPads, Macs, and even AirPods – Siri’s slightly old-fashioned approach to processing and answering makes it seem even more outdated.

Even Apple’s internal team that developed the XR device was not too keen on using Siri to control devices and functions on the XR device, citing that it was not smart enough.
For this reason, Mike Rockwell, the head of the XR device, considered an alternate solution to replace Siri’s voice control, but it didn’t work out in the end.
The rumored Apple XR headset will still be similar to Apple’s other devices, allowing for simple controls with Siri and, of course, a simple conversation with it if you want.

Climbing

The release of Siri is considered to be a turning point in the history of smartphones. After Siri, almost all smartphone manufacturers will introduce a similar smart assistant for their products to avoid being outdated.

For Apple, it spent $200 million to acquire Siri Inc. in 2010 and integrated Siri into the iPhone a year later.
After that, Apple also assembled a Siri smart assistant team, though between its release in 2011 and 2018, the Siri team was in a funk and there was some internal management and general direction debates.
.

Apple’s solution was to ‘hire someone high up’ and in 2018 poached John Giannandrea from Google to become Apple’s senior vice president for artificial intelligence and machine learning strategy.
Previously, he was in charge of search and artificial intelligence at Google, and could be considered a senior expert in artificial intelligence at Google.
With John Giannandrea on board, Apple wants to leverage his experience to improve Siri over time and help it catch up with its competitors.
.

John Giannandrea also brings a Google-like atmosphere to the Siri team, and when Apple executives need immediate changes in Siri, he uses the process of “climbing the mountain” to explain how to solve the challenges in AI artificial intelligence.
For Siri, or Apple’s AI strategy, there needs to be a long-term goal, during which each small optimization, change, will gradually accumulate over time and cannot be sloppy.
In other words, John Giannandrea thinks Apple’s base in AI is too thin to be rushed.
.
And he has convinced Apple executives to focus on building the team for the time being, leaving the talent behind and giving them more freedom to research and develop what they are interested in.
Most importantly, John Giannandrea significantly raised the salary level of the AI team to industry standards.
In three waves, Apple’s AI team has been bolstered by bringing in many former Google AI experts and acquiring machine learning startup Laserlike for $150 million.

Its three founders, Srinivasan Venkatachary, Steven Baker and Anand Shukla, later became experts in the Siri team, Apple’s LLM space and search.
The purpose of introducing Laserlike was actually to improve Siri’s search capabilities, and Venkatachary logically became the head of Apple’s search team.
In 2019, Apple added the ability to answer user questions through web-based information to Siri, which is gradually becoming more and more functional.
Siri’s functionality is gradually being added and enhanced.
However, these changes are more of a “flash in the pan”.

Apple’s artificial intelligence team has a number of internal projects, from BlackBird, which ports Siri to the iPhone, to SiriX, which celebrates Siri’s 10th anniversary.
But in addition to internal competition, Apple executives have been too slow to make decisions about the general direction of AI and too conservative about the use of new technologies like LLMs.

In the fall of 2022, Srinivasan Venkatachary, Steven Baker and Anand Shukla also left Apple for Google.
Interestingly, Google CEO Sundar Pichai personally recruited the three-person team, while Tim Cook also retained them.
But they thought Google was a good company to work on LLM, and that it would be quickly applied to products.
.
Today they are working at Google on how to reduce the training costs of large language models and how to improve accuracy.
Not only the Laserlike team, but most of the other experts and teams Giannandrea recruited himself left Apple, as well, because Apple seemed to be less focused on AI research.
After John Giannandrea arrived, Apple AI struggled in the process of climbing the mountain of artificial intelligence, perhaps because of differences in the company’s strategic direction.

Struggle

Privacy protection is a company-level strategy that Apple has been following lately.
When it comes to this, everything has to give way and never compromise.

John Giannandrea joined Apple with the clear goal of training algorithms to make Siri smarter by optimizing Apple’s use of user data.
Because, within companies like Google and Amazon, it is a routine thing to improve the product algorithm model by collecting and analyzing user data.
And because there is this process, it makes the algorithm smarter and the AI smarter.

Apple previously collected data on Siri’s conversations with users without remembering IDs, but it didn’t do it professionally and didn’t use the data to improve Siri.
With the arrival of John Giannandrea, Apple brought in a number of outsourced companies to collect the data and eventually put Siri through a process to optimize it.
But in 2019, The Guardian caused an uproar when it revealed that Apple’s outsourced team was listening to users’ conversations with Siri without their consent, especially since Apple has always been known for its focus on privacy.
.
In response, Apple eventually replaced outsourcers with full-time employees and changed its internal processes and policies so that it was almost impossible for regular employees to hear Siri’s recorded conversations.
Such rules also make it harder for AI teams to optimize and iterate in real time and on time, which is one of the reasons Siri seems so classical at the moment.
“The downside of what they’re doing is going to become more and more obvious,” argues Pedro Domingos, a professor of computer science at the University of Washington and author of the machine learning book The Master Algorithm, “and they’re going to have to mine more and more private data to be more competitive with everyone else.

And some of Siri’s odd answers that often hit the hot seat also caught Tim Cook’s attention. He often goes beyond the process and directly asks the Siri team to revise “embarrassing” answers.
Apple is very conscious of its corporate image, reducing data collection because of privacy protection, and avoiding awkward answers and making manual corrections.
As a result, many former Apple AI team members believe it will be difficult for Apple to deploy LLM-based Siri any time soon, even with the massive funding and resources it now has.
.
In addition, Apple has set a number of rules within Siri, such as asking for the price of an iPhone and giving priority to the Apple website rather than giving a direct answer.
Apple is not a technology-first company; all their services and technologies are for the product, which is to sell more iPhones, iPads, and Macs.
Therefore, for a long time, the design team had a big say in what they thought should be 100% perfect.
.
But as an algorithm, it’s impossible to be 100% accurate, and mistakes are inevitable, so we can better optimize the model.
The difference between their pursuits also made the AI team’s job stressful. After Giannandrea’s wrangling, the software design team had to add a button for Siri to facilitate feedback on whether the user’s answer was accurate.
Whether it was to insist on privacy, workflow issues, or to make a 100% perfect product, Siri’s AI team was going through a series of struggles, like having their hands tied competing with big companies like Google and Amazon in the AI space.

Unknown

This “unknown” can actually mean a lot.
Apple’s AI team has introduced many improvements to Siri, such as Siri X, BlackBird, Pegasus, and other projects.
While some members of the project have left, they are finally nearing completion and ready to be replaced or optimized into the current Siri.
However, as John Giannandrea advocates, an artificial intelligence model is actually a rather complex project that involves a lot of work.
Blind modifications and replacements can cause unpredictable problems.

In addition, Siri’s answers rely heavily on human involvement compared to other voice assistants such as Amazon’s Alexa or Google’s Assistant.
That is, Siri’s database has many human limitations and interventions that have been modified and tweaked over a decade, making this database complex and lengthy.
Moreover, it also works differently from the current LLM Big Language Model database, which cannot simply add an API interface to run ChatGPT smoothly and have similar functionality.
.
All Apple has to decide now is whether it wants a Smart Siri and whether it should be pushed back or optimized layer by layer.
Also, no one can deny Apple’s profitability, cash flow, and vast resource mobilization capabilities. And, it also has the hardware resources and knowledge base of chips, terminals, etc.
When Apple makes up its mind to devote itself to AIGC, it can train complex big language models and make its own generative AI.

But Apple still has to decide whether it needs to “repeat the wheel” or put its vision and resources into how to apply AIGC to the terminal and embed it into the ecology.
After all, LLMs that support complex services like ChatGPT are still running in the cloud, and there is still a gap in the endpoint ecosystem.
In terms of personnel transfers, Apple is currently focusing more on visual recognition specialists and, relatively speaking, also putting the focus on areas suitable for XR virtual reality.
.
For Apple, the AIGC industry technology explosion, less than a year, how to use to Apple products, but also in a very preliminary attempt.
Apple has been preparing for years, and intends to replace the iPhone as the next big trend of XR virtual reality devices, perhaps Apple should be the most concerned at the moment, and to focus on the development of products.
As for turning Siri into Smart Siri or Siri Copilot, let the AI team led by John Giannandrea climb the mountain a little longer.

Leave a Reply

Your email address will not be published. Required fields are marked *