In the early hours of yesterday morning, at the much-anticipated Apple WWDC, the Apple MR headset, which has been awaited for several years, was finally unveiled and instantly ignited the world. In Cook's words, the revolutionary introduction of Apple's Apple Vision Pro marks our official move into the era of spatial computing and will open up a whole new journey of personalized technology.
Vision Pro is very cool, and from the overwhelming reviews, we also see Apple's strong strength in hardware and UI design, and aim at the possibility of a future generation of computing platforms, but while excited, in this whole conference, Silicon Star people are also paying close attention to another important topic - in the current rolling wave of AI What is Apple doing and what does it intend to do with the current wave of AI?
And if we're just talking about AI, this event was a big disappointment mixed with a little hope. Some expectations are that although Apple did not mention the word AI this time, important AI concepts such as machine learning and Transformer models still appear from time to time, and a number of its products have been updated with some AI features, in addition to which the launch of the MR headset may also open up a new imagination of Apple's AI applications.
However, the whole conference did not launch a generative AI-based self-developed products, and even next door Microsoft has begun to enable AI assistants at the system-wide level, Apple is still playing the old game of speech-to-text, keyboard auto-correct fill - which is inevitably a little disappointing. Don't forget, it was Apple's Siri that started the era of voice assistants. But in today's ChatGPT sweeping the world, Apple's Siri just a less "Hey"?
At least from this year's WWDC, Apple has made an effort on AI, but not much.
WWDC, what AI updates did Apple unveil this time?
Unlike Google IO and Microsoft Build, which just ended in May and almost all revolved around AI, the AI at Apple's developer conference was more like an egg-like presence, buried in hardware and software updates. Although it brought some product optimization, it did not make people's presence shine.
First, at the hardware level, the newly released M2 Ultra chip enables up to 192GB of unified memory by combining two M2 Max chips together, 50 percent more than the M1 Ultra, thus enabling smooth support for large model computing and accomplishing tasks that other chips cannot. For example, Apple said the M2 Ultra can be compared to other PC devices in support of Transformer large models run on the advantage of obvious.
In addition to the chip, AirPods have also quietly introduced some new AI-powered features. For example, AirPods now have "adaptive audio" capabilities that automatically adjust the volume of media playback based on your environment or remember your preferences at different times, allowing you to focus on content or communicate with people, backed by machine learning technology.
In addition, along with the OS upgrade, some useful little AI features are becoming available to Apple users. But to be honest, there are no highlights and it all gives a feeling of déjà vu.
First of all iOS 17 this time added a new native application called Journal, similar to the previous Memory upgrade. Based on machine learning technology, the iPhone is now able to intelligently mark and record significant events or interesting moments in a user's life, and automatically add details to any entry for photos, music, recordings, and more to make it easier to go back.
In addition, Live Voicemail (live voicemail) was launched, and both voice messages and others' iMessage voicemails can now support real-time voice-to-text conversion. But this feature is all too familiar to the majority of Chinese users - after all, WeChat users started using this feature years ago, and not long ago OpenAI also open-sourced its audio-to-text model Whisper, which can accurately identify 98 languages. This time Apple did not specify what model was used to develop the feature, only that it was based on a powerful neural network engine.
The iPhone's own keyboard now has the ability to automatically predict word recommendations and corrections. Apple says that now every time we press a key, the iPhone will run the Transformer language model under the powerful computing power of Apple's chip.