May 2024

Strategic forecasts to prepare for Apple’s WWDC24

At Loomery we strive to be on top of all the new developments in the technology industry so that we can not only build out our in-house capabilities but also find great opportunities for use cases of new technologies for our clients.

One of the biggest annual events in tech is Apple’s WWDC and as is tradition at the company we will be watching with great interest as to what the tech giant has in store for its platforms. 

Naturally we have a few ideas for what might be announced at this year’s event and what this would mean for Apple’s platforms and the wider industry, in areas such as generative AI and spatial computing. This post will detail our predictions...

Siri + Apple Watch = rabbit R1 done right?

With products like HumaneAI and rabbit hitting the market it shows the potential for having the user interact naturally with a wearable device that can execute tasks like researching and booking flights for you, albeit poorly executed. These are all new hardware devices that you’ll have to buy and add to an ever-growing ecosystem of hardware. There is a device that most iPhone users already own that has been around for years that has all the same hardware features as these newer products: the Apple Watch. 

I believe that Apple is in a prime position to leverage all the groundwork they have been laying over the past several years with features such as Siri Intents and shortcuts for an augmented AI to take full advantage of in a similar way to the rabbit or the Humane Pin but executed in a far superior way without expensive hardware or subscriptions.

iOS apps already have a lot of support for opening up their functions to enable shortcuts to automate the performing of actions and have done for the past few years, it’s just required the user to manually program those automations. However, Apple could upgrade Siri to make use of these exposed functions as a kind of highway for an AI to interact with your apps without the need for interacting with any UI. 

This could of course all be done using your Apple Watch that you already own and already has the hardware to take voice instructions. I would be surprised if this was not part of the new rumoured Siri updates this year. 

Making your native app support Siri Intents and Shortcuts could prove fruitful come iOS 18 for getting strong usage of your service! 

AI powered user experience

Apple has in fact been adding AI features to their OSs for years, a recent example in iOS 16, which they gave the ability for the Photos app to automatically mask people and objects in pictures; this leans on the same technology as tools like GPT-4 with Vision. Another is the Dictation feature which leans on AI for speech to text similar to OpenAI’s Whisper and not to mention a feature that came with iOS 17 last year that got nowhere near enough attention is Personal Voice which leans on the same techniques used to train text to speech models. 

Automatic image masking in Photos app on iOS. Even includes AI features like Visual Lookup (iOS 15) that uses image classification as iOS infers it is a food item (see the info button in the bottom toolbar).

Apple likes to classify all these features under ML (machine learning) rather than AI - ML, a more accurate term for what is going on under the hood! The organisation also likes to differentiate themselves from the rest of the industry. I believe that when most people think of GenAI they think of chat bots like ChatGPT and stable diffusion models like Dall E, which is quite a narrow band of what GenAI (or GenML) is capable of; they are merely products that are built on top of AI not AI itself, just like the OS features listed above. 

Expect more "intelligence features" (as Apple calls them) seamlessly integrated across Apple’s OSs at WWDC24, creating a "magic" user experience where AI is used without noticeable effort. This approach differs from competitors, and what people and some investors expect, who anticipate standalone AI chatbot apps like an "AppleGPT."

The thing I am most excited about for this year’s WWDC is how Apple sees GenAI being useful to most (nontechnical) users and not just tech bros, making the UX as seamless as possible. Also which of these features they will make accessible to developers via API will be highly appreciated as my guess would be that Apple would want as many apps as possible to adopt these new features. 

It’s likely that Apple has already partnered with third-party apps for early access to intelligence APIs, similar to how they showcased Uber and Nike with Live Activities at WWDC22.

Strategic alliances in GenAI

It's no secret that Apple appears to be behind OpenAI, Google and Anthropic when it comes to training a LLM, so the rumour that they could be partnering with one of the frontrunners is not far-fetched. It is very expensive and time consuming to train their own LLM and not to mention there is a PR hazard if the model causes consistent hallucinations or has been fine tuned to follow a particular political leaning, like the issues that Gemini had when it first launched. 

While Apple has, and always had, great on-device ML features for handling user data but when it comes to keeping up with its competitors on LLM features it would need to process those tasks server side. At the moment it is tricky to predict what these features might be, Apple has always been privacy centric and sending user data to Google probably is not in their best interest. Which makes me think it might be used more in its services like Apple Music for making AI playlists (similar to the features Spotify has just launched). 

At this point it's hard to tell what impact this partnership (if true) will have on 3rd party apps and if there will be any API Apple opens up that leverages Gemini, or server-side GenAI. It is likely that the GenAI features that developers can use will be restricted to Apple’s own on-device intelligence features. 

visionOS’s first major update

I would expect a hefty visionOS update this year too, beyond all the AI and Siri updates. Like all of Apple’s past OSs, v2 is always large: iOS got the App Store, watchOS got fully native app support and I’d bet visionOS 2 will take a lot of what Apple has learnt from users over the months as well as features they couldn’t get ready and test in time for v1 that will be in this year’s update. 

Apple will hopefully open up more APIs for taking advantage of the VisionPro’s hardware, although most of it will probably be in the UX of the OS itself like improving notifications and maybe more customisation features like moving the apps on the home screen and dark mode… or I guess light mode as the default windows already have a dark appearance? 

Hopefully there will be more Apple apps that will be adapted to spatial environments such as Logic Pro and Final Cut that will set the standard for what professional software should look like according to Apple. 

Final thoughts

Predicting the future is always hard and when it comes to predicting updates to Apple’s software it is hard to come by reliable information and leaks for what to expect.

However, I believe this year it is slightly more predictable due to Apple’s tight lid on anything GenAI related for the past 18 or so months since ChatGPT was first released, they will have to show their hand eventually and it seems all but confirmed that they will talk about it at WWDC24. 

As outlined in this post, there are a few things you can do as a developer to prepare for what is likely to come from the event. Loomery can also help provide advice and technical support for helping your product get ready for all the new features that are coming.

Score your team against the 8Cs

Sign up below to receive a worksheet to score your team against the 8Cs, and a guide to some smart next steps based on where you score lowest.

For information on how we use your contact data, please read our Privacy Notice.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Get the latest news and views from Loomery directly to your inbox
Stay ahead of the curve with our monthly newsletter, The Weave.
Discover more insights
'