Apple has revealed a lot about the Vision Pro, their upcoming mixed-reality headset, even though it won’t be available until 2024. They dedicated almost an hour during the WWDC keynote to talk about it and also rolled out developer tools for Vision OS, which gives us more details about what the operating system can and can’t do. This is all aimed at helping users understand the capabilities of Vision OS more clearly.
However, I still hold the belief that Apple has not disclosed the complete narrative. The WWDC keynote served as an opportunity to make an initial impact, but there is still a considerable amount of time before the full picture unfolds. Consequently, Apple chose to highlight the features of the device that it deemed most representative. One key emphasis was on positioning the product as an augmented reality device that fosters connection with those in your surroundings. This strategic move aimed to counter any criticism suggesting that Apple intends to isolate individuals from one another using its technology.
More is yet to come with the Vision Pro. Apple has a bigger story to tell.
By default, the Vision Pro presents itself as an augmented-reality headset, urging users to engage with the world surrounding them and allowing them to filter reality through the Digital Crown. However, the reality is that it is, in fact, a virtual reality headset that mimics augmented reality. This emulation can be effortlessly switched off. If users prefer not to work within the expansive rooms showcased in Apple’s demo video, they can equally opt for a virtual reality environment without any hassle.
However, app developers have the ability to create entire VR experiences within their apps. Although Apple will highly encourage apps to function in augmented reality and make it a requirement for walking around objects, this won’t hinder the development of apps that offer traditional virtual reality experiences.
Apple did not touch upon VR games during their presentation, despite them being the most popular type of software in current VR devices. However, VR games will definitely be included in the Vision Pro experience. The extent to which Apple’s hand and eye-tracking technology will meet the requirements of different VR games remains unknown, but we will certainly discover the answer in due course.
Surprisingly, Apple did not reveal any integration between the Vision Pro and its Fitness+ service. The exclusion of such integration was surprising, given the immense popularity of fitness-focused apps like Supernatural and Beat Saber on Meta’s headsets. These apps have gained substantial followings by injecting excitement into regular workout routines.
It’s uncertain whether Fitness+ will be available on the Vision Pro at its initial release, as per Bloomberg’s Mark Gurman’s suggestion. However, if Apple doesn’t bridge this gap, another entity will step in to fill the void. Clever app developers will strive to become the next sensation, akin to Beat Saber. Regardless of the potential weight or awkwardness of the Vision Pro, regardless of the amount of perspiration absorbed by its 3-D-knit headband, people will undoubtedly attempt to utilize VR for fitness purposes. It’s an inevitable trend that will unfold.
During the WWDC event, Apple aimed to make a strong impression with the Digital Persona feature—an accurate representation of a user’s face generated through an extensive initial scan and brought to life by Vision Pro sensors that analyze eye movements, facial expressions, and body gestures. However, I, along with several others who tried the Vision Pro at WWDC, found these animations unsettling and somewhat eerie.
Apple never intended the Digital Persona to be the sole choice for users seeking a means of interacting with others while wearing the Vision Pro. The Digital Persona technology was something Apple took pride in, hence its inclusion as a demonstration to showcase its capabilities and to provide insight into the development of the EyeSight feature. It served as both a bragging point and an explanation of how the EyeSight functionality was achieved.
However, it is essential to acknowledge that the Digital Persona cannot be the sole option. Firstly, not everyone will feel at ease utilizing an exact replica of themselves if given the choice. Apple’s Memoji creation interface, for instance, offers a broad selection of characteristics, including ones that extend beyond the realms of human appearance. Additionally, users have the freedom to opt for an animated emoji character as their representation, providing further alternatives to suit individual preferences.
In all honesty, it seems quite probable that the Memoji project was a preliminary step towards introducing avatars for Vision Pro. It’s highly unlikely that Apple would enforce the use of Digital Personas for everyone, especially when there’s a preference to embody something as unconventional as an elephant. Such an approach is simply unrealistic and not something we can expect from Apple.
In Apple’s film showcasing the Vision Pro, there is a scene where a user verbally recites a website address, offering a glimpse of Apple’s voice assistant in conjunction with the headset. However, it is highly improbable that voice control won’t play a significant role in the device’s interface. It is almost inconceivable to imagine that Apple wouldn’t prioritize voice control as a major component of the Vision Pro’s user interaction.
During my experience with the Vision Pro, I encountered several instances where I realized that relying on voice commands would likely be more efficient than manipulating a virtual interface through hand gestures. However, Siri was not active on the demo headsets, most likely because it may not have met Apple’s quality standards. Additionally, if Apple heavily emphasized Siri integration, it could potentially diminish the perceived uniqueness of the other interface features that the company has worked on for the Vision Pro.
If a significant number of individuals in isolated settings, lounging on couches or seated in chairs, were to heavily rely on voice control, it could inadvertently amplify the perception of these devices as antisocial. Constantly interacting with Siri implies a lack of nearby listeners, which is not the impression Apple wants to convey at the moment. The company aims to avoid reinforcing the notion that using their devices leads to isolation.
We cannot deny the inherent integration of voice commands within the visionOS platform. The combination of visual and auditory interactions on the platform makes perfect sense and enhances the overall user experience.
Other Apple Devices
The Vision Pro seems to stand on par with the iPad and iPhone, possessing the capability to run iOS apps independently without relying on those devices. During the presentation, Apple showcased limited direct integration between the Vision Pro and its most popular products.
While it may not be available right from the initial launch, I find it difficult to believe that the iPhone won’t integrate closely with the Vision Pro. In macOS Sonoma, Apple is introducing the concept of iPhone widgets that can be projected onto your Mac desktop. So, why not extend this functionality to the Vision Pro? Imagine being able to launch and view iPhone apps directly within the Vision Pro, and even have the ability to view multiple apps simultaneously. The potential for such integration is certainly intriguing.
Apple is likely working diligently to utilize Continuity features in order to establish seamless connections between all of its devices and the Vision Pro. Apple is likely focusing on finding the best ways to integrate and enhance the user experience between the iPhone and the Vision Pro, as the iPhone holds particular importance among these devices.
The release of the Vision Pro is still some time away. While we have heard Apple’s initial sales pitch, it is far from the moment for the company to finalize the deal. There is undoubtedly more to this story than what has been revealed thus far. It’s a tale that is yet to unfold completely, so stay tuned for further updates and developments.