IPhone

Vision Pro AR/VR, The Future of Spatial Computing

Apple is gearing up for the release of its highly anticipated Vision Pro headset, set to debut in the US on February 2. The tech giant has recently unveiled detailed app development requirements for the device, shedding light on its unique features and capabilities.

In line with the launch, Apple has emphasized the need for developers to thoroughly introduce and describe their creations before submitting their code to the device’s store. The company has set forth new guidelines for apps intended to operate on visionOS, the operating system specially tailored for the Vision Pro headset.

According to Apple’s new guidelines, apps designed for visionOS should be labeled as “spatial computing” experiences rather than traditional “virtual reality” apps. Developers are also instructed to refrain from describing their apps as augmented reality (AR), virtual reality (VR), extended reality (XR), or mixed reality (MR) apps, even if that’s their intended functionality.

The Vision Pro headset, priced at $3,500, is available for pre-order starting January 19, marking a significant advance in the company’s foray into the AR/VR space. Notably, the visionOS operating system is said to maintain full compatibility with iPadOS and iOS apps, making it easier for developers to extend compatibility to the new platform.

Furthermore, Apple’s guidelines highlight that apps created for iPhone and iPad devices can seamlessly run on the Vision Pro headset without modification, underscoring the seamless integration of the new platform with existing iOS apps.

The submission guidelines for Vision Pro apps also emphasize marketing principles that may be easily overlooked by end users. Developers are instructed to convey accurate “motion information,” particularly in the new entertainment category, and to indicate any visually intense experiences in the app’s description.

Additionally, Apple has mandated specific privacy labels for visionOS apps, covering elements such as environment scanning, scene classification, image detection, and body tracking. Any scanning practices carried out by third-party code within an app must also be clearly declared to users.

Developers are encouraged to leverage the visionOS SDK in Xcode 15.2 to create their spatial computing experiences tailored for the Vision Pro headset, with Apple highlighting the “unique” and immersive capabilities of visionOS in additional development resources. The company has also emphasized that existing apps can be upgraded to fully exploit the platform-specific capabilities, providing ample opportunity for developers to capitalize on Apple’s latest innovation.

George

Leave a Comment