Oh, wow. OK. Apple really is making a $3.5K VR ski-mask. Dev tools are now out for it

They weren't kidding about this facial, er, spatial computing thing

Apple's Vision Pro goggles won't be available until next year, though registered developers can now explore the iGiant's tools for making apps for the virtual-reality headset.

On Wednesday, the tech titan released Xcode 15 beta 2, which includes the visionOS software development kit (SDK), a 3D content design tool called Reality Composer Pro, and a visionOS simulator. VisionOS being the operating system powering the ski-mask-like Vision Pro gadget.

Together, these tools give software developers a way to begin creating and testing augmented-reality apps in preparation for actual hardware availability.

"Apple Vision Pro redefines what’s possible on a computing platform," said Susan Prescott, Apple’s VP of worldwide developer relations, in a statement. "Developers can get started building visionOS apps using the powerful frameworks they already know, and take their development even further with new innovative tools and technologies like Reality Composer Pro, to design all-new experiences for their users.

"By taking advantage of the space around the user, spatial computing unlocks new opportunities for our developers, and enables them to imagine new ways to help their users connect, be productive, and enjoy new types of entertainment."

To acclimate developers to the unusual challenges of virtual environment interaction, Apple next month plans to open developer labs in Cupertino, London, Munich, Shanghai, Singapore, and Tokyo. This will provide an opportunity to see how apps actually perform on a facetop. The company is also planning to let teams apply to receive developer kits – presumably prototype hardware of some sort.

A decade ago, Google adopted a similar approach, distributing its unloved Google Glass augmented reality headset through Basecamp stores in San Francisco, London, Los Angeles, and New York. But it shuttered those outlets in late 2014, then in 2019 shut down the Google+ social network where the ad biz published the Basecamp decampment announcement.

Starting next month, developers who have created apps using the Unity development framework will have a shortcut into the world of spatial computing: Apple will provide a way to port Unity games and apps to visionOS.

Figuring out what's possible in visionOS and how to create an appealing experience may not come easily, so it's helpful that devs have at least half a year to hash things out. Interacting with on-screen objects requires reading standard hand gestures supported in the SwiftUI framework or creating custom gestures using ARKit, Apple's augmented reality framework.

Apple has provided four sample apps that can help developers understand its new computing paradigm. There's Hello World, a demonstration of windows and 3D space; Destination Video, which shows off spatial audio and 3D video; Happy Beam, which shows how to use ARKit for 3D entertainment; and Diorama, which illustrates how to use Reality Composer Pro to create and preview RealityKit content.

Initially, developers will have to get used to how 2D interfaces and associated gestures should behave in a 3D environment. Then there's figuring out how to take advantage of spatial audio, which locates sounds in 3D space. And eventually, devs will have to grapple with how multiple people should interact in a 3D environment, using SharePlay, the Group Activities framework, and Spatial Personas.

There's a lot to think about, like how many people plan to pony up $3,500 for a headset and hand waving. At that price point, it doesn't have to be that many to make it worthwhile. ®

More about


Send us news

Other stories you might like