Apple Vision Pro labs wows Apple developers. (Apple)News 

Developers’ Responses to Apple Vision Pro Labs Introduction

Following the unveiling of Apple Vision Pro at the WWDC 2023 event, Apple has extended an invitation to the developer community to attend the newly established Apple Vision Pro labs. These labs serve as a platform for developers to engage in discussions regarding the integration of their apps with the device, as well as the necessary adjustments required to enhance its performance. While this process is customary for iOS, iPadOS, macOS, and watchOS, visionOS stands out as a unique technology that revolutionizes user interaction with the platform through a visually immersive AR/VR experience.

As a result, developers visiting the Vision Pro labs were both shocked and surprised to experience such technology and imagine how they could optimize their applications in this 3D visual interface.

Developers experience Apple Vision Pro

As CEO of the team behind successful apps like Flexibit, Fantastical and Cardhop, Michael Simmons has spent over a decade thinking about every aspect of how his team works. But when he brought the Fantastical to the Apple Vision Pro lab in Cupertino this summer and experienced it for the first time with the device, he felt something he didn’t expect.

“It was like seeing Fantastical for the first time,” he says. “It felt like I was part of the app.”

Developers around the world have echoed this sentiment. People can test their apps, get hands-on experience, and work with Apple experts to get answers to their questions. Developers can apply if they have a running visionOS app or an existing iPadOS or iOS app that they want to test with Apple Vision Pro.

Simmons, on the other hand, saw a fantastic job right out of the box. He describes the labs as a “testing ground” for future research and an opportunity to push software beyond its current limits. “A bordered screen can be limiting. Sure, you can scroll or use multiple screens, but in general you’re limited to the edges,” he says. “Experiencing computing not only strengthened the designs we were coming up with — it helped us think beyond left-to-right or up-and-down orientation, but generally across borders.”

And not just as CEO, but as lead product designer (and the guy who “still comes up with all these crazy ideas”), he left the lab with a fresh batch of spatial ideas. “Can people watch an entire week spatially? Can people compare their current day to the next week? If the day is less busy, can people make that day wider? And then, what if you had 360 degrees around you all week?” he says. “I could probably — no kidding — talk about this for two hours.”

“Audible Sigh”

Just before his inaugural visit to the Apple Vision Pro developer lab in London, David Smith, developer, prominent podcaster and self-described designer, prepared everything he needed for the day: a MacBook, an Xcode project and a checklist (on paper!) of what he hoped to achieve.

All the planning paid off. When he was working with the Apple Vision Pro, “I checked everything off my list,” Smith says. “From there I just pretended I was at home developing the next feature.”

“I was just pretending to be at home developing the next feature,” Smith said.

Smith began working on a spatial computing version of his Widgetsmith app almost immediately after the release of the visionOS SDK. While the visionOS simulator provides a solid foundation to help developers test the experience, the labs offer a unique opportunity for a full day of hands-on training with Apple Vision Pro before its public release. “I had been staring at this thing in the simulator for weeks and had a general idea of how it worked, but it was in a box,” says Smith. “The first time you see your app actually running, you’ll hear a note.”

Smith wanted to start working on the device as soon as possible so he could get the “full experience” and start refining the app. “I could say, ‘Oh, that didn’t work? Why didn’t it work? These are questions you can only really answer with a device.”

“We understand where we are going”

When testing Pixite’s video creator and editor Spool, experience manager Ben Guerrette prioritized exploring interactions. “What makes our editor different is that you tap videos to the beat,” he says. “Spool is great on touchscreens because the instrument is in front of you, but with Apple Vision Pro you’re looking at the interface you choose – and in our case, that means you’re looking at the video while touching the interface.”

The team spent time in the lab exploring different interaction models to address this core challenge. “At first, we didn’t know if it would work in our application,” says Guerrette. “But now we understand where we’re going. This kind of learning experience is incredibly valuable: it allows us to say, ‘OK, now we understand what we’re working with, what the interaction is, and how we can create a stronger connection.’

Slack’s Chief Engineering Technology Chris Delbuck set out to test the company’s iPadOS version of their app using an Apple Vision Pro. When he spent time with the device, “it immediately got me thinking about how 3D offerings and visuals could play out in our experiences,” he says. “I couldn’t have done it without the device in hand.”

“It helps us make better apps”

Simmons says the labs provided not only a playground, but a way to shape and streamline his team’s thinking about what the spatial experience could really be. “With Apple Vision Pro and spatial computing, I’ve really seen how to start building a limitless canvas—how to stop thinking about what fits on the screen,” he says. “And that helps us make better apps.”

Related posts

Leave a Comment