At the WWDC 2023 event, Apple has unveiled its long-awaited augmented reality headset, the Apple Vision Pro. This marks the tech giant’s first new product line since the release of the Apple Watch in 2015. According to Apple, the headset is a ‘spatial computer’ that seamlessly integrates the digital and physical worlds while maintaining awareness of the user’s surroundings.
“Today marks the beginning of a new era in computing,” said Apple CEO Tim Cook. He added, “Just as the Mac introduced us to personal computing and the iPhone introduced us to mobile computing, Apple Vision Pro introduces us to spatial computing.”
The Apple Vision Pro will be available next year, in 2024, and will retail at $3,499 (roughly Rs. 2,88,000) in the US.
The Vision Pro has two micro OLED displays with a total of 23 million pixels. Apple claimed that “this technological breakthrough, combined with custom curved lenses that enable incredible sharpness and clarity, delivers incredible experiences”.
You also get a battery lead that’s designed to stay in your pocket and can power the headset for up to two hours. However, users can continue to use the headset if they stay connected.
It also has a crown that looks similar to the crown found on Apple watches and allows users to adjust various scales. “The development of the Digital Crown allows the user to control the degree to which they are present in the environment,” the company said.
The Apple Vision Pro features the M2 chipset, Apple’s second-generation internal chip. Apple also included a new R1 chip that adds immersion and makes things look more realistic, and it “processes inputs from 12 cameras, five sensors and six microphones to ensure that content appears to be appearing in front of the user’s eyes in real time.”
The headphones also feature a “3D camera” that Apple Vision Pro allows “users to capture, relive and immerse themselves in their favorite memories with Spatial Audio.”
The headphones run on VisionOS, the iPhone maker’s new 3D user interface “that makes digital content appear to exist in the user’s physical world.”
“By dynamically reacting to natural light and creating shadows, it helps the user understand scale and distance. To enable users to navigate and interact with spatial content, Apple Vision Pro introduces a completely new input system controlled by human eyes, hands and voice.
“Users can browse apps just by looking at them, swiping their finger to select, swiping their wrist to scroll, or using their voice to dictate,” Apple said.