Apple's Vision Pro mixed-reality headset will be officially released this week. Analysts estimate it will significantly impact the market; the company is predicted to sell 400,000 devices in the first year and generate $1.4 billion in sales.

The Vision Pro - with a starting price of $3,499 (256GB memory storage) - offers "spatial computing," an Apple-invented concept that enables users to arrange different programs strategically for different goals and fully immerse themselves in a digital world. Prior to its official launch, some tech experts had the chance to experience the highly anticipated wearable tech.

Sharper View Compared to Rivals

Empowered with sharp displays and a full M2 processor typically found in Macs, the Vision Pro lives up to the performance expectations set by Apple devices. It boasts a dedicated App Store for Vision Pro apps and supports installing over a million iPhone or iPad apps. Additionally, users can seamlessly pair the device with their Mac, leveraging a virtual 4K display inside the goggles.

According to The Verge, Apple's Vision Pro's strengths lie in its exceptionally sharp and vibrant screens, "passthrough" technology for a clear view of the surrounding environment, and a high-performance processor. Notably, the displays eliminate the "screendoor" effect seen in lower-cost headsets, enhancing the overall visual experience.

Compared to competitors like the Meta Quest 3, the Vision Pro's passthrough feature stands out, providing a clearer and sharper view of the real-world surroundings with full-color visibility and minimal lag. The device features a small digital crown, similar to controls on the Apple Watch or AirPods Max, facilitating volume adjustments and transporting users into a 3D landscape. Virtual travel is a notable feature, allowing users to simulate working or watching movies in various scenic environments.

Apple's Vision Pro Review: Breaking Ground in Wearable AR

(Photo : Justin Sullivan/Getty Images)

Superb Eye and Hand Tracking, But There are Challenges Too

Apple takes pride in the advanced eye and hand tracking control system featured in the Vision Pro, surpassing other consumer systems on the market. Navigation relies on sensors tracking users' eyes, which eliminates the need for controllers, a CNBC article pointed out.

Deputy tech editor Todd Haselton lauded Apple's new product is easy navigation, describing the setup process as "incredibly accurate" and quick. "You just look where you want to go and then tap your thumb and index finger to select a button or app," he noted.

Initially, the hand and eye tracking system on the Vision Pro is awe-inspiring, offering a sense of superpower. The external cameras effectively track hands within a considerable zone around the user, eliminating the need for gestures in the air. T, with continued use, the novelty diminishes, and certain aspects make the device operationally challenging. The requirement to constantly look at the target for control becomes distracting and, in some cases, counterproductive.

As the Vision Pro demands the user's focus on an element before executing a command, it creates potential distractions as attention shifts from the task at hand to locating the specific button or control. Plus, the VisionOS interface appears tailored for slightly more precise eye tracking, with controls often placed too closely together, leading to inadvertent clicks.

Read Also: Deus Ex Project Scrapped: Embracer Group Faces Layoffs and Studio Closures

Notably, the relationship between the eyes and hands controlling the Vision Pro is not direct; cameras observe and translate their movements into input. This distinction becomes apparent in scenarios where interpretation flaws occur. A notable example is the on-screen keyboard, activated by staring at letters and pinching fingers for selection. While suitable for short inputs, its impracticality for longer text entries highlights the need for alternative input methods like dictation or a Bluetooth keyboard for direct control.

Furthermore, the Vision Pro's ability to detect hands is not omnipresent; certain positions or environments may obstruct visibility.

Not a Lot of Mixed Reality?

CNET's Scott Stein noted that testing the Vision Pro as a primary computing device creates an immersive environment with virtual monitors surrounding the user. It can pair accessories like the Magic Keyboard and Magic Trackpad; the experience feels powerful. However, he observed anomalies such as the smart-type toolbar, virtually hovering above the physical keyboard, occasionally triggers unintended actions, and some iPad-like Page options are challenging to select, triggering a subtle struggle between the virtual and the real.

However, when aiming for a seamless computing experience, he noted that the system's propensity for extra inputs can become frustrating. Maintaining a balance between the magical aspects of the system and ensuring reliability in its input mechanisms remains crucial for the widespread adoption and effective use of the Apple Vision Pro.

The author also observed that there needs to be more blending of virtual reality and reality in apps and environments even though Apple's ARKit on iOS already has that capability. "I haven't seen virtual things hop on my sofa, or run behind furniture. I haven't placed apps on walls or turned windows into aquariums," Stein remarked.

Ultimately, the Vision Pro has the potential to pioneer the initial move into modern spatial computing for mixed reality. It offers a futuristic feel that is not just due to the display and apps but also to the subtle and intuitive integration of eye and hand tracking, surpassing other AR headsets in this aspect.

Related Article: Xreal Aims to Overtake Apple in AR Glasses War After Raising $60 Million

byline-quincy


ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion