A team of researchers from NYU Tandon School of Engineering is taking a step closer to advanced augmented reality (AR) assistants by introducing the Augmented Reality Guidance and User-Modeling System, or ARGUS. 

TOPSHOT-FRANCE-TECHNOLOGY-ECONOMY-VR-AR-3D
(Photo : JEAN-FRANCOIS MONIER/AFP via Getty Images)
TOPSHOT - A visitor uses a Virtual Reality (VR) headset during the "Laval Virtual" virtual reality, augmented reality and 3D techonology show on April 6, 2018, in Laval, northwestern France. - The "Laval Virtual" virtual reality, augmented reality and 3D technology show runs from April 3 to April 8.

All About ARGUS

ARGUS is an interactive visual analytics tool tailored to bolster the creation of intelligent AR assistants compatible with devices like Microsoft HoloLens 2 or MagicLeap. Its capabilities empower developers to gather and scrutinize data, model human task performance, and identify and rectify issues within the AR assistants they are constructing.

Claudio Silva, an NYU Tandon Institute Professor of Computer Science and Engineering and a Professor of Data Science at the NYU Center for Data Science, spearheaded the research team behind ARGUS. 

The team is slated to present their findings on this tool at IEEE VIS 2023 on October 26 in Melbourne, Australia, where their paper will be awarded an Honorable Mention in the Best Paper Awards.

"Imagine you're developing an AR AI assistant to help home cooks prepare meals. Using ARGUS, a developer can monitor a cook working with the ingredients, so they can assess how well the AI is performing in understanding the environment and user actions," Silva noted

"Also, how the system is providing relevant instructions and feedback to the user. It is meant to be used by developers of such AR systems," he added.

Read Also: Apple MR Headset Update: AR-VR Wearable Could be Lighter, Thanks to Rumored External Magnetic Battery

ARGUS' Modes

ARGUS also operates in two distinct modes: Online and offline. The online mode is designed for real-time monitoring and debugging while an AR system is actively engaged. 

Developers gain insights into what the AR system perceives and how it interprets user actions and its environment. Moreover, they can fine-tune settings and capture data for future analysis.

Conversely, the offline mode is geared towards examining historical data generated by the AR system. It equips developers with tools to delve into and visualize this data, aiding their comprehension of the system's past behavior. 

The offline mode of ARGUS comprises three essential elements: the Data Manager, which aids in organizing and filtering AR session data; the Spatial View, which provides a 3D depiction of spatial interactions within the AR environment; and the Temporal View, focusing on the chronological sequence of actions and objects during AR sessions. This combined functionality simplifies thorough data analysis and debugging. 

"ARGUS is unique in its ability to provide comprehensive real-time monitoring and retrospective analysis of complex multimodal data in the development of systems," said Silva. 

"Its integration of spatial and temporal visualization tools sets it apart as a solution for improving intelligent assistive AR systems, offering capabilities not found together in other tools," he added.

The findings of the team were published in arXiv.

Related Article: Snapchat Dress Up Feature Brings Virtual Shopping Experience! What To Expect From This AR Enhancement?

Byline

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion