Tandon team wins $5 Million DARPA contract to develop AI-driven augmented reality assistant


BROOKLYN, New York, Monday, December 20, 2021 – The Visualization, Imaging and Data Analysis (VIDA) Center at the New York University Tandon School of Engineering is leading an NYU initiative to develop an artificial intelligence (AI) “virtual assistant” providing just-in-time visual and audio feedback to help with task execution. 

The project is part of a national effort involving eight other institutional teams, funded by the Defense Advanced Research Projects Agency (DARPA) Perceptually-enabled Task Guidance (PTG) program. With support of a $5 million DARPA contract, the NYU group, led by Claudio Silva, director of the VIDA Center and professor of computer science and engineering and data science at NYU Tandon and NYU CDS, aims to develop AI technologies to help people perform complex tasks while making these users more versatile by expanding their skillset — and more proficient by reducing their errors.

The NYU group, including investigators at NYU Tandon’s Department of Computer Science and Engineering, the NYU Center for Data Science and the Music and Audio Research Laboratory (MARL), will do fundamental research on knowledge transfer, perceptual grounding, perceptual attention and user modeling to create a dynamic intelligent agent that engages with the user, responding to not only circumstances but the user’s emotional state, location, surrounding conditions and more. 

Dubbing it a “checklist on steroids” Silva says that the project aims to develop Transparent, Interpretable, and Multimodal Personal Assistant (TIM), a system that can “see” and “hear” and what users see and hear, interpret spatiotemporal contexts and provide feedback through speech, sound and graphics. 

While the initial application use-cases for the project — for evaluation purposes — focus on military applications such as assisting medics and helicopter pilots, there are countless other scenarios that can benefit from this research, effectively any physical task. 

“The vision is that when someone is performing a certain operation, this intelligent agent would not only guide them through the procedural steps for the task at hand, but also be able to automatically track the process, and sense both what is happening in the environment, and the cognitive state of the user, while being as unobtrusive as possible, ” said Silva.

He noted that the DARPA project, focused as it is on human-centered and data-intensive computing, is right at the center of what VIDA does.  “Most of our current projects have an AI component and we tend to build systems — such as the ARt Image Exploration Space (ARIES) in collaboration with the Frick Collection, the VisTrails data exploration system, or the OpenSpace project for astrographics, which is deployed at planetariums around the world. What we make is really designed for real-world applications, systems for people to use, rather than as theoretical exercises,” said Silva. 

The project brings together a team of researchers from across computing, including visualization, human-computer interaction, augmented reality, graphics, computer vision, natural language processing, and machine listening. It includes 14 NYU faculty and students, with co-PIs Juan Bello, director of the Center for Urban Science and Progress (CUSP) at NYU Tandon; Kyunghyun Cho, and He He, associate and assistant professors (respectively) of computer science and data science at NYU Courant and CDS, and Qi Sun, assistant professor of computer science and engineering at NYU Tandon and a member of CUSP will use the Microsoft Hololens 2 augmented reality system as the hardware platform test bed for the project. 

The project will use the Microsoft Hololens 2 augmented reality system as the hardware platform testbed. Silva said that, because of its array of cameras, microphones, lidar scanners, and inertial measurement unit (IMU) sensors, the Hololens 2 headset is an ideal experimental platform for Tandon’s proposed TIM system.

“Integrating Hololens will allow us to deliver massive amounts of input data to the intelligent agent we are developing, allowing it to ‘understand’ the static and dynamic environment,” explained Silva, adding that the volume of data generated by the Hololens’ sensor array requires the integration of a remote AI system requiring very high speed, super low latency wireless connection between the headset and remote cloud computing. 

To hone TIM’s capabilities, Silva’s team will train it on a process that is at once mundane and highly dependent on the correct, step-by-step performance of discrete tasks: cooking. A critical element in this video-based training process is to “teach” the system to locate the starting and ending point — through interpretation of video frames — of each action in the demonstration process. 

“It’s conceivable that in five to ten years these ideas will be integrated into almost everything we do.” 

 

 

 

About the New York University Tandon School of Engineering

The NYU Tandon School of Engineering dates to 1854, the founding date for both the New York University School of Civil Engineering and Architecture and the Brooklyn Collegiate and Polytechnic Institute. A January 2014 merger created a comprehensive school of education and research in engineering and applied sciences as part of a global university, with close connections to engineering programs at NYU Abu Dhabi and NYU Shanghai. NYU Tandon is rooted in a vibrant tradition of entrepreneurship, intellectual curiosity, and innovative solutions to humanity’s most pressing global challenges. Research at Tandon focuses on vital intersections between communications/IT, cybersecurity, and data science/AI/robotics systems and tools and critical areas of society that they influence, including emerging media, health, sustainability, and urban living. We believe diversity is integral to excellence, and are creating a vibrant, inclusive, and equitable environment for all of our students, faculty and staff. For more information, visit engineering.nyu.edu.