Simulating Interactions with Visually Impaired | NYU Tandon School of Engineering

Simulating Interactions with Visually Impaired

Health & Wellness


Project Sponsor:

 


Project Abstract

Urban environments represent particularly dire challenges for the mobility of the visually impaired, who must travel complex routes in often crowded and noisy conditions with limited to no assistance. To help visually impaired regain their independence, they are offered orientation and mobility training (O&M). However, O&M training represent a risk to the visually impaired, as it exposes them to dangerous situations and falls. We seek to overcome this issue by simulating O&M training in virtual and augmented reality (VR/AR), in which trainers and trainees interact within a safe and controlled environment that simulates part of a city.


Project Description & Overview

Visual impairments will become a preeminent public health issue, as more baby boomers turn 65 and older. To reduce the impact of these disabilities on mobility, visually impaired attend orientation and mobility training (O&M) sessions, in which they learn techniques to travel safely within their community. These techniques include how to use a white cane, walk in a straight line, or cross an intersection in urban environments. Clearly, O&M training exposes visually impaired to potentially serious harm, including accidental falls, and undesired contact with people and objects.

Our previous work demonstrated that a virtual/augmented reality (VR/AR) platform can help overcome these dangers. Trainees can learn and practice new O&M techniques in a completely safe environment, before translating them in the real world. However, our previous study focused on a single player platform, which did not allow virtual interactions between trainers and trainees.

In this Capstone Project, students will extend our previous work by implementing a VR/AR multiplayer platform in which two users (trainer and trainee) interact in a virtual environment. VR/AR will be exploited to simulate visual impairments in the trainee. Students will design an O&M training in a highly dangerous realistic environment, such as a busy intersection in NYC, and implement it in VR/AR. They will formulate and perform hypothesis-driven experiments with human subjects, toward investigating technology-mediated interaction in training sessions. Ultimately, we aim at demonstrating the potential of a multiplayer VR/AR platform to train visually impaired persons in O&M techniques in a controlled, safe environment.


Datasets

No datasets required.


Competencies

  • Programming (preferably Unity, C#, and Python)
  • Data analysis and visualization (using R, Python, or MATLAB)
  • Statistics

We are looking for highly motivated students with a passion to explore and learn new concepts and ideas that range between engineering and medical science. Students should also show a keen and strong interest in rehabilitation and human-computer interaction.


Learning Outcomes & Deliverables

  1. Students will learn how to develop advanced VR/AR platforms for future experiments;
  2. Students will learn to design experiments involving humans and their technology-mediated interaction;
  3. Students will learn to formulate and test research hypotheses in a statistical framework.

Students

Jianjun Chen, Andrew Lan, Ruyi Quan, Hanyu Tian