We make the world more accessible — and more exciting
We make life more user-friendly
Industry Assistant Professor Regine Gilbert and her students are now partnering with the Advanced Research In STEAM Accessibility (ARISA) Lab, an organization dedicated to making science enriching and open to all. The Tandon team will be participating in ARISA’s Eclipse Soundscapes Project, an initiative that had been inspired by anecdotal accounts that suggest animal behavior may change during a total solar eclipse. According to one story, at the moment of totality, when the moon blocked out the sun, a chorus of crickets began chirping. As soon as the light returned, the crickets stopped. The project’s creators realized that if eclipses affect the earth in ways that can be experienced and measured using a variety of senses, not merely sight, even the visually impaired could get excited about eclipse research. Now, through a series of workshops led by subject matter experts from NASA’s Science Mission Directorate, citizen scientists will collect audio recordings from eclipses and analyze acoustic data to determine how disruptions in light and circadian rhythms may affect ecosystems.
All workshops, materials, and learning interfaces will be designed to the highest degree of accessibility, with an emphasis on physical, social, and cognitive inclusion, and that’s where Gilbert and her students come in: they’ll be designing, implementing, and testing all citizen-scientist web interfaces for the project, which was recently approved for a five-year cooperative agreement from the NASA Science Mission Directorate’s Science Activation Program.
The VIP Consortium — a network of 45 schools in 12 countries that have implemented an initiative called Vertically Integrated Projects — holds an annual competition to choose the best projects and student teams, and this year, a group of Tandon students took home the second-place prize.
Sixth Sense, the only American-based team to place in the competition, is developing technology aimed at helping people with visual impairments navigate their surroundings. The team is building a machine learning system that can detect physical obstacles faster than the human eye. The user simply dons a backpack containing all the needed electronics, as well as a belt containing several actuators, each of which touches a specific spot on the wearer’s abdomen. When a camera attached to the backpack detects an obstacle, a signal is sent to the appropriate actuator, which vibrates to let the user know. (If the actuator on the upper right abdomen vibrates, for example, that signals an obstacle to the upper right of the wearer’s position.) The students point out that the system can be especially helpful for those who live in densely populated areas, where navigation is difficult. They foresee it being more cost-effective and convenient than existing methods, which generally involve bulky, handheld devices or headphones, which pose a danger since people with visual impairments often rely on auditory clues to function.
We make our real and virtual worlds more expansive
Associate Professor of Civil and Urban Engineering Semiha Ergan points out that people spend more than 90% of their time indoors, making it imperative to understand how the built environment influences human experience and to assess how architectural design features impact them. Ergan, who also has an appointment in Tandon’s Department of Computer Science and Engineering and is an associated faculty member at CUSP, is using noninvasive sensors to measure physiological metrics such as skin conductance, brain activity, and heart rate and placing subjects in virtual reality environments to find out. Do larger windows lead to lower blood pressure? Do narrow doors provoke anxiety? The quantitative data she is gathering is proving very interesting to architects and designers, so if the next building you enter induces a marked feeling of well-being, you just might have Ergan to thank.
The concept of a metaverse, a shared online space made possible by augmented and virtual reality technology, is capturing the popular imagination, with real-life pop stars holding concerts on virtual platforms and collectors paying large sums of money for one-of-a-kind digital objects that confer status in the richly detailed 3D environments of the new universe.
Industry Professor of Integrated Design and Media Carla Gannis, a veteran of creating such environments, warns however, that the metaverse can pose just as many dangers as the real world, and she’s teaching her students to consider the implications of their work. “We really need to think, as we’re contemplating the metaverse, about accountability,” she has told interviewers. “Are we just going to perpetuate the problems we have in grounded or base reality? The ills of human culture and society can be persistent, unless we’re actually thinking ethically about these things.”
Alum Benjamin Williams is leveraging his Management of Technology degree as the CEO of his own company, ARsome Technology, which designs custom augmented reality (AR) software and systems for a variety of clients and strategic partners — many in the education and healthcare sectors. (The latter is especially vital, he has pointed out, at a time when an estimated 40% of all Americans exhibit poor health literacy.)
Williams launched ARsome in 2016, while he was still a student, and among his first products was an interactive AR scavenger hunt for a Boston museum. He later developed an interactive experience outside the public library in Hartford, Connecticut, home of Mark Twain, to make a statue of the famed 19th-century writer come to life when patrons pointed their smartphones or tablets at it. Mystic Aquarium, one of the most popular aquatic museums in the country, has been another partner: ARSome has developed an AR experience that allows young subscribers to personalize an avatar and learn about a new creature each month through interactive games, videos, and activities. With AR technology now being used in a wide variety of sectors — from retailing and manufacturing to theme-park design and book publishing — ARSome is poised to grow even further
We make “Epic” realities
Disney Imagineering, Industrial Light & Magic, Amazon Web Services, T-Mobile, the New York Times — it’s an impressive list of companies, and they’re all flocking to NYU Tandon’s online course for digital filmmakers, content directors, producers, technologists, and students who want to work with the Unreal Engine, a suite of tools that allows for the creation of cuttingedge entertainment, compelling visualizations, and immersive virtual worlds. Considered the most advanced and powerful platform of its kind, it has been used by the makers of blockbusters like The Mandalorian and mega-selling games like Fortnite, and by countless architects and car designers, among other professionals, to support their work.
The course was developed by Todd Bryant, Visiting Industry Assistant Professor, alongside colleagues in the Tandon Integrated Design & Media (IDM) program — co-directed by Technology, Culture & Society Department Co-Chair R. Luke DuBois and Industry Associate Professor Scott Fitzgerald — with the help of a grant from Epic Games (makers of Fortnite).
Up next? A worldbuilding course in which participants will explore relationships to nature in virtual worlds, and another focused on mixed reality filmmaking and the key issues, challenges, and best practices of visual storytelling. By training the existing workforce and students in real-time virtual production pipelines, IDM faculty hope to bridge the gap between talented individuals working in real-time 3D technologies and the studios and agencies using those pipelines to develop projects.
In addition to a cutting-edge curriculum, IDM boasts state-of-the-art facilities, some located in Building 22 of the Brooklyn Navy Yard. These include a 44-camera Optitrack motion capture studio that has already been used by the American Ballet Theater, and a volumetric studio that can handle the real-time feeds from 20 Intel RealSense cameras. (Volumetric video captures objects or spaces three-dimensionally in real-time, and they can then be transferred to web, mobile, or virtual worlds to be viewed.)
But what’s the use of having access to all this tech if you’re not going to dive into production yourself? IDM faculty have created Planet Real-time, a variety and interview show entirely about real-time rendering made inside the Unreal Engine. The format gives guests the ability to be interviewed from home while controlling an avatar in the virtual environment with motion and facial capture and to share their experience with a range of projects, from purely art-driven non-commercial pieces to AAA games.
With the help of Epic and Unreal Engine, NYU IDM is definitely ready for its close-up now.
Design students win big
For design students in the Integrated Design and Media program, getting a degree at an engineering school offers a unique opportunity to explore the technology behind great art and design. That approach helped put these students over the top in DesignRush’s Global Student Design Competition 2021. Four Tandon students — Sally Lee, Cassandra Liau, Thao Minh Nguyen, and Claudia Shao — were recognized for their work at the intersection of art and engineering. Their projects — ranging from a mobile application that tracks users’ physical and mental health by using a 5g medical kit to a digital film-festival ad campaign — offered a glimpse into the unique way the department approaches design as a discipline.