In the air
Assistant Professor of Electrical & Computer and Mechanical & Aerospace Engineering Giuseppe Loianno is exploring ways of making drones more stable mid-flight while carrying swinging loads. Typically, engineering aerial drones to maintain stability while lofting cargo is an expensive and complex endeavor. Research by Loianno and his team at the Agile Robotics and Perception Lab (ARPL) has made such capabilities possible in smaller, more affordable drones. The investigators are able to keep the drone in control over its payload, which dangles underneath, solely with onboard equipment. “We want drones to be capable of operating efficiently in less-than-optimal conditions, such as crowded urban areas with lots of physical obstacles or locales where GPS is difficult to access,” graduate student Guanrui Li says. “Drones can be sent into situations no person would want to be in, such as inspecting a nuclear reactor or undertaking a perilous search-and-rescue mission. One day you may even look up and see a team of drones delivering a heavy package to your door.”
A drone that can operate safely near dangerous power lines? That vision is becoming a reality thanks to Manifold Robotics, a company launched by alum Jeffrey Laut (then a student) and Institute Professor Maurizio Porfiri. Manifold recently partnered with the U.S. Army Research Laboratory, which granted them a license for its power line detect-and-avoid technology, and will be working to keep unmanned aerial vehicles safe while inspecting power lines (or while flying anywhere near them).
On the ground
Robots capable of the sophisticated motions that define advanced physical actions like walking, jumping, and navigating terrain can cost $50,000 or more, making real-world experimentation prohibitively expensive for many. Last year, a team at NYU Tandon helped design a relatively low-cost, easy-and-fast-to-assemble quadruped robot called “Solo 8” that could be upgraded and modified, opening the door to sophisticated research and development to teams on limited budgets, including those at startups, smaller labs, or teaching institutions. This year, the team has refined the robot, now called “Solo 12,” which has even more capabilities. For organizations without the resources to undertake the open-source project on their own, it’s now possible to buy Solo 12 either fully assembled or in kit form, thanks to a partnership with Barcelona-based PAL Robotics. Solo 12 is not the only robot perambulating around Tandon’s Machines in Motion Laboratory, which is run by Associate Professor of Electrical & Computer and Mechanical & Aerospace Engineering Ludovic Righetti. His team also recently developed Bolt, a biped robot capable of dynamic walking thanks to a novel algorithm. “If you understand how humans and animals move, you can devise algorithms that will allow biped and quadruped robots to move in a similar fashion,” Avadesh Meduri, a graduate student who works in the Lab, explains. “If you can figure out a way for robots to walk on uneven terrain without falling or grasp delicate objects without breaking them, they could be used in endless ways. There are too many practical applications to list completely here: everything from better-performing exoskeletons for those without use of their limbs to robots capable of climbing scaffolding to aid construction workers, entering burning buildings to save inhabitants, assisting in surgery, and deactivating landmines. We’re in an exciting and transformative period. Robots are going to be making people safer, healthier, and more productive.”
Like toddlers, bipedal robots are inherently unstable. Sometimes the problem comes from the terrain it is navigating: is it uneven or rocky? Sometimes the robot itself has insufficient control or an unwieldy design. Other times, outside interactions are a factor: a legged robot that is pushed hard enough will inevitably topple. Associate Professor of Mechanical and Aerospace Engineering Joo H. Kim, who heads Tandon’s Applied Dynamics and Optimization Laboratory, and his team of graduate student researchers have introduced an algorithmic framework that can estimate the balance stability states of a biped system and used it to demonstrate walking and push recovery control. The goal, he has explained, is to know if a robot is well-balanced enough, discovering that in advance computationally, rather than out in the field, where a toppling robot could do damage to itself or its surroundings.
In the cloud
There are many real-world — and, someday, off-world — applications for lightweight, energy-efficient, fully autonomous robots. Yet the more autonomous a robot is, the greater its computational requirements. Onboarding the components to handle this computational function adds weight and cost and reduces the potential for applications in hostile environments. It might thus be desirable to offload intensive computation — not only sensing and planning, but also low-level whole-body control — to remote servers in order to reduce on-board computational needs. 5G wireless cellular technology, with its low latency and high bandwidth capabilities, has the potential to unlock cloud-based high-performance control of complex robots. However, state-of-the-art control algorithms for legged robots can only tolerate very low control delays, which even ultra-lowlatency 5G edge computing can sometimes fail to achieve. A multidisciplinary team of Tandon researchers — including Ludovic Righetti and Siddharth Garg, Sundeep Rangan, and Elza Erkip of the Department of Electrical and Computer Engineering and members of NYU WIRELESS — are now investigating cloud-based wholebody control of legged robots over a 5G link. In simulations, their novel approach has significantly reduced on-board computational needs while increasing robustness.
In outer space
What do Coney Island and Mars have in common? Ask the NYU Robotics Design Team.
It takes a lot to capture the attention of New Yorkers, but carrying an autonomous Mars rover on the subway from Downtown Brooklyn to Coney Island will generally do the trick, according to the members of the NYU Robotics Design Team.
The team used the iconic beach while preparing to take part in NASA’s annual Robotic Mining Competition, which challenges students from across the country to design and build a fully autonomous robot capable of mining regolith, the powdery substance that covers the surface of Mars. While fun, the contest has a serious purpose: In order to sustain human life on Mars and enable a return trip back to Earth, NASA is researching In-Situ Resource Utilization (ISRU). If robotic excavators can be developed to extract raw resources, they can then be converted into water, fuel, building media, and other useful materials.
The sand at Coney Island, augmented with a base layer of gravel, stood in for Martian regolith as the team tested PIPER (Project Inter-Planetary Excavation Rover), drawing awed crowds as the autonomous machine excavated the material and deposited it at a collection point — with the goal being to gather as much as possible, as quickly and efficiently as possible.
The long subway rides and testing paid off: the team won the top prize in the innovation category, thanks in large part to an unusual set of legs that could raise and lower the excavator as needed; placed first in the live video presentation category, which had been instituted during COVID because competitors could not travel to the Kennedy Space Station to demonstrate their work; and were named one of the top teams overall. They also placed third in the public outreach category, because of the work they had done throughout the year with the city’s K-12 teachers and students — particularly challenging at a time of remote learning.
Presenting on a Global Stage
The IEEE International Conference on Robotics and Automation is widely acknowledged to be the world’s largest and most important gathering devoted to those fields, and in 2021 NYU Tandon researchers made their presence felt, conducting multiple workshops, speaking on panels, and presenting 18 papers on a variety of topics, including advancements in human-robot interfaces, multi-target visual tracking, and robotic stability.