Shape Estimation and Data-driven Intelligent Control of Soft Robotic Upper-limb Exoskeletons for In-home Telerehabilitation | NYU Tandon School of Engineering

Shape Estimation and Data-driven Intelligent Control of Soft Robotic Upper-limb Exoskeletons for In-home Telerehabilitation

Health & Wellness,
Urban


Project Sponsor:

 


Project Abstract

In our aging society, neuromuscular disorders like stroke are becoming more prevalent. With that comes an increasing need for labor-intensive physical therapy, which is prohibitively expensive for many patients in need, resulting in long-term paralysis from lack of appropriate care. Soft robotic exoskeletons can deliver safe, in-home, and quantifiable teletherapy for these patients. We are building a soft exoskeleton to control the hand, wrist, and elbow. We are fabricating, sensorizing, and controlling soft modular actuators. Shape estimation and control of soft robots is nontrivial. In this project, CUSP students will work with us to fabricate soft robots and train machine learning models for shape estimation and data-driven control.


Project Description & Overview

In the MERIIT lab, we are building a soft robotic exoskeleton for telerehabilitation. It has over 15 degrees of freedom (DOF). Each DOF is controlled by a soft module, custom-made in our lab with 3D printing and casting, and pneumatically controlled.

Soft robotic actuators are continuum robots. The kinematics of rigid robots can be captured by simple encoders. The kinematics of soft robots must be modeled with many more parameters. These models are unknown, subject to uncertainties and unmodeled dynamics. Thus, the control of these complex systems is a challenging problem. For a rigid robot, the reference commands can be analytically calculated. With a soft robot, machine learning and data-driven modeling combined with analytical computation can be used to map the reference commands to the resulting shape.

Students will be given experimental data from our soft robotic modules. The data includes pressure inputs from the pump station and optical data from cameras. They will clean and process this data and then build machine learning algorithms. They will use computer vision to perform shape estimation. With self-supervision, the shape labels will be used to train a model. The model will take pneumatic pressure commands as input, and it will output the resulting shape of the soft module. Students will also get involved with soft robotic fabrication, controls, sensorization, processing of other biosignals in our lab, and the corresponding learning techniques.


Datasets

We have collected data on the kinematic behavior of a large amount of soft robotic actuators being custom-made in our lab. The data includes optical/vision sensing, force sensing, and pressure/voltage readings from pneumatic pumps. We have been collecting this data for many of our custom-fabricated actuators. These actuators differ in size, form, function, and material properties, resulting in very rich data. We also have simulated data from Finite Element Analysis models for our actuators, which can be fused with our real data for hybrid learning approaches. We continue to collect new data from other biosignals in our lab, including Electromyography (EMG), Mechanomyography (MMG), microphones, and more.


Competencies

  • Computer vision
  • Deep/Machine learning
  • Sensor fusion
  • FEM modeling
  • Mechatronics
  • Robotics

Learning Outcomes & Deliverables

  1. A learned model to estimate the kinematics for soft robotic continuum actuators.
  2. Experience with the modeling, control and design of soft robots.
  3. Experience with computer vision, deep learning, and self-supervised learning applied to robotic systems.
  4. Experience with the fabrication of soft robots.

Students

Chen Liu, Xiaojia Pan, Jiazhi Xu, Yinuo Zhao