Rose Faghih
-
Associate Professor of Biomedical Engineering
Education
Massachusetts Institute of Technology
Postdoctoral, Brain and Cognitive Sciences
Massachusetts Institute of Technology
Ph.D., Electrical Engineering and Computer Science
Massachusetts Institute of Technology
S.M., Electrical Engineering and Computer Science
University of Maryland
B.S. (Summa cum Laude), Electrical Engineering (Honors Program)
Awards & Distinctions
- MIT Technology Review Innovator Under 35, 2020
- National Science Foundation CAREER Award, 2020
- Junior Faculty Research Excellence Award, Cullen College of Engineering, University of Houston, 2020
- Teaching Excellence Award, Cullen College of Engineering, University of Houston, 2020
- Featured in IEEE Women in Engineering Magazine as a 'Woman To Watch', 2020
- Selected for the National Academy of Engineering's Frontiers of Engineering Symposium, 2019
- IEEE-USA’s New Face of Engineering, 2016
- National Science Foundation Graduate Research Fellowship, on tenure 2009-2012
- Massachusetts Institute of Technology Graduate Fellowship in Control, 2008
- Department of Electrical and Computer Engineering Chair’s Award, University of Maryland, 2008
- Phi Kappa Phi Honor Society, Inducted in 2008
- Tau Beta Pi, The Engineering Honor Society, Inducted in 2008
- Eta Kappa Nu, The Honor Society of IEEE, Inducted in 2008
- University of Maryland President's Scholarship, 2006-2008
Research News
Researchers Quantify Intensity of Emotional Response to Sound, Images and Touch Through Skin Conductance
When we listen to a moving piece of music or feel the gentle pulse of a haptic vibration, our bodies react before we consciously register the feeling. The heart may quicken, palms may sweat resulting in subtle electrical resistance variations in the skin. These changes, though often imperceptible, reflect the brain’s engagement with the world. A recent study by researchers at NYU Tandon and the Icahn School of Medicine at Mount Sinai and published in PLOS Mental Health explores how such physiological signals can reveal cognitive arousal — the level of mental alertness and emotional activation — without the need for subjective reporting.
The researchers, led by Associate Professor of Biomedical Engineering Rose Faghih at NYU Tandon, focused on skin conductance, a well-established indicator of autonomic nervous system activity. When sweat glands are stimulated, even minutely, the skin’s ability to conduct electricity changes. This process, known as electrodermal activity, has long been associated with emotional and cognitive states. What distinguishes this study is the combination of physiological modeling and advanced statistical methods to interpret these subtle electrical fluctuations in response to different sensory experiences.
This research work started as a course project for student authors Suzanne Oliver and Jinhan Zhang in Faghih's “Neural and Physiological Signal Processing.'' Research Scientist and co-author Vidya Raju mentored the students under the supervision of Faghih. James W. Murrough, Professor of Psychiatry and Neuroscience and Director of the Depression and Anxiety Center for Discovery and Treatment at the Icahn School of Medicine at Mount Sinai also collaborated in this research.
Taking Prof. Faghih's class was a great experience and allowed me to combine coursework and research,” said Oliver. “It was very exciting to see the work I did in class could help improve treatment of mental health conditions in the future."
The researchers analyzed a published dataset of participants’ continuously recorded skin conductance measured while they were exposed to visual, auditory, and haptic stimuli. Participants also provided self-ratings of arousal using the Self-Assessment Manikin, a pictorial scale that quantifies emotional states. By applying a physiologically informed computational model, the team separated the slow and fast components of the skin’s electrical response and inferred when the autonomic nervous system was most active. Bayesian filtering and a marked point process algorithm were then used to estimate a continuous measure of cognitive arousal over time.
The analysis revealed a striking pattern: the nervous system responded most strongly within two seconds of a new stimulus, with haptic sensations eliciting the largest immediate activations. Yet when the researchers compared these physiological signals to participants’ own self-assessments, auditory stimuli — particularly sounds and music — were most often linked to high arousal states. This suggests that the brain’s perception of stimulation and the body’s raw autonomic responses, while related, may not always align perfectly. However, when the physiological signals were further processed into estimates of user arousal, the modelled arousal agreed with the participant's assessment that auditory stimuli caused the highest arousal.
Interestingly, the model was able to track transitions in participants’ arousal levels as they moved from low- to high-intensity stimuli with an accuracy exceeding random chance. When the participants who felt more stimulated by visual cues were analyzed separately from those more responsive to touch, the model’s predictions revealed the significant differences in participants’ responses to these stimuli in the self-reports effectively capturing group trends.
The implications of this work extend beyond the laboratory. In clinical contexts, self-reported measures remain the gold standard for assessing mental states such as anxiety or stress, yet they are inherently subjective and often unreliable. Objective metrics derived from skin conductance could complement these reports, offering clinicians a more nuanced view of a patient’s emotional dynamics in real time. Such tools might one day aid in monitoring recovery from depression, anxiety, or post-traumatic stress disorder, where changes in physiological arousal often mirror symptom fluctuations.
The study also points to potential uses in virtual reality and human-computer interaction. By quantifying how users react to visual, auditory, or tactile elements, systems could adapt dynamically — heightening immersion, enhancing focus, or reducing stress depending on the goal. This closed-loop feedback between body and machine could make digital environments more responsive to human emotion.
Still, the authors acknowledge the complexity of translating sweat and associated signals into precise emotional understanding. Factors such as stimulus duration, individual variability, and prior experience complicate the interpretation. The correlation between computed arousal and self-reported ratings was modest overall, reflecting the intricate and personal nature of emotional experience. Yet the model’s consistency in identifying moments of heightened engagement underscores its promise as a complementary measure of internal states.
In essence, the study bridges a subtle gap between physiology and perception. By grounding emotion in the body’s own electrical rhythms, it invites a more continuous, data-driven view of how humans experience the world — one that may eventually inform both mental health care and the design of emotionally intelligent technologies.
New study tracks leptin pulse patterns, a potential clue to understanding obesity
Obesity has become a global epidemic, and the need for treatment and monitoring for people with obesity is growing. Researchers aiming to understand relevant biomarkers for the condition have fixed their gaze on leptin, a hormone that regulates energy intake and induces feelings of fullness, to eventually help improve treatments for obesity. Understanding the patterns of leptin secretion from fat cells throughout the body could help scientists identify hidden health issues in patients with obesity, monitor health development after treatment, and test drug effectiveness.
Now, in a new study in the Journal of the Endocrine Society, NYU Tandon Associate Professor of Biomedical Engineering Rose Faghih and her PhD students Qing Xiang and Revanth Reddy have utilized a probabilistic physiological modeling approach to investigate the pulse events underlying leptin secretion.
“Instead of only relying on visual inspection of leptin by statistical modeling of leptin secretion events, we quantify the underlying pulsatile physiological signaling to enable extending investigations of leptin signaling in health and disease and in response to medications,” said Faghih.
Leptin, produced primarily by fat cells, typically signals the brain when enough food has been consumed, helping to regulate energy and appetite. But in some individuals with obesity, this signaling system seems to malfunction — a condition known as leptin resistance, where high levels of leptin fail to curb hunger. Why this resistance develops remains unclear, but recent findings on leptin’s high-frequency, wave-like release patterns offer intriguing possibilities for future research.
Hormones like cortisol and insulin, which manage stress and blood sugar, are known to follow rhythmic release patterns, adapting to daily cycles. Leptin follows a similar — but unknown
— rhythm. Faghih’s lab set out to analyze leptin’s pulses, examining how often these leptin pulses occur and their relative strength. By breaking down the pulses into their timings and amplitudes, researchers could better understand these patterns, which could lead to breakthroughs in treating leptin resistance and, by extension, obesity.
The researchers applied statistical models with different complexities to fit the data and compared the performance of these distribution models for each subject using pre-established metrics. They aimed to find the best model for these distributions which would be important in monitoring leptin secretion and detecting deviations from a normal secretion pattern.
The results show that each of these models can capture the general shape of the distribution alone for each subject despite the complex process of leptin secretion which can be affected by various factors. These distribution models suggest several possible features of leptin secretion such as the rarity of extremely small or large pulses.
They also allow researchers to see the effects of drugs by identifying changes in the model parameters before and after treatment. They compared leptin behavior before and after treatment with bromocriptine, a medication that affects neuroendocrine signaling. After treatment, subtle shifts in one model of leptin’s pulse timings (the diffusion model) suggested that it might be possible to influence these patterns with medication. Such findings open the door to exploring hormonal treatments for obesity that work by reestablishing leptin’s natural rhythms.
The research is a key first step in reliable long term leptin monitoring that could help doctors and patients fight obesity.
This work was supported by the National Institutes of Health (NIH) under grant R35GM151353: MESH: Multimodal Estimators for Sensing Health.
Qing Xiang, Revanth Reddy, Rose T Faghih, Marked Point Process Secretory Events Statistically Characterize Leptin Pulsatile Dynamics, Journal of the Endocrine Society, Volume 8, Issue 10, October 2024, bvae149, https://doi.org/10.1210/jendso/bvae149
Harnessing the power of eye tracking in brain-machine interfaces
In recent years, eye tracking technology has advanced rapidly, suggesting that our eyes deserve greater attention within the evolving brain-machine interface (BMI) landscape. One particularly intriguing area is the connection between eye movements and internal brain states—a link that is becoming increasingly difficult to ignore. Eye tracking systems can function in a completely contactless manner, integrated into devices like screens, laptops, tablets, and smartphones. In contrast, wearable-based systems utilize wearable technology to monitor and even influence brain states, presenting a more hands-on approach to BMI development.
However, a promising alternative lies in the development of a framework that decodes hidden brain states, such as interoceptive awareness, directly from eye tracking data. This advance could help create safer, more efficient closed-loop systems that monitor and modulate the brain-body connection. That is the findings of a new study from the lab of Rose Faghih, Associate Professor of Biomedical Engineering at NYU Tandon.
Decoding the Brain’s Hidden Signals
Interoceptive awareness represents the brain’s ability to interpret bodily sensations—signals that arise in response to internal or external stimuli. However, these states are difficult to observe and must be decoded through physiological indicators. Tracking and understanding these internal brain states is critical for optimizing the brain-body connection, yet the challenge lies in how to access them.
One potential solution is to study interoception in the context of fear conditioning, a process where heightened arousal correlates with heightened interoception. In Pavlovian fear conditioning, subjects learn to anticipate aversive events—such as a mild electric shock—creating an ideal model for observing interoceptive signals. In a recent experiment, participants underwent fear conditioning and extinction, with mild electric shocks used as the aversive stimulus. Given the strong association between arousal and interoceptive awareness, researchers anticipated synchronized responses between these two states.
In this study, the research team decoded interoceptive awareness by analyzing neural activity linked to eye tracking data—specifically, measurements from pupillometry and eye gaze patterns. In parallel, they decoded arousal states from skin conductance data. While it was expected that the two states would show similar responses to the electric shock, the interoceptive awareness state, as inferred from eye tracking data, showed greater sensitivity to the mild shocks than the arousal state decoded from skin conductance.
This finding underscores the potential of eye tracking technology as a powerful psychophysiological tool for decoding interoceptive awareness, a signal that could offer significant insight into brain-body interactions.
Towards Future Closed-Loop Systems: The Dawn of ‘MINDCAM’
The discovery that eye tracking signals can serve as sensitive indicators of interoceptive awareness opens up exciting possibilities. These findings could pave the way for new therapeutic approaches to treating neuropsychiatric and neurodegenerative disorders. By decoding interoceptive awareness, future closed-loop systems may be able to restore and enhance the brain-body connection, offering safer, more personalized interventions.
One particularly promising application is the development of ‘MINDCAM’—a system that integrates eye-tracking-enabled cameras into devices like smartphones, tablets, and monitors. This technology could potentially monitor a user’s interoceptive awareness in real time, helping individuals regulate their mood and cognitive performance. However, while this research represents an exciting first step, much more work is needed to develop safe and effective closed-loop systems that can reliably decode and modulate interoceptive states.
Faghih’s previous research on wearables includes the development of ‘MINDWATCH,’ which uses information collected from electrical charges in skin to assess brain states. MINDCAM could be used to complement that technology to provide even better data on how the brain reacts to stress.
The integration of eye tracking technology into brain-machine interfaces may hold the key to unlocking deeper insights into the mind, offering new hope for improving mental health and cognitive function in the years to come.
This research was supported in part by National Institutes of Health (NIH) grant R35GM151353 - Maximizing Investigators' Research Award (MIRA) for Early Stage Investigators (ESI): MESH: Multimodal Estimators for Sensing Health and in part by National Science Foundation (NSF) under Grant 2226123 - Faculty Early Career Development Program (CAREER): MINDWATCH: Multimodal Intelligent Noninvasive brain state Decoder for Wearable AdapTive Closed-loop arcHitectures.
Saman Khazaei, Rose T Faghih, Eye tracking is more sensitive than skin conductance response in detecting mild environmental stimuli, PNAS Nexus, Volume 3, Issue 9, September 2024, page 370
New research develops algorithm to track cognitive arousal for optimizing remote work
In the ever-evolving landscape of workplace dynamics, the intricate dance between stress and productivity takes center stage. A recent study, spanning various disciplines and delving into the depths of neuroscience, sheds light on this complex relationship, challenging conventional wisdom and opening new pathways for understanding how to improve productivity.
At the heart of this exploration lies the Yerkes-Dodson law, a theory proposing an optimal level of stress conducive to peak productivity. Yet, as researchers unveil, the universality of this law remains under scrutiny, prompting deeper dives into the nuances of stress-response across different contexts and populations.
Drawing from neuroscience, researchers from NYU Tandon led by Rose Faghih, Associate Professor of Biomedical Engineering, have published a study illuminating the role of autonomic nervous system, which is directly influenced by key brain regions — like the amygdala, prefrontal cortex, and hippocampus — in shaping our responses to stress and influencing cognitive functions. These insights not only deepen our comprehension of stress but also offer pathways to enhance cognitive performance.
The researchers’ approach is innovative, in that it concurrently tracks cognitive arousal and expressive typing states, employing sophisticated multi-state Bayesian filtering techniques. This allows them to paint a picture of how physiological responses and cognitive states interplay to influence productivity.
One particularly innovative aspect of the study involves typing dynamics as a measure of cognitive engagement and emotional expression. By examining typing patterns and brain autonomic nervous system activation, researchers gain insights into individuals' cognitive states, especially relevant in remote work environments. The integration of typing dynamics into the analysis provides a tangible link between internal cognitive processes and externalized behaviors.
“With the rise of remote work, understanding how stress impacts productivity takes on newfound significance,” Faghih says. “By uncovering the mechanisms at play, we’re paving the way for developing tools and strategies to eventually optimize performance and well-being in remote settings.”
Moreover, the study's methodology, grounded in sophisticated Bayesian models, promises not only to validate existing theories but also to unveil new patterns and insights. As the discussion turns to practical applications, the potential for integrating these findings into ergonomic workspaces and mental health support systems becomes apparent.
This research offers a glimpse into the intricate web of stress, productivity, and cognition. As we navigate the evolving landscape of work, understanding these dynamics becomes paramount, paving the way for a more productive, resilient workforce.
Alam, S., Khazaei, S., & Faghih, R. T. (2024). Unveiling productivity: The interplay of cognitive arousal and expressive typing in remote work. PLOS ONE, 19(5). https://doi.org/10.1371/journal.pone.0300786