Research News
AI produces Connections puzzles that rival human-created ones
Can artificial intelligence (AI) create word puzzles as engaging and challenging as those crafted by human experts?
A new study suggests the answer may be yes — at least when it comes to The New York Times' popular Connections game.
Researchers from NYU Tandon School of Engineering and Jester Labs have developed an AI system capable of generating novel Connections puzzles that often rival those created by Times puzzle designers.
In a user study, participants played both AI-generated and official Times puzzles without knowing their source. In roughly half of head-to-head comparisons, players judged the AI puzzles to be equally or more enjoyable, creative, and difficult than their human-created counterparts.
Their findings shed light on the creative capabilities of large language models like GPT-4.
Connections, which debuted in June 2023, challenges players to sort 16 words into four thematically linked groups of four. The game quickly became one of the Times' most popular online offerings, second only to Wordle, with billions of plays per year.
To create AI-generated puzzles, the researchers employed an "agentic workflow" approach. This method involves using GPT-4 in multiple specialized roles throughout the puzzle creation process.
Rather than asking the AI to generate an entire puzzle at once, researchers broke down the task into smaller, more focused steps. For each step, they prompted GPT-4 with specific instructions, effectively having it play different roles such as puzzle creator, editor, and difficulty assessor.
This approach allowed the team to leverage the AI's capabilities more effectively by guiding it through a process that mimics how human designers might approach puzzle creation.
"We found that solving a complex problem like generating a Connections puzzle requires more than just asking an AI to do it," said Timothy Merino, a Ph.D. student in NYU Tandon’s Game Innovation Lab who is the lead author of the study. "By breaking the task into smaller, more manageable steps and using the LLM as a tool in various ways, we achieved better results."
The paper’s senior author, Julian Togelius — NYU Tandon associate professor of computer science and engineering, and the Director of the Game Innovation Lab — emphasized the importance of this approach. "The LLM is crucial to our system, but it's not in the driving seat. We use it in different parts of the system for specific tasks, like asking for the best concept that would apply to a particular list of words."
The researchers also identified two key ways puzzles introduce difficulty: "Intentional overlap" and "False groups." They analyzed word similarity in relation to difficulty levels, finding that easier word groups tend to have more similar words, while trickier groups have less similar words.
“I was consistently surprised at how good GPT was at creating a clever word group,” said Merino. “One of my favorites the AI generated is ‘Beatles Album Words’: ‘Abbey’, ‘Mystery,’ "Pepper,’ and ‘White.’”
The research has implications beyond word games, according to the researchers. It is a step toward better understanding both AI capabilities and human creativity.
"This work isn't just about generating puzzles," Togelius said. "It's about using AI to test and refine our theories of what makes a good puzzle in the first place. Connections is a worthy area of research because what makes a good game isn’t easy to define. We can refine our understanding of game design by creating theories of what makes for good games, implement them into algorithms, and see whether the games that are generated by the algorithms are actually good."
This recent paper builds upon the Game Innovation Lab's ongoing research into AI and Connections. In a study published earlier this year, the lab's researchers evaluated various AI models' proficiency in solving Connections puzzles. Their findings revealed that while GPT-4 outperformed other models, it still fell short of mastering the game, successfully solving only about 29 percent of the puzzles presented.
Merino, Tim & Earle, Sam & Sudhakaran, Ryan & Sudhakaran, Shyam & Togelius, Julian. (2024). Making New Connections: LLMs as Puzzle Generators for The New York Times' Connections Word Game.
Subway air pollution disproportionately impacts New York City's minority and low-income commuters
A comprehensive study on New York City's subway air quality has revealed that longer commute times lead to higher exposure to hazardous air pollutants, a problem that disproportionately affects minority and low-income communities who endure more prolonged and frequent travel through the system.
In a paper published in PLOS ONE, NYU Tandon School of Engineering researchers modeled subway riders' typical daily commutes to determine exposure to particulate matter pollution (PM2.5). This was done by integrating home-to-work commute data with pollutant measurements the researchers collected from all platforms and within a typical car in all train lines throughout the NYC subway system.
Masoud Ghandehari -- a professor in NYU Tandon’s Civil and Urban Engineering Department and a member of C2SMARTER, a Tier 1 University Transportation Center designated by the U.S. Department of Transportation -- led the research team. Other researchers on the paper include its first author, Shams Azad, who earned a Ph.D. in Transportation Engineering from NYU Tandon in 2023, and Pau Ferrar-Cid, Machine Learning Researcher at Universitat Politècnica de Catalunya in Spain who was an NYU Tandon visiting scholar in 2022.
PM2.5 refers to tiny particles suspended in the air that, when inhaled, can enter the lungs and potentially the bloodstream, causing a range of short- and long-term health complications. These include respiratory and cardiovascular diseases, and some components have also been identified as neurotoxins.
PM2.5 are traditionally byproducts of fossil fuel combustion. In the subway system, however, the particles are introduced as a result of abrasion of breaks, rails, and wheels, contributing to very high iron content in the collected and analyzed particles.
An NYC Air Pollution Exposure Map the researchers created can be used to calculate personal exposure for any origin and destination within New York City.
The burden of disease due to exposure to poor air quality in the subway system does not fall equally, the study found. Black and Hispanic workers face PM2.5 exposure levels 35% and 23% higher, respectively, compared to their Asian and white counterparts, according to the study.
This disparity stems from differences in commuting trends, duration of subway travel, and the varying pollution levels across stations and lines. Minority workers residing in low-income communities often endure longer commutes, transferring through stations (which are more polluted than subway cars) in order to reach job hubs like downtown Manhattan.
Economically disadvantaged communities generally are exposed to more pollutants compared to affluent workers. A positive correlation was identified between the percentage of residents below the poverty line and higher levels of PM2.5 exposure.
This discrepancy is partly attributed to the reliance on the subway system among lower-income populations, who have limited access to alternative transportation options like private vehicles or carpool services. Conversely, many affluent workers can avoid lengthy subway commutes by living in proximity to their workplaces.
In fact, residents in upper Manhattan neighborhoods, including Washington Heights and Inwood — two communities with poverty rates above citywide averages — have the highest per capita levels of subway pollutant commuting exposure, the study shows. This is due to a combination of a large number of commuters and longer commute time. Midtown Manhattan — where many people live close to workplaces — and portions of Queens without easily accessible subway stations have some of the lowest per capita exposure levels.
Measurements were carried out in December 2021 and June 2022, sampling 19 distinct subway lines and 368 stations. The researchers took end-to-end round trips on each of the lines studied, measuring the PM2.5 concentration at one-second sampling intervals. In one direction they stayed on the train from start to end. On the return trip they got off at each station and waited until the arrival of the next train, measuring the platform concentrations at the same sampling interval.
Sampling of the platform air at one-second intervals confirmed that pollutant concentration peaks when the train arrives, where the train churns up the pollutants deposited in the tunnel throughout years of service. The concentration values from the 2021 study can be found in a paper published in Atmospheric Pollution Research.
To calculate PM2.5 exposure, researchers used origin-destination records from the U.S. Census Bureau (specifically the 2019 LEHD Origin-Destination (OD) dataset) to simulate home-to-job commutes of over 3 million workers in Manhattan, Brooklyn, the Bronx, and Queens in 2019, calculating the average per capita daily PM2.5 exposure in 34,000 census blocks.
This study was a collaboration between the NYU Tandon School of Engineering and the NYU Grossman School of Medicine, in partnership with the Polytechnic University of Catalunya. The work was funded by NYU Tandon's C2SMART Center, with a grant from the U.S. Department of Transportation’s University Transportation Centers Program under Grant Number 69A3551747124, and by the National Science Foundation (award number 1856032). For more information, see also A Comprehensive Analysis of the Air Quality in the NYC Subway System (September 2022).
Azad, S., Ferrer-Cid, P., & Ghandehari, M. (2024). Exposure to fine particulate matter in the New York City subway system during home-work commute. PLOS ONE, 19(8), e0307096. https://doi.org/10.1371/journal.pone.0307096
New research explores how ant colonies regulate group behaviors
In the world of social creatures, from humans to ants, the spread of behaviors through a group — known as social contagion — is a well-documented phenomenon. This process, driven by social imitation and pressure, causes individuals to adopt behaviors observed in their peers, often resulting in synchronized mass actions; Think of stampedes, or standing ovations.
Social contagion is a double-edged sword in highly integrated societies. While it facilitates cohesion and collective efficiency, unchecked contagion can lead to detrimental mass behaviors, such as mass panic. Thus, nature has evolved regulatory mechanisms to keep such behaviors in check.
One such mechanism is reverse social contagion. In reverse social contagions, increased interactions between individuals engaged in a behavior lead to a higher likelihood of them stopping that behavior, rather than engaging in it.
In a new paper published in PNAS Nexus, researchers led by Maurizio Porfiri — NYU Tandon Institute Professor of Professor of Biomedical Engineering, Mechanical and Aerospace Engineering, and Civil and Urban Engineering, as well as the director of its Center for Urban Science and Progress (CUSP) — describe this unique phenomenon in colonies of harvester ants (Pogonomyrmex californicus) in order to understand the energetic consequences of highly integrated social behavior.
“Ants colonies reduce their energy spending per individual as the colony grows, similar to the size-dependent scaling of metabolic costs in birds and mammals discovered by Kleiber almost a century ago,” said Porfiri. “To date, a convincing explanation of how this collective response emerges is lacking.”
Utilizing tracked video recordings of several colonies, they discovered that individual ants did not increase their activity levels in proportion to the colony size. This was a curious finding, because larger colonies means more interactions between their members, and more opportunities for reinforcing behaviors.
To decode this behavior, the team — who also includes Pietro De Lellis from the University of Naples, Eighdi Aung and (Tandon alum) Nicole Abaid from Virginia Tech, Jane S. Waters from Providence College, and Santiago Meneses and Simon Garnier from the New Jersey Institute of Technology — applied scaling theories typically used to study human settlements. They derived relationships linking colony size to interaction networks and activity levels, hypothesizing that reverse social contagion was at play. Their hypothesis was supported by respirometry data, which revealed a potential connection between ant activity and metabolism.
Imagine you are an ant, and you see one of your fellow workers foraging for food. If you are governed by social contagion, you might also begin foraging so you don't look lazy. But the energy you expend foraging might not be worth it if one ant can efficiently gather the food. In this case, reverse social contagion tells you to kick your feet up and relax while you let your compatriot do the work, because you’ll need your energy later for another task. In this way, restraining social contagion makes the colony more efficient.
The study draws a fascinating parallel between insect colonies and human cities. In both systems, social interactions influence energy expenditure, but in opposite directions. Insect colonies exhibit hypometric scaling—activity levels do not increase proportionally with colony size. In contrast, human cities show hypermetric scaling, where energy expenditure grows faster than the population size.
“Human behavior is often driven by personal gain” says Simon Garnier, Associate Professor of Biological Sciences at NJIT and senior author on the paper. “Ants, on the other hand, tend to prioritize the needs of the colony over their own. This has huge implications for understanding the differences between the organization of human and social insect societies.”
Unlike humans, ants manage their energy as a colony rather than individually, somehow displaying a cooperative response. This study shows that ants use reverse social contagion to regulate their overall activity and energy use. Essentially, when many ants are busy with a task, some will stop to prevent the entire colony from overworking. This behavior aligns with scaling laws and metabolic patterns seen in other biological systems.
In simpler terms, think of an ant colony as one big organism where every ant's actions are coordinated for the colony's benefit, not just their own. Future research will look into how exactly these ants communicate and manage their energy so efficiently.
This research not only sheds light on the regulatory mechanisms in ant colonies but also offers insights into the broader principles of social regulation across species. As we continue to explore these parallels, we may uncover more about the fundamental dynamics that govern both natural and human-made systems.
“This is the first step we are taking to understand and model energy regulation in ant colonies,” said Porfiri. “Is energy regulation accompanied by improved performance for the collective? Can we design algorithms for robot teams inspired by ants that can maximize performance and minimize energy costs? Can we learn some lessons for our city transportation networks? These are just some of the questions we would like to address next. ”
This work was funded in part by a $3 million grant from the National Science Foundation, which over five years will aim to create a new paradigm for a better understanding of how loosely connected units can nonetheless collectively maintain function and homeostasis.
Porfiri, M., De Lellis, P., Aung, E., Meneses, S., Abaid, N., Waters, J. S., & Garnier, S. (2024). Reverse social contagion as a mechanism for regulating mass behaviors in highly integrated social systems. PNAS Nexus, 3(7). https://doi.org/10.1093/pnasnexus/pgae246
Arizona’s chief election officer endured more Twitter attacks than any other similar state official during 2022 midterm elections, new study reveals
In the lead-up to the 2022 U.S. midterm elections, Arizona's chief election officer Katie Hobbs received far more harassing messages on Twitter than any of her counterparts in other states. Over 30 percent of all tweets directed at her, along with commentators on her posts, fell into the "most aggressive" category of attacks.
That is a finding from researchers at NYU Tandon School of Engineering, Universidad del Rosario in Colombia, and the University of Murcia in Spain, in a paper published in Information Fusion that examines the phenomenon of online intimidation targeting state election administrators.
The research team used a machine learning model from the Perspective API – a tool developed by Google to identify abusive online comments – to analyze nearly 600,000 tweets mentioning chief state election officials nationwide during the weeks surrounding November 8, 2022. These tweets were rated based on six attributes: toxicity, severe toxicity, identity attack, insult, profanity, and threat.
Arizona produced the most Twitter activity in the study, representing almost ninety percent of all collected tweets, and had by far the highest volume of toxic language directed at its top election official, who was also running for governor. Sentiment analysis revealed these messages exhibited high rates of overt "attacks on the author" and "attacks on commenters," as well as generalized toxicity and inflammatory rhetoric.
"Many of the harassing messages made connections to the 2020 presidential election and baseless conspiracy theories about electoral fraud," said Damon McCoy, the paper’s senior author, a professor of Computer Science and Engineering at NYU Tandon. McCoy is co-director of Cybersecurity for Democracy, a multi-university research project of the Center for Cybersecurity at NYU Tandon and the Cybersecurity and Privacy Institute at the Northeastern University Khoury College of Computer Sciences that aims to expose online threats and recommend how to counter them.
To further investigate, the researchers employed entity recognition software to automatically detect references within the hateful messages. It flagged prevalent mentions of "Watergate" and inflammatory phrases like being "at ground zero" for election integrity issues when discussing Arizona.
Clustering analysis based on semantic similarities within the messages also allowed the researchers to identify distinct communities promoting hate speech and map their interactions.
While political speech is constitutionally protected, the researchers warn that abuse and intimidation of election workers could have a chilling effect, deterring qualified professionals from overseeing voting and eroding public trust.
"If we want to safeguard democracy, we must find ways to promote civil discourse and protect those ensuring fair elections from harassment and threats," said McCoy.
The study proposes using the data pipeline developed by the researchers to automatically detect abusive accounts and content for faster content moderation. It also calls for clearer policies around harassment of election officials and cultural shifts to uphold democratic norms.
The research adds to McCoy’s body of work that delves into identifying and combating online threats and misinformation that can harm democracy and civic life. Other studies have investigated the monetization of YouTube political conspiracy channels, looked at how well Facebook identifies and manages political ads on its platform and explored U.S. political Facebook ads in Spanish during the 2020 presidential election.
Zapata Rozo, A., Campo-Archbold, A., Díaz-López, D., Gray, I., Pastor-Galindo, J., Nespoli, P., Gómez Mármol, F., & McCoy, D. (2024). Cyber democracy in the digital age: Characterizing hate networks in the 2022 US midterm elections. Information Fusion, 110, 102459. https://doi.org/10.1016/j.inffus.2024.102459
Synthetic data holds the key to determining best statewide transit investments
Synthetically generated population data can reveal the equity impacts of distributing transportation resources and funding across diverse regions, according to new research from NYU's Tandon School of Engineering that uses New York State as a case study.
Relying on an artificial dataset representing 19.5 million New York residents and over 120,000 modeled origin-destination trips, researchers from NYU Tandon's C2SMARTER, a Tier 1 U.S U.S. Department of Transportation-funded University Transportation Center, determined how best to invest in transportation services when equitable benefits are an objective.
They presented the findings in a paper published in Transportation Research Part D: Transport and Environment.
"Policymakers often use surveys to allocate transportation resources, but these surveys frequently underrepresent low-income and marginalized communities," said Joseph Chow, Institute Associate Professor of Civil and Urban Engineering, who led the study. "We developed a completely new approach for transportation planning, showing that synthetic data can consistently assess equity impacts across large regions like New York State. Our statewide model parameters are available to any agency to study the multiple effects of new service designs, something previously impossible."
The research team developed what they call an "equity-aware choice-based decision support tool.”
Given a budget level, the proposed tool selects optimal service regions for one or two new mobility services considering four objectives: (1) maximizing total revenue, (2) maximizing total increased consumer surplus, meaning delivering consumers cost savings (3) minimizing consumer surplus disparity, meaning making the benefits fair between different groups and (4) minimizing consumer surplus insufficiency, meaning ensuring baseline benefits even in areas that are less profitable.
The first two objectives focus on making the transportation system more efficient and profitable overall. The last two objectives emphasize making sure the benefits are distributed more equitably among different consumer groups and regions.
Using the tool with New York State synthetic data, researchers focused on two hypothetical mobility services: ride-hailing services that offer shorter travel times but higher trip fares, and on-demand microtransit services that provide longer travel times with lower trip fares. The results showed that:
- Investing mostly in ride-hailing services, focusing on longer trips in metropolitan areas like New York City, maximized revenue.
- Also prioritizing ride-hailing services but covering shorter trips in metropolitan areas maximized consumer surplus.
- Investing mainly in on-demand microtransit service, targeting disadvantaged communities, minimized consumer surplus disparity,
- Splitting the budget between ride-hailing and microtransit services, covering both urban and rural areas, balanced equity and efficiency,
"Microtransit played an outsized role boosting equity, proving more viable in disadvantaged areas. But it needed subsidies to offset lower productivity than ride-hailing,” said Chow, who is also Deputy Director of C2SMARTER. “We hope this study is a step towards creating a way to analyze and allocate transportation resources nationally, to produce equitable outcomes throughout the U.S.
Replica, a transportation data and analytics firm, provided the synthetic data for the study. The dataset combines real mobility, demographic, and built environment information with mathematical models, providing details like travel demand patterns, transportation network characteristics, and mode choices for a given region.
"The work Dr. Chow and the team at NYU Tandon are doing is precisely what we had in mind when making Replica data available," said Robert Regué, Director of Research and Development at Replica. "We believe synthetic data is the key to taking a more data driven approach to creating more equitable, sustainable, and economically resilient cities, while protecting personal privacy. We are always excited to see our data contribute to such thoughtful, impactful research."
Along with Chow, the paper’s authors are NYU Tandon PhD candidate Xiyuan Ren and ChengHe Guan, Assistant Professor of Urban Science and Policy at NYU Shanghai and Global Network Assistant Professor at NYU. The researchers received funding support from C2SMARTER (U.S. Department of Transportation).
Xiyuan Ren, Joseph Y.J. Chow, ChengHe Guan, Mobility service design with equity-aware choice-based decision-support tool: New York case study, Transportation Research Part D: Transport and Environment, Volume 132, 2024, 104255, ISSN 1361-9209,
Privacy-enhancing browser extensions fail to meet user needs, new study finds
Popular web browser extensions designed to protect user privacy and block online ads are falling short, according to NYU Tandon School of Engineering researchers, who are proposing new measurement methodologies to better uncover and quantify these shortcomings.
Led by Rachel Greenstadt, professor in the NYU Tandon Computer Science and Engineering (CSE) Department, the team will present its study at the 19th ACM ASIA Conference on Computer and Communications Security, taking place July 1–5, 2024 in Singapore.
Through an analysis of over 40,000 user reviews of seven of the most most popular privacy-preserving Chrome extensions, the researchers identified five key concerns among users: Performance, referring to the extent the extensions slowed down the system; Web compatibility, indicating how much they disrupted websites or caused substantial rendering delays; Data and Privacy Policy, pertaining to how the extensions handled user data; Effectiveness, evaluating how well they fulfilled their advertised purpose; and Default Configurations, assessing users' trust in the default settings.
"Our study found that there's a disconnect between what users want and what these extensions are actually providing," said Ritik Roongta, CSE PhD student who is the lead author of the study. "Developers need to do a much better job of understanding and addressing the real-world pain points."
The researchers analyzed extensions that fall into two main groups. The first category, dubbed “Ad-Blockers & Privacy Protection,” comprised extensions that block advertisements and third-party trackers. These include AdBlock Plus (ABP), uBlock Origin, Adguard, and Ghostery.
The second category, called “Privacy Protection,” encompasses extensions primarily focused on enhancing user privacy by blocking trackers and other privacy-invasive elements. This category includes Privacy Badger, Decentraleyes, and Disconnect.
The research team found that existing academic studies and benchmarking efforts had comprehensively explored just 4 out of the 14 key metrics underlying these five main user concerns. Crucial aspects like RAM usage overhead, ad-blocker detection likelihood, privacy policy soundness and adequacy of filtering rules were overlooked.
To bridge these research gaps, the researchers designed novel measurement methodologies and conducted extensive evaluation of the extensions against the unexplored metrics, providing a new benchmarking framework for evaluating the strengths and shortcomings of these privacy tools.
Their experiments involved smart crawlers visiting over 1,500 websites to analyze performance hits, compatibility issues, privacy policy strengths, ad-blocking capabilities and filter list configurations.
"The goal of this study is not to compare extensions specifically but to come up with a standardized benchmarking framework that addresses all user concerns so that the user can make informed decisions,” said Roongta. “As extensions evolve with every update, they might over- or underperform in different metrics at different times.”
The new measurement methodologies the researchers applied painted a mixed picture of the extensions they studied. While extensions like uBlock Origin optimized performance overheads well, most others like ABP exhibited significant CPU and memory overheads. Privacy Badger blocked ads and third-party trackers effectively while Ghostery struggled with them.
“Most of our analysis shows ABP needs to improve on metrics,” said Roongta. “That’s because it whitelists certain ads to show to the users. While this new dimension is often perceived critically by the users, it is important to sustain a free Internet. It will be interesting to see how user preferences change as these standards evolve with the advertiser policies over time and the system gets better so that the overhead caused by the extensions is negligible."
The study highlighted instances of potential permission abuse and non-compliance with data protection regulations by some of the evaluated extensions. It provided recommendations for extension developers to enhance transparency around data practices.
The research underscores the pressing need for more rigorous analysis and systematic benchmarking of privacy-preserving browser additions that millions entrust with their online data and browsing experience daily. It contributes to Greenstadt’s body of research that explores what happens when people try to use privacy-enhancing technologies and how the Internet responds.
The following chart shows the new and re-assessed metrics the NYU Tandon researchers introduced to evaluate browser extensions, as presented in From User Insights to Actionable Metrics: A User-Focused Evaluation of Privacy-Preserving Browser Extensions.
| User Concern | Measurements |
|---|---|
| Performance | RAM usage: measure of the RAM used by the extensions during website loading. |
| CPU usage: Measure of the CPU cycles used by the extensions during website loading (studied before but researchers re-assessed with enhanced measurement methods) | |
| Data Usage: Measure of the disk space used by the extensions during website loading (studied before but researchers re-assessed with enhanced measurement methods). | |
| Web compatibility | Ad-Blocker Detection Prompt: the number of websites that either employ javascript to detect the presence of an ad-blocker extension, or display a prompt asking the user to disable their ad-blocker. |
| Unable to Load: websites taking longer than 60 seconds to load when the extensions are present. | |
| Data and Privacy Policy | Permissions: evaluation of the extra permissions requested for the actual functioning of the extension. |
| Privacy Policy: evaluation of the privacy policies of the extensions. | |
| Extension Effectiveness | Ads: the extension's ability to block third-party trackers. |
| Default Configurations | Default Filter Lists: set of rules to identify and block various web content like advertisements, trackers, and other unwanted elements from being loaded or displayed. |
| Acceptable Ads: ABP filterlist that allows certain advertisements to appear that adhere to acceptable ad standards. Acceptable Ad Standards. |
NYU Tandon researchers develop technology that may allow stroke patients to undergo rehab at home
For survivors of strokes, which afflict nearly 800,000 Americans each year, regaining fine motor skills like writing and using utensils is critical for recovering independence and quality of life. But getting intensive, frequent rehabilitation therapy can be challenging and expensive.
Now, researchers at NYU Tandon School of Engineering are developing a new technology that could allow stroke patients to undergo rehabilitation exercises at home by tracking their wrist movements through a simple setup: a smartphone strapped to the forearm and a low-cost gaming controller called the Novint Falcon.
The Novint Falcon, a desktop robot typically used for video games, can guide users through specific arm motions and track the trajectory of its controller. But it cannot directly measure the angle of the user's wrist, which is essential data for therapists providing remote rehabilitation.
In a paper presented at SPIE Smart Structures + Nondestructive Evaluation 2024, the researchers proposed using the Falcon in tandem with a smartphone's built-in motion sensors to precisely monitor wrist angles during rehab exercises.
"Patients would strap their phone to their forearm and manipulate this robot," said Maurizio Porfiri, NYU Tandon Institute Professor and director of its Center for Urban Science + Progress (CUSP), who is the paper’s senior author. "Data from the phone's inertial sensors can then be combined with the robot's measurements through machine learning to infer the patient's wrist angle."
The researchers collected data from a healthy subject performing tasks with the Falcon while wearing motion sensors on the forearm and hand to capture the true wrist angle. They then trained an algorithm to predict the wrist angles based on the sensor data and Falcon controller movements.
The resulting algorithm could predict wrist angles with over 90% accuracy, a promising initial step toward enabling remote therapy with real-time feedback in the absence of an in-person therapist.
"This technology could allow patients to undergo rehabilitation exercises at home while providing detailed data to therapists remotely assessing their progress," Roni Barak Ventura, the paper’s lead author who was an NYU Tandon postdoctoral fellow at the time of the study. "It's a low-cost, user-friendly approach to increasing access to crucial post-stroke care."
The researchers plan to further refine the algorithm using data from more subjects. Ultimately, they hope the system could help stroke survivors stick to intensive rehab regimens from the comfort of their homes.
"The ability to do rehabilitation exercises at home with automatic tracking could dramatically improve quality of life for stroke patients," said Barak Ventura. "This portable, affordable technology has great potential for making a difficult recovery process much more accessible."
This study adds to NYU Tandon’s body of work that aims to improve stroke recovery. In 2022, Researchers from NYU Tandon began collaborating with the FDA to design a regulatory science tool based on biomarkers to objectively assess the efficacy of rehabilitation devices for post-stroke motor recovery and guide their optimal usage. A study from earlier this year unveiled advances in technology that uses implanted brain electrodes to recreate the speaking voice of someone who has lost speech ability, which can be an outcome from stroke.
In addition to Porfiri and Barak Ventura, the study’s authors are Angelo Catalano, who earned an MS from NYU Tandon in 2024, and Rayan Succar, an NYU Tandon PhD candidate. The study was funded by grants from the National Science Foundation.
Roni Barak Ventura, Angelo Catalano, Rayan Succar, and Maurizio Porfiri "Automating the assessment of wrist motion in telerehabilitation with haptic devices", Proc. SPIE 12948, Soft Mechatronics and Wearable Systems, 129480F (9 May 2024); https://doi.org/10.1117/12.3010545
Cutting-edge cancer treatments research promises more effective interventions
Cancer is one of the most devastating diagnoses that a person can receive, and it is a society-wide problem. According to the National Cancer Institute, an estimated 2,001,140 new cases of cancer will be diagnosed in the United States and 611,720 people will die from the disease this year. While cancer treatments have seen large improvements over the last decades, researchers are still laser-focused on developing strategies to defeat cancers, especially those that have been resistant to traditional interventions.
New research from Jin Kim Montclare, Professor of Chemical and Biomolecular Engineering, may lend hope to cancer patients in the future. Montclare’s lab, which uses customized artificial proteins to target human disorders, drug delivery and tissue regeneration utilizing a blend of chemistry and genetic engineering, has recently published two papers that take aim at cancers that have been difficult to treat.
MAPping a course to recovery
In the ongoing quest to develop more effective cancer therapies, these researchers have unveiled a promising approach utilizing protein-based targeting agents. A recent study published in Biomaterials Science introduces a novel strategy that harnesses the power of multivalent assembled proteins (MAPs) to target hypoxic tumors with unprecedented precision and efficacy.
Traditional cancer treatments often rely on passive or active targeting mechanisms to deliver therapeutic agents to tumor sites. However, these approaches have limitations, particularly in overcoming physiological and pathological barriers. To address these challenges, the research team focused on exploiting the unique features of the tumor microenvironment (TME), specifically its hypoxic conditions.
Hypoxia, a characteristic feature of many solid tumors, is known to play a crucial role in tumor progression and resistance to therapy. By targeting hypoxia-inducible factor 1 alpha (HIF1α), a key regulator of cellular response to low oxygen levels, the researchers aimed to develop a more effective strategy for tumor-specific drug delivery.
Previous efforts to target HIF1α have been hindered by the instability and limited binding abilities of the peptide-based molecules used. To overcome these obstacles, the team turned to MAPs, which offer the advantages of high stability and multivalency.
Drawing inspiration from their successful development of MAPs targeting COVID-19, the researchers engineered HIF1α-MAPs (H-MAPs) by grafting critical residues of HIF1α onto the MAP scaffold. This innovative design resulted in H-MAPs with picomolar binding affinities, significantly surpassing previous approaches. In vivo studies showed promising results, with H-MAPs effectively homing in on hypoxic tumors.
“This is a very promising result, using a material that is degradable within the body and likely will limit side effects to treatments,” said Montclare. “We’re taking advantage of the building blocks of our own bodies and using those protein compositions to treat the body — and that’s where we’re making big progress.”
Montclare’s findings suggest that H-MAPs hold great potential as targeted therapeutic agents for cancer treatment. With further refinement and exploration, H-MAPs could offer a new avenue for precision medicine, providing clinicians with a powerful tool to combat cancer while minimizing side effects and maximizing therapeutic efficacy.
The development of H-MAPs represents a significant advancement in the field of cancer therapy, highlighting the importance of innovative approaches that leverage the intricacies of the tumor microenvironment. As researchers continue to unravel the complexities of cancer biology, protein-based targeting agents like H-MAPs offer hope for improved outcomes and better quality of life for cancer patients.
Targeting stubborn breast cancer subtypes
Not all tumors are hypoxic, however, and some forms of cancers are much harder to target than others.
Triple-negative breast cancer (TNBC) poses a significant challenge in the realm of oncology due to its resistance to conventional targeted therapies. Unlike other breast cancer subtypes, TNBC lacks the receptor biomarkers — such as estrogen receptors and human epidermal growth factor receptor 2 — making it unresponsive to standard treatments. Consequently, chemotherapy remains the primary option for TNBC patients. However, the efficacy of chemotherapy is often hindered by the development of drug resistance, necessitating innovative approaches to enhance treatment outcomes.
In recent years, there has been a surge of interest in improving the efficacy of chemotherapy for TNBC through enhanced drug delivery systems. One promising avenue involves the use of biocompatible materials, including lipids, polymers, and proteins, as carriers to encapsulate chemotherapeutic agents. Among these materials, protein-based hydrogels have emerged as a particularly attractive option due to their biocompatibility, tunable properties, and ability to achieve controlled drug release.
A recent breakthrough in the field comes from the development of a novel protein-based hydrogel, known as Q8, which demonstrates remarkable improvements over previous materials
By fine-tuning the molecular characteristics of the hydrogel using a machine learning algorithm, the researchers were able to engineer Q8 to exhibit a two-fold increase in gelation rate and mechanical strength. These enhancements pave the way for Q8 to serve as a promising platform for sustained chemotherapeutic delivery.
In a groundbreaking study, the researchers investigated the therapeutic potential of Q8 for the treatment of TNBC in vivo using a mouse model. Remarkably, the delivery of doxorubicin encapsulated in Q8 resulted in significantly improved tumor suppression compared to conventional doxorubicin treatment alone. This achievement marks a significant milestone in the development of non-invasive and targeted therapies for TNBC, offering new hope for patients facing this aggressive form of breast cancer.
The success of Q8 underscores the immense potential of protein-based hydrogels as versatile platforms for drug delivery in cancer therapy. By leveraging the unique properties of these materials, researchers can overcome longstanding challenges associated with chemotherapy, including poor drug bioavailability and resistance. Moving forward, further advancements in protein engineering and hydrogel design hold the promise of revolutionizing cancer treatment paradigms, offering renewed optimism for patients battling TNBC and other challenging malignancies.
NYU Tandon researchers help bring the fossil record into the digital age with low-cost 3D scanning
Inside Museu de Paleontologia Plácido Cidade Nuvens (MPPCN), a museum in rural northeastern Brazil, sits an unassuming piece of equipment that promises to digitize one of the world's most prized collections of ancient fossils.
PaleoScan, designed by NYU Tandon School of Engineering researchers as part of a collaboration between Brazilian paleontologists and American computer scientists, is a first-of-its-kind 3D fossil scanner that allows museum staff to rapidly and easily scan and digitally archive their vast fossil collection for global accessibility. The technology can democratize paleontology and archaeology in the process.
The research team presented a paper about PaleoScan’s development at ACM CHI conference on Human Factors in Computing Systems this month in Honolulu, Hawaii. View a video about PaleoScan.
"PaleoScan was created specifically to enable museums with limited resources to digitize vital fossil collections," said Cláudio Silva, NYU Tandon Institute Professor of Computer Science and Engineering and Co-Director of the Visualization and Data Analytics Research Center (VIDA), who led PaleoScan’s development. "This low-cost and easy-to-assemble technology delivers results that rival expensive CT scanners while working at a much faster pace. Affordability and high-volume output were paramount goals. We achieved them."
MPPCN’s 11,000 diverse and well-preserved fossils from Brazil’s Araripe Basin, one of the planet's most fossil-rich regions, hold immense scientific value.
Digitizing the museum's world-class collection has proved enormously challenging, however. Situated in the remote city of Santana do Cariri, transporting the fragile fossils over two hours to the nearest urban center risked damaging the specimens. Yet the museum’s lack of technologically-trained personnel, reliable internet, computing resources and funds for expensive scanning technology made on-site digitizing unviable.
"PaleoScan emerges as a powerful system for fossil investigation," said Naiara Cipriano Oliveira, visiting researcher at the MPPCN at Universidade Regional do Cariri in Brazil, who worked with Silva on installing the device and oversees the digitization of the collection. "Thousands of irreplaceable fossils will soon be readily available to scientists anywhere in the world, ensuring that evidence of ancient life is studied and preserved like never before."
PaleoScan is a fully self-contained system with innovative hardware and software components. At its core is a compact, automated camera rig that can slide along two axes above fossils placed on a calibrated surface. An intuitive touchscreen interface allows an operator to simply select which fossils to scan and the desired resolution.
The scanner's integrated camera (a mirrorless DSLR) then automatically captures a series of overlapping photos from different angles while LED lights ensure consistent illumination. This raw image data, stored on an SD card, is periodically uploaded to a cloud-based software pipeline developed at NYU Tandon.
The "PaleoDP" software processes the images using photogrammetry techniques to discern precise color, texture and geometry for each fossil down to sub-millimeter accuracy. Paleontologists can then easily annotate the 3D fossil data with metadata before it is added to an online database for other researchers to study remotely.
The PaleoScan device itself was assembled for less than $3,500, making it a low-cost solution for museums and fossil repositories to rapidly digitize their collections and make their specimens available for anyone to study.
In a pilot deployment last year, PaleoScan digitized more than 200 fossils at MPCCN in just weeks. At that rate, the entire specimen collection could be digitized and backed up in about a year.
High-resolution PaleoScan images revealed remnants of the ganoid scales of an exquisitely preserved fossil fish species, offering insights about its evolutionary biology and environment.
“The incredible preservation and abundance of these fossils from Brazil offer a rare opportunity to study the ecology of ancient environments. Now with PaleoScan, a scanner can come to the fossils rather than having fragile specimens sent to expensive scanners, allowing high-quality scans to be easily shared with the global scientific community,” said Akinobu Watanabe, Ph.D., Associate Professor of Anatomy at New York Institute of Technology and Research Associate at the American Museum of Natural History, who assisted NYU researchers in developing and deploying the device technology.
Looking ahead, scientists envision fleets of PaleoScan devices deploying to digitize neglected fossil collections globally, especially at underfunded museums and remote field sites lacking modern equipment. The portable, low-cost scanner could democratize access to the world’s archived fossil record for educators and researchers.
PaleoScan builds on Silva’s track record of developing new technologies for museums. He led an international team that worked with The Frick Collection to create ARIES - ARt Image Exploration Space, an interactive image manipulation system that enables the exploration and organization of fine digital art. The PaleoScan project has been partially supported through funding from the National Science Foundation in the U.S. and Fundação Cearense de Apoio ao Desenvolvimento (Funcap) in Brazil.
CHI '24: Proceedings of the CHI Conference on Human Factors in Computing Systems; May 2024; Article No.: 708; Pages 1–16 https://doi.org/10.1145/3613904.3642020
Deep-sea sponge's “zero-energy” flow control could inspire new energy efficient designs
The Venus flower basket sponge, with its delicate glass-like lattice outer skeleton, has long intrigued researchers seeking to explain how this fragile-seeming creature’s body can withstand the harsh conditions of the deep sea where it lives.
Now, new research reveals yet another engineering feat of this ancient animal’s structure: its ability to filter feed using only the faint ambient currents of the ocean depths, no pumping required.
This discovery of natural ‘“zero energy” flow control by an international research team co-led by University of Rome Tor Vergata and NYU Tandon School of Engineering could help engineers design more efficient chemical reactors, air purification systems, heat exchangers, hydraulic systems, and aerodynamic surfaces.
In a study published in Physical Review Letters, the team found through extremely high-resolution computer simulations how the skeletal structure of the Venus flower basket sponge (Euplectella aspergillum) diverts very slow deep sea currents to flow upwards into its central body cavity, so it can feed on plankton and other marine detritus it filters out of the water.
The sponge pulls this off via its spiral, ridged outer surface that functions like a spiral staircase. This allows it to passively draw water upwards through its porous, lattice-like frame, all without the energy demands of pumping.
"Our research settles a debate that has emerged in recent years: the Venus flower basket sponge may be able to draw in nutrients passively, without any active pumping mechanism," said Maurizio Porfiri, NYU Tandon Institute Professor and director of its Center for Urban Science + Progress (CUSP), who co-led the study and co-supervised the research. "It's an incredible adaptation allowing this filter feeder to thrive in currents normally unsuitable for suspension feeding."
At higher flow speeds, the lattice structure helps reduce drag on the organism. But it is in the near-stillness of the deep ocean floors that this natural ventilation system is most remarkable, and demonstrates just how well the sponge accommodates its harsh environment. The study found that the sponge’s ability to passively draw in food works only at the very slow current speeds – just centimeters per second – of its habitat.
"From an engineering perspective, the skeletal system of the sponge shows remarkable adaptations to its environment, not only from the structural point of view, but also for what concerns its fluid dynamic performance," said Giacomo Falcucci of Tor Vergata University of Rome and Harvard University, the paper’s first author. Along with Porfiri, Falcucci co-led the study, co-supervised the research and designed the computer simulations. "The sponge has arrived at an elegant solution for maximizing nutrient supply while operating entirely through passive mechanisms."
Researchers used the powerful Leonardo supercomputer at CINECA, a supercomputing center in Italy, to create a highly realistic 3D replica of the sponge, containing around 100 billion individual points that recreate the sponge's complex helical ridge structure. This “digital twin” allows experimentation that is impossible on live sponges, which cannot survive outside their deep-sea environment.
The team performed highly detailed simulations of water flow around and inside the computer model of the skeleton of the Venus flower basket sponge. With Leonardo's massive computing power, allowing quadrillions of calculations per second, they could simulate a wide range of water flow speeds and conditions.
The researchers say the biomimetic engineering insights they uncovered could help guide the design of more efficient reactors by optimizing flow patterns inside while minimizing drag outside. Similar ridged, porous surfaces could enhance air filtration and ventilation systems in skyscrapers and other structures. The asymmetric, helical ridges may even inspire low-drag hulls or fuselages that stay streamlined while promoting interior air flows.
The study builds upon the team’s prior Euplectella aspergillum research published in Nature in 2021, in which it revealed it had created a first-ever simulation of the deep-sea sponge and how it responds to and influences the flow of nearby water.
In addition to Porfiri and Falcucci, the current study’s authors are Giorgio Amati of CINECA; Gino Bella of Niccolò Cusano University; Andrea Luigi Facci of University of Tuscia; Vesselin K. Krastev of University of Rome Tor Vergata; Giovanni Polverino of University of Tuscia, Monash University, and University of Western Australia; and Sauro Succi of the Italian Institute of Technology.
A grant from the National Science Foundation supported the research. Other funding came from CINECA, Next Generation EU, European Research Council, Monash University and University of Tuscia.