Spotlighting the Future

Perspectives from the Tandon community on quantum, connectivity, climate, and healthcare.

sun shining on city street

What’s Next for Quantum Computing?

With Incoming Assistant Professor Aziza Almanakly

Quantum computing has long occupied that space between scientific curiosity and futuristic promise. But in the past few years, that promise has begun to look more like a plan. I work on quantum interconnects — technologies that help different quantum chips talk to each other — and in my view, the field is entering a critical new phase.

Let’s start with the basics. Traditional computers process information using bits — zeros and ones. But quantum computers use qubits, which are governed by the laws of quantum mechanics. That means they can exist in a state called superposition, where a qubit is both zero and one at the same time, at least until it’s measured. Qubits can also become entangled, which allows the state of one to be dependent on the state of another upon measurement, even across distance.

These properties open up new possibilities for solving certain types of problems. For example, factoring large numbers is notoriously hard for classical computers, and that difficulty underpins much of modern encryption. A quantum algorithm known as Shor’s algorithm could, in theory, factor those same numbers much faster than any known classical algorithm, making it a potential game-changer for cybersecurity. While we’re not there yet, the threat has spurred the development of post-quantum cryptography: algorithms designed to resist attacks even from quantum computers.

Another exciting application is quantum simulation. In this area, we use qubits to model other quantum systems, such as molecules or materials. This could drastically improve how we design new pharmaceuticals or study protein folding — tasks that are challenging on classical computers because of the complexity of quantum interactions.

So what’s holding us back? Simply put: errors. Quantum systems are fragile. A state-of-the-art classical processor operates with a low error rate of around 10⁻²⁰. For quantum computers, we're currently at about 10⁻³. It’s a massive gap. To do anything useful, we need millions of qubits working with extremely high fidelity. And to get there, we need quantum error correction.

Instead of relying on a single qubit to represent one quantum bit of information, error correction schemes encode one “logical qubit” into many physical qubits. The idea is that while individual qubits might fail, the system as a whole can still function reliably. In 2024, Google Quantum AI demonstrated a proof-of-principle version of this using superconducting qubits, showing that adding more noisy qubits could still reduce the overall error rate. That’s a major milestone.

And it’s not just superconducting qubits. Other groups, like one at Harvard, are working on neutral atom qubits—where actual atoms are used as information carriers and moved around with lasers to implement quantum logic gates. They too are beginning to demonstrate early forms of error correction. We don’t yet know which type of qubit will “win.” It’s entirely possible that the most practical quantum computers will be hybrids, using one type for memory and another for computation.

This brings us to scalability. Today’s most advanced processors, from industry leaders like IBM Quantum and Google Quantum AI, contain up to a few hundred qubits. But we’ll need far more than that. To make quantum computers practical, we’re starting to think in terms of distributed systems—multiple quantum chips working together. That’s where my research comes in: developing quantum interconnects to move quantum information between processors efficiently.

Looking five to ten years ahead, we may see quantum processors with a few thousand qubits operating in tandem, enabled by this modular, networked architecture. We’re still early, but the roadmap is becoming clearer. While quantum computing faces enormous technical challenges, the research community remains optimistic that with time, effort, and investment, these hurdles can be overcome.

Headshot of Aziza Almanakly

6G's Missing Piece: The Killer App That Could Drive Next-Gen Wireless

With Professor Sundeep Rangan, Director, NYU WIRELESS

6G is in flux. Network operators—companies like T-Mobile, AT&T, and Verizon—are asking what can create the market pull for the extra wireless capacity 6G will provide. That's unusual. 

Unlike previous generations, we don't have an immediate problem that more bandwidth can solve. The big question isn't whether engineers can make 6G work. They will. It's what transformative use case will generate the demand.

Networks evolve in roughly ten-year cycles. In 2007, the transition to smartphones fundamentally changed global communication – requiring networks that could handle vastly more data – fueling 4G's rollout around 2010.

The situation is markedly different today. Current wireless networks already provide reliable connectivity. 5G succeeded technologically when it arrived a decade after 4G. Yet some skeptics still call it a “failure,” because it's missing a killer application that fully exploits its advances, let alone ignites the momentum for another generational leap.

So where might that groundswell of demand come from? 

Many expect virtual and augmented reality, or connected vehicles, to create it. Meta and Apple have invested hugely in VR. 6G will likely offer very high data rates and low-latency that both require. If either scale widely enough, they may be 6G catalysts.

Another potential avenue is satellite integration. Starlink has revolutionized satellite communications by deploying over 8,000 active Low Earth Orbit (LEO) satellites, an unprecedented mega-constellation that is rapidly growing. Enabled by reusable launch vehicles, SpaceX has dramatically lowered the cost and frequency of satellite deployment, making global broadband coverage a reality and connecting areas where cellular infrastructure does not exist.  

One 6G vision is to connect phones directly to satellites the way they do to towers today. But because satellites orbit hundreds of kilometers above Earth, far beyond the short hops to nearby towers, delivering reliable, high-bandwidth service is a daunting challenge. Solving it could be a defining feature of 6G.

Security and cryptography could also be key demand drivers. Today's networks weren't designed for resiliency against hostile attacks, a gap that 6G could address. A major security breach or cyberattack could make people suddenly prioritize protected communications, creating the urgency that drives 6G adoption.

Further afield, so-called “privacy-preserving computation” techniques — such as homomorphic encryption, and secure multi-party computation — offer a transformative shift in how sensitive data is handled. Instead of moving raw data to centralized servers, these methods enable processing while keeping personal information encrypted.  

If deployed, they could redefine user privacy by allowing organizations to train models or perform analytics without ever accessing the underlying data, dramatically reducing exposure risks and enabling compliance with stringent data protection laws. These methods, however, explode the size of data, potentially creating a significant new demand for network capacity. 

Meanwhile, engineers are preparing the technical foundation for 6G. One key area is finding new radio frequencies to use. Just as 5G opened up very high frequencies called millimeter waves, 6G will likely tap into what's called upper mid-band spectrum, mid-range frequencies that fall between today's cellular bands and 5G's highest frequencies.

The complication is that satellite companies already use these frequencies. Making both systems work without interfering with each other will require careful coordination.  

A second bottleneck that needs to be addressed is the computational fabric – the microchips in handsets and network infrastructure on which wireless systems run. Wireless systems are supporting ever larger numbers of antennas, wider bandwidths, and greater data rates, all of which need more processing at lower power.   

The situation is similar to AI with continually growing models, and wireless can benefit from the strides in digital processors to support AI.  AI itself is also being widely used for improving wireless communication design and operation and could be a piece of the 6G puzzle. 

These technical preparations won't create demand on their own, but they'll be essential when new applications finally emerge. 

The NYU WIRELESS center is well-positioned to continue its role in developing future wireless systems. Most advances for 6G will come from collaboration across multiple areas of expertise including wireless communications, circuit design, AI, networking, and security, as well as potential developers using wireless including virtual reality, vehicular systems, robotics, and healthcare.  

NYU WIRELESS uniquely brings these diverse talents together along with our collaboration with leading industry partners.   

6G will likely roll out around 2030, consistent with the ten-year cycle. Whether 6G’s transformative use case is AR/VR, connected cars, advanced cryptography, or something no one has yet imagined remains to be seen. There's no technical reason 6G can't happen. It's just a matter of finding what people will actually need it for.

Headshot of Sundeep Rangan

Student Spotlight: Pulkita Jain

Pulkita Jain does not want to belabor the obvious, but, as she points out, earning a Ph.D. is hard work, requiring years of intense dedication, intellectual rigor, and perseverance in the face of setbacks. Now a doctoral candidate in Tandon’s Department of Chemical and Biomolecular Engineering, she says, “If you’re going to spend up to five years earning an advanced degree, you should make sure you are going to enjoy the research involved ahead of time.”

Jain, a native of Delhi, India, had an inkling early on that her future would involve a great deal of time in a lab. “Like many kids, I initially dreamt of becoming an astronaut or surgeon, and in my early teens, I began thinking of being a software developer because I loved video games,” she recalls. “Then, in my senior year of high school, I had a wonderful chemistry teacher, Hema Madasani, who opened my eyes to how chemistry related to everyday life by introducing us to questions like why coffee stains some people’s teeth.” That year, Jain conducted her own research experiment, exploring the relationship between hair dyes and ammonia content, and became hooked.

Eager to start serious research as quickly as possible, she decided to apply to U.S. schools, believing that more opportunities might be open to undergraduates there. Her parents, Sudha, a psychologist, and Manish, an electrical engineer, were understandably nervous about allowing their only child to attend a college thousands of miles from home; still, they had always encouraged her intellectual curiosity and were supportive when she researched her choices and discovered that some two-year community colleges in the U.S. were not only more affordable than traditional four-year institutions but highly ranked. They also tended to be smaller, so that students – even if they were somewhat shy and uncertain, as she then was – ran little risk of feeling lost or isolated. 

She decided upon Santa Barbara City College, and during her very first semester there, she was recruited to work as a math tutor, boosting her confidence considerably; by the time she earned an associate’s degree with honors in Physics, Chemistry, and Mathematics in 2020, she felt more than ready for a larger stage. 

Thanks to a scholarship, she found that larger stage at the Illinois Institute of Technology, where she majored in chemical engineering. “I knew that chemistry would present a wealth of opportunities, since it’s applicable in so many sectors,” she explains. “I felt confident that I would find the right niche for me.” 

“Failure is just an opportunity to try again, and overcoming challenges can be the most interesting part of a project.”

While her first semester coincided with the height of the COVID-19 pandemic and all her courses took place online, Jain nonetheless moved to Chicago, hoping that physical proximity would make it easier to get involved in research. Accepted into an Illinois Tech program called RES-MATCH, she spent her first semester working remotely to analyze glucose-level data from patients with diabetes and create a predictive model to help prevent hypoglycemia. 

She admits that it could be frustrating to work while isolated and that she sometimes felt she lacked enough experience with the required algorithmic tools, but she says that the experience taught her resilience. “Failure is just an opportunity to try again, and overcoming challenges can be the most interesting part of a project,” she asserts. (She also notes that even when the situation seemed less than optimal, she had the strong support of research advisor Dr. Mudassir Rashid and a graduate-student mentor, Andrew Shahidehpour, who later helped her decide on her next steps.) 

By the end of the semester, Jain had successfully created the model, and she applied for another Illinois Tech program that would allow her to develop it fully. By the time she was done, the tool not only predicted hypoglycemic episodes but also estimated their magnitude, personalized results according to a person’s body composition and age, and integrated machine learning to help speed the process. 

While the algorithmic work had been gratifying, during her final undergraduate year, she realized that she needed hands-on experience in a wet lab, so she arranged to study with a chemical engineering professor developing an ointment to help patients with diabetes heal from wounds. 

She left Chicago in 2022 with a B.S. in Chemical Engineering, a summa cum laude distinction, and a strong conviction that she could thrive in a doctoral program. Faced with choosing among the multiple schools that had admitted her, she decided upon NYU Tandon. “There is such a variety of work being done in Tandon’s Department of Chemical and Biomolecular Engineering that I knew I would never get bored or pigeonholed,” she recalls. 

Another factor also played a large role: “You’re going to be spending a lot of time in someone’s lab, so it’s important that you find their work compelling and that you feel they would be a good mentor,” she explains. “When I met Eray Aydil, something just clicked, and I knew we would work well together.”

Aydil, who was then chairing the department but now serves as the senior vice dean of the school, was engaged in trying to improve the efficiency of solar cells. Existing solar arrays and cells can produce clean energy from the most abundant source in nature, but increasing their efficiencies while lowering the cost to replace coal and gas as energy sources still requires technological advancements.

Aydil and his team were focused on one facet of solar cells’ inefficiency — the nature of light itself. The issue with silicon solar cells is that they are not the best match for the solar spectrum. Only certain wavelengths can be efficiently utilized with existing cells. For example, ultraviolet and blue light aren’t converted to electrical power as efficiently as infrared light. This means that a great deal of the potential energy that could be captured is wasted.

“You’re going to be spending a lot of time in someone’s lab, so it’s important that you find their work compelling and that you feel they would be a good mentor.”

In Aydil’s lab, Jain helped develop a film that could be used in the solar cells to shift the light spectrum, turning ultraviolet and blue light (from the less efficient band of the spectrum) into near-infrared light (the more efficient source for solar cells). Changing the light spectrum also had other benefits: UV rays can cause the cells to degrade more quickly, which would require them to be replaced more frequently, increasing the cost of electricity. Additionally, UV rays can cause overheating due to the excess energy they carry, decreasing their efficiency and contributing to their premature degradation. By shifting these rays into the near-infrared part of the spectrum, the new film solved multiple issues with a single fix.

Minh Tran, who earned her Ph.D. in 2023 and went on to found the start-up Heliotrope Photonics based on the technology, was one of many senior doctoral candidates who offered guidance to Jain during her early years at Tandon. Today, Jain actively mentors others, guiding new doctoral students through Tandon’s Womentorship program, working with participants in the Undergraduate Research Summer Program, and engaging with the engineering honor society Tau Beta Pi’s “Ambition in Motion” mentoring initiative 

Does she see following in Tran’s entrepreneurial footsteps after her projected 2027 graduation? The possibility has occurred to her, she admits, pointing out that Miguel Modestino, another professor with whom she works, has a solid track record in encouraging student start-ups: it was two of his students who launched the highly successful Sunthetics, an enterprise aimed at making the chemical manufacturing industry greener. 

Jain, who recently won a graduate research award from the American Vacuum Society for “engineering lead-free halide double perovskites [a class of materials with a crystalline structure] via vapor-phase synthesis and optical characterization,” ultimately aims to bridge the gap between industry and possible sustainability solutions – whatever form that work takes. 

Her husband, Ahmed, is currently a graduate student at the City University of New York, so there will eventually be multiple career decisions to make. In the meantime, however, there are workshops to attend, students to mentor, and lab work to conduct – all the things that make life as a doctoral candidate both challenging and rewarding.

Engineering Health:
A Conversation
with Jeffrey Hubbell

headshot of Hubbell

When Jeffrey Hubbell arrived in New York City to lead the new NYU Institute for Engineering Health, he brought a vision that merges molecular science, computation, and clinical insight to rethink how therapies are discovered and deployed. In this interview, Hubbell — who is also Vice President of Bioengineering Strategy at the University — discusses the new initiative, why physiology is the ultimate systems concept, and how building new medical technologies is like planting a garden.

One of the main reasons you came to NYU was to launch the Institute for Engineering Health. What’s the vision behind the Institute?

We’re bringing together NYU Tandon’s strengths in engineering — computation, data science, AI, molecular and device engineering — with deep expertise in the biological and medical sciences at NYU Langone Health, the College of Arts and Science, Dentistry, and others. The goal is to create an ecosystem that links fundamental understanding of disease biology and immunology to the design of new therapies.

Much of modern medicine focuses on blocking or inhibiting harmful pathways. We’re asking the inverse question: how do we promote beneficial processes? If we can drive positive biological cascades, we might counteract several disease mechanisms simultaneously, rather than inhibit one molecule or pathway at a time. 

Is that a big shift from traditional approaches?

It’s not unheard of, but it’s still relatively unexplored. Antibody therapies, for example, are great at shutting down a single target, but they don’t change the system’s overall behavior. We’re exploring different approaches, using biological molecules, polymers, or nanoscale materials, to bias the system toward tolerance in autoimmune disease or toward localized inflammation in cancer. It’s a physiological, systems-level approach.

Because physiology is the ultimate systems concept. We’re not just changing a molecule; we’re asking how to bias entire biological systems toward desired states, including immune, metabolic, neural and other systems. In inflammatory disease, for example, we want to encourage immunological tolerance. In cancer, we want to trigger a focused immune response, but only in the tumor’s microenvironment.

This approach draws on engineering principles like control systems, transport, and feedback to think about the body as an integrated, adaptive system rather than a set of isolated targets.

The Institute’s name suggests a wide scope. What areas will you focus on first?

Immunology is central, because the immune system connects to so many conditions, from infection and cancer to healthy aging. There’s even a term, inflammaging, describing how low-grade inflammation accumulates over time. Understanding how to counteract that is a key question for us.

We’re also interested in the microbiome — the complex relationship between humans and our microbial communities in the gut, on the skin, in the eyes. Immunology governs that balance too, so learning how to bias those interactions toward health is crucial. But that’s only the start of our areas of focus.

Professor and student wearing lab coats in a chem lab

How does engineering fit in—beyond biology and medicine?

We’re trying to bring new modalities of therapy based on new mechanistic understandings of underlying conditions. Those new modalities are sometimes challenging to develop, because if no one's developed them, you can't say, “Great, I know how to do that. People have turned that crank before, and I can turn that crank in a different way.” Rather, you're asking, “What is the crank, and how do I turn it?”

Engineering tools expand what we can sense and control. Think of wearable and implantable electronics that measure molecular signals or deliver targeted stimulation. You need advanced wireless communications to transmit signals reliably and quickly. And we’re developing materials that not only serve mechanical roles, but also biological ones, blending the interface between a therapeutic material and the body. That’s where materials science becomes a vehicle for new classes of medicine.

To do this work, you can’t just have a bioengineer “understand” biology. You need engineers who are biologists. The same goes for immunology or neuroscience. We’re creating an environment where students and faculty from different disciplines work side by side until the boundaries blur. That’s where the innovation happens — when ideas from distinct fields coexist in the same brain.

You’ve said the challenges ahead require a new structure for research. What does that look like?

The traditional model isolates disciplines — engineers here, chemists there, immunologists somewhere else. We’re replacing that with a unified structure and spaces: teams organized around shared scientific problems rather than departmental silos. That’s why NYU’s new Science and Technology Hub is so important. It will bring researchers in AI, data science, engineering, materials, and quantum science into one physical space. That co-location creates a true and necessary research milieu.

What guides your approach to translating research into practice?

We’re very intentional about thinking through the path from discovery to application. I often tell students, it’s a terrible thing to solve a problem nobody cares about. From the start, we ask: what’s the real need, and what would it take to make this work in the clinic or the field? We hold “translational exercises” where teams map out what success would look like, where and how things could fail, and how to test assumptions early.

It’s the “fail fast” mindset, but with a scientific purpose: identify the critical experiments that can disprove your idea quickly, then move forward smarter.

When my daughter and I planted a garden years ago, we planned the whole thing out. But when it came time to actually plant the garden on a hot Texas summer day, she said to me, “It’s much easier to think about something than to actually do it.” That’s true in science, too. We need to think deeply before acting. The more carefully we plan, the better our chance to grow something that lasts. That philosophy is built into Engineering Health: thoughtful design, deliberate collaboration, and the courage to rethink what medicine can be engineered to do.