Jeff (Jun) Zhang
Jeff (Jun) Zhang is the recipient of the Ernst Weber Fellowship for his Ph.D. in the Department of Electrical and Computer Engineering, New York University. He received his B.Eng. and M.Eng. in Computer Engineering with Magna Cum Laude from Hunan University, Changsha, China. He also has research internship experience with Samsung Semiconductor and Microsoft Research.
His research interests are in Computer Architecture and EDA, with particular emphasis on energy-efficient and fault-tolerant design for deep learning systems and accelerators.
Research Interests: Computer Architecture and Systems, Deep Learning, Electronic Design Automation (EDA)
I'm joining the Harvard Architecture, Circuits, and Compilers Group as a Postdoctoral Fellow.
Later, I will be on the job market looking for academic positions!
2020/07/22: Attended 57th Design Automation Conference (DAC), and presented at DAC Ph.D. Forum.
2020/07/20: Successfully defended my Ph.D. thesis: Towards Energy Efficient and Reliable Deep Learning Inference. Thesis committee members: Prof. Siddharth Garg, Prof. Ramesh Karri, Prof. Tushar Krishna, Prof. Sundeep Rangan, and Prof. Brandon Reagen. [Slides][Video]
New York University, 2020
Ph.D., Electrical and Computer Engineering
Hunan University, 2013
M.Eng., Computer Engineering
Hunan University, 2011
B.Eng., Information Security
Microsoft Research, Redmond WA
Research Intern August 2018 to November 2018
- Architecture Optimizations for Deep Convolutional Neural Networks.
Samsung Semiconductor Inc., San Jose CA
Research Intern May 2018 to August 2018
- Reinforcement learning based SSD storage performance optimization.
A nearly complete list of my publications can be found via my Google Scholar page.
[HotCloud'20] Model-Switching: Dealing with Fluctuating Workloads in Machine-Learning-as-a-Service Systems. (Acceptance rate: 22/95=23%)
J. Zhang, S. Elnikety, S. Zarar, A. Gupta, S. Garg.
[ESWeek'19] CompAct: On-chip Compression of Activations for Low Power Systolic Array Based CNN Acceleration. (Acceptance rate: 16/75=21%)
J. Zhang, P. Raj, S. Zarar, A. Ambardekar, S. Garg.
[DAC'19] Building Robust Machine Learning Systems: Current Progress, Research Challenges, and Opportunities. (Special Session)
J. Zhang, K. Liu, et al.
[ICCAD'18] FATE: fast and accurate timing error prediction framework for low power DNN accelerator design. (Acceptance rate: 98/396=24.7%)
J. Zhang, S. Garg.
[DAC'18] ThUnderVolt: Enabling Aggressive Voltage Underscaling and Timing Error Resilience for Energy-Efficient DNN Accelerators. (Acceptance rate: 168/691=24.3%)
J. Zhang, Z. Ghodsi, S. Garg.
[VTS'18] Analyzing and Mitigating the Impact of Permanent Faults on a Systolic Array Based Neural Network Accelerator. (Best Paper Nominee)
J. Zhang, T. Gu, K. Basu, S. Garg.
[DATE'17] BandiTS: Dynamic timing speculation using multi-armed bandit based optimization.
J. Zhang, S. Garg.
[DAC'16] Synergistic Timing Speculation for Multi-threaded Programs. (Acceptance rate: 152/876=17%)
A. Yasin, J. Zhang, H. Chen, S. Chakraborty, S. Garg, K. Chakraborty.
- Best Presentation Award Nomination, ACM SIGDA DATE PhD Forum, 2020
- Best Paper Award Nomination, IEEE VLSI Test Symposium, 2018
- Ernst Weber P.h.D Fellowship, New York University, 2015, 2016
- National Scholarship, Ministry of Education of China, 2008, 2010, 2012