Scalable Probabilistic Inference For Complex Dynamical Models
Speaker: Lei Li, UC Berkeley
Time series data arise in numerous applications, such as data center monitoring, tracking web user activities, health care, etc. Detecting patterns and learning features in collections of data sequences are crucial to solve real-world, domain specific problems, for example, to track moving objects in videos, to spot nefarious online activities, and to forecast patients' health states.
In this talk, I define a new tensor dynamical model for multivariate data as well as an efficient algorithm for learning such models from data. In addition, I will present an efficient approach to jointly estimate parameters and latent states for a large class of models including nonlinear dynamical systems. Finally, I will present my work on efficient inference for a probabilistic declarative programming language, which aims to democratize machine learning and to enable practitioners to solve their domain specific problems.
Lei Li is a Postdoctoral Researcher in the EECS Department at UC Berkeley. His research interest lies in the intersection of machine learning, statistical inference and database systems. Specifically, he has been working on Bayesian inference in open universe probabilistic models, probabilistic programming languages, large-scale learning, time series, communication, and social networks. He served on the Program Committees for ICML 2014, SDM 2013/2014, and IJCAI 2011/2013. He has been invited as reviewer for TOMCCAP, DAMI, TKDE, TOSN, Neurocomputing, KDD, SIGMOD, VLDB, PKDD and WWW. He was invited to review NSF proposals in 2010, and to DARPA's Information Science and Technology (ISAT) probabilistic programming workshop in 2013.
Lei received his B.S. in Computer Science and Engineering from Shanghai Jiao Tong University in 2006, and his Ph.D. in Computer Science from Carnegie Mellon University in 2011. His dissertation work on fast algorithms for mining co-evolving time series received an ACM KDD Dissertation Award (Runner Up).