I am an Assistant Professor of Statistics at the University of South Carolina (USC). Before joining USC in 2020, I was a postdoc at the University of Pennsylvania for two years. I got my PhD in Statistics from the University of Florida, my MS in Applied Mathematics from the University of Massachusetts Amherst, and my BA from Cornell University. Prior to my career in academia, I worked in industry for five years as an engineer and as a financial software analyst. Here are the links to my CV and Google Scholar. If my work interests you, you are also welcome to connect with me on LinkedIn and Twitter.

My research group conducts research in the following areas of statistics and machine learning:
  • methods and scalable algorithms for high-dimensional data
  • deep learning and deep generative models
  • Bayesian methodology and computation
  • matrix completion and recommender systems
  • survival analysis and causal inference
In Fall 2024, I am teaching STAT 515: Statistical Methods I and STAT 517: Advanced Statistical Models.

Prospective Students

To work with me, you must be currently enrolled as a student at USC. PhD students who are interested in my supervision must also first pass the USC Statistics PhD Qualifying Exam (offered every August). Information for prospective students

If you are interested in applying to the Statistics graduate program at USC but are not yet enrolled there as a student, please visit the department webpage for information about applying. Note that graduate students are admitted by the department’s Graduate Admissions Committee — not by myself or individual faculty. I am also unable to provide specific feedback on anyone’s application or recommend applicants whom I do not know for admission.

Recent News

  • September 2024: “VCBART: Bayesian trees for varying coefficients” (with Sameer Deshpande, Cecilia Balocchi, Jennifer Starling, and Jordan Weiss) has been accepted by Bayesian Analysis. Paper

  • June 2024: “Bayesian modal regression based on mixture distributions” (with Qingyang Liu and Xianzheng Huang) has been accepted by Computational Statistics & Data Analysis. Paper

  • June 2024: New preprint “Neural-g: A deep learning framework for mixing density estimation” (with Shijie Wang, Saptarshi Chakraborty, and Qian Qin). arXiv

  • May 2024: My student Shijie Wang has defended his PhD dissertation “New Deep Learning Approaches to Classical Statistical Problems” and will join Gauss Labs as an Applied Scientist. Congratulations, Shijie!

  • March 2024: “Fast bootstrapping nonparametric maximum likelihood for latent mixture models” (with Shijie Wang and Minsuk Shin) has been published in IEEE Signal Processing Letters. Paper

  • February 2024: “Generative quantile regression with variability penalty” (with Shijie Wang and Minsuk Shin) has been accepted by Journal of Computational and Graphical Statistics. Paper

  • January 2024: New preprint “A unified three-state model framework for analysis of treatment crossover in survival trials” (with Zile Zhao, Ye Li, and Xiaodong Luo). arXiv

  • December 2023: My PhD student Shijie Wang has had the first chapter of his dissertation “Generative multi-purpose sampler for weighted M-estimation” (co-authored with Minsuk Shin and Jun Liu) accepted for publication in Journal of Computational and Graphical Statistics. Paper

  • October 2023: “Scalable high-dimensional Bayesian varying coefficient models with unknown within-subject covariance” (with Mary R. Boland and Yong Chen) has been published in Journal of Machine Learning Research. Paper