I am an Assistant Professor of Statistics at the University of South Carolina where I am also a McCausland Faculty Fellow. Before joining USC in 2020, I was a postdoc at the University of Pennsylvania for two years. I got my PhD in Statistics from the University of Florida, my MS in Applied Mathematics from the University of Massachusetts Amherst, and my BA from Cornell University. Prior to my career in academia, I worked in industry for five years as an engineer and as a financial software analyst. Here are the links to my CV and Google Scholar. If my work interests you, you are also welcome to connect with me on LinkedIn and Twitter.

My research group conducts research in the following areas of statistics and machine learning:
  • methods and scalable algorithms for high-dimensional data
  • deep learning for deep generative models and reinforcement learning
  • Bayesian methodology and computation
  • survival analysis and causal inference
  • clinical trials, drug discovery, GWAS, and other biomedical applications.
In Spring 2024, I am teaching STAT 721: Stochastic Processes.


I will be able to take two or three new PhD students in Fall 2024. In order to work under my supervision, you must first pass the Statistics PhD Qualifying Exam (details here) in August 2024 — I cannot advise students who have not passed this exam. My hope is to have (at least) one new student work on Bayesian methods/algorithms and (at least) one new student work on deep learning problems. Potential projects within these areas include scalable Bayesian computation, generative modeling for non-IID data, causal inference, and competing risks. Information for prospective students

Recent News

  • February 2024: “Generative quantile regression with variability penalty” (with Shijie Wang and Minsuk Shin) has been accepted by Journal of Computational and Graphical Statistics. Paper

  • January 2024: New preprint “A unified three-state model framework for analysis of treatment crossover in survival trials” (with Zile Zhao, Ye Li, and Xiaodong Luo). arXiv

  • January 2024: New preprint “Fast bootstrapping nonparametric maximum likelihood for latent mixture models” (with Shijie Wang and Minsuk Shin). arXiv

  • December 2023: My PhD student Shijie Wang has had the first chapter of his dissertation, “Generative multi-purpose sampler for weighted M-estimation” (co-authored with Minsuk Shin and Jun Liu), accepted for publication in Journal of Computational and Graphical Statistics. Paper

  • October 2023: “Scalable high-dimensional Bayesian varying coefficient models with unknown within-subject covariance” (with Mary R. Boland and Yong Chen) has been published in Journal of Machine Learning Research. Paper

  • October 2023: New preprint “Sparse high-dimensional linear mixed modeling with a partitioned empirical Bayes ECM algorithm” (with Anja Zgodic, Jiajia Zhang, and Alexander McClain). arXiv

  • September 2023: New preprint “Quantifying predictive uncertainty of aphasia severity in stroke patients with sparse heteroscedastic Bayesian high-dimensional regression” (with Anja Zgodic, Jiajia Zhang, Yuan Wang, Chris Rorden, and Alexander McLain). arXiv

  • June 2023: New preprint “Bayesian group regularization in generalized linear models with a continuous spike-and-slab prior.” arXiv

  • June 2023: I received a Big Data Health Science Center Pilot Project grant to develop new methods for matrix completion with applications to computational drug repositioning.

  • May 2023: New preprint “Two-step mixed-type multivariate Bayesian sparse variable selection with shrinkage priors” (with Shao-Hsuan Wang and Hsin-Hsiung Huang). arXiv

Past news