Quick Bio:
I will be joining SfS at
ETH D-Math in Spring 2024!
I am an assistant professor in the Department of Statistical Science at Duke University. Previously, I was a postdoc fellow at
ETH Foundations of Data Science (ETH-FDS) in ETH Zürich under the supervision of
Prof. Peter Bühlmann. I obtained my PhD in the
Department of Statistics at UC Berkeley in 2019. I was very fortunate to be advised by
Prof. Bin Yu. During my PhD, I was also fortunate to work with
Prof. Martin Wainwright and
Prof. Jack Gallant. Before my PhD study, I obtained my Diplome d'Ingénieur (Eng. Deg. in Applied Mathematics) at
Ecole Polytechnique in France.
My main research interests lie on statistical machine learning, MCMC sampling, optimization, domain adaptation and statistical challenges that arise in computational neuroscience. If you are curious about the main theory directions that I'm going towards in the next 5 years, you may take a look at this
NSF CAREER abstract.
yuansi.chen at duke.edu
"It is necessary and true that all of the things we say in science, all of the conclusions, are uncertain, because they are only conclusions. They are guesses as to what is going to happen, and you cannot know what will happen, because you have not made the most complete experiments."
-- Richard P. Feynman
Recent Papers:
When does Metropolized Hamiltonian Monte Carlo provably outperform Metropolis-adjusted Langevin algorithm?, [
arXiv]
Yuansi Chen, Khashayar Gatmiry
A Simple Proof of the Mixing of Metropolis-Adjusted Langevin Algorithm under Smoothness and Isoperimetry, [
arXiv]
Yuansi Chen, Khashayar Gatmiry
Hit-and-run mixing via localization schemes, [
arXiv]
Yuansi Chen, Ronen Eldan
Localization Schemes: A Framework for Proving Mixing Bounds for Markov Chains, [
arXiv]
Yuansi Chen, Ronen Eldan
Minimax Mixing Time of the Metropolis-Adjusted Langevin Algorithm for Log-Concave Sampling, [
JMLR][
arXiv]
Keru Wu, Scott Schmidler, Yuansi Chen.
Journal of Machine Learning Research (JMLR) 2022
An Almost Constant Lower Bound of the Isoperimetric Coefficient in the KLS Conjecture, [
GAFA][
arXiv]
Yuansi Chen.
Geometric And Functional Analysis (GAFA) 2021
Domain Adaptation Under Structural Causal Models, [
JMLR][
arXiv][
Code on Github]
Yuansi Chen, Peter Bühlmann.
Journal of Machine Learning Research (JMLR) 2021
Fast mixing of Metropolized Hamiltonian Monte Carlo: Benefits of multi-step gradients, [
JMLR][
arXiv]
Yuansi Chen, Raaz Dwivedi, Martin Wainwright, Bin Yu.
Journal of Machine Learning Research (JMLR) 2020
The DeepTune Framework for Modeling and Characterizing Neurons in Visual Cortex Area V4, [
bioRxiv]
Reza Abbasi-Asl*, Yuansi Chen*, Adam Bloniarz, Michael Oliver, Ben Willmore, Jack Gallant, Bin Yu.
Submitted to Proceedings of the National Academy of Sciences (PNAS)
Papers with code:
Log-concave sampling: Metropolis-Hastings algorithms are fast, [
JMLR][
arXiv][
Code on Github]
Yuansi Chen*, Raaz Dwivedi*, Martin Wainwright, Bin Yu.
Journal of Machine Learning Research (JMLR) 2019
Fast and Robust Archetypal Analysis for Representation Learning, [
CVPR][
arXiv][
Code&Demo]
Yuansi Chen, Julien Mairal and Zaid Harchaoui.
IEEE Computer Vision and Pattern Recognition (CVPR) 2014