Nicholas H. Nelsen
NSF Mathematical Sciences Postdoctoral Research Fellow
Department of Mathematics
Massachusetts Institute of Technology
About Me
Welcome! I am an NSF Mathematical Sciences Postdoctoral Research Fellow in the Department of Mathematics at MIT, where I am hosted by Prof. Philippe Rigollet. I am also affiliated with LIDS and the UQ Group, where I collaborate with Prof. Youssef Marzouk. Broadly, my research interests lie at the intersection of computational mathematics and statistics. Using rigorous analysis and domain-specific insight, I develop novel data-driven machine learning methods for high- and infinite-dimensional problems, establish theoretical guarantees on the reliability and trustworthiness of these methods, and apply them in the physical and data sciences. My current work blends operator learning with ideas from inverse problems, generative modeling, and uncertainty quantification.
I received my Ph.D. from Caltech in 2024, where I was fortunate to be advised by Prof. Andrew M. Stuart and supported by the Amazon AI4Science Fellows Program and an NSF Graduate Research Fellowship. My doctoral dissertation was awarded two "best thesis" prizes, one in applied mathematics and another in engineering. I obtained my M.Sc. from Caltech in 2020 and my B.Sc. (Mathematics), B.S.M.E., and B.S.A.E. degrees from Oklahoma State University in 2018.
nnelsen [at] mit [dot] edu
Recent News
2024/09 (new): I am attending a workshop on the Statistical Aspects of Non-Linear Inverse Problems at the University of Cambridge.
2024/08 (new): I am happy to announce that Andrew Stuart and I received a 2024 SIGEST Award from SIAM for our 2021 paper on operator learning using random features. This award recognizes an exceptional paper of general interest published in the SIAM Journal on Scientific Computing in the last few years. An expanded version of the article is now published online in the SIGEST section in SIAM Review.
2024/08 (new): I had a productive research visit at the University of Washington Department of Applied Mathematics. Thanks to Bamdad Hosseini for hosting!
2024/08 (new): I am giving an invited talk on operator learning for parameter-to-observable maps at the University of Bath Machine Learning in Infinite Dimensions Workshop in Bath, UK. Our paper on this topic was accepted in the AIMS journal "Foundations of Data Science".
2024/07: Using black-box, derivative-free, particle-based optimizers, our new preprint develops a framework for Hyperparameter Optimization for Randomized Algorithms (A Case Study for Random Features). We demonstrate that random features can be a robust and practical replacement for Gaussian processes in high-dimensional regression problems. Thanks to Oliver Dunbar and Maya Mutic for the great collaboration!
2024/07: I am joining the Department of Mathematics at MIT as an NSF Mathematical Sciences Postdoctoral Research Fellow. The following year, I will join the Department of Mathematics at Cornell University as a Klarman Fellow.
2024/06: My Caltech Ph.D. thesis on the "Statistical Foundations of Operator Learning" won the W.P. Carey and Co. Prize for Best Thesis in Applied Mathematics and the Centennial Prize for the Best Thesis in MCE.
2024/06 [In the media]: I was interviewed about my research program and how it connects with the public sphere in Caltech Magazine's #SoCaltech section and also featured in SIAM's student spotlight video, where I discuss my graduate research and associated skills.