Nicholas H. Nelsen
NSF Graduate Research Fellow and Ph.D. Candidate
California Institute of Technology
About Me
Welcome! I am a fifth year graduate student in the Division of Engineering and Applied Science at Caltech, where I work with my advisor Prof. Andrew M. Stuart. My research interests are in theory and algorithms for high-dimensional scientific and data-driven computation. Within applied and computational mathematics, some particular areas that I work in include scientific machine learning, inverse problems, uncertainty quantification, and statistical inference.
My current work centers on operator learning—regressing, from (noisy) data, operators that map between infinite-dimensional (function) spaces—with application to forward and inverse problems, especially those arising from parametric partial differential equations (PDEs) that model complex physical systems. To this end, I develop and utilize tools from machine learning, model reduction, numerical analysis, and statistics. Please refer to my curriculum vitae and my publications page to learn more about my background and research experience.
I am fortunate to be supported by a NSF Graduate Research Fellowship. In 2020, I obtained my M.Sc. from Caltech, and before starting doctoral study in the fall of 2018, I worked on Lagrangian particle methods for PDEs as a summer research intern in the Center for Computing Research at Sandia National Laboratories. I obtained my B.Sc. (Mathematics), B.S.M.E., and B.S.A.E. degrees from Oklahoma State University in 2018.
nnelsen [at] caltech [dot] edu
Recent News
2023/04: I am speaking at the Oden Institute's inaugural Workshop on Scientific Machine Learning at UT Austin, the Workshop on Establishing Benchmarks for Data-Driven Modeling of Physical Systems at USC, and the Southern California Applied Mathematics Symposium (SoCAMS) at UC Irvine.
2023/02: I am giving an invited talk about "Learning the Electrical Impedance Tomography Inversion Operator" in MS46: Goal-Oriented and Context-Aware Scientific Machine Learning, part of SIAM CSE23 in Amsterdam, The Netherlands. There, I am also co-organizing MS370 and MS406: Operator Learning in the Physical and Data Sciences, Parts I & II.
2022/11: Our paper on the theory of linear operator learning was accepted for publication in the SIAM/ASA Journal on Uncertainty Quantification.
2022/09: Professor Joel Tropp's course lecture notes on "Matrix Analysis" are now publicly available and include chapter III.8 that I wrote on the topic of "Operator-Valued Kernels."