I am a Senior Research Scientist in the
Center for Computational Mathematics (CCM)
I joined CCM in July 2022 as a group leader in machine learning.
My research focuses on high dimensional data analysis, latent variable modeling,
kernel methods, and deep learning. Before joining Flatiron, I was a research scientist at
and a faculty member at UPenn and
UC San Diego.
I also served previously as Editor-in-Chief of JMLR and as program chair for
I obtained my PhD in Physics from MIT, with a thesis on exact computational methods in
the statistical mechanics of disordered systems.
High dimensional data analysis
How can we discover low dimensional structure in high dimensional data?
How does this question change (if at all) when the data is sparse?
These are fascinating problems that lie at the intersection of statistics, geometry,
Latent variable modeling
Many types of structure in high dimensional data can be modeled via a smaller
number of latent variables. An ongoing project is to discover new and richer latent
variable models in which exact probabilistic inference remains
The distributed representations in deep neural nets are induced
by the nonlinearities (e.g., ReLU) at each layer of processing.
The effectiveness of these nonlinearities can be studied in
layerwise models of unsupervised learning.
L. K. Saul (2022).
A nonlinear matrix decomposition for mining the zeros of sparse data.
SIAM Journal of Mathematics of Data Science 4(2):431-463.
L. K. Saul (2021). An online passive-aggressive algorithm for difference-of-squares classification.
In M. A. Ranzato, A. Beygelzimer, Y. N. Dauphin, P. Liang, and J. W. Vaughan (eds.),
Advances in Neural Information Processing Systems 34:
Proceedings of the Conference on Neural Information Processing Systems (NeuRIPS-2021),
L. K. Saul (2021). An EM algorithm for capsule regression. Neural Computation 33(1):194-226
L. K. Saul (2020). A tractable latent variable model for nonlinear dimensionality reduction.
Proceedings of the National Academy of Sciences USA 117(27):15403-15408.