Current Work
Kokot & Luedtke (2025) · Coreset selection for the Sinkhorn divergence and generic smooth divergences
A framework for selecting representative coresets with respect to arbitrary losses, with special application to the Sinkhorn divergence. PDF
BibTeX
@misc{kokot2025coresetselectionsinkhorndivergence,
title={Coreset selection for the Sinkhorn divergence and generic smooth divergences},
author={Alex Kokot and Alex Luedtke},
year={2025},
eprint={2504.20194},
archivePrefix={arXiv},
primaryClass={stat.ML},
url={https://arxiv.org/abs/2504.20194},
}
Kokot, Murad, and Meila (2025) · The Noisy Laplacian: a threshold phenomenon for non-linear dimension reduction
A study of the effect of noise in common dimension reduction algorithms, and the geometry recoverable in these settings. To appear at ICML. Preprint
Kokot et al. (2025) · The EGOP Flow: Local features for Continuous Index Learning
We study a local kernel method motivated by local advances related to the EGOP. We show that under a supervised noisy manifold hypothesis, it is possible to learn functions at intrinsic dimensional rates under a suitable Wasserstein flow. Preprint
Talks
Meila, Kokot, and Murad (2025) · Noise level as a dimension -- spectral embeddings vs. generative models
JMM, AMS Special Session on Geometric and Combinatorial Methods in Deep Learning Theory, II Abstract
Scientific Engagement and Mathematical Reports
These include works which I assisted as a laboratory aid during my undergraduate studies, as well as research from REUs. I am forever indebted to the terrific opportunites that I have been able to partake in, and the incredible mentorship of so many esteemed researchers.