J. Lyle Kim

J. Lyle Kim

Quantum Computing Research Scientist

JPMorgan Chase, Global Technology Applied Research

Biography

I am a Quantum Computing Research Scientist – Sr. Associate at JPMorgan Chase, Global Technology Applied Research. I obtained a Ph.D in Computer Science from Rice University, where I was advised by Prof. Anastasios Kyrillidis. I also worked closely with Prof. César A. Uribe and Prof. Nai-Hui Chia.

During graduate school, I interned at JPMorgan Chase hosted by Dr. Marco Pistoia, at MILA – Quebec Artificial Intelligence Institute hosted by Prof. Gauthier Gidel and Prof. Ioannis Mitliagkas, and at Meta (Fundamental AI Research) hosted by Dr. Aaron Defazio.

Before joining Rice, I was a research professional at the University of Chicago Booth School of Business supervised by Prof. Panos Toulis, after receiving bachelor’s degrees in Mathematics and Statistics from the University of Chicago.

Here is my CV, and Google Scholar.

Interests

  • Optimization
  • Quantum computing / algorithms
  • Machine learning

Education

  • Ph.D. in Computer Science, 2019 - 2024

    Rice University

  • B.A. in Mathematics; B.A. in Statistics, 2017

    University of Chicago

[01 Oct 2024] I started working as a quantum computing research scientist – Sr. Associate at JP Morgan Chase, Global Technology Applied Research.

[31 Aug 2024] I obtained my Ph.D. in Computer Science from Rice University. I am extremely grateful to my advisor Prof. Anastasios Kyrillidis and my committee members Prof. César A. Uribe, and Prof. Nai-Hui Chia.

[03 Jun 2024] I started my quantum computing research internship at JP Morgan Chase under the supervision of Dr. Marco Pistoia.

[23 Apr 2024] I succesfully defended! My defense slides are available here.

[01 May 2024] Our paper On the Error-Propagation of Inexact Deflation for Principal Component Analysis has been accepted at ICML 2024. Congratulations to all the co-authors!

On the Error-Propagation of Inexact Deflation for Principal Component Analysis.

International Conference on Machine Learning (ICML) 2024.

Preprint

Smoothness-Adaptive Sharpness-Aware Minimization for Finding Flatter Minima.

Practical Machine Learning for Low Resource Settings Workshop, ICLR 2024.

OpenReview Poster

How Much Pre-training Is Enough to Discover a Good Subnetwork?.

Transactions on Machine Learning Research (TMLR) 2024.

OpenReview Preprint

Adaptive Federated Learning with Auto-Tuned Clients.

International Conference on Learning Representations (ICLR) 2024.

Preprint Poster

When is Momentum Extragradient Optimal? A Polynomial-Based Analysis.

Transactions on Machine Learning Research (TMLR) 2024.

OpenReview Preprint

Adaptive Federated Learning with Auto-Tuned Clients via Local Smoothness.

Federated Learning and Analytics in Practice: Algorithms, Systems, Applications, and Opportunities Workshop, ICML 2023.

Pdf Poster

Fast Quantum State Tomography via Accelerated Non-convex Programming.

Photonics 2023, 10(2), 116 / Quantum Information Processing (QIP) 2023 (poster).

Pdf Preprint Code Slides Poster Blog Press

Momentum Extragradient is Optimal for Games with Cross-Shaped Spectrum.

Workshop on Optimization for Machine Learning, NeurIPS 2022.

Pdf Poster

Local Stochastic Factored Gradient Descent for Distributed Quantum State Tomography.

IEEE Control Systems Letters 2022 / Quantum Information Processing (QIP) 2023 (poster).

Pdf Preprint Poster

Convergence and Stability of the Stochastic Proximal Point Algorithm with Momentum.

Learning for Dynamics and Control (L4DC) 2022.

Pdf Preprint Blog Poster