J. Lyle Kim

J. Lyle Kim

Ph.D. student in Computer Science

Rice University

Biography

My legal name is Junhyung Kim (김준형), but I usually go by Lyle. I am a fifth year Ph.D. student in Computer Science at Rice University, advised by Prof. Anastasios Kyrillidis. I also work closely with Prof. César A. Uribe and Prof. Nai-Hui Chia.

Before joining Rice, I was a research professional at the University of Chicago Booth School of Business supervised by Prof. Panos Toulis.

I received bachelor’s degrees in Mathematics and Statistics at the University of Chicago in 2017. During my senior year at the college, I also worked with Prof. Mikael Kuusela under the supervision of Prof. Michael L. Stein.

In summer 2023, I worked as a visitng student researcher at MILA – Quebec Artificial Intelligence Institute under the supervision of Prof. Gauthier Gidel and Prof. Ioannis Mitliagkas.

In summer 2022, I worked at Meta (Fundamental AI Research) as a research intern under the supervision of Dr. Aaron Defazio.

Here is my CV, and Google Scholar.

Interests

  • Optimization
  • Distributed optimization
  • Machine learning
  • Quantum computing

Education

  • Ph.D. in Computer Science, 2019 - present

    Rice University

  • B.A. in Mathematics; B.A. in Statistics, 2017

    University of Chicago

[26 Jan 2024] Our paper How Much Pre-training Is Enough to Discover a Good Subnetwork? has been accepted at Transactions on Machine Learning Research (TMLR). Congratulations to all the co-authors!

[16 Jan 2024] Our paper Adaptive Federated Learning with Auto-Tuned Clients has been accepted at the International Conference on Learning Representations (ICLR), 2024. This is joint work with M. Taha Toghani, Prof. César A. Uribe, and Prof. Tasos Kyrillidis.

[12 Dec 2023] Our paper When is Momentum Extragradient Optimal? A Polynomial-Based Analysis has been accepted at Transactions on Machine Learning Research (TMLR). This is joint work with Prof. Gauthier Gidel, Prof. Tasos Kyrillidis, and Dr. Fabian Pedregosa.

[27 Oct 2023] I co-organized Quantum Information Processing Systems (QuantIPS 2023). Thank you to all the speakers and participants!

[29 Sep 2023] I co-organized Texas Colloquium on Distributed Learning (TL;DR 2023). Thank you to all the speakers and participants!

How Much Pre-training Is Enough to Discover a Good Subnetwork?.

Transactions on Machine Learning Research (TMLR) 2024.

OpenReview Preprint

Adaptive Federated Learning with Auto-Tuned Clients.

International Conference on Learning Representations (ICLR) 2024.

Preprint Poster

When is Momentum Extragradient Optimal? A Polynomial-Based Analysis.

Transactions on Machine Learning Research (TMLR) 2024.

OpenReview Preprint

Adaptive Federated Learning with Auto-Tuned Clients via Local Smoothness.

Federated Learning and Analytics in Practice: Algorithms, Systems, Applications, and Opportunities Workshop, ICML 2023.

Pdf Poster

Fast Quantum State Tomography via Accelerated Non-convex Programming.

Photonics 2023, 10(2), 116 / Quantum Information Processing (QIP) 2023 (poster).

Pdf Preprint Code Slides Poster Blog Press

Momentum Extragradient is Optimal for Games with Cross-Shaped Spectrum.

Workshop on Optimization for Machine Learning, NeurIPS 2022.

Pdf Poster

Local Stochastic Factored Gradient Descent for Distributed Quantum State Tomography.

IEEE Control Systems Letters 2022 / Quantum Information Processing (QIP) 2023 (poster).

Pdf Preprint Poster

Convergence and Stability of the Stochastic Proximal Point Algorithm with Momentum.

Learning for Dynamics and Control (L4DC) 2022.

Pdf Preprint Blog Poster

Acceleration and Stability of the Stochastic Proximal Point Algorithm.

Workshop on Optimization for Machine Learning, NeurIPS 2021 (spotlight).

Pdf Blog Poster

Momentum-inspired low-rank coordinate descent for diagonally constrained SDPs.

Preprint.

Preprint

Contact