J. Lyle Kim

J. Lyle Kim

Ph.D. student in Computer Science

Rice University

Biography

My legal name is Junhyung Kim (김준형), but I usually go by Lyle. I am a fifth year Ph.D. student in Computer Science at Rice University, advised by Prof. Anastasios Kyrillidis. I also work closely with Prof. César A. Uribe and Prof. Nai-Hui Chia.

Before joining Rice, I was a research professional at the University of Chicago Booth School of Business supervised by Prof. Panos Toulis.

I received bachelor’s degrees in Mathematics and Statistics at the University of Chicago in 2017. During my senior year at the college, I also worked with Prof. Mikael Kuusela under the supervision of Prof. Michael L. Stein.

In summer 2023, I worked as a visitng student researcher at MILA – Quebec Artificial Intelligence Institute under the supervision of Prof. Gauthier Gidel and Prof. Ioannis Mitliagkas.

In summer 2022, I worked at Meta (Fundamental AI Research) as a research intern under the supervision of Dr. Aaron Defazio.

Here is my CV, and Google Scholar.

Interests

  • Optimization
  • Distributed optimization
  • Machine learning
  • Quantum computing

Education

  • Ph.D. in Computer Science, 2019 - present

    Rice University

  • B.A. in Mathematics; B.A. in Statistics, 2017

    University of Chicago

[27 Oct 2023] I co-organized Quantum Information Processing Systems (QuantIPS 2023). Thank you to all the speakers and participants!

[29 Sep 2023] I co-organized Texas Colloquium on Distributed Learning (TL;DR 2023). Thank you to all the speakers and participants!

[19 June 2023] Adaptive Federated Learning with Auto-Tuned Clients via Local Smoothness got accepted at ICML 2023 Federated Learning Workshop. My poster is available here.

[08 May 2023] I started my summer research visit at MILA under the supervision of Prof. Ioannis Mitliagkas and Prof. Gauthier Gidel.

[20 Apr 2023] I completed my PhD Proposal. Committee: Profs. Anastasios Kyrillidis (chair), César A. Uribe, and Nai-Hui Chia.

Adaptive Federated Learning with Auto-Tuned Clients via Local Smoothness.

Federated Learning and Analytics in Practice: Algorithms, Systems, Applications, and Opportunities Workshop, ICML 2023.

Pdf Poster

Adaptive Federated Learning with Auto-Tuned Clients.

Under review.

Preprint

When is Momentum Extragradient Optimal? A Polynomial-Based Analysis.

Under review.

Preprint

Fast Quantum State Tomography via Accelerated Non-convex Programming.

Photonics 2023, 10(2), 116 / Quantum Information Processing (QIP) 2023 (poster).

Pdf Preprint Code Slides Poster Blog Press

Momentum Extragradient is Optimal for Games with Cross-Shaped Spectrum.

Workshop on Optimization for Machine Learning, NeurIPS 2022.

Pdf Poster

Local Stochastic Factored Gradient Descent for Distributed Quantum State Tomography.

IEEE Control Systems Letters 2022 / Quantum Information Processing (QIP) 2023 (poster).

Pdf Preprint Poster

Convergence and Stability of the Stochastic Proximal Point Algorithm with Momentum.

Learning for Dynamics and Control (L4DC) 2022.

Pdf Preprint Blog Poster

Acceleration and Stability of the Stochastic Proximal Point Algorithm.

Workshop on Optimization for Machine Learning, NeurIPS 2021 (spotlight).

Pdf Blog Poster

Provably Efficient Lottery Ticket Discovery.

Preprint.

Preprint

Momentum-inspired low-rank coordinate descent for diagonally constrained SDPs.

Preprint.

Preprint

Contact