About me

I am a Master’s student in the Industrial and Systems Engineering department at KAIST advised by Prof. Woo Chang Kim. My research interest lies in

  • Probabilitic Machine Learning
  • Bayesian Inference and Stochastic Optimization

I have a broad but focused interest in the theoretical aspects of Black-Box Variational Inference, GFlowNets, and Neural SDEs. I am involved in research with various groups, including the DiffEqML research group, where my interest in probabilistic machine learning has deepened. Additionally, I had the valuable opportunity to visit Carnegie Mellon University and work with Prof. Andrew A. Li last fall. Prior to KAIST, I did my undergraduate at Yonsei University.

[News] I’m excited to announce that I’ll be joining the CS PhD program at UMass Amherst this fall, working with Professor Justin Domke.

Selected Publications

Provably Scalable Black-Box Variational Inference with Structured Variational Families
Joohwan Ko*, Kyurae Kim*, Woo Chang Kim, Jacob R. Gardner.
41th International Conference on Machine Learning (ICML 2024)

Learning to scale logits for temperature-conditional GFlowNets
Minsu Kim*, Joohwan Ko*, Taeyoung Yun*, Dinghuai Zhang, Ling Pan, Woo Chang Kim, Jinkyoo Park, Emmanuel Bengio, Yoshua Bengio
41th International Conference on Machine Learning (ICML 2024)

Demystifying Doubly Stochastic Gradient Descent
Kyurae Kim, Joohwan Ko, Yian Ma, Jacob R. Gardner
41th International Conference on Machine Learning (ICML 2024)

Multilevel approach to efficient gradient calculation in stochastic systems
Joohwan Ko, Michael Poli, Stefano Massaroli, Woo Chang Kim
11th International Conference on Learning Representations Workshop on Physics for Machine Learning (ICLR 2023 Workshop)

( * denotes equal contribution)