About

I am Giung Nam, a Ph.D. student at the Kim Jaechul Graduate School of AI, Korea Advanced Institute of Science and Technology (KAIST AI), advised by Professor Juho Lee. I received my M.S. from KAIST AI and my B.S. from the Department of Computer Science and Engineering, Korea University.

Since March 2024, I have been working as a Technical Research Personnel, and I expect to complete my doctoral degree by February 2026.

For more details and ways to connect:


Publication

Improving constrained language generation via self-distilled twisted sequential Monte Carlo Sooyeon Kim, Giung Nam, Byoungwoo Park, Juho Lee Frontiers in Probabilistic Inference: Learning Meets Sampling (FPI Workshop at NeurIPS), 2025 (To appear)

PANGEA: projection-based augmentation with non-relevant general data for enhanced domain adaptation in LLMs Seungyoo Lee, Giung Nam, Hyungi Lee, Moonseok Choi, Juho Lee Neural Information Processing Systems (NeurIPS), 2025 (To appear)

Ensemble distribution distillation via flow matching Jonggeon Park*, Giung Nam*, Hyunsu Kim, Jongmin Yoon, Juho Lee International Conference on Machine Learning (ICML), 2025 [pdf]

Parameter expanded stochastic gradient Markov chain Monte Carlo Hyunsu Kim, Giung Nam, Chulhee Yun, Hongseok Yang, Juho Lee International Conference on Learning Representations (ICLR), 2025 [pdf, arXiv]

Ex uno pluria: insights on ensembling in low precision number systems Giung Nam, Juho Lee Neural Information Processing Systems (NeurIPS), 2024 [pdf, arXiv]

Sparse weight averaging with multiple particles for iterative magnitude pruning Moonseok Choi*, Hyungi Lee*, Giung Nam*, Juho Lee International Conference on Learning Representations (ICLR), 2024 [pdf, arXiv]

Enhancing transfer learning with flexible nonparametric posterior sampling Hyungi Lee*, Giung Nam*, Edwin Fong, Juho Lee International Conference on Learning Representations (ICLR), 2024 [pdf, arXiv]

Lipsum-FT: robust fine-tuning of zero-shot models using random text guidance Giung Nam, Byeongho Heo, Juho Lee International Conference on Learning Representations (ICLR), 2024 [pdf, arXiv]

Traversing between modes in function space for fast ensembling Eunggu Yun*, Hyungi Lee*, Giung Nam*, Juho Lee International Conference on Machine Learning (ICML), 2023 [pdf, arXiv]

Martingale posterior neural processes Hyungi Lee, Eunggu Yun, Giung Nam, Edwin Fong, Juho Lee International Conference on Learning Representations (ICLR), 2023, Spotlight [pdf, arXiv]

Decoupled training for long-tailed classification with stochastic representations Giung Nam*, Sunguk Jang*, Juho Lee International Conference on Learning Representations (ICLR), 2023 [pdf, arXiv]

Benefits of stochastic weight averaging in developing neural network radiation scheme for numerical weather prediction Hwan-Jin Song, Soonyoung Roh, Juho Lee, Giung Nam, Eunggu Yun, Jongmin Yoon, Park Sa Kim Journal of Advances in Modeling Earth Systems (JAMES), October 2022 [pdf, ESSOAr]

Improving ensemble distillation with weight averaging and diversifying perturbation Giung Nam, Hyungi Lee, Byeongho Heo, Juho Lee International Conference on Machine Learning (ICML), 2022 [pdf, arXiv]

Diversity matters when learning from ensembles Giung Nam*, Jongmin Yoon*, Yoonho Lee, Juho Lee Neural Information Processing Systems (NeurIPS), 2021 [pdf, arXiv]