Avatar

Donghyeok Shin

Ph.D. Student

AAI Lab, KAIST


CV | Google Scholar | |


I am a Ph.D. student in Applied Artificial Intelligence Lab at KAIST ISE, advised by Prof. Il-Chul Moon.

My research interests focus on advancing the efficiency of deep learning across various aspects. I have studied efficiency improvements from both the dataset and network perspectives. Currently, I am interested in enhancing the inference efficiency of generative models.

I am currently looking for a research internship position. Please feel free to contact me!

Contact

  • tlsehdgur0@kaist.ac.kr

  • Bldg E2-2, 291 Daehak-ro, Yuseong-gu, Daejeon, Korea, 34141

Education

  • Ph.D. in ISE at KAIST, Present

    Advisor: Prof. Il-Chul Moon

  • M.S. in ISE at KAIST, Feb 2022

    Advisor: Prof. Il-Chul Moon

    Thesis: Dataset Distillation via Loss Approximation for Continual Learning

  • B.S. in Mathematical Sciences at KAIST, Feb 2020

  • B.S. in ISE at KAIST, Feb 2020

    Double Major

Latest News

  • [Nov 2025] 🏆 Awarded as the Winner of Qualcomm Innovation Fellowship Korea 2025 (QIFK 2025).
  • [Oct 2025] 🏆 Recognized as the Top Reviewer of NeurIPS 2025.
  • [Sep 2025] 🎉 One paper is accepted (DATE) to NeurIPS 2025.
  • [Jan 2025] 🎉 One paper is accepted (DDiF) to ICLR 2025.

Preprints

  • AMiD: Knowledge Distillation for LLMs with α-mixture Assistant Distribution

    Donghyeok Shin, Yeongmin Kim, Suhyeon Jo, Byeonghu Na, and Il-Chul Moon

    Under review

    [ paper ]

  • Distillation of Large Language Models via Concrete Score Matching

    Yeongmin Kim, Donghyeok Shin, Mina Kang, Byeonghu Na, and Il-Chul Moon

    Under review

    [ paper ]

  • AC-Sampler: Accelerate and Correct Diffusion Sampling with Metropolis-Hastings Algorithm

    Minsang Park, Gyuwon Sim, Hyungho Na, Jiseok Kwak, Sumin Lee, Richard Lee Kim, Donghyeok Shin, Byeonghu Na, Yeongmin Kim, and Il-chul Moon

    Under review

  • Towards Adversarially Robust VLMs with an Information-Theoretic Approach

    Jiseok Kwak, Donghyeok Shin, Richard Lee Kim, Minsang Park, Zeynep Altiner, and Il-chul Moon

    Under review

  • Towards Pareto-Optimality for Test-Time Adaptation

    JoonHo Jang, DongHyeok Shin, Byeonghu Na, HeeSun Bae, and Il-chul Moon

    Preprint, 2024.

Publications

(*: Equal contribution)

  • Diffusion Adaptive Text Embedding for Text-to-Image Diffusion Models

    Byeonghu Na, Minsang Park, Gyuwon Sim, Donghyeok Shin, HeeSun Bae, Mina Kang, Se Jung Kwon, Wanmo Kang, and Il-Chul Moon

    NeurIPS 2025

    [ paper / code ]

  • Distilling Dataset into Neural Field

    Donghyeok Shin, HeeSun Bae, Gyuwon Sim, Wanmo Kang, and Il-Chul Moon

    ICLR 2025

    [ paper / code ]

  • Diffusion Rejection Sampling

    Byeonghu Na, Yeongmin Kim, Minsang Park, Donghyeok Shin, Wanmo Kang, and Il-Chul Moon

    ICML 2024

    [ paper / code ]

  • Frequency Domain-based Dataset Distillation

    Donghyeok Shin*, Seungjae Shin*, and Il-Chul Moon

    NeurIPS 2023

    [ paper / code ]

  • Loss Curvature Matching for Dataset Selection and Condensation

    Seungjae Shin*, Heesun Bae*, DongHyeok Shin, Weonyoung Joo, and Il-Chul Moon

    AISTATS 2023

    [ paper / code ]

  • Hierarchical Multi-Label Classification with Partial Labels and Unknown Hierarchy

    Suhyeon Jo, DongHyeok Shin, Byeonghu Na, JoonHo Jang, and Il-Chul Moon

    CIKM 2023

    [ paper / code ]

  • Unknown-Aware Domain Adversarial Learning for Open-Set Domain Adaptation

    JoonHo Jang, Byeonghu Na, DongHyeok Shin, Mingi Ji, Kyungwoo Song, and Il-Chul Moon

    NeurIPS 2022

    [ paper / code ]

Awards & Honors

  • Winner, Qualcomm Innovation Fellowship Korea 2025 (QIFK 2025)
    • Distilling Dataset into Neural Field
  • Top Reviewer, NeurIPS 2025

Academic Activities

Reviewer

  • International Conference on Learning Representation (ICLR): 2025, 2026
  • Neural Information Processing Systems (NeurIPS): 2025