|
Jinuk Kim
Hello!
I am a 3rd year PhD student at Seoul National University, Computer Science department, Machine Learning Lab, advised by Hyun Oh Song.
I have worked on ML model (e.g. LLMs and diffusion models) compression and efficiency. Going forward, I aim to build domain-specific LLM agents and tools for hardware design, CUDA programming, and scientific workflows.
Refer to my Research Statement for more details.
CV   / 
Scholar   / 
Github   / 
Twitter
  / 
Blog
  / 
LinkedIn
|
|
|
Updates
|
- Sep 2025 One paper got accepted in NeurIPS 2025 as an oral presentation (KVzip).
- Aug 2025 I have been awarded the Youlchon AI Star Scholarship.
- May 2025 One paper got accepted in ICML 2025 (GuidedQuant).
- Aug 2024 I will be joining Google as a Student Researcher.
- May 2024 One paper got accepted in ICML 2024 (LayerMerge).
- Aug 2023 I will be joining Samsung Advanced Institute of Technology as a Research Intern.
|
|
Highlights
|
- GuidedQuant proposed an improved objective function and quantization method for large language models.
|
|
KVzip: Query-Agnostic KV Cache Compression with Context Reconstruction
Jang-Hyun Kim, Jinuk Kim, Sangwoo Kwon, Jae W. Lee, Sangdoo Yun, Hyun Oh Song
NeurIPS 2025 Oral Presentation (77/21575=0.35%)
ICML 2025 ES-FoMo-III Workshop
Paper |
Code |
Project page |
Bibtex
|
|
GuidedQuant: Large Language Model Quantization via Exploiting End Loss Guidance
Jinuk Kim, Marwa El Halabi, Wonpyo Park, Clemens JS Schaefer, Deokjae Lee, Yeonhong Park, Jae W. Lee, Hyun Oh Song
ICML 2025
Paper |
Code |
Project page |
Poster |
Bibtex
|
|
LayerMerge: Neural Network Depth Compression through Layer Pruning and Merging
Jinuk Kim, Marwa El Halabi, Mingi Ji, Hyun Oh Song
ICML 2024
Paper |
Code |
Project page |
Poster |
Bibtex
|
|
Efficient Latency-Aware CNN Depth Compression via Two-Stage Dynamic Programming
Jinuk Kim*, Yeonwoo Jeong*, Deokjae Lee, Hyun Oh Song
ICML 2023
Paper |
Code |
Project page |
Bibtex
|
|
Dataset Condensation via Efficient Synthetic-Data Parameterization
Jang-Hyun Kim, Jinuk Kim, Seong Joon Oh, Sangdoo Yun, Hwanjun Song, Joonhyun
Jeong, Jung-Woo Ha, Hyun Oh Song
ICML 2022
Paper |
Code |
Bibtex
|
|
|
I own a SO100 robot, which I sometimes play with and teach to do things.
|
|
Talk-to-President
is a service that allows you to chat with AI personas of the presidential candidates in the voice of the candidates.
|
|
VibeCite
uses your Claude Code to search and generate bibtex entries from your high-level paper descritions.
|
|
SNU Board
is an Android/iOS service which collects notices from website of SNU departments and gather them
(Android / Aug 2021 / 100+ MAU / 80+ WAU / 1000+ Downloads).
|
Academic Services
- Conference Reviewer: TMLR 2024-, ICML 2025-, NeurIPS 2025-, ICLR 2026-
|
|