About
Hello, I am a Ph.D. student at KAIST specializing in ML Systems. My primary research focus is on optimizing
distributed training and inference performance, with a keen interest in saving energy in ML systems. I possess
unique insights in optimizing system performance by leveraging underlying hardware features. I am always looking
to connect with like-minded individuals who share my passion for sustainable and efficient ML system development.
Education
- Ph.D. Student in School of Computing, KAIST, current
- M.S. in School of Computing, KAIST, 2021
- B.S. in School of Computing, KAIST, 2019
Publications
-
[ICCAD 2023] Jaehoon Heo, Yongwon Shin, Sangjin Choi, Sungwoong Yune, Jung-Hoon Kim, Hyojin Sung, Youngjin Kwon, Joo-Young Kim,
PRIMO: A Full-Stack Processing-in-DRAM Emulation Framework for Machine Learning Workloads (Acceptance rate: 22.9%)
-
[USENIX ATC 2023] Sangjin Choi, Inhoe Koo, Jeongseob Ahn, Myeongjae Jeon, Youngjin Kwon,
EnvPipe: Performance-preserving DNN Training Framework for Saving Energy (Acceptance rate: 18.4%)
[ Paper |
Code |
Talk ]
-
[USENIX ATC 2022] Sangjin Choi*, Taeksoo Kim*, Jinwoo Jeong, Rachata Ausavarungniurn,
Myeongjae Jeon, Youngjin Kwon, Jeongseob Ahn, Memory Harvesting in Multi-GPU Systems with Hierarchical
Unified Virtual Memory (*Co-first author, Acceptance rate: 16.2%)
[ Paper |
Code |
Talk ]