We are a group of people interested in machine learning and optimization. We develop theories and algorithms using computational and mathematical tools, and our ultimate goal is to provide robust and provable solutions to challenging problems in artificial intelligence, particularly those in large-scale settings. We are passionate about translating our findings into practical applications that can benefit society.
Modern neural network models entail a considerable amount of computations and memory. We develop advanced optimization algorithms to compress these models so that they can run more efficiently at training and inference without sacrificing their performance. We apply our techniques to various domains including vision and language.
The success of deep learning is largely due to the unreasonable effectiveness of stochastic gradient descent and its variants. Why do these algorithms work so well? What are their convergence properties? We seek to prove their convergence and generalization behavior via concepts of overparameterization, sharpness, and implicit bias.
Deep neural networks are employed in nearly all modern intelligent systems, but their decision-making process is not easy to explain. We aim to increase the interpretability of complex neural networks by utilizing high-level human-interpretable concepts from both data- and model-centric perspectives.
Besides the highlights above, several research projects are under way, from theories to applications. Some of them are collaborative efforts with others in both academia and industry including POSTECH, Oxford, Google, and Amazon. Our research broadly revolves around the following themes:
Optimization convex • non-convex • stochastic • distributed • online
Deep learing compression • generalization • robustness • interpretability
Data science learning from limited data • vision applications • language models
Jinseok Chung (2022-)
Donghyun Oh (2022-)
Sungbin Shin (2023-)
Dahun Shin (2023-)
Dongyeop Lee (2023-)
Seonghwan Park (2023-)
Jeeon Bae (2023-)
Jueun Mun (2023-)
Hyunjune Kim (2023 Summer-)
Chanyoung Maeng (2023 Summer-)
Yoojin Jang (2023), Jeeon Bae (2023), Kyunghun Nam (2023), Chanho Jung (2023), Sungbin Shin (2023)
We are always on the lookout for talented and motivated students to join us.
Application Please fill in this application form.
Vacancy We are hiring a few graduate students to work on optimization theory and algorithms for the upcoming terms; candidates are expected to have strong computational and mathematical skills. We may also hire interns if you are currently enrolled as a POSTECH student or have top grades.
Others Have a look at this epilogue (KOR) written by 2023 summer interns.