Student projects
We offer various student projects.
This is a great opportunity for anyone who wants to (1) get research experience with optimization in general, (2) know more about what we do before joining us as a graduate student, or (3) receive research credits as a part of the undergrad coursework (for POSTECH students).
Please find the (incomplete) list below, and if you're interested in getting involved, consider applying!
-
Establishing a benchmark for parameter-free methods
Although parameter-free methods have attracted considerable attention, it is often witnessed that their performances differ quite significantly across papers.
In this project, we aim to establish a reliable benchmark where different parameter-free methods are evaluated and compared fairly.
-
Mitigating catastrophic forgetting in continual learning of LLMs
Continual learning allows LLMs to evolve without retraining, but it often leads to catastrophic forgetting of previously acquired knowledge.
In this project, we aim to identify key underlying factors contributing to forgetting and suggest an optimization-based strategy that promotes stable updates.
-
Black-box training of large foundation models under relaxed assumptions
Recent studies have shown that large foundation models can be trained in black-box settings.
However, these approaches often rely on assumptions that are difficult to meet in practice—such as having the ability to modify either the input or output layers of the model.
This project aims to develop effective strategies for training black-box models under more realistic and broadly applicable conditions.
-
Investigating the effect of calibration data on LLM compression
Most LLM compression techniques rely on an arbitrarily given small set of calibration data, and there is limited understanding of the potential influence this data has on the compression result or how they should be made.
In this project, we plan to study various aspects of calibration data and develop an optimal strategy to construct one.