AI & Fundamentals
Efficient Fine-tuning via Model Reprogramming - Feng Liu, Assistant Professor, University of Melbourne
DATE: Mon, December 16, 2024 - 11:30 am
LOCATION: UBC Vancouver Campus, ICCS X836
DETAILS
Abstract:
Knowledge shouldn’t be limited to those who can pay," said Robert C. May, chair of UC's Academic Senate. In machine learning, this is particularly relevant, as recent foundation models—pre-trained on massive datasets—have widened the gap between those who can afford to access and control these models and those who cannot. While these models hold promise for global challenges like healthcare and education, their size and cost make fine-tuning prohibitive, especially for institutions in developing regions. Model reprogramming (MR) offers a cost-effective solution by repurposing pre-trained models without expensive retraining. This talk will introduce MR’s input reprogramming and output transformation and two corresponding state-of-the art MR methods (ICML 2024 Spotlight, NeurIPS 2024 Oral).
Bio:
Dr Feng Liu is a machine learning researcher with research interests in hypothesis testing and trustworthy machine learning. Currently, he is the recipient of the ARC DECRA Fellowship, a Lecturer at The University of Melbourne, Australia, and a Visiting Scientist at RIKEN-AIP, Japan. He has served as an Area Chair for AISTATS, ICLR, ICML, NeurIPS, as a senior program committee (SPC) member for AAAI, IJCAI. He has received the ARC Discovery Early Career Researcher Award, and the Outstanding Paper Award of NeurIPS (2022).