TR2024-156
SuperLoRA: Parameter-Efficient Unified Adaptation of Large Foundation Models
-
- "SuperLoRA: Parameter-Efficient Unified Adaptation of Large Foundation Models", British Machine Vision Conference (BMVC), November 2024.BibTeX TR2024-156 PDF Presentation
- @inproceedings{Chen2024nov,
- author = {{Chen, Xiangyu and Liu, Jing and Wang, Ye and Wang, Pu and Brand, Matthew and Wang, Guanghui and Koike-Akino, Toshiaki}},
- title = {SuperLoRA: Parameter-Efficient Unified Adaptation of Large Foundation Models},
- booktitle = {British Machine Vision Conference (BMVC)},
- year = 2024,
- month = nov,
- url = {https://www.merl.com/publications/TR2024-156}
- }
,
- "SuperLoRA: Parameter-Efficient Unified Adaptation of Large Foundation Models", British Machine Vision Conference (BMVC), November 2024.
-
MERL Contacts:
-
Research Areas:
Abstract:
Low-rank adaptation (LoRA) and its variants are widely employed in fine-tuning large models, including large language models for natural language processing and dif- fusion models for computer vision. This paper proposes a generalized framework called SuperLoRA that unifies and extends different LoRA variants, which can be realized un- der different hyper-parameter settings. Introducing new options with grouping, folding, shuffling, projection, and tensor decomposition, SuperLoRA offers high flexibility and demonstrates superior performance, with up to a 10-fold gain in parameter efficiency for transfer learning tasks.