Numerical Optimizations for Weighted Low-rank Estimation on Language Models

Ting Hua, Yen-Chang Hsu, Felicity Wang, Qian Lou, Yilin Shen, Hongxia Jin


Abstract
Singular value decomposition (SVD) is one of the most popular compression methods that approximate a target matrix with smaller matrices. However, standard SVD treats the parameters within the matrix with equal importance, which is a simple but unrealistic assumption. The parameters of a trained neural network model may affect the task performance unevenly, which suggests non-equal importance among the parameters. Compared to SVD, the decomposition method aware of parameter importance is the more practical choice in real cases. Unlike standard SVD, weighed value decomposition is a non-convex optimization problem that lacks a closed-form solution. We systematically investigated multiple optimization strategies to tackle the problem and examined our method by compressing Transformer-based language models.Further, we designed a metric to predict when the SVD may introduce a significant performance drop, for which our method can be a rescue strategy.The extensive evaluations demonstrate that our method can perform better than current SOTA methods in compressing Transformer-based language models.
Anthology ID:
2022.emnlp-main.91
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1404–1416
Language:
URL:
https://aclanthology.org/2022.emnlp-main.91
DOI:
10.18653/v1/2022.emnlp-main.91
Bibkey:
Cite (ACL):
Ting Hua, Yen-Chang Hsu, Felicity Wang, Qian Lou, Yilin Shen, and Hongxia Jin. 2022. Numerical Optimizations for Weighted Low-rank Estimation on Language Models. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 1404–1416, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Numerical Optimizations for Weighted Low-rank Estimation on Language Models (Hua et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.91.pdf
Note:
 2022.emnlp-main.91.note.pdf