Prompt-based Distribution Alignment for Domain Generalization in Text Classification

Chen Jia, Yue Zhang


Abstract
Prompt-based learning (a.k.a. prompting) achieves high performance by bridging the gap between the objectives of language modeling and downstream tasks. Domain generalization ability can be improved by prompting since classification across different domains can be unified into the prediction of the same set of label words. The remaining challenge for domain generalization by prompting comes from discrepancies between the data distribution of different domains. To improve domain generalization with prompting, we learn distributional invariance across source domains via two alignment regularization loss functions. The first is vocabulary distribution alignment, which uses a Kullback-Leibler divergence regularization on source-domain vocabulary distributions. The second is feature distribution alignment, which uses a novel adversarial training strategy to learn domain invariant representation across source domains. Experiments on sentiment analysis and natural language inference show the effectiveness of our method and achieve state-of-the-art results on six datasets.
Anthology ID:
2022.emnlp-main.690
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10147–10157
Language:
URL:
https://aclanthology.org/2022.emnlp-main.690
DOI:
10.18653/v1/2022.emnlp-main.690
Bibkey:
Cite (ACL):
Chen Jia and Yue Zhang. 2022. Prompt-based Distribution Alignment for Domain Generalization in Text Classification. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 10147–10157, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Prompt-based Distribution Alignment for Domain Generalization in Text Classification (Jia & Zhang, EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.690.pdf
Software:
 2022.emnlp-main.690.software.zip