Pretrained Transformers for Text Ranking: BERT and Beyond

Suzan Verberne


Anthology ID:
2023.cl-1.8
Volume:
Computational Linguistics, Volume 49, Issue 1 - March 2023
Month:
March
Year:
2023
Address:
Cambridge, MA
Venue:
CL
SIG:
Publisher:
MIT Press
Note:
Pages:
253–255
Language:
URL:
https://aclanthology.org/2023.cl-1.8
DOI:
10.1162/coli_r_00468
Bibkey:
Cite (ACL):
Suzan Verberne. 2023. Pretrained Transformers for Text Ranking: BERT and Beyond. Computational Linguistics, 49(1):253–255.
Cite (Informal):
Pretrained Transformers for Text Ranking: BERT and Beyond (Verberne, CL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.cl-1.8.pdf