Release_mlm_vs_clm
We release Should We Still Pretrain Encoders with Masked Language Modeling?, a large-scale study comparing causal and bidirectional pretraining objectives for text representation learning.
We release Should We Still Pretrain Encoders with Masked Language Modeling?, a large-scale study comparing causal and bidirectional pretraining objectives for text representation learning.