arxivApr 4
BidirLM: From Text to Omnimodal Bidirectional Encoders by Adapting and Composing Causal LLMs
arXiv:2604.02045v1 Announce Type: new Abstract: Transforming causal generative language models into bidirectional encoders offers a powerful alternative to BERT-style architectures. However, current approaches remain limited: they lack consensus on optimal training objectives, suffer from catastroph