Towards Effective Time-Aware Language Representation: Exploring Enhanced Temporal Understanding in Language Models
Abstract: In the evolving field of NLP, understanding the temporal context of text is increasingly critical for applications requiring advanced temporal reasoning. Traditional pre-trained LLMs like BERT, which rely on synchronic document collections such as BookCorpus and Wikipedia, often fall short in effectively capturing and leveraging temporal information. To address this limitation, we introduce BiTimeBERT 2.0, a novel time-aware LLM pre-trained on a temporal news article collection. BiTimeBERT 2.0 incorporates temporal information through three innovative pre-training objectives: Extended Time-Aware Masked Language Modeling (ETAMLM), Document Dating (DD), and Time-Sensitive Entity Replacement (TSER). Each objective is specifically designed to target a distinct dimension of temporal information: ETAMLM enhances the model's understanding of temporal contexts and relations, DD integrates document timestamps as explicit chronological markers, and TSER focuses on the temporal dynamics of "Person" entities. Moreover, our refined corpus preprocessing strategy reduces training time by nearly 53\%, making BiTimeBERT 2.0 significantly more efficient while maintaining high performance. Experimental results show that BiTimeBERT 2.0 achieves substantial improvements across a broad range of time-related tasks and excels on datasets spanning extensive temporal ranges. These findings underscore BiTimeBERT 2.0's potential as a powerful tool for advancing temporal reasoning in NLP.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.