Temporal Information Extraction

This project has two stages so far:

(1) We propose to use a set of simple, uniform in architecture LSTM-based models to recover different kinds of temporal relations from text. Using the shortest dependency path between entities as input, the same architecture is implemented to extract intra-sentence, cross- sentence, and document creation time relations. A “double-checking” technique reverses entity pairs in classification, boosting the recall of positive cases and reducing misclassifications between opposite classes. An efficient pruning algorithm resolves conflicts globally. Evaluated on QA-TempEval (SemEval2015 Task 5), our proposed technique outperforms state-of-the-art methods by a large margin. We also conduct intrinsic evaluation and post state-of-the-art results on Timebank-Dense.

(2) We propose a context-aware neural network model for temporal information extraction. This model has a uniform architecture for event-event, event-timex and timex-timex pairs. A Global Context Layer (GCL), inspired by Neural Turing Machine (NTM), stores processed temporal relations in narrative order, and retrieves them for use when relevant entities come in. Relations are then classified in context. The GCL model has long-term memory and attention mechanisms to resolve irregular long-distance dependencies that regular RNNs such as LSTM cannot recognize. It does not require any new input features, while outperforming the existing models in literature. To our knowledge it is also the first model to use NTM-like architecture to process the information from global context in discourse-scale natural text processing. We are going to release the source code in the future.

Model diagram

Fig.1. Model diagram

Publications

Y. Meng A. Rumshisky A. Romanov Temporal Information Extraction for Question Answering Using Syntactic Dependencies in an LSTM-based Architecture. Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, Association for Computational Linguistics. 2017

@inproceedings{meng_temporal_2017, author = "Meng, Yuanliang and Rumshisky, Anna and Romanov, Alexey", address = "Copenhagen, Denmark", title = "Temporal {Information} {Extraction} for {Question} {Answering} {Using} {Syntactic} {Dependencies} in an {LSTM}-based {Architecture}", booktitle = "Proceedings of the 2017 {Conference} on {Empirical} {Methods} in {Natural} {Language} {Processing}", publisher = "Association for Computational Linguistics", year = "2017", pages = "887--896" }

Y. Meng A. Rumshisky Context-Aware Neural Model for Temporal Information Extraction. Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. 2018

@inproceedings{meng2018context, title={Context-Aware Neural Model for Temporal Information Extraction}, author={Meng, Yuanliang and Rumshisky, Anna}, booktitle={Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)}, volume={1}, pages={527--536}, year={2018} }