Papers
Topics
Authors
Recent
Search
2000 character limit reached

Task-specific Pre-training and Prompt Decomposition for Knowledge Graph Population with Language Models

Published 26 Aug 2022 in cs.CL | (2208.12539v2)

Abstract: We present a system for knowledge graph population with LLMs, evaluated on the Knowledge Base Construction from Pre-trained LLMs (LM-KBC) challenge at ISWC 2022. Our system involves task-specific pre-training to improve LM representation of the masked object tokens, prompt decomposition for progressive generation of candidate objects, among other methods for higher-quality retrieval. Our system is the winner of track 1 of the LM-KBC challenge, based on BERT LM; it achieves 55.0% F-1 score on the hidden test set of the challenge.

Citations (13)

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.