2000 character limit reached
Ask2Transformers: Zero-Shot Domain labelling with Pre-trained Language Models
Published 7 Jan 2021 in cs.CL | (2101.02661v2)
Abstract: In this paper we present a system that exploits different pre-trained LLMs for assigning domain labels to WordNet synsets without any kind of supervision. Furthermore, the system is not restricted to use a particular set of domain labels. We exploit the knowledge encoded within different off-the-shelf pre-trained LLMs and task formulations to infer the domain label of a particular WordNet definition. The proposed zero-shot system achieves a new state-of-the-art on the English dataset used in the evaluation.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.