Publication Details

KnowlyBERT

Hybrid Query Answering over Language Models and Knowledge Graphs

authored by
Jan-Christoph Kalo, Leandra Fichtel, Philipp Ehler, Wolf-Tilo Balke
Abstract

Providing a plethora of entity-centric information, Knowledge Graphs have become a vital building block for a variety of intelligent applications. Indeed, modern knowledge graphs like Wikidata already capture several billions of RDF triples, yet they still lack a good coverage for most relations. On the other hand, recent developments in NLP research show that neural language models can easily be queried for relational knowledge without requiring massive amounts of training data. In this work, we leverage this idea by creating a hybrid query answering system on top of knowledge graphs in combination with the masked language model BERT to complete query results. We thus incorporate valuable structural and semantic information from knowledge graphs with textual knowledge from language models to achieve high precision query results. Standard techniques for dealing with incomplete knowledge graphs are either (1) relation extraction which requires massive amounts of training data or (2) knowledge graph embeddings which have problems to succeed beyond simple baseline datasets. Our hybrid system KnowlyBERT requires only small amounts of training data, while outperforming state-of-the-art techniques by boosting their precision by over 30% in our large Wikidata experiment.

External Organisation(s)
Technische Universität Braunschweig
Type
Conference contribution
Pages
294-310
No. of pages
17
Publication date
2020
Publication status
Published
Peer reviewed
Yes
ASJC Scopus subject areas
Theoretical Computer Science, Computer Science(all)
Electronic version(s)
https://doi.org/10.1007/978-3-030-62419-4_17 (Access: Closed)