Learning Concept Embeddings with a Transferable Deep Neural Reasoner
Deep Learning for and with Knowledge Graphs Track
•
26m
We present a novel approach for learning embeddings of concepts from knowledge bases expressed in the ALC description logic. They reflect the semantics in such a way that it is possible to compute an embedding of a complex concept from the embeddings of its parts by using appropriate neural constructors. Embeddings for different knowledge bases are vectors in a shared vector space, shaped in such a way that approximate subsumption checking for arbitrarily complex concepts can be done by the same neural network for all the knowledge bases.
Up Next in Deep Learning for and with Knowledge Graphs Track
-
Incorporating Ontological Information...
In this talk, we explores how such hierarchical ontological components in knowledge graphs are incorporated into KG representation learning. We present multiple practical machine learning methods, such as hierarchical graph modeling, graph neural networks, self-supervised learning, and language m...