Combining Knowledge Graph and Word Embeddings for Spherical Topic Modeling

Hafsa Ennajari, Nizar Bouguila, Jamal Bentahar

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

Probabilistic topic models are considered as an effective framework for text analysis that uncovers the main topics in an unlabeled set of documents. However, the inferred topics by traditional topic models are often unclear and not easy to interpret because they do not account for semantic structures in language. Recently, a number of topic modeling approaches tend to leverage domain knowledge to enhance the quality of the learned topics, but they still assume a multinomial or Gaussian document likelihood in the Euclidean space, which often results in information loss and poor performance. In this article, we propose a Bayesian embedded spherical topic model (ESTM) that combines both knowledge graph and word embeddings in a non-Euclidean curved space, the hypersphere, for better topic interpretability and discriminative text representations. Extensive experimental results show that our proposed model successfully uncovers interpretable topics and learns high-quality text representations useful for common natural language processing (NLP) tasks across multiple benchmark datasets.

Original languageBritish English
JournalIEEE Transactions on Neural Networks and Learning Systems
DOIs
StateAccepted/In press - 2021

Keywords

  • Analytical models
  • Data models
  • Integrated circuit modeling
  • Knowledge graph (KG) embedding
  • Mathematical models
  • Probabilistic logic
  • representation learning
  • Semantics
  • Task analysis
  • topic modeling
  • von Mises-Fisher (vMF) distribution
  • word embedding.

Fingerprint

Dive into the research topics of 'Combining Knowledge Graph and Word Embeddings for Spherical Topic Modeling'. Together they form a unique fingerprint.

Cite this