Representing uncertain information is crucial for modeling real world domains. This has been fully recognized both in the field of Logic Programming and of Description Logics (DLs), with the introduction of probabilistic logic languages (PLL) in logic and with various probabilistic extensions of DLs respectively. In this work, we consider the distribution semantics and face the problem of learning PLLs and expressing and reasoning with probabilis- tic DLs. For PLLs, we present a parameter learning algorithm (EMBLEM) and two structure learning algorithms (SLIPCASE, SLIPCOVER): EMBLEM is based on the Expectation Maximization method and efficiently computes the expectations directly on the Binary Decision Diagrams (BDD) built for inference. It is embedded in both SLIPCASE and SLIPCOVER. The algorithms were tested on real world relational datasets and showed superior performance in almost all cases with respect to the state of the art. We then transposed both the distribution semantics and the inference techniques based on BDDs to DLs, devel- oping a probabilistic semantics (DISPONTE) and an algorithm (BUNDLE) that computes the probability of queries over DISPONTE DLs by encoding their explanations in BDDs. We show that BUNDLE is competitive with the probabilistic reasoner PRONTO on a real probabilistic ontology.
Integration of logic and probability in inductive and terminological reasoning
BELLODI, Elena
2014
Abstract
Representing uncertain information is crucial for modeling real world domains. This has been fully recognized both in the field of Logic Programming and of Description Logics (DLs), with the introduction of probabilistic logic languages (PLL) in logic and with various probabilistic extensions of DLs respectively. In this work, we consider the distribution semantics and face the problem of learning PLLs and expressing and reasoning with probabilis- tic DLs. For PLLs, we present a parameter learning algorithm (EMBLEM) and two structure learning algorithms (SLIPCASE, SLIPCOVER): EMBLEM is based on the Expectation Maximization method and efficiently computes the expectations directly on the Binary Decision Diagrams (BDD) built for inference. It is embedded in both SLIPCASE and SLIPCOVER. The algorithms were tested on real world relational datasets and showed superior performance in almost all cases with respect to the state of the art. We then transposed both the distribution semantics and the inference techniques based on BDDs to DLs, devel- oping a probabilistic semantics (DISPONTE) and an algorithm (BUNDLE) that computes the probability of queries over DISPONTE DLs by encoding their explanations in BDDs. We show that BUNDLE is competitive with the probabilistic reasoner PRONTO on a real probabilistic ontology.I documenti in SFERA sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.