Learning probabilities for noisy first-order rules (1997)by D. Koller and A. Pfeffer
First-order logic is the traditional basis for knowledge representation languages. However, its applicability to many real-world tasks is limited by its inability to represent uncertainty. Bayesian belief networks, on the other hand, are inadequate for complex KR tasks due to the limited expressivity of the underlying (propositional) language. The need to incorporate uncertainty into an expressive language has led to a resurgence of work on first-order probabilistic logic. This paper addresses one of the main objections to the incorporation of probabilities into the language: "Where do the numbers come from?" We present an approach that takes a knowledge base in an expressive rule-based first-order language, and learns the probabilistic parameters associated with those rules from data cases. Our approach, which is based on algorithms for learning in traditional Bayesian networks, can handle data cases where many of the relevant aspects of the situation are unobserved. It is also capable of utilizing a rich variety of data cases, including instances with varying causal structure, and even involving a varying number of individuals. These features allow the approach to be used for a wide range of tasks, such as learning genetic propagation models or learning first-order STRIPS planning operators with uncertain effects.
D. Koller and A. Pfeffer (1997). "Learning probabilities for noisy first-order rules." Proceedings of the International Joint Conference on Artificial Intelligence (pp. 1316-1321).
author = "D. Koller and A. Pfeffer",
booktitle = "Proceedings of the International Joint Conference on
title = "Learning probabilities for noisy first-order rules",
publisher = "Morgan Kaufman",
address = "Nagoya, Japan",
pages = "1316--1321",
year = "1997",