Representation dependence in probabilistic inference (2004)by J. Y. Halpern and D. Koller
[older version, 1995]
Non-deductive reasoning systems are often representation dependent: representing the same situation in two different ways may cause such a system to return two different answers. Some have viewed this as a significant problem. For example, the principle of maximum entropy has been subjected to much criticism due to its representation dependence. There has, however, been almost no work investigating representation dependence. In this paper, we formalize this notion and show that it is not a problem specific to maximum entropy. In fact, we show that any representation-independent probabilistic inference procedure that ignores irrelevant information is essentially entailment, in a precise sense. Moreover, we show that representation independence is incompatible with even a weak default assumption of independence. We then show that invariance under a restricted class of representation changes can form a reasonable compromise between representation independence and other desiderata, and provide a construction of a family of inference procedures that provides such restricted representation independence, using relative entropy.
J. Y. Halpern and D. Koller (2004). "Representation dependence in probabilistic inference." Journal of Artificial Intelligence Research, 21, 319-356.
Full version of IJCAI-95 paper.
author = "J.~Y. Halpern and D. Koller",
journal = "Journal of Artificial Intelligence Research",
title = "Representation dependence in probabilistic inference",
year = "2004",
volume = 21,
pages = "319--356",
note = "Full version of IJCAI-95 paper",