License: Creative Commons Attribution 4.0 International license (CC BY 4.0)
When quoting this document, please refer to the following
DOI: 10.4230/DagSemProc.05381.3
URN: urn:nbn:de:0030-drops-7561
URL: http://dagstuhl.sunsite.rwth-aachen.de/volltexte/2006/756/
Go to the corresponding Portal


Pavlin, Gregor ; Nunnink, Jan ; Groen, Frans

Robustness and Accuracy of Bayesian Information Fusion Systems

pdf-format:
05381.PavlinGregor.Paper.756.pdf (0.5 MB)


Abstract

Modern situation assessment and controlling applications often require efficient
fusion of large amounts of heterogeneous and uncertain information. In addition,
fusion results are often mission critical.

It turns out that Bayesian networks (BN) are suitable for a significant class of
such applications, since they facilitate modeling of very heterogeneous types of
uncertain information and support efficient belief propagation techniques. BNs are
based on a rigorous theory which facilitates (i) analysis of the robustness of fusion
systems and (ii) monitoring of the fusion quality.

We assume domains where situations can be described through sets of discrete random
variables. A situation corresponds to a set of hidden and observed states that the
nature `sampled' from some true distribution over the combinations of possible states.
Thus, in a particular situation certain states materialized while others did not, which
corresponds to a point-mass distribution over the possible states. Consequently, the
state estimation can be reduced to a classification of the possible combinations of
relevant states. We assume that there exist mappings between hidden states of interest
and optimal decisions/actions.

In this context, we consider classification of the states accurate if it is equivalent
to the truth in the sense that knowing the truth would not change the action based
on the classification. Clearly, BNs provide a mapping between the observed symptoms
and hypotheses about hidden events. Consequently, BNs have a critical impact on the
fusion accuracy.

We emphasize a fundamental difference between the model accuracy and fusion
(i.e.classification) accuracy. A BN is a generalization over many possible situations
that captures probability distributions over the possible events in the observed
domain. However, even a perfect generalization does not necessarily support accurate
classification in a particular situation. We address this problem with the help of the
Inference Meta Model (IMM) which describes information fusion in BNs from a coarse,
runtime perspective.

IMM is based on a few realistic assumptions and exposes properties of BNs that are r
elevant for the construction of inherently robust fusion systems. With the help of IMM
we show that in BNs featuring many conditionally independent network fragments inference
can be very insensitive to the modeling parameter values. This implies that fusion can be
robust, which is especially relevant in many real world applications where we cannot obtain
precise models due to the lack of sufficient training data or expertise. In addition,
IMM introduces a reinforcement propagation algorithm that can be used as an alternative
to the common approaches to inference in BNs. We can show that the classification accuracy
of this propagation algorithm is asymptotically approaching 1 as the number of conditionally
independent network fragments increases. Because of these properties, the propagation
algorithm can be used as a basis for effective detection of misleading fusion results
as well as discovery of inadequate modeling components and erroneous information sources.

BibTeX - Entry

@InProceedings{pavlin_et_al:DagSemProc.05381.3,
  author =	{Pavlin, Gregor and Nunnink, Jan and Groen, Frans},
  title =	{{Robustness and Accuracy of Bayesian Information Fusion Systems}},
  booktitle =	{Form and Content in Sensor Networks},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{5381},
  editor =	{Leonidas Guibas and Uwe D. Hanebeck and Thomas C. Henderson},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/opus/volltexte/2006/756},
  URN =		{urn:nbn:de:0030-drops-7561},
  doi =		{10.4230/DagSemProc.05381.3},
  annote =	{Keywords: Robust Information Fusion, Bayesian Networks, Heterogeneous Information, Modeling Uncertainties}
}

Keywords: Robust Information Fusion, Bayesian Networks, Heterogeneous Information, Modeling Uncertainties
Collection: 05381 - Form and Content in Sensor Networks
Issue Date: 2006
Date of publication: 02.11.2006


DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI