Ebook: Data Fusion for Situation Monitoring, Incident Detection, Alert and Response Management
Data Fusion is a very broad interdisciplinary technology domain. It provides techniques and methods for; integrating information from multiple sources and using the complementarities of these detections to derive maximum information about the phenomenon being observed; analyzing and deriving the meaning of these observations and predicting possible consequences of the observed state of the environment; selecting the best course of action; and controlling the actions. Here, the focus is on the more mature phase of data fusion, namely the detection and identification / classification of phenomena being observed and exploitation of the related methods for Security-Related Civil Science and Technology (SST) applications. It is necessary to; expand on the data fusion methodology pertinent to Situation Monitoring, Incident Detection, Alert and Response Management; discuss some related Cognitive Engineering and visualization issues; provide an insight into the architectures and methodologies for building a data fusion system; discuss fusion approaches to image exploitation with emphasis on security applications; discuss novel distributed tracking approaches as a necessary step of situation monitoring and incident detection; and provide examples of real situations, in which data fusion can enhance incident detection, prevention and response capability. In order to give a logical presentation of the data fusion material, first the general concepts are highlighted (Fusion Methodology, Human Computer Interactions and Systems and Architectures), closing with several applications (Data Fusion for Imagery, Tracking and Sensor Fusion and Applications and Opportunities for Fusion).
An Advanced Study Institute (ASI) “Data Fusion for Situation Monitoring, Incident Detection, Alert and Response Management” was held in Yerevan State University's Narek Hotel, 19–29 August 2003 in Tsakhkadzor. This ASI continued the exploration of the relatively young (less than 20 years) discipline called Data Fusion, subsequent to the Multisensor Data Fusion ASI held in Pitlochry, Scotland, UK, 25 Jun 2000 – 7 Jul 2000. This publication is the Proceedings of the Institute.
An ASI is a high-level tutorial activity, one of many types of funded group support mechanisms established by the NATO Science Committee in support of the dissemination of knowledge and the formation of international scientific contacts. The NATO Science Committee was approved at a meeting of the Heads of Government of the Alliance in December 1957, subsequent to the 1956 recommendation of “Three Wise Men” – Foreign Ministers Lange (Norway), Martino (Italy) and Pearson (Canada) on Non-Military Cooperation in NATO. The NATO Science Committee established the NATO Science Programme in 1958 to encourage and support scientific collaboration between individual scientists and to foster scientific development in its member states. In 1999, following the end of the Cold War, the Science Programme was transformed so that support is now devoted to collaboration between Partner-country and NATO-country scientists or to contributing towards research support in Partner countries. Since 2004, the Science Programme was further modified to focus exclusively on NATO Priority Research Topics (i.e. Defense Against Terrorism or Countering Other Threats to Security) and also preferably on a Partner country priority area.
Data Fusion is a very broad interdisciplinary technology domain. It provides techniques and methods for:
1. integrating information from multiple sources and using the complementarity of these sources to derive maximum information about the phenomenon being observed;
2. analyzing and deriving the meaning of this information and predicting possible consequences of the observed state of the environment;
3. selecting the best course of action; and
4. controlling the actions.
The Data Fusion ASI in Pitlochry provided a systematic high-level view of data fusion fundamental theory and the enabling technologies and presented a set of applications in an accessible manner. In that ASI, more emphasis was put on the first, more mature phase of data fusion, namely the detection and identification/classification of phenomena being observed and exploitation of the related methods for Security-Related Civil Science and Technology (SST) applications. The organizers felt that in this ASI it was necessary to:
– expand on the data fusion methodology pertinent to Situation Monitoring, Incident Detection, Alert and Response Management;
– discuss some related Cognitive Engineering and visualization issues;
– provide an insight into the architectures and methodologies for building a data fusion system;
– discuss fusion approaches to image exploitation with emphasis on security applications;
– discuss novel distributed tracking approaches as a necessary step of situation monitoring and incident detection;
– provide examples of real situations, in which data fusion can enhance incident detection, prevention and response capability.
The theme of the Institute was scientific communication and exchange of ideas among academic, industrial, and government laboratory groups having a common interest in the development of information fusion based approaches to detection and prevention of safety and security risks.
The technical program was conceived to highlight general concepts (Fusion Methodology, Human Computer Interactions and Systems and Architectures) in the first week and applications (Data Fusion for Imagery, Tracking and Sensor Fusion and Applications and Opportunities for Fusion) in the second week, thus ensuring that the attendees were given a logical presentation of the data fusion material. In addition, at the request of some students as well as lecturer participants, a short tutorial on the Introduction to Data Fusion was repeated from the previous Data Fusion ASI in Pitlochry, on one of the evenings during the first week.
The Organizing Committee and the Directors encouraged informal discussion sessions, which were specifically useful because of the interdisciplinary nature of the topics discussed as well as the fact that most students did not have a formal education in Data Fusion, since the discipline is not on the curricula of most Universities in the world.
Fifty-nine participants and lecturers representing Armenia, Belgium, Bulgaria, Canada, Czech Republic, France, Norway, Portugal, Israel, Italy, Spain, Russia, Ukraine, and the United States attended the Institute. A distinguished faculty of lecturers was assembled and the technical program was organized with the generous and very capable assistance of the Organizing Committee composed of Prof. Ashot Akhperjanian (Armenia), Dr. Elisa Shahbazian (Canada), Dr. Eloi Bosse (Canada), Dr. Galina Rogova, (USA), and Dr. Pierre Valin (Canada).
The value to be gained from any ASI depends on the faculty – the lecturers who devote so much of their time and talents to make an Institute successful. As the reader of these proceedings will see, this ASI was particularly honored with an exceptional group of lecturers to whom the organizers and participants offer their deep appreciation.
Due to the broad interdisciplinary nature of the subject, the editors of this volume were faced with difficult decisions, such as:
– considering the technical continuity more important than keeping lectures by the same author together, since some lecturers chose to talk on very disparate topics;
– accepting some redundancy of discussion between lectures, when the applications are different; and
– accepting some minor differing interpretations of the data fusion model and taxonomy, since these are still debatable topics in the data fusion community;
– choosing to include some lectures on applications related to Situation Monitoring, Incident Detection, Alert and Response Management that did not discuss fusion methodology but presented a great challenge for fusion applications.
We are grateful to a number of organizations for providing the financial assistance that made the Institute possible. Foremost to the NATO Security Through Science Programme, which provided not only important financial support for the Institute, but equally valuable organizational and moral support, both in the beginning, when the ASI was proposed, as well as during the institute, through the participation of the NATO observer Dr. Steinar Hoibraten from Norway. In addition, the following sources made significant contributions: the U.S. Office of Naval Research, Defense Research and Development Canada in Valcartier, Centre of Research in Mathematics of University of Montreal, Canada, and Lockheed Martin Canada.
We would like to thank the management of Yerevan State University for allocating the Narek hotel for the ASI and ensuring that all the requirements of the ASI were fulfilled, and the staff of Narek hotel for a truly enjoyable and memorable two weeks in Tsakhtadzor. We would like to thank the Yerevan Physics Institute for organizing an unforgettable picnic at their facility on the banks of the mountainous Lake Sevan. We would also like to thank our local site manager, Karine Gasparian, for her dedicated efforts to address all communication, housing, transportation and entertainment requirements of the ASI participants, so that the Organizing Committee was able to fully concentrate on the technical program issues. Our special thanks go to our administrative assistants and interpreters Anna Galstyan, Artsvik Gevorkian, Armen Malkhasyan, Gayane Malkhasyan and Alana Mark, whose competence and warm friendliness made all the attendees feel welcomed at the ASI and comfortable in Armenia.
A very special acknowledgement goes to Ani Shahbazian who undertook a very challenging task of English language editing of all the lecturers' manuscripts, re-formatting all lectures after the technical editing was complete, and producing a camera-ready document for IOS Press. Thank you for your long hours and hard work.
And, finally, all of our thanks go to the people of Armenia, who certainly displayed, in every way, their warmth and hospitality.
Elisa Shahbazian, Montreal, Canada; Galina Rogova, Rochester, USA; Pierre Valin, Valcartier, Canada
December 2004
Humans explore their world by selecting sensor observations, fusing data, and deciding how to act. For the most part, the goal of sensory data fusion is to increase confidence in entity identification and location. Since the 4th century B.C., the fusion process has been acknowledged through sensed data association. However, in the past century, probability and statistics formalized the association problem with mathematical correlation. Together, association and correlation aid in reducing uncertainty for decision making. Fusion can have a variety of meanings, such as data fusion, sensor fusion, and information fusion. Thus, it is important to explore terminology. Data fusion is the correlation of raw data whereas sensor fusion is the multimodal integration of transduced data into a common perception. Information fusion (IF) is the collection of sensory and knowledge data to create a unique understanding upon which a user performs response management. IF for decision making includes: the situational context, sensor control, and the user-machine interaction. In this chapter, we will highlight many of the key aspects of fusion research to give the reader a flavor of what IF can do and cannot do to aid a user. This chapter will discuss the fundamentals of information fusion: (1) who: decision makers and machines; (2) what: data, sensor, and information; (3) where: applications; (4) when: appropriate situations; (5) why: to reduce uncertainty and increase confidence; and (6) how: techniques. What you gain from reading this chapter is a perspective of how to integrate the fusion-machine and the cognitive-user, a taxonomy of information fusion algorithms, and an appreciation of the benefits and limitations of fusion designs. Sensor and data fusion books are useful for the reader who is unfamiliar with basic concepts. This chapter is not intended to repeat the explorations of these texts, but to summarize important aspects of IF relative to “cognitive fusion” which integrates the user with the fusion machine for decision making.
In the field of pattern recognition, fusion of multiple classifiers is currently used for solving difficult recognition tasks and designing high performance systems. This chapter is aimed at providing the reader with a gentle introduction to this fertile area of research. We open the chapter with a discussion about motivations for the use of classifier fusion, and outline basic concepts on multiple classifier systems. Main concepts about methods and algorithms for creating and fusing multiple pattern classifiers are reviewed. The chapter closes with a critical discussion of current achievements and open issues.
Considerable concern has arisen regarding the quality of intelligence analysis. This has been in large part motivated by the task, prior to the Iraq war, of determining whether Iraq had weapons of mass destruction. One problem that made this analysis difficult was the uncertainty in much of the information available to the intelligence analysts. In this work we introduce some tools that can be of use to intelligence analysts for representing and processing uncertain information. We make considerable use of technologies based on fuzzy sets and related disciplines such as approximate reasoning.
Situation Analysis, for which information fusion is a key enabler, has to deal with knowledge and uncertainty. This paper discusses the key notions of knowledge, belief and uncertainty in relation to information fusion. The aim is not to provide a theory of some sort but to help the information fusion practitioner to navigate and see the links among the numerous mathematical and logical models/tools that are available to process uncertainty-based information and knowledge.
An important step in the intelligence gathering process is the fusion of information provided by several sources. The objective of this process is to build an up-to-date and correct picture of the current situation with the overall available information in order to make adequate decisions. In the framework of intelligence, as opposed to the air defense domain for example, there is no real automatic process and the fusion of information is performed by human operators. The STANAG 2022 proposes a methodology for the evaluation of information in this framework of intelligence and for human processing. However, given the increase in the amount of information that the human operator must process, some automatic processing is being considered. Therefore there is a need for a formal methodology for the evaluation of information. In this paper, we present some considerations about the correspondence between the STANAG 2022 information evaluation methodology and a formal and mathematical evaluation approach given by an automatic process.
Information fusion from dissimilar sensors is best performed through extraction of attributes that can be measured by each of these sensors. In this way both imaging (Synthetic Aperture Radar (SAR), and Forward Looking Infra-Red (FLIR)) and non-imaging sensors (Identification Friend or Foe (IFF), Electronic Support Measures (ESM), radar) can be treated on an equal footing. In order to properly identify the target platform through repeated fusion of identity declarations, the attributes measured must be correlated with known platforms through a comprehensive a priori platform database. This comprehensive database is carefully analyzed for attributes that can be provided by sensors and additional knowledge that can be interpreted at all levels of fusion. The identity (ID) of the target platform can be provided in a hierarchical “tree” form where leaves are unique IDs but branch nodes correspond to a taxonomy obeying certain standards. In some cases, precise attribute measurement is either impractical or of moot value, so fuzzification is performed through appropriate membership functions. The actual identity of tracked ships is performed by an algorithm utilizing the Dempster-Shafer theory of evidence. The algorithm can mathematically handle conflict, which possibly appears as the result of countermeasures and/or poor associations, and ignorance, which may be present in the cases when sensors provide ambiguous or hard to interpret results. Since each imaging sensor has its own measurement potential, customized classifier solutions must be designed for optimal performance. A series of FLIR classifiers is presented and fused through a neural net fuser, while a hierarchical SAR classifier is shown to perform well for combatant ships, which are most likely to be imaged by the SAR. The complete fusion solution is demonstrated in a series of realistic scenarios involving both friendly and enemy ships.
A discussion of a fusion problem in multi-agent systems for time critical decision making is presented. The focus is on the problem of distributed learning for classification into several hypotheses of observations representing states of an uncertain environment. Special attention is devoted to reinforcement learning in a homogeneous non-communicating multi-agent system for time critical decision making. A system in which an agent network processes observational data and outputs beliefs to a fusion center module is considered. Belief theory serves as the analytic framework for computing these beliefs and composing them over time and over the set of agents. The agents are modeled using evidential neural networks, whose weights reflect the state of learning of the agents. Training of the network is guided by reinforcements received from the environment as decisions are made. Two different sequential decision making mechanisms are discussed: the first one is based on a “pignistic ratio test” and the second one is based on “the value of information criterion,” providing for learning utilities. Results are shown for the test case of recognition of naval vessels from FLIR image data.
Active robotic sensing is a large field aimed at providing robotic systems with tools and methods for decision making under uncertainty, e.g. in a changing environment and with a lack of sufficient information. Active sensing (AS) incorporates the following aspects: (i) where to position sensors, (ii) how to make decisions for subsequent actions in order to extract maximum information from the sensor data and minimize costs such as travel time and energy. We concentrate on the second aspect: “where should the robot move at the next time step?” and present AS in a probabilistic decision theoretic framework. The AS problem is formulated as a constrained optimization with a multi-objective criterion combining an information gain and a cost term with respect to generated actions. Solutions for AS of autonomous mobile robots are given, illustrating the framework.
This paper introduces a novel Genetic Algorithm (GA) for time efficient calculation of a solution to a resource management (RM) problem in the context of naval warfare. The novelty resides in the introduction of a new operator to correct the behavior observed in Steady State Genetic Algorithms (SSGA). The SSGA model differs from the traditional model in that it simulates the dynamics of a population reproducing in a semi-random way. It has been observed that genetic diversity is lost within a few generations when an SSGA is implemented using a small population [5]. The main purpose of the new operator is the diversification of a population. Its performance is evaluated according to a measure of a population's diversity (entropy). The RM problem is also examined in detail; it is formulated as a non-linear optimization problem. The GA has been implemented using a proprietary data-driven multi-agent system, developed by Lockheed Martin Canada. The advantage of this novel GA over previous methods (TABU search) has been empirically confirmed by extensive simulations.
We present some data fusion procedures based on the use of partial differential equations, optimization problems and some ideas drawn from the calculus of variations. We present two real situations in which these fusion procedures have been successfully applied: the fusion of remote sensed SAR and optical images, and the fusion of multifrequency electromagnetic scattering data obtained in a laboratory experiment. Some examples of fusion results obtained through the use of real data are shown at the end of the paper.
The procedures of the identification of probability distributions for K(≥1) random objects, each having one from the known set of M distributions, are studied. K sequences of discrete independent random variables represent results of N observations of each of these objects. The exponential decrease of test error probabilities is considered. The reliability matrices of logarithmically asymptotically optimal tests are investigated for some models. These models are determined by conditions of dependence or independence of objects and by the formulation of an identification problem. The optimal subsets of reliabilities which may be given beforehand and conditions of positiveness of all of the reliabilities are investigated.
Command and control can be characterized as a dynamic human decision making process. A technological perspective of Command and control has led system designers to propose solutions such as decision support and information fusion to overcome many of the domain problems. This and the lack of knowledge in cognitive engineering have in the past jeopardized the design of helpful computerized aids aimed at complementing and supporting human cognitive tasks. Moreover, this lack of knowledge has most of the time created new trust problems in designed tools, and human in the loop concerns. Solving the command and control problem requires balancing the human factor perspective with that of the system designer and coordinating the efforts in designing a cognitively fitted system to support decision-makers. This paper discusses critical issues in the design of computer aids by which the decision-maker can better understand the situation in his area of operations, select a course of action, issue intent and orders, monitor the execution of operations and evaluate the results. These aids will support decision-makers to cope with uncertainty and disorder in warfare and to exploit people or technology at critical times and places so as to ensure success in operations.
The modality integration issue is addressed with the example of a system that aims at enabling users to combine their speech and 2D gestures when interacting with life-like characters in an educative game context. The use of combined input speech, 2D gesture and environment entities for user system interaction is investigated and presented in a preliminary and limited fashion.
Recently, the development of complex sensor networks has received considerable attention from the scientific society and industry, and many such systems have been implemented in different applications. Sensor networks consist of homogeneous or heterogeneous sensors spread in a global surveillance volume, acting joinly to optimally solve required tasks. Multiple target tracking is one of the most essential requirements for these systems; it is used to interpret an environment that includes both true targets and false alarms simultaneously. In spite of the great interest centered on fully automatic sensor data processing, the man-machine interface is nevertheless vital in such complex scenarios. This paper concerns spatio-temporal sensor data visualization and analysis for such complex systems, with a focus on components and data flow in sensor systems. Simplified examples illustrate the use of a developed graphic user interface and programming package for radar data processing.
Methods are presented for the cost-effective development and integration of multi-sensor data fusion technology. The key new insight is in formulating the system engineering process as a resource management problem. Effectively, system engineering is formulated as a problem of planning under uncertainty, where the probability of outcome of various actions, the utility of such outcomes and the cost of actions are posed as time-varying functions. The fusion-specific problem is that of decomposing system operational and support requirements into requirements for a network of processing nodes, involving functions for data alignment, data association and state estimation.
Distributed Sensor Networks have evolved from the early networks of sensors coupled with processing elements to wireless networks of resource-constrained embedded devices. Such networks usher in new paradigms for computation, control and communication. Data Fusion is an important application for Distributed Sensor Networks as it facilitates the synthesis of new information by integrating data from multiple sensors in a deterministic, time-critical and reliable manner. In general, sensors are used either in complementary, competitive, or collaborative modes. The mode of the sensors forces a consideration of architectural issues. In this paper, we explore the landscape of architectures for Distributed Sensor Networks and identify the critical elements that are essential for Data Fusion.
Current trends require the use of a global information environment, including end-users and loosely coupled knowledge sources (experts, knowledge bases, repositories, etc.) for decision making. This leads to an expansion in e-applications dealing with knowledge storing in the Internet and based on the intensive use of WWW-technologies and standards such as XML, RDF, DAML, etc. A vast diversity of knowledge management tools has made the problem of knowledge fusion (KF) from distributed sources crucial. The above necessitates the development of a KF approach to complex operations management (global understanding of ongoing processes, global knowledge exchange, etc.). The presentation discusses a Knowledge Source Network (KSNet) configuration approach to KF and its potential e-applications for a scalable information environment (infosphere). This approach is based on utilizing such technologies as ontology management, intelligent agents, constraint satisfaction, etc.
This paper introduces a present-day understanding of the data and information fusion problem and describes some aspects of the methodology, technology and software toolkit developed by the authors for the design, implementation and deployment of a class of multi-agent information fusion-related applications. The distinctive feature of the proposed technology supported by the software toolkit is that it is distributed and agent-mediated, i.e. it assumes a distributed mode of designer activity mediated by agents that perform most of the routine engineering work and support the coordination of collaborative designer activities. The above methodology, technology and software toolkit were implemented and practically used for prototyping several applications from the data and information fusion scope.
Information and Data Fusion is a discipline that provides methods and techniques to build Observe-Orient-Decide-Act (OODA) capabilities for various applications. There are many ways in which these methods and techniques can be chosen to provide capabilities in each phase of the OODA decision making cycle, and there are different fusion architectures, i.e., ways these methods and techniques can be applied, grouped and integrated. How one chooses the most appropriate set of methods, techniques and fusion architecture for an application depends on a number of factors. Additional factors have to be considered in the case when decision making is performed through a collaboration of a number of fusion centres on a network, defined as Distributed Data Fusion, in the context of this lecture. This lecture describes the choices for fusion architectures, the factors leading to the selection of a fusion architecture, and proposes a model to help make these choices in the case of distributed data fusion.
This paper discusses a testbed for data fusion in the context of maritime surveillance. The testbed fuses non-real-time and near-real-time data from three different sensors, High Frequency Surface Wave Radar, Surveillance Aircraft, and ELINT data. The testbed correlator uses a Fuzzy Logic base clustering technique to associate ELINT data. For all data, the fusion process is performed at contact level and track level and resulting tracks are modeled with a genetic algorithm. The testbed interface allows the user to perform, among others, data files browsing, data contact edition and data visualisation on an interactive map. Current work on the implementation of level 2 fusion for situation awareness is also discussed. This testbed has been developed to test new techniques in the field of data fusion for maritime surveillance.
Over the last years, Lockheed Martin Canada has developed a Testbed to regroup and analyze fusion architectures, algorithms and information sharing strategies. This Testbed is used to demonstrate practical implementations of distributed data fusion between multiple collaborating platforms. In a decentralized data fusion center, there are many algorithms that process positional information coming from a network to generate a Global Tactical Picture. These algorithms tend to remove or prevent cross-correlation from being part of the received data. This paper compares four different track fusion algorithms applied in a simulation environment. There are two implementations of the Tracklet fusion approach, an algorithm based on track quality and an algorithm based on the source of the information.