FUSION 2015 SPECIAL SESSIONS
- SS1: Sequential Monte Carlo Methods for Complex Systems
- SS2: Sensor, Resources and Process Management for Information Fusion Systems
- SS3: Extended Object and Group Tracking
- SS4: Directional Estimation
- SS5: Averaging Measures: Wasserstein Barycenters, MMOSPA, and more
- SS6: Homotopy Methods for Progressive Bayesian Estimation
- SS7: Probabilistic RGBD Data Fusion
- SS8: Advances in Distributed Kalman Filtering
- SS9: Data Fusion Methods for Indoor Localization of People and Objects
- SS10: Dynamic Data Driven Application Systems for Sensor Data Problems
- SS11: Intelligent Information Fusion
- SS12: Applications of Data Fusion and Predictive Analytics to Finance, Business, and Marketing
- SS13: Change, Anomaly, and Trend (CAT) Detection in Challenging Environments
- SS14: Context-based Information Fusion
- SS15: Trust in Fused Information
- SS16: Multistatic Tracking
- SS17: Real-World Problems with Network Abstractions
- SS18: Evaluation of Technologies for Uncertainty Reasoning
- SS19: Space Object Detection, Tracking, Identification, and Classification
- SS20: Large Scale Value-Centered Data Fusion
- SS21: Information Fusion in Multi-Biometric Systems
- SS22: Multi-Level Fusion: Bridging the Gap Between High and Low-level Fusion
SS1: Sequential Monte Carlo Methods for Complex Systems
Description: The aim of this special session is to address challenging problems such as estimation for high-dimensional systems and systems with complex dynamics (inter-relationships) with Sequential Monte Carlo (SMC) methods. This session will get together experts from different areas and is aimed at presenting novel techniques, algorithms, approaches especially based on sequential Monte Carlo methods. Both theoretically oriented and application related works are welcomed.
Organizers: Lyudmila Mihaylova; University of Sheffield; Hans Driessen, Thales, the Netherlands; Fredrik Gustafsson, University of Linkoeping; and Martin Ulmke, FKIE, Fraunhofer.
SS2: Sensor, Resources and Process Management for Information Fusion Systems
Description: A continuing increase of performance requirements sets up the need for optimal gain and exploitation of information. This gives rise to a broad field of optimization problems in the context of uncertainty. Advancements in communication, information and sensor technologies are driving a trend in the development of complex, adaptive and reconfigurable sensor systems. Such a sensor system can have a large scope for online reconfiguration, which typically exceeds the management capability of the human operator. In addition, the sensor system can face a variety of fundamental resource limitations, such as a limited power supply, a finite total time-budget, a narrow field of sight, a limited on-board processing capability or constraints on the communication channels between the sensor nodes. Consequently, effective sensor scheduling and resources management is a key performance factor for the emerging generation of adaptive and reconfigurable sensor systems. In the case of stationary sensors, it is usually desirable to schedule measurements to maximize the benefit in respect to the objectives of the sensor system, whilst avoiding redundant measurements. This benefit can be quantified by an appropriate metric, for example, a task specific metric, information gain or utility. For mobile sensors it is also necessary to consider the sensor platform navigation (including its uncertainties), as the sensor-scenario geometry can significantly affect performance, e.g., for coordinated exploration in disaster areas. Increasing computational power has rendered the progress of considering tasks with more and more increasing complexity. In many controlled systems the input affects not only the state, but also the future information acquisition and sensing. This called dual effect originates the compulsion for considering sensors and information fusion as part of the control task. Considering the uncertainty of the controlled variable, which can be done explicitly or implicitly, allows for the exploitation of this dual effect and an improved control performance.
Organizers: Christof Chlebek; Alexander Charlish; Leon Kester; Igor Gilitschenski; and Uwe Hanebeck, Karlsruhe Institute of Technology.
SS3: Extended Object and Group Tracking
Description: Standard target tracking algorithms usually assume that the target is a single point without an extent. However, there are several scenarios in which this assumption is not feasible. For example, when the resolution of modern sensor devices is higher than the spatial extent of the target, a varying number of measurements from different spatially distributed reflection centers on the target are received. Furthermore, a collectively moving group of point targets can be treated as a single extended target because of the high interdependency of the group members. This Special Session addresses fundamental techniques, recent developments, and future research directions in the field of extended object and group tracking.
Organizers: Marcus Baum; Uwe Hanebeck, Karlsruhe Institute of Technology; Wolfgang Koch; and Peter Willett.
Description: Many estimation problems of practical relevance include the problem of estimating directional quantities, for example angular values or orientations. However, conventional filters like the Kalman filter assume Gaussian distributions defined on R^n. This assumption neglects the inherent periodicity present in directional quantities. Consequently, more sophisticated approaches are required to accurately describe the circular setting. This Special Session addresses fundamental techniques, recent developments and future research directions in the field of estimation involving directional and periodic data. It is our goal to bridge the gap between theoreticians and practitioners. Thus, we welcome both applied and theoretic contributions on this topic.
Organizers: Gerhard Kurz; Igor Gilitschenski; and Uwe Hanebeck, Karlsruhe Institute of Technology.
SS5: Averaging Measures: Wasserstein Barycenters, MMOSPA, and more
Description: Summarizing the information encoded in (one or more) probability measures is a fundamental problem in many areas such as signal processing, machine learning, computer vision, and data fusion. In this context, the concept of an “average” measure has recently gained significant interest: Wasserstein Barycenters are used, for example, for texture mixing and fusing (empirical) probability densities. In multi-target tracking with missing target identities, the mean square error (MSE) cannot be used to calculate expected target states. Hence, instead of the MSE the Mean Optimum Subpattern Assignment (MOSPA) distance is employed, which is closely related to the Wasserstein distance. This leads to Minimum MOPSPA estimates instead of MMSE estimates. This Special Session addresses all recent research results that involve the calculation of an “average” of (one or more) probability measures in all its variants. This includes both new theoretical results and applications.
Organizers: Marcus Baum, Karlsruhe Institute of Technology; Peter Willett; and Uwe Hanebeck, Karlsruhe Institute of Technology.
SS6: Homotopy Methods for Progressive Bayesian Estimation
Description: This session is concerned with homotopy methods for the efficient solution of Bayesian state estimation problems occurring in information fusion and filtering. For state estimation in the presence of stochastic uncertainties, the best current estimate is represented by a probability density function. For that purpose, different representations are used including continuous densities such as Gaussian mixtures or discrete densities on continuous domain such as particle sets. Given prior knowledge in form of such a density, the goal is to include new information by means of Bayes’ theorem. Typically, the resulting posterior density is of higher complexity and difficult to compute. In the case of particle sets, additional problems such as particle degeneracy occur. Hence, an appropriate approximate posterior has to be found. For recursive applications, this approximate posterior should be of the same form as the given prior density (approximate closedness). To cope with this challenging approximation problem, a well-established technique is to gradually include the new information instead of using it in one shot, which is achieved by a homotopy. For this session, manuscripts are invited that cover any aspect of homotopy methods for state estimation. This includes both theoretically oriented work and applications of known methods.
Organizers: Uwe Hanebeck, Karlsruhe Institute of Technology; and Fred Daum.
SS7: Probabilistic RGBD Data Fusion
Description: Since the launch of Kinect in 2010, setting up an RGBD Sensor Network has become within easy reach. In contrast to a single sensor, the combination of multiple cameras brings several advantages, including simultaneous coverage of a large environment, increased resolution, redundancy, and robustness against occlusion. However, together with these great benefits a variety of challenges arise: synchronization, calibration, registration, multi-sensor fusion, large amounts of data, interference issues, and last but not least, sensor-specific stochastic and set-valued uncertainties. This Special Session addresses fundamental techniques, recent developments and future research directions in the field of probabilistic RGBD data fusion.
Organizers: Florian Faion; Antonio Zea; and Uwe Hanebeck, Karlsruhe Institute of Technology.
SS8: Advances in Distributed Kalman Filtering
Description:The rapid advances in sensor and communication technologies are accompanied by an increasing demand for distributed state estimation methods. Centralized implementations of Kalman filter algorithms are often too costly in terms of communication bandwidth or simply inapplicable – for instance when mobile ad-hoc networks of autonomously operating state estimation systems are considered. Compared to centralized approaches, distributed or decentralized Kalman filtering is considerably more elaborate. In particular, the treatment of dependent information shared by different state estimation systems is a central issue. Distributed state estimation is, in general, a balancing act between estimation quality and flexible network design. With the Distributed Kalman Filter, it has been demonstrated that an optimal (MSE minimal) estimate can be computed in a distributed fashion, but this algorithm is not robust to packet delay and drops, node failures, and changing network topologies. However, in practice, these problems deserve careful attention and have to be addressed by future research.
Organizers: Benjamin Noack; Felix Govaers; Alexander Charlish; and Uwe Hanebeck, Karlsruhe Institute of Technology.
SS9: Data Fusion Methods for Indoor Localization of People and Objects
Description: Indoor positioning has gained great importance since technology allows for affordable realtime sensing and processing systems. Also the pervasiveness of WSNs (e.g., in the form of WLAN) and mobile sensors (such as smartphones) has inspired researchers and developers to exploit the existing infrastructure. Applications include pedestrian navigation in public buildings and shops, location based services, safety for the elderly and impaired, museum guides, surveillance tasks, but also tracking products in manufacturing, warehousing etc., the list goes on. Unlike outdoor environments, which are covered by GNSS to a satisfiable extent, indoor navigation faces additional challenges, depending on the underlying measurement system, such as occlusions, reflections and attenuation. There is a great variety of sensors and measuring principles, however, in practice every single measuring technique suffers from deficits. While RF and (ultra-)sound are subject to multipath propagation, optical systems are intolerant to NLOS conditions. Some systems require setting up beacons, while others are self-calibrating and easy-to-install. Obviously, data fusion can overcome these limitations by combining complementary, and redundant sensing techniques, and using elaborate algorithmic methods, such as stochastic filtering. This Special Session addresses fundamental techniques, recent developments and future research directions to help clear the way toward robust, accurate, indoor localization.
Organizers: Antonio Zea; Florian Faion; and Uwe Hanebeck, Karlsruhe Institute of Technology.
SS10: Dynamic Data Driven Application Systems for Sensor Data Problems
Description: The Dynamic Data-Driven Application Systems (DDDAS) paradigm shapes a symbiotic feedback ecosystem consisting of models of physical and engineered systems and application instrumentation. Precisely, DDDAS establishes new avenues for accurate analysis and robust prediction, and control in application systems using multi-modal fusion of sensory data. The ubiquitous Big Data problems place the DDDAS as a unifying framework among applications, mathematical and statistical modeling, as well as information systems. Such challenges make the DDDAS paradigm now more relevant than ever that integrate modeling, measurements, and software. The DDDAS Session invites papers that demonstrate advances in the DDDAS paradigm that combine real-world applications, contemporary mathematical approaches, real-time large scale measurements, with software solutions. Key applications requiring DDDAS high-end computing solutions include distributed wireless platforms, distributed processing, collection and processing of sensor data for situation awareness, and critical infrastructure systems.
Organizers: Erik Blasch, AFRL/RI; Frederica Darema, AFOSR; Vasileios Maroulas, University of Tennessee; and Ioannis D. Schizas, University of Texas.
SS11: Intelligent Information Fusion
Description: Research on Intelligent Systems for information fusion has matured during the last years and many effective applications of this technology are now deployed. The problem of Information Fusion has attracted significant attention in the artificial intelligence and machine learning community, trying to innovate in the techniques used for combining the data and to provide new models for estimations and predictions. The growing advances of Information Fusion as rapid advances in sensor technologies that provide context-information has led to new applications in different environments such as remote sensing, surveillance, home care, etc. With the continuing expansion of the domain of interest and the increasing complexity of the collected information, intelligent techniques for fusion processing have become a crucial component in information fusion applications. In this sense, Intelligent systems can improve high level information fusion aimed at supporting decision making and/or intelligent information management.
This special session focuses on information fusion methods using intelligent systems, to report on the latest scientific and technical advances on the application of intelligent systems for information fusion, to discuss and debate the major issues, and to showcase the latest systems and the most important outcomes. These methods provide information fusion systems with advanced capabilities that allow them to have a dynamic response in view of potential changing situations in the environment. These systems are appropriate for developing dynamic and distributed systems, as they possess the capability of adapting themselves to the users and environmental characteristics. Both theoretical and practical approaches in the area are welcomed.
It will promote a forum for discussion on how Intelligent Systems and information fusion techniques, methods, and tools help system designers to accomplish the mapping between available technologies and application needs. Other stakeholders should be rewarded with a better understanding of the potential and challenges of the Intelligent Systems in the information fusion approach.
Particularly, the topics of interest of this special section include but are not limited to:
- Intelligent systems, Hybrid intelligent systems
- Adaptive fusion system architectures,
- Reasoning techniques
- Intelligent/adaptive signal processing
- Artificial intelligence and machine learning related information fusion
Organizers: Juan Manuel Corchado, University of Salamanca; Javier Bajo, Technical University of Madrid; and Tiancheng Li, The BISITE research group.
SS12: Applications of Data Fusion and Predictive Analytics to Finance, Business, and Marketing
Description: This proposal is to propose a new session of applying data fusion and predictive analytics to finance, business, and marketing within the Fusion 2015 conference. Finance and business are critical application areas in information fusion and data analytics. Many of the techniques discussed in the information fusion community are directly applicable to this emerging and important application area. The goal of this proposed session is to open up a forum for data scientists and engineer to share their latest experience and insight on applying the predictive modeling and data analytics techniques to the applications in finance and business areas.
Topics include but are not limited to:
- Predictive marketing
- Predictive model and decision analytics, market forecasting
- Fundamental trading strategies with data fusion techniques
- Bayesian networks, neural networks, rule-based, ontologies
- Technical trading strategies and performance evaluation
- Kalman filtering, multiple switching models, machine learning, dynamic asset allocation
- Economic data analytics and Business forecasting
- Financial and risk analysis, data fusion for business intelligence and decision
Organizers: KC Chang and Zhi Tian, George Mason University.
SS13: Change, Anomaly, and Trend (CAT) Detection in Challenging Environments
Description: This special session is to be a forum to present and discuss the state of the art in change, anomaly, and trend (CAT) detection with emphasis on overcoming challenging circumstances that often arise in practice. Situations of interest include, but are not limited to
- detection under non-stationary, highly uncertain, or highly dependent data
- change detection with multiple sensors
- detection in high dimensions
- distribution free methods
- detection with small sample sizes
- detection with multi-modalities/heterogeneous data
- multiple change point detection
- self-starting control charts.
Despite the long and successful history of CAT detection in industry, the military, finance, etc., there is a rapidly increasing number of applications wherein the above challenges play significant roles. For example, the data collected by a sensor network can easily exhibit complicated statistical dependencies if the (possibly heterogeneous) sensors are affected by the same physical noise phenomenon. The increasing prevalence of these challenges is caused in large part by the ever increasing instrumentation and connectivity of our world. Sensing more phenomena with more sensors more often places these challenges front and center. CAT detection represents a suite of fundamental analytical tools that lie at the heart of many algorithms and thus they arguably remain critical components in software technology. Simply put, the ability to accurately and reliably detect statistical changes in a signal or a data stream is often a necessary first step to higher level processing tasks.
Organizers: Michael Lexa; Marco Guerriero; Satish Iyengar; Dayu Huang; and Jayakrishnan Unnikrishnan, GE Global Research.
SS14: Context-based Information Fusion
Description: The goal of the proposed session is discussing approaches to context-based information fusion. It will cover the design and development of information fusion solutions integrating sensory data with contextual knowledge. The context may be spread at different levels, with static or dynamic structure, and be represented in different ways, as maps, knowledge-bases, ontologies, etc. It can constitute a powerful tool to favour adaptability and system performance. Therefore, the session covers both representation and exploitation mechanisms so that this knowledge can be efficiently integrated in the fusion process and enable adaptation mechanisms under different possible paradigms (intelligent systems, knowledge management, integration in fusion algorithms, etc). The applicability of advanced approaches can be illustrated with real-world applications of information fusion requiring a contextualized approach.
Topics include but are not limited to:
- Representation and exploitation of contextual information at all levels in real world systems
- Managing of heterogeneous contextual sources (hard and soft data and knowledge)
- Injection of a priori knowledge to improve the performance of fusion systems
- Augmentation of tracking, classification, recognition, situation analysis, etc. algorithms with contextual information
- Adaptation techniques to have the system respond not only to changing target’s state but also to the surrounding environment
- Strategies and algorithms for context discovery (off-line ad on-line)
- Application examples including: context-aided surveillance systems (security/defence), traffic control, autonomous navigation, cyber security, ambient intelligence, ambient assistance, etc.
Organizers: Jesus Garcia, University Carlos III de Madrid; Lauro Snidaro, University of Udine; José M. Molina, University Carlos III de Madrid; and Ingrid Visentini, Danieli Automation S.p.A.
SS15: Trust in Fused Information
Description: The volumes of information streamed and collected from disparate sensor, data and information sources have increased dramatically in recent years. Fusing such multimodal information has increased the decision maker’s ability to make informed decision in rapidly changing complex environments which would not have been possible if the information were taken individually. While such approaches have drawn a considerable attention from the community, the trustworthiness of the fused information stands as a critical issue yet to be addressed.
The aim of the special session is as follows: (a) discuss how different strains of knowledge can be processed, analysed, and combined to model trust in information; (b) investigate how such models can be exploited to improve the trustworthiness of the fused information; (c) highlight different aspects of information fusion and trust research from decision makers perspectives ─ be they central or edge users; and (d) to discuss about the interesting topics in trust and information fusion in the wider context.
Organizers: Geeth de Mel, IBM T. J Watson Research Center; Murat Sensoy, Özyeğin University; Lance Kaplan, U.S. Army Research Laboratory; and Tien Pham, U.S. Army Research Laboratory.
Description: This special session focuses on multistatic sonar and radar information fusion and target tracking algorithms. Recent years have seen increasing interest in fusion and tracking algorithms for multistatic systems. Challenges include the effective treatment of bistatic sensor nodes, non-linear measurements, and false alarm overloading. Recent progress in multistatic tracking has been facilitated by the Multistatic Tracking Working Group (MSTWG), an International Society of Information Fusion (ISIF) working group. The purpose of this working group is to evaluate a large variety of multistatic tracking algorithms available amongst group members on common data sets with common metrics. The reporting of these results and other related multistatic topics has been of great value to MSTWG and ISIF in the form of numerous papers and participation during similar special sessions in previous FUSION conferences since 2006. A special session on multistatic sonar/radar tracking at FUSION’15 will enable current MSTWG outputs and other contributions by others outside of this group to be documented.
Topics of Interest:
- Use of measurements made in local sensor coordinates to track targets in a common coordinate system.
- Use of multiple sensors to compensate for a low probability of detection at the individual sensor level (e.g. pre-detection fusion).
- Use of multiple sensors to compensate for high false alarm rate at the individual sensor level.
- Multisensor tracking using available sources (e.g. passive radar, passive sonar or active-passive fusion).
- Metrics to evaluate multistatic tracking.
Organizers: Garfield Mellema, Defence Research and Development Canada – Atlantic; and Jason Aughenbaugh, the University of Texas at Austin.
SS17: Real-World Problems with Network Abstractions
Description: Data analysis scenarios typically involve relationships among entities. It is natural to create network abstractions of such scenarios and apply network algorithms to them. In some cases, analysis of the network abstraction produces valuable insight, revealing the relative importance of entities, their roles in the network, or their organization into communities. In other cases, however, the network abstraction is harder to leverage. This Special Session addresses those real-world aspects of data that make network analysis difficult. Examples include entity resolution among the nodes, heterogeneity of link types, loss of critical features in the network abstraction, missing nodes, missing links, collection biases, and problems where the network is itself only a means to discover a location, a threat status, or an activity. Whereas much of network science is concerned with problems where the network structure is explicit (e.g., a physical network) or canonical (e.g., Karate Club and other reference cases), the emphasis here is on problems in which the network abstraction of the data is difficult and tells only part of the story.
While Network Science is a broad field encompassing many communities, the mainstream of network science does not reflect the priorities of the Fusion community, nor serve the needs of its customers. The goal of this session is to push boundaries of network science to address problems related to the formation of networks from complex data.
Organizer: James Ferry, Metron, Inc.
SS18: Evaluation of Technologies for Uncertainty Reasoning
Description: The ETUR Session is intended to report the latest results of the ISIF’s ETURWG, which aims to bring together advances and developments in the area of evaluation of uncertainty representation. The ETURWG special sessions started in Fusion 2010 and has been held ever since, which an attendance consistently averaging between 40 and 48 attendees. While most attendees consist of ETURWG participants new researchers and practitioners interested in uncertainty evaluation have attended the sessions and some stayed with the ETURWG.
The session will focus three topics: (1) to summarize the state of the art in uncertainty analysis, representation, and evaluation, (2) discussion of metrics for uncertainty representation, and (3) survey uncertainty at all levels of fusion. The impact to the ISIF community would be an organized session with a series of methods in uncertainty representation as coordinated with evaluation. The techniques discussed and questions/answers would be important for the researchers in the ISIF community; however, the bigger impact would be for the customers of information fusion systems to determine how measure, evaluate, and approve systems that assess the situation beyond Level 1 fusion. The customers of information fusion products would have some guidelines to draft requirements documentation, the gain of fusion systems over current techniques, as well as issues that important in information fusion systems designs. One of the main goals of information fusion is uncertainty reduction, which is dependent on the representation chosen. Uncertainty representation differs across the various levels of Information Fusion (as defined by the JDL/DFIG models). Given the advances in information fusion systems, there is a need to determine how to represent and evaluate situational (Level 2 Fusion), impact (Level 3 Fusion) and process refinement (Level 5 Fusion), which is not well standardized for the information fusion community.
Organizers: Paulo Costa, George Mason University; Kathryn Laskey, George Mason University; Anne-Laure Jousselme, DRDC-Valcatier; and Erik Blasch, AFRL.
SS19: Space Object Detection, Tracking, Identification, and Classification
Description: The operation of Earth-orbiting spacecraft has become increasingly difficult due to the proliferation of orbit debris and increased commercialization. This has been made evident by several collisions involving operational spacecraft over the past five years. Maintaining sustainability of key orbit regimes, e.g., low-Earth, sun-synchronous, and geosynchronous orbits, requires improved tracking and prediction of up to hundreds of thousands of objects given sparse measurements in both space and time. Target identification and classification allows for better prediction and situational awareness. Moreover, proper characterization of Accept measurement assignments as well as the determination of measurement associations for maneuvering targets play a pivotal role in successful space situational awareness. Solutions to the problem will be interdisciplinary and require expertise in astrodynamics, computational sciences, information fusion, applied mathematics, and many other fields. The primary goal of this session is to promote interaction between the astrodynamics and space situational awareness community with those conducting research in information fusion and multitarget tracking. Secondary is a gathering of the individuals performing research on the associated topics to present, discuss, and disseminate ideas related to solving the detection, tracking, identification, and classification problems in the context of space situational awareness.
Organizers: Kyle DeMars, Missouri University of Science and Technology; and Brandon Jones, University of Colorado at Boulder.
SS20: Large Scale Value-Centered Data Fusion
Description: We propose a workshop on Large Scale Value-Centered Data Fusion to be held during the upcoming 18th International Conference on Information Fusion in Washington, DC. It has been noted that achieving information superiority is one of the keys to securing and defending contested environments in military and civilian applications of national defense and security. As a consequence, data-centric approaches to information collection, processing and exploitation are being developed that aggregate large amounts of data from a combination of hard information sources, such as physical sensors, and soft information sources, such as human intelligence sources. These data are highly heterogeneous, are collected at different space and time scales, and have different levels of reliability. For example, hard information sources can be obtained from stationary and mobile sensor platforms, including unattended ground sensors, organic air vehicles, and small robots. These platforms use multiple sensing modalities, such as acoustic, seismic, FLIR, LIDAR, SAR, or hyperspectral, to collect diverse information in complex environments. On the other hand, soft information sources can be obtained from public newsfeeds/microblogs or private data reported by intelligence sources. A critical challenge is to manage the distributed data collection, processing, and fusion of this highly diverse set of data with the primary goal of generating useful, mission critical information. Consequently, to meet mission needs, next generation distributed information collection systems must adopt an integrated approach combined with cross-layer design, optimization and adaptation thus allowing users to simultaneously actively manage information resources, quantify their informational value, and rapidly and efficiently extract that information over communication networks (which are themselves subject to disruption). Such systems must aggregate huge amounts of information across multiple platforms and diverse modalities. They must incorporate available prior information (e.g. known phenomenology and sensor physics, physical/social context, and observed behaviors). They must be able to adapt to increasingly intelligent and agile adversaries under a wide range of dynamically changing operating conditions. In total, this represents an extremely broad research area which is beyond the scope of a single workshop. This workshop will focus on a few key aspects of the larger problem, namely
- Quantifying the information gain of a large set of information sources acting in concert.
- Extracting information (versus raw data) from these sources.
- Computing and communicating the value of information in distributed settings.
These issues comprise an inter-related set of research challenges. Quantifying the information gain of a single sensor involves understanding the physical sensor or collection model in the context of a given inference problem. Often the known physical model is not sufficient to the task and empirical methods are needed. Extracting information versus raw data requires understanding which aspects and/or representation of the sensor data are relevant to a particular inference problem. It is often the case that resource constraints preclude the transmission of raw data and as such lossy compression methods preserving information regarding a particular inference problem are needed. These issues become more complex for multiple sensing modalities where common and complementary information content is not well understood. Finally, within a distributed setting, computation and communication of such quantities poses many challenges.
Additionally, this workshop is an opportunity to foster discussions between academia and DoD and other national labs, and to help shape future research towards national security needs. As such, we will invite leading academic, industry, and government experts representing ideas from several of these subareas.
Organizers: Alfred Hero, University of Michigan; and John W. Fisher III, Massachusetts Institute of Technology.
SS21: Information Fusion in Multi-Biometric Systems
Description: This session will focus on the latest innovations and best practices in the emerging field of multi-biometric fusion. Biometrics tries to build an identity recognition decision based on the physical or behavioral characteristics of individuals. Multi-biometrics aims at outperforming the conventional biometric solutions by increasing accuracy, and robustness to intra-person variations and to noisy data. It also reduces the effect of the non-universality of biometric modalities and the vulnerability to spoof attacks. Fusion is performed to build a unified biometric decision based on the information collected from different biometric sources. This unified result must be constructed in a way that guarantees the best performance possible and take into account the efficiency of the solution. The topic of this special session, Information Fusion in Multi-Biometrics, requires the development of innovative and diverse solutions. Those solutions must take into account the nature of biometric information sources as well as the level of fusion suitable for the application in hand. The fused information may include more general and non-biometric information such as the estimated age of the individual or the environment of the background. This special session will be supported by the European Association for Biometrics (EAB). The EAB will provide technical support by addressing experts for reviews and will help with the dissemination and exploitation of the event.
Topics of interest include Multi-Biometrics, Biometric fusion, Data-level fusion, Feature-level fusion, Score-level fusion, Fuzzy Fusion, Classification-based fusion, Combination-based fusion, Multi-Biometric Identification, Multi-Biometric forensics fusion, Soft biometrics fusion.
Organizers: Naser Damer, IGD Fraunhofer; and Raghavendra Ramachandra, Gjøvik University College.
SS22: Multi-Level Fusion: Bridging the Gap Between High and Low-level Fusion
Description:
The exploitation of all relevant information originating from a growing mass of heterogeneous sources, both device-based (sensors, video, etc.) and human-generated (text, voice, etc.), is a key factor for the production of timely, comprehensive and most accurate description of a situation or phenomenon. There is a growing need to effectively identify relevant information from the mass available, and exploit it through automatic fusion for timely, comprehensive and accurate situation awareness. Even if exploiting multiple sources, most fusion systems are developed for combing just one type of data (e.g. positional data) in order to achieve a certain goal (e.g. accurate target tracking) without considering other relevant information that could be of different origin, type, and with possibly very different representation (e.g. a priori knowledge, contextual knowledge, mission orders, risk maps, availability and coverage of sensing resources, etc.) but still very significant to augment the knowledge about observed entities. Very likely this latter type of information could be considered of different fusion levels that rarely end up being systematically exploited automatically. The result is often stove-piped systems dedicated to a single fusion task with limited robustness. This is caused by the lack of an integrative approach for processing sensor data (low-level fusion) and semantically rich information (high-level fusion) in a holistic manner thus effectively implementing a multi-level processing architecture and fusion process. The proposed special session will bring together researchers working on fusion techniques and algorithms often considered to be at different and disjunct, fostering thus the discussion on the commonalities and differences in their research methodologies, and proposing viable multi-level fusion solutions to address challenging problems or relevant applications.
Topics of interest include (but not limited to) the following:
- Theory and Representation: Probability theory, Dempster-Shafer theory, logic based fusion, classifiers, statistical relational learning, data mining, ontologies;
- Algorithms: Target tracking and localization; situation assessment; pattern analysis; behavior modeling; predictive and impact assessment; process management; sensor/resource management; hard and soft data fusion.
- Architectures: Frameworks to interconnect multiple fusion processes, uncertainty representation and interchange, multi-level reasoning.
- Applications: Defence and intelligence, security, medicine, emergency response, safety monitoring, economy, robotics.
Organizers: Lauro Snidaro, University of Udine, Italy; Jesus Garcia, University Carlos III de Madrid, Spain; and Wolfgang Koch, Fraunhofer FKIE, Germany.