(9 intermediate revisions by the same user not shown) | |||
Line 96: | Line 96: | ||
* [[Tutorials#tutorial12| T12 Implementations of Random-Finite-Set-Based Multi-Target Filters]] | * [[Tutorials#tutorial12| T12 Implementations of Random-Finite-Set-Based Multi-Target Filters]] | ||
* [[Tutorials#tutorial13| T13 Tracking and Sensor Data Fusion – Methodological Framework and Selected Applications]] | * [[Tutorials#tutorial13| T13 Tracking and Sensor Data Fusion – Methodological Framework and Selected Applications]] | ||
− | * <span style="background:#cdcdcd">[[Tutorials#tutorial14| <s>T14 Multistatic Exploration – Introduction to Modern Passive Radar and Multistatic Tracking & Data Fusion</s>]]</span> - Withdrawn | + | * <span style="background:#cdcdcd">[[Tutorials#tutorial14| <s>T14 Multistatic Exploration – Introduction to Modern Passive Radar and Multistatic Tracking & Data Fusion</s>]]</span> - Withdrawn |
* [[Tutorials#tutorial15| T15 Big Data Fusion and Analytics]] | * [[Tutorials#tutorial15| T15 Big Data Fusion and Analytics]] | ||
* [[Tutorials#tutorial16| T16 Object Tracking, Sensor Fusion and Situational Awareness for Assisted- and Self-Driving Vehicles: Problems, Solutions and Directions]] | * [[Tutorials#tutorial16| T16 Object Tracking, Sensor Fusion and Situational Awareness for Assisted- and Self-Driving Vehicles: Problems, Solutions and Directions]] | ||
Line 104: | Line 104: | ||
* [[Tutorials#tutorial20| T20 Extended Object Tracking: Theory and Applications]] | * [[Tutorials#tutorial20| T20 Extended Object Tracking: Theory and Applications]] | ||
* [[Tutorials#tutorial21| T21 Probabilistic Situation Assessment for Abnormal Interaction Detection]] | * [[Tutorials#tutorial21| T21 Probabilistic Situation Assessment for Abnormal Interaction Detection]] | ||
− | * <span style="background:#cdcdcd">[[Tutorials#tutorial22| <s>T22 Multitarget Tracking and Sensor Calibration in Centralized and Distributed Networks]]</s></span> - Withdrawn | + | * <span style="background:#cdcdcd">[[Tutorials#tutorial22| <s>T22 Multitarget Tracking and Sensor Calibration in Centralized and Distributed Networks]]</s></span> - Withdrawn |
* [[Tutorials#tutorial23| T23 Information Fusion in Resource-Limited Camera Networks]] | * [[Tutorials#tutorial23| T23 Information Fusion in Resource-Limited Camera Networks]] | ||
* [[Tutorials#tutorial24| T24 Introduction to Bayesian Filtering and Smoothing]] | * [[Tutorials#tutorial24| T24 Introduction to Bayesian Filtering and Smoothing]] | ||
* [[Tutorials#tutorial25| T25 Sensor Fusion for Intelligent Vehicles]] | * [[Tutorials#tutorial25| T25 Sensor Fusion for Intelligent Vehicles]] | ||
− | * [[Tutorials#tutorial26| T26 Multisensor Data Fusion in Wireless Sensor and Actuator Networks]] | + | * <span style="background:#cdcdcd">[[Tutorials#tutorial26| <s>T26 Multisensor Data Fusion in Wireless Sensor and Actuator Networks]] </s></span> - Withdrawn |
</div> | </div> | ||
Line 138: | Line 138: | ||
|style="text-align:center;background-color:#e7deef;" | '''Morning'''<br /> | |style="text-align:center;background-color:#e7deef;" | '''Morning'''<br /> | ||
'''08:30–11:30''' | '''08:30–11:30''' | ||
− | + | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial7| T7]]'''<br /> | |
− | + | Multitarget Tracking and Multisensor Information Fusion<br /> | |
− | + | ''Bar-Shalom'' | |
− | + | ||
− | + | ||
− | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials# | + | |
− | + | ||
− | '' | + | |
− | + | ||
− | + | ||
− | + | ||
|style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial4| T4]]'''<br /> | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial4| T4]]'''<br /> | ||
… Track-to-Track Fusion and the Distributed Kalman Filter<br /> | … Track-to-Track Fusion and the Distributed Kalman Filter<br /> | ||
''Govaers'' | ''Govaers'' | ||
+ | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial3| T3]]'''<br /> | ||
+ | Multisensor-Multitarget Tracker/Fusion Engine …<br /> | ||
+ | ''Kiruba'' | ||
+ | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutoria9| T9]]'''<br /> | ||
+ | Quantum Physics Methods For Nonlinear Filtering<br /> | ||
+ | ''Balaji, Daum'' | ||
+ | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial2| T2]]'''<br /> | ||
+ | Bayesian Networks and Trust Fusion with Subjective Logic<br /> | ||
+ | ''Jøsang'' | ||
|style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial15| T15]]'''<br /> | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial15| T15]]'''<br /> | ||
Big Data Fusion and Analytics<br /> | Big Data Fusion and Analytics<br /> | ||
''Das'' | ''Das'' | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
|style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial8| T8]]'''<br /> | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial8| T8]]'''<br /> | ||
Overview of High-Level Information Fusion Theory …<br /> | Overview of High-Level Information Fusion Theory …<br /> | ||
''Blasch'' | ''Blasch'' | ||
− | |style="text-align:center;background-color:# | + | |style="text-align:center;background-color:#cdcdcd;" | |
− | + | <s>'''[[Tutorials#tutoria19| T19]]'''<br /> | |
− | '' | + | Integration of Information to Identify Objects in Big Data<br /> |
+ | ''Shieh''</s><br /> | ||
+ | <small>Withdrawn by presenter</small> | ||
+ | |style="text-align:center;background-color:#cdcdcd;" | <s>'''[[Tutorials#tutorial14| T14]]'''<br /> | ||
+ | Multistatic Exploration ... Modern Passive Radar …<br /> | ||
+ | ''Koch''</s><br /> | ||
+ | <small>Withdrawn</small> | ||
|- | |- | ||
|style="width:7em;text-align:center;background-color:#e7deef;"| '''11:30–12:30''' | |style="width:7em;text-align:center;background-color:#e7deef;"| '''11:30–12:30''' | ||
Line 176: | Line 176: | ||
|style="text-align:center;background-color:#e7deef;" | '''Mid Day''' | |style="text-align:center;background-color:#e7deef;" | '''Mid Day''' | ||
'''12:30–15:30''' | '''12:30–15:30''' | ||
− | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials# | + | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial16| T16]]'''<br /> |
− | + | Object tracking, sensor fusion ... self-driving vehicles …<br /> | |
− | + | ''Kiruba'' | |
− | + | ||
− | + | ||
− | '' | + | |
− | + | ||
|style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial12| T12]]'''<br /> | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial12| T12]]'''<br /> | ||
... random-finite-set-based multi-target filters<br /> | ... random-finite-set-based multi-target filters<br /> | ||
''Vo, Vo'' | ''Vo, Vo'' | ||
− | |||
− | |||
− | |||
|style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial6| T6]]'''<br /> | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial6| T6]]'''<br /> | ||
Information Quality in Information Fusion …<br /> | Information Quality in Information Fusion …<br /> | ||
''Rogova'' | ''Rogova'' | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
|style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial18| T18A]]'''<br /> | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial18| T18A]]'''<br /> | ||
Maneuvering Target Tracking ... Filtering Methods<br /> | Maneuvering Target Tracking ... Filtering Methods<br /> | ||
''Li, Jilkov'' | ''Li, Jilkov'' | ||
+ | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial10| T10]]'''<br /> | ||
+ | Basic concepts in multi-object estimation<br /> | ||
+ | ''Clark, Delande, Houssineau'' | ||
+ | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial13| T13]]'''<br /> | ||
+ | Tracking and Sensor Data Fusion ... Framework …<br /> | ||
+ | ''Koch'' | ||
+ | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial17| T17]]'''<br /> | ||
+ | Emerging Quantum Technologies for Fusion<br /> | ||
+ | ''Balaji'' | ||
+ | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial23| T23]]'''<br /> | ||
+ | Information fusion in resource-limited camera networks<br /> | ||
+ | ''Cavallaro, SanMiguel'' | ||
+ | |style="text-align:center;background-color:#cdcdcd;" | <s>'''[[Tutorials#tutorial11| T11]]'''<br /> | ||
+ | System-of-Systems … Issues for Information Fusion<br /> | ||
+ | ''Steinberg''</s><br /> | ||
+ | <small>Withdrawn by presenter</small> | ||
|- | |- | ||
|style="width:7em;text-align:center;background-color:#e7deef;"| '''15:30–16:00''' | |style="width:7em;text-align:center;background-color:#e7deef;"| '''15:30–16:00''' | ||
Line 212: | Line 212: | ||
|style="text-align:center;background-color:#e7deef;" | '''Afternoon'''<br /> | |style="text-align:center;background-color:#e7deef;" | '''Afternoon'''<br /> | ||
'''16:00–19:00''' | '''16:00–19:00''' | ||
− | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials# | + | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial5| T5]]'''<br /> |
− | + | ... Finite-Set Statistics for Information Fusion<br /> | |
− | '' | + | ''Mahler'' |
+ | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial25| T25]]'''<br /> | ||
+ | Sensor Fusion for Intelligent Vehicles<br /> | ||
+ | ''Duraisamy, Yuan, Schwarz, Fritzsche'' | ||
|style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial1| T1]]'''<br /> | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial1| T1]]'''<br /> | ||
Bayesian Multiple Target Tracking<br /> | Bayesian Multiple Target Tracking<br /> | ||
''Stone, Streit'' | ''Stone, Streit'' | ||
+ | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial18| T18B]]'''<br /> | ||
+ | Maneuvering Target Tracking ... Filtering Methods<br /> | ||
+ | ''Li, Jilkov'' | ||
|style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial20| T20]]'''<br /> | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial20| T20]]'''<br /> | ||
Extended Object Tracking: Theory and Applications<br /> | Extended Object Tracking: Theory and Applications<br /> | ||
''Granström, Reuter, Baum'' | ''Granström, Reuter, Baum'' | ||
+ | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial24| T24]]'''<br /> | ||
+ | Introduction to Bayesian Filtering and Smoothing<br /> | ||
+ | ''Särkkä'' | ||
|style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial21| T21]]'''<br /> | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial21| T21]]'''<br /> | ||
Probabilistic situation assessment for abnormal …<br /> | Probabilistic situation assessment for abnormal …<br /> | ||
''Regazzoni, Marcenaro'' | ''Regazzoni, Marcenaro'' | ||
+ | |style="text-align:center;background-color:#cdcdcd;" | <s>'''[[Tutorials#tutorial26| T26]]'''<br /> | ||
+ | Multisensor Data Fusion in Wireless ... Networks<br /> | ||
+ | ''Miceli de Farias''</s><br /> | ||
+ | <small>Withdrawn</small> | ||
|style="text-align:center;background-color:#cdcdcd;" | <s>'''[[Tutorials#tutorial22| T22]]'''<br /> | |style="text-align:center;background-color:#cdcdcd;" | <s>'''[[Tutorials#tutorial22| T22]]'''<br /> | ||
Multitarget tracking and sensor calibration in ... networks<br /> | Multitarget tracking and sensor calibration in ... networks<br /> | ||
''Uney, Julier, Clark''</s><br /> | ''Uney, Julier, Clark''</s><br /> | ||
− | <small>Withdrawn | + | <small>Withdrawn</small> |
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
|- | |- | ||
|style="width:7em;text-align:center;background-color:#e7deef;"| '''19:00–22:00''' | |style="width:7em;text-align:center;background-color:#e7deef;"| '''19:00–22:00''' | ||
Line 268: | Line 269: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' This tutorial is based on the book, Bayesian Multiple Target Tracking 2nd Ed. Its purpose is to present the basic results in multiple-target tracking from a Bayesian point of view. People who register will receive a complimentary copy of the book when they attend the tutorial.<br /> | '''Brief description:''' This tutorial is based on the book, Bayesian Multiple Target Tracking 2nd Ed. Its purpose is to present the basic results in multiple-target tracking from a Bayesian point of view. People who register will receive a complimentary copy of the book when they attend the tutorial.<br /> | ||
− | [[T1| More Details]] | + | [[T1| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate1.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 288: | Line 289: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' This tutorial gives attendees a first-hand insight into the theory and application of subjective logic by the author and researcher who proposed and started developing this framework in 1997. The tutorial gives an introduction to subjective logic, and how it applies to Bayesian network modelling and information fusion. ...<br /> | '''Brief description:''' This tutorial gives attendees a first-hand insight into the theory and application of subjective logic by the author and researcher who proposed and started developing this framework in 1997. The tutorial gives an introduction to subjective logic, and how it applies to Bayesian network modelling and information fusion. ...<br /> | ||
− | [[T2| More Details]] | + | [[T2| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate2.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 307: | Line 308: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' While numerous tracking and fusion algorithms are available in the literature, their implementation and application on real-world problems are still challenging. Since new algorithms continue to emerge, rapidly prototyping them, developing for production and evaluating them on real-world (or realistic) problems efficiently are also essential. In addition to reviewing state-of-the-art tracking algorithms, this tutorial will focus on a number of realistic multisensor-multitarget tracking problems, simulation of large-scale tracking scenarios, rapid prototyping, development of high performance real-time tracking/fusion software, and performance evaluation on realistic scenarios. ...<br /> | '''Brief description:''' While numerous tracking and fusion algorithms are available in the literature, their implementation and application on real-world problems are still challenging. Since new algorithms continue to emerge, rapidly prototyping them, developing for production and evaluating them on real-world (or realistic) problems efficiently are also essential. In addition to reviewing state-of-the-art tracking algorithms, this tutorial will focus on a number of realistic multisensor-multitarget tracking problems, simulation of large-scale tracking scenarios, rapid prototyping, development of high performance real-time tracking/fusion software, and performance evaluation on realistic scenarios. ...<br /> | ||
− | [[T3| More Details]] | + | [[T3| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate3.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 326: | Line 327: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' The increasing trend towards connected sensors (”internet of things” and ”ubiquitous computing”) derive a demand for powerful distributed estimation methodologies. In tracking applications, the ”Distributed Kalman Filter” (DKF) provides an optimal solution under certain conditions. The optimal solution in terms of the estimation accuracy is also achieved by a centralized fusion algorithm which receives either all associated measurements or so-called tracklets.... Two more recent methodologies are based on the ”accumulated state densities” (ASD) which augment the states from multiple time instants. In practical applications, tracklet fusion based on the equivalent measurement often achieves reliable results even if full communication is not available. The limitations and robustness of the tracklet fusion will be discussed. ...<br /> | '''Brief description:''' The increasing trend towards connected sensors (”internet of things” and ”ubiquitous computing”) derive a demand for powerful distributed estimation methodologies. In tracking applications, the ”Distributed Kalman Filter” (DKF) provides an optimal solution under certain conditions. The optimal solution in terms of the estimation accuracy is also achieved by a centralized fusion algorithm which receives either all associated measurements or so-called tracklets.... Two more recent methodologies are based on the ”accumulated state densities” (ASD) which augment the states from multiple time instants. In practical applications, tracklet fusion based on the equivalent measurement often achieves reliable results even if full communication is not available. The limitations and robustness of the tracklet fusion will be discussed. ...<br /> | ||
− | [[T4| More Details]] | + | [[T4| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate4.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 345: | Line 346: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' Finite-set statistics is a theoretically unified mathematical machine for solving information fusion problems, based on random set theory. First systematically described in Statistical Multisource-Multitarget Information Fusion (Artech, 2007), it has attracted the interest of dozens of research groups in at least 19 nations, resulting in well over a thousand publications. Advances in Statistical Multisource-Multitarget Information Fusion (Artech, 2014) systematically described the most intriguing aspects of this research, including algorithms that outperform conventional approaches. Previous tutorials have focused on applications of random set information fusion. This is the first systematic tutorial treatment of finite-set statistics itself. ...<br /> | '''Brief description:''' Finite-set statistics is a theoretically unified mathematical machine for solving information fusion problems, based on random set theory. First systematically described in Statistical Multisource-Multitarget Information Fusion (Artech, 2007), it has attracted the interest of dozens of research groups in at least 19 nations, resulting in well over a thousand publications. Advances in Statistical Multisource-Multitarget Information Fusion (Artech, 2014) systematically described the most intriguing aspects of this research, including algorithms that outperform conventional approaches. Previous tutorials have focused on applications of random set information fusion. This is the first systematic tutorial treatment of finite-set statistics itself. ...<br /> | ||
− | [[T5| More Details]] | + | [[T5| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate5.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 364: | Line 365: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' ... The tutorial will discuss major challenges and some possible approaches addressing the problem of representing and incorporating information quality into fusion processes. In particular it will present an ontology of quality of information and identify potential methods of representing and assessing the values of quality attributes and their combination. It will also examine the relation between information quality and context, and suggest possible approaches to quality control compensating for insufficient information and model quality.<br /> | '''Brief description:''' ... The tutorial will discuss major challenges and some possible approaches addressing the problem of representing and incorporating information quality into fusion processes. In particular it will present an ontology of quality of information and identify potential methods of representing and assessing the values of quality attributes and their combination. It will also examine the relation between information quality and context, and suggest possible approaches to quality control compensating for insufficient information and model quality.<br /> | ||
− | [[T6| More Details]] | + | [[T6| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate6.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 384: | Line 385: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' To provide to the participants the latest state-of-the art techniques to estimate the states ofmultiple targets with multisensor information fusion. Tools for algorithm selection, design and evaluation willbe presented. These form the basis of automated decision systems foradvanced surveillanceandtargeting.The various information processing configurations for fusion are described, including the recently solvedtrack-to-track fusion from heterogeneous sensors.<br /> | '''Brief description:''' To provide to the participants the latest state-of-the art techniques to estimate the states ofmultiple targets with multisensor information fusion. Tools for algorithm selection, design and evaluation willbe presented. These form the basis of automated decision systems foradvanced surveillanceandtargeting.The various information processing configurations for fusion are described, including the recently solvedtrack-to-track fusion from heterogeneous sensors.<br /> | ||
− | [[T7| More Details]] | + | [[T7| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate7.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 403: | Line 404: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' Over the past decade, the ISIF community has put together special sessions, panel discussions, and concept papers to capture the methodologies, directions, needs, and grand challenges of high-level information fusion (HLIF) in practical system designs. This tutorial brings together the contemporary concepts, models, and definitions to give the attendee a summary of the state-of-the-art in HLIF. Analogies from low-level information fusion (LLIF) of object tracking and identification are extended to the HLIF concepts of situation/impact assessment and process/user refinement. HLIF theories (operational, functional, formal, cognitive) are mapped to representations (semantics, ontologies, axiomatics, and agents) with contemporary issues of modeling, testbeds, evaluation, and human-machine interfaces. Discussions with examples of search and rescue, cyber analysis, and battlefield awareness are presented. ...<br /> | '''Brief description:''' Over the past decade, the ISIF community has put together special sessions, panel discussions, and concept papers to capture the methodologies, directions, needs, and grand challenges of high-level information fusion (HLIF) in practical system designs. This tutorial brings together the contemporary concepts, models, and definitions to give the attendee a summary of the state-of-the-art in HLIF. Analogies from low-level information fusion (LLIF) of object tracking and identification are extended to the HLIF concepts of situation/impact assessment and process/user refinement. HLIF theories (operational, functional, formal, cognitive) are mapped to representations (semantics, ontologies, axiomatics, and agents) with contemporary issues of modeling, testbeds, evaluation, and human-machine interfaces. Discussions with examples of search and rescue, cyber analysis, and battlefield awareness are presented. ...<br /> | ||
− | [[T8| More Details]] | + | [[T8| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate8.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 422: | Line 423: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' Relationships between nonlinear filtering and quantum physics has been studied in the past. In this tutorial, more modern connections between the two fields are drawn, particularly based on methods drawn from Feynman path integrals, quantum field theory and renormalization group.<br /> | '''Brief description:''' Relationships between nonlinear filtering and quantum physics has been studied in the past. In this tutorial, more modern connections between the two fields are drawn, particularly based on methods drawn from Feynman path integrals, quantum field theory and renormalization group.<br /> | ||
− | [[T9| More Details]] | + | [[T9| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate9.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 438: | Line 439: | ||
|- | |- | ||
| style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | | style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | ||
− | '''Presenter:''' [mailto:D.E.Clark@hw.ac.uk Daniel Clark], Emmanuel D. Delande, and | + | '''Presenter:''' [mailto:D.E.Clark@hw.ac.uk Daniel Clark], Emmanuel D. Delande, and Isabel Schlangen<br /> |
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' ... This tutorial will highlight some basic mathematical concepts in multiobject estimation to enable researchers to better understand and contribute to innovations in this field. The goal of the presenters is to inspire participants to develop a broader mathematical perspective and explore the literature in spatial statistics and point processes to aid their research in sensor fusion. The presenters will highlight where new concepts to multiobject estimation in sensor fusion, such as regional variance for estimating population uncertainty, can be facilitated when considering a measuretheoretic point process perspective.<br /> | '''Brief description:''' ... This tutorial will highlight some basic mathematical concepts in multiobject estimation to enable researchers to better understand and contribute to innovations in this field. The goal of the presenters is to inspire participants to develop a broader mathematical perspective and explore the literature in spatial statistics and point processes to aid their research in sensor fusion. The presenters will highlight where new concepts to multiobject estimation in sensor fusion, such as regional variance for estimating population uncertainty, can be facilitated when considering a measuretheoretic point process perspective.<br /> | ||
− | [[T10| More Details]] | + | [[T10| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate10.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 480: | Line 481: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' The Finite Set Statistics framework for multi-sensor multi-target tracking has attached considerable interest in recent years. It provides a unified perspective of multi-target tracking in a very intuitive manner by drawing direct parallels with the simpler problem of single-target tracking. This framework has lead to the development of multi-target filters such as the Probability Hypothesis Density (PHD), Cardinalized PHD (CPHD), Multi-Bernoulli filters and recently, the Generalized Labeled Multi-Bernoulli filter. In this tutorial, we show how these filters are implemented and illustrate via Matlab how these filters work. ...<br /> | '''Brief description:''' The Finite Set Statistics framework for multi-sensor multi-target tracking has attached considerable interest in recent years. It provides a unified perspective of multi-target tracking in a very intuitive manner by drawing direct parallels with the simpler problem of single-target tracking. This framework has lead to the development of multi-target filters such as the Probability Hypothesis Density (PHD), Cardinalized PHD (CPHD), Multi-Bernoulli filters and recently, the Generalized Labeled Multi-Bernoulli filter. In this tutorial, we show how these filters are implemented and illustrate via Matlab how these filters work. ...<br /> | ||
− | [[T12| More Details]] | + | [[T12| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate12.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 499: | Line 500: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' The tutorial covers the material of the recently published book of the presenter with the same title (Springer 2014, Mathematical Engineering Series, ISBN 978-3-642-39270-2) and thus provides an guided introduction to deeper reading. Starting point is the well known JDL model of sensor data and information fusion that provides general orientation within the world of fusion methodologies and its various applications, covering a dynamically evolving field of ever increasing relevance. Using the JDL model as a guiding principle, the tutorial introduces into advanced fusion technologies based on practical examples taken from real world applications.<br /> | '''Brief description:''' The tutorial covers the material of the recently published book of the presenter with the same title (Springer 2014, Mathematical Engineering Series, ISBN 978-3-642-39270-2) and thus provides an guided introduction to deeper reading. Starting point is the well known JDL model of sensor data and information fusion that provides general orientation within the world of fusion methodologies and its various applications, covering a dynamically evolving field of ever increasing relevance. Using the JDL model as a guiding principle, the tutorial introduces into advanced fusion technologies based on practical examples taken from real world applications.<br /> | ||
− | [[T13| More Details]] | + | [[T13| More Details]][http://fusion2016.org/download/certificates/TutorialCertificate13.pdf Certificate (PDF)] |
− | </div> | + | </div> |
|- | |- | ||
|} | |} | ||
Line 513: | Line 514: | ||
{| id="mp-left" style="width:100%; vertical-align:top; background:#ffffe8;" | {| id="mp-left" style="width:100%; vertical-align:top; background:#ffffe8;" | ||
| style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:#fff7bd; font-family:inherit; font-size:120%; font-weight:bold; border:1px solid #f2ea7e; text-align:left; color:#000; padding:0.2em 0.4em;"> | | style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:#fff7bd; font-family:inherit; font-size:120%; font-weight:bold; border:1px solid #f2ea7e; text-align:left; color:#000; padding:0.2em 0.4em;"> | ||
− | <span style="background:#cdcdcd"><s>T14 Multistatic Exploration – Introduction to Modern Passive Radar and Multistatic Tracking & Data Fusion</s></span> - Withdrawn | + | <span style="background:#cdcdcd"><s>T14 Multistatic Exploration – Introduction to Modern Passive Radar and Multistatic Tracking & Data Fusion</s></span> - Withdrawn</h2> |
|- | |- | ||
| style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | | style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | ||
Line 538: | Line 539: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' Big data has tremendous potential to transform businesses but poses significant challenge in searching, processing, and extracting actionable intelligence. In this tutorial, I will present some techniques for fusion and analytics to process big centralized warehouse data, inherently distributed data, and data residing on the cloud. The fusion and analytics techniques to be discussed will handle both structured transactional and sensor data as well as unstructured textual data such as human intelligence, emails, blogs, surveys, etc. As a background, this tutorial is intended to provide an account of both the cutting-edge and the most commonly used approaches to high-level data fusion and predictive and text analytics. The demos to be presented are in the areas of distributed search and situation assessment, information extraction and classification, and sentiment analyses. ...<br /> | '''Brief description:''' Big data has tremendous potential to transform businesses but poses significant challenge in searching, processing, and extracting actionable intelligence. In this tutorial, I will present some techniques for fusion and analytics to process big centralized warehouse data, inherently distributed data, and data residing on the cloud. The fusion and analytics techniques to be discussed will handle both structured transactional and sensor data as well as unstructured textual data such as human intelligence, emails, blogs, surveys, etc. As a background, this tutorial is intended to provide an account of both the cutting-edge and the most commonly used approaches to high-level data fusion and predictive and text analytics. The demos to be presented are in the areas of distributed search and situation assessment, information extraction and classification, and sentiment analyses. ...<br /> | ||
− | [[T15| More Details]] | + | [[T15| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate15.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 557: | Line 558: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' ... In this tutorial, we aim to discuss a number of problems related to assisted- and self-driving vehicles, potential solutions and directions for research & development. The issues discussed in this tutorial will span multitarget tracking, multisensor fusion and situational awareness within the context of smart cars. We will also present some of the algorithms that are available in the open literature as well as those we have developed recently. In addition, we will also discuss related computational issues and sensor technologies. Finally, we will present some results on real data.<br /> | '''Brief description:''' ... In this tutorial, we aim to discuss a number of problems related to assisted- and self-driving vehicles, potential solutions and directions for research & development. The issues discussed in this tutorial will span multitarget tracking, multisensor fusion and situational awareness within the context of smart cars. We will also present some of the algorithms that are available in the open literature as well as those we have developed recently. In addition, we will also discuss related computational issues and sensor technologies. Finally, we will present some results on real data.<br /> | ||
− | [[T16| More Details]] | + | [[T16| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate16.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 577: | Line 578: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' ... Although the fundamentals of quantum physics have been well-known since the 1920s, in the last few decades several novel consequences of the laws of quantum physics (particularly, in the areas of atomic, molecular and optical physics and quantum computer science and information theory) have been discovered. ... In particular, in the areas of sensing, quantum physics sets the bounds on the sensitivity of sensing... that is orders of magnitude below the sensitivity of current sensors. In the area of computing, it has been observed that a quantum computer allows some computations to be carried out that are unfeasible using current or future classical computing technology. In the area of communication, quantum physics enables provable secure communication and at much higher data rates than those allowed by classical Shannon limit. Many of these advances could have major near-term and long-term consequences in the areas of sensing, secure communication, big data analysis, and machine learning, and hence sensor and information fusion.<br /> | '''Brief description:''' ... Although the fundamentals of quantum physics have been well-known since the 1920s, in the last few decades several novel consequences of the laws of quantum physics (particularly, in the areas of atomic, molecular and optical physics and quantum computer science and information theory) have been discovered. ... In particular, in the areas of sensing, quantum physics sets the bounds on the sensitivity of sensing... that is orders of magnitude below the sensitivity of current sensors. In the area of computing, it has been observed that a quantum computer allows some computations to be carried out that are unfeasible using current or future classical computing technology. In the area of communication, quantum physics enables provable secure communication and at much higher data rates than those allowed by classical Shannon limit. Many of these advances could have major near-term and long-term consequences in the areas of sensing, secure communication, big data analysis, and machine learning, and hence sensor and information fusion.<br /> | ||
− | [[T17| More Details]] | + | [[T17| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate17.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 596: | Line 597: | ||
'''Length:''' 3+3 hours<br /> | '''Length:''' 3+3 hours<br /> | ||
'''Brief description:''' The principal challenges for tracking a maneuverable target are nonlinearity in both target motion and measurement models as well as the uncertainty in the pattern of target motion. This tutorial presents theoretical and algorithmic means available to meet these challenges. The overview part elucidates a well organized panorama of maneuvering target tracking. The other part presents an in-depth coverage of recent advances in nonlinear filtering for maneuvering target tracking, including some of the instructors’ results and insights as well as better known methods. The tutorial highlights the underlying ideas and pros and cons of approaches and techniques as well as inter-relationships among them. It is an outgrowth of the instructors’ ongoing comprehensive survey and several short courses of the same subject as well as a graduate course on target tracking taught at the Electrical Engineering Department of the University of New Orleans.<br /> | '''Brief description:''' The principal challenges for tracking a maneuverable target are nonlinearity in both target motion and measurement models as well as the uncertainty in the pattern of target motion. This tutorial presents theoretical and algorithmic means available to meet these challenges. The overview part elucidates a well organized panorama of maneuvering target tracking. The other part presents an in-depth coverage of recent advances in nonlinear filtering for maneuvering target tracking, including some of the instructors’ results and insights as well as better known methods. The tutorial highlights the underlying ideas and pros and cons of approaches and techniques as well as inter-relationships among them. It is an outgrowth of the instructors’ ongoing comprehensive survey and several short courses of the same subject as well as a graduate course on target tracking taught at the Electrical Engineering Department of the University of New Orleans.<br /> | ||
− | [[T18| More Details]] | + | [[T18| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate18.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 634: | Line 635: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' Autonomous driver safety functions are standard in many modern cars, and semi-automated systems (e.g., traffic jam assist) are becoming more and more common. Construction of a driverless vehicle requires solutions to many different problems, among them multiple object tracking. This tutorial will introduce the audience to extended object tracking, i.e., object tracking using modern high resolution sensors that give multiple detections per object. State of the art theory will be introduced, and relevant real world applications will be shown where different object types—e.g., pedestrians, bicyclists, cars—are tracked using different sensors such as lidar, radar, and camera.<br /> | '''Brief description:''' Autonomous driver safety functions are standard in many modern cars, and semi-automated systems (e.g., traffic jam assist) are becoming more and more common. Construction of a driverless vehicle requires solutions to many different problems, among them multiple object tracking. This tutorial will introduce the audience to extended object tracking, i.e., object tracking using modern high resolution sensors that give multiple detections per object. State of the art theory will be introduced, and relevant real world applications will be shown where different object types—e.g., pedestrians, bicyclists, cars—are tracked using different sensors such as lidar, radar, and camera.<br /> | ||
− | [[T20| More Details]] | + | [[T20| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate20.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 653: | Line 654: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' The tutorial aims at providing an overview of new insights in extending Dynamic Bayesian Networks techniques for representing, modeling and automatically interpreting and managing complex interaction situations occurring in cognitive environments starting from observations provided by multidimensional signals collected through a distributed network of embedded systems. A uniform representation is discussed that can also be used to support decisions concerning interactions between operators and the status of the observed environment. Solutions, which are based on an extension of traditional Bayesian filters for object assessment, are the basis background of discussion from which techniques in this tutorial. ...<br /> | '''Brief description:''' The tutorial aims at providing an overview of new insights in extending Dynamic Bayesian Networks techniques for representing, modeling and automatically interpreting and managing complex interaction situations occurring in cognitive environments starting from observations provided by multidimensional signals collected through a distributed network of embedded systems. A uniform representation is discussed that can also be used to support decisions concerning interactions between operators and the status of the observed environment. Solutions, which are based on an extension of traditional Bayesian filters for object assessment, are the basis background of discussion from which techniques in this tutorial. ...<br /> | ||
− | [[T21| More Details]] | + | [[T21| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate21.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 667: | Line 668: | ||
{| id="mp-left" style="width:100%; vertical-align:top; background:#e7f7e76;" | {| id="mp-left" style="width:100%; vertical-align:top; background:#e7f7e76;" | ||
− | | style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:#ceecf2; font-family:inherit; font-size:120%; font-weight:bold; border:1px solid #a3babf; text-align:left; color:#000; padding:0.2em 0.4em;"><span style="background:#cdcdcd"><s>T22 Multitarget Tracking and Sensor Calibration in Centralized and Distributed Networks</s></span> - Withdrawn | + | | style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:#ceecf2; font-family:inherit; font-size:120%; font-weight:bold; border:1px solid #a3babf; text-align:left; color:#000; padding:0.2em 0.4em;"><span style="background:#cdcdcd"><s>T22 Multitarget Tracking and Sensor Calibration in Centralized and Distributed Networks</s></span> - Withdrawn</h2> |
|- | |- | ||
| style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | | style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | ||
Line 692: | Line 693: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' … This tutorial will introduce key features of modern visual sensor networks while exploring the issues commonly found in such networks, which have recently become central in several applications. For smart-camera networks to enable these emerging applications they need to adapt to unforeseen conditions and varying tasks under constrained resources. The tutorial will offer theoretical explanations followed by examples using the WiseMNet++ simulator.<br /> | '''Brief description:''' … This tutorial will introduce key features of modern visual sensor networks while exploring the issues commonly found in such networks, which have recently become central in several applications. For smart-camera networks to enable these emerging applications they need to adapt to unforeseen conditions and varying tasks under constrained resources. The tutorial will offer theoretical explanations followed by examples using the WiseMNet++ simulator.<br /> | ||
− | [[T23| More Details]] | + | [[T23| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate23.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 711: | Line 712: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' ... The tutorial introduces the current state-of-the-art of non-linear (single-target) optimal filtering and smoothing methods in a unified Bayesian framework. The attendees learn what non-linear Kalman filters and particle filters are, how they are related, and their relative advantages and disadvantages. They also discover how Bayesian parameter estimation methods can be combined with the filtering and smoothing algorithms. ... Example applications from navigation, remote surveillance, and time series analysis.<br /> | '''Brief description:''' ... The tutorial introduces the current state-of-the-art of non-linear (single-target) optimal filtering and smoothing methods in a unified Bayesian framework. The attendees learn what non-linear Kalman filters and particle filters are, how they are related, and their relative advantages and disadvantages. They also discover how Bayesian parameter estimation methods can be combined with the filtering and smoothing algorithms. ... Example applications from navigation, remote surveillance, and time series analysis.<br /> | ||
− | [[T24| More Details]] | + | [[T24| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate24.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 731: | Line 732: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' This tutorial is focussed towards the stringent requirements, foundations, development and testing of sensor fusion algorithms meant for advanced driver assistance functions and driverless applications in automotive vehicle systems. ... The interesting part of the tutorial is covered on the different challenging and important practical aspects such as fusion with incomplete information, data association, etc. related to fusion and target tracking in automotive setting. Fusion and management of the different extended target representations of heterogeneous nature obtained from sensors with different resolution is presented with examples. More than one kind of intelligent vehicular sensor fusion framework dealing with tracked objects i.e. track level fusion and raw sensor measurements i.e. measurement level fusion, with results obtained using several real world data sets that contains various static and dynamic targets would be presented in this tutorial.<br /> | '''Brief description:''' This tutorial is focussed towards the stringent requirements, foundations, development and testing of sensor fusion algorithms meant for advanced driver assistance functions and driverless applications in automotive vehicle systems. ... The interesting part of the tutorial is covered on the different challenging and important practical aspects such as fusion with incomplete information, data association, etc. related to fusion and target tracking in automotive setting. Fusion and management of the different extended target representations of heterogeneous nature obtained from sensors with different resolution is presented with examples. More than one kind of intelligent vehicular sensor fusion framework dealing with tracked objects i.e. track level fusion and raw sensor measurements i.e. measurement level fusion, with results obtained using several real world data sets that contains various static and dynamic targets would be presented in this tutorial.<br /> | ||
− | [[T25| More Details]] | + | [[T25| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate25.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 744: | Line 745: | ||
| class="MainPageBG" style="width:100%; border:1px solid #f5946e; background:#fae6de; vertical-align:top; color:#000;" | | | class="MainPageBG" style="width:100%; border:1px solid #f5946e; background:#fae6de; vertical-align:top; color:#000;" | | ||
{| id="mp-left" style="width:100%; vertical-align:top; background:#fae6de;" | {| id="mp-left" style="width:100%; vertical-align:top; background:#fae6de;" | ||
− | | style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:#f9d6c9; font-family:inherit; font-size:120%; font-weight:bold; border:1px solid #f5946e; text-align:left; color:#000; padding:0.2em 0.4em;">T26 Multisensor Data Fusion in Wireless Sensor and Actuator Networks</h2> | + | | style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:#f9d6c9; font-family:inherit; font-size:120%; font-weight:bold; border:1px solid #f5946e; text-align:left; color:#000; padding:0.2em 0.4em;"><span style="background:#cdcdcd"><s>T26 Multisensor Data Fusion in Wireless Sensor and Actuator Networks</s></span> - Withdrawn</h2> |
|- | |- | ||
| style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | | style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | ||
Line 756: | Line 757: | ||
| style="border:1px solid transparent;" |<br /> | | style="border:1px solid transparent;" |<br /> | ||
|- | |- | ||
− | + | {{Organisation}} | |
− | { | + | |
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
__NOTOC____NOEDITSECTION__ | __NOTOC____NOEDITSECTION__ |
Latest revision as of 12:47, 21 July 2016
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|