(46 intermediate revisions by 2 users not shown) | |||
Line 1: | Line 1: | ||
<!-- CALL FOR PAPERS --> | <!-- CALL FOR PAPERS --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
− | + | ||
<!-- CFP Download --> | <!-- CFP Download --> | ||
| class="MainPageBG" style="width:100%; border:1px solid #bdd6c6; background:#e7f7e7; vertical-align:top; color:#000;" | | | class="MainPageBG" style="width:100%; border:1px solid #bdd6c6; background:#e7f7e7; vertical-align:top; color:#000;" | | ||
Line 16: | Line 15: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<!-- General Information --> | <!-- General Information --> | ||
| class="MainPageBG" style="width:100%; border:1px solid #d6bdde; background:#f7eff7; vertical-align:top; color:#000;" | | | class="MainPageBG" style="width:100%; border:1px solid #d6bdde; background:#f7eff7; vertical-align:top; color:#000;" | | ||
Line 26: | Line 25: | ||
Tutorials will be held on Tuesday, July 5, 2016 from 08:30 to 19:00. | Tutorials will be held on Tuesday, July 5, 2016 from 08:30 to 19:00. | ||
+ | |||
+ | Tutorials do not include lunch, but include coffee/tea breaks between tutorials. | ||
+ | |||
|- | |- | ||
|} | |} | ||
Line 32: | Line 34: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<!-- Information for tutorial attendees: --> | <!-- Information for tutorial attendees: --> | ||
− | | class="MainPageBG" style="width:100%; border:1px solid # | + | | class="MainPageBG" style="width:100%; border:1px solid #f5946e; background:#fae6de; vertical-align:top; color:#000;" | |
− | {| id="mp-left" style="width:100%; vertical-align:top; background:# | + | {| id="mp-left" style="width:100%; vertical-align:top; background:#fae6de;" |
− | | style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:# | + | | style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:#f9d6c9; font-family:inherit; font-size:120%; font-weight:bold; border:1px solid #f5946e; text-align:left; color:#000; padding:0.2em 0.4em;">Information for Tutorial Attendees</h2> |
|- | |- | ||
| style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | | style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | ||
Line 51: | Line 53: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<!-- Information for tutorial presenters: --> | <!-- Information for tutorial presenters: --> | ||
| class="MainPageBG" style="width:100%; border:1px solid #a3babf; background:#f5fdff; vertical-align:top; color:#000;" | | | class="MainPageBG" style="width:100%; border:1px solid #a3babf; background:#f5fdff; vertical-align:top; color:#000;" | | ||
Line 60: | Line 62: | ||
| style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | | style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | ||
− | * The conference does not offer any printing services and is not responsible for study materials distributed to the attendees. This is solely the responsibility of the presenter. | + | * The conference does not offer any printing services and is not responsible for study materials distributed to the attendees. This is solely the responsibility of the presenter. |
− | * There will be an honorarium for tutorial presenters, | + | * The tutorial cost for attendees is €165 - €250 per 3-hr slot, depending on time of registration and number of tutorials taken, see [[Fees and Registration| Fees and Registration page]]. |
+ | * There will be an honorarium for tutorial presenters, where the proceeds from each tutorial, less the room cost of €250, will be shared 50-50 between Fusion 2016 and the Presenter. When there is more than one presenter, the honorarium is paid to the first person named in the program. | ||
* The number of attendees will be counted based on the registrations. | * The number of attendees will be counted based on the registrations. | ||
− | * | + | * Any tutorial with less than 5 people registered by the early-bird deadline June 1<sup>st</sup> may be cancelled. |
+ | |||
* Please be present in your assigned room early. The time and room for your tutorial is shown in the [[Tutorials#timetable| time table]] below. | * Please be present in your assigned room early. The time and room for your tutorial is shown in the [[Tutorials#timetable| time table]] below. | ||
* There will be a technician and a student volunteer helping you with setting up the system and providing access control. | * There will be a technician and a student volunteer helping you with setting up the system and providing access control. | ||
Line 72: | Line 76: | ||
<!-- List of Tutorials of FUSION 2016 --> | <!-- List of Tutorials of FUSION 2016 --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<!-- List of Special Sessions of FUSION 2016 --> | <!-- List of Special Sessions of FUSION 2016 --> | ||
| class="MainPageBG" style="width:100%; border:1px solid #f2ea7e; background:#ffffe8; vertical-align:top; color:#000;" | | | class="MainPageBG" style="width:100%; border:1px solid #f2ea7e; background:#ffffe8; vertical-align:top; color:#000;" | | ||
Line 89: | Line 93: | ||
* [[Tutorials#tutorial9| T9 Quantum Physics Methods for Nonlinear Filtering]] | * [[Tutorials#tutorial9| T9 Quantum Physics Methods for Nonlinear Filtering]] | ||
* [[Tutorials#tutorial10| T10 Basic Concepts in Multiobject Estimation]] | * [[Tutorials#tutorial10| T10 Basic Concepts in Multiobject Estimation]] | ||
− | * [[Tutorials#tutorial11| T11 System-of-Systems | + | * <span style="background:#cdcdcd">[[Tutorials#tutorial11| <s>T11 System-of-Systems Opportunities and Issues for Information Fusion]]</s></span> - Withdrawn by presenter |
* [[Tutorials#tutorial12| T12 Implementations of Random-Finite-Set-Based Multi-Target Filters]] | * [[Tutorials#tutorial12| T12 Implementations of Random-Finite-Set-Based Multi-Target Filters]] | ||
* [[Tutorials#tutorial13| T13 Tracking and Sensor Data Fusion – Methodological Framework and Selected Applications]] | * [[Tutorials#tutorial13| T13 Tracking and Sensor Data Fusion – Methodological Framework and Selected Applications]] | ||
− | * [[Tutorials#tutorial14| T14 Multistatic Exploration – Introduction to Modern Passive Radar and Multistatic Tracking & Data Fusion]] | + | * <span style="background:#cdcdcd">[[Tutorials#tutorial14| <s>T14 Multistatic Exploration – Introduction to Modern Passive Radar and Multistatic Tracking & Data Fusion</s>]]</span> - Withdrawn |
* [[Tutorials#tutorial15| T15 Big Data Fusion and Analytics]] | * [[Tutorials#tutorial15| T15 Big Data Fusion and Analytics]] | ||
* [[Tutorials#tutorial16| T16 Object Tracking, Sensor Fusion and Situational Awareness for Assisted- and Self-Driving Vehicles: Problems, Solutions and Directions]] | * [[Tutorials#tutorial16| T16 Object Tracking, Sensor Fusion and Situational Awareness for Assisted- and Self-Driving Vehicles: Problems, Solutions and Directions]] | ||
* [[Tutorials#tutorial17| T17 Emerging Quantum Technologies for Fusion]] | * [[Tutorials#tutorial17| T17 Emerging Quantum Technologies for Fusion]] | ||
* [[Tutorials#tutorial18| T18 Maneuvering Target Tracking: Overview and Nonlinear Filtering Methods]] | * [[Tutorials#tutorial18| T18 Maneuvering Target Tracking: Overview and Nonlinear Filtering Methods]] | ||
− | * [[Tutorials#tutorial19| T19 Integration of Information to | + | * <span style="background:#cdcdcd">[[Tutorials#tutorial19| <s>T19 Integration of Information to Identify Objects in Big Data</s>]]</span> - Withdrawn by presenter |
* [[Tutorials#tutorial20| T20 Extended Object Tracking: Theory and Applications]] | * [[Tutorials#tutorial20| T20 Extended Object Tracking: Theory and Applications]] | ||
* [[Tutorials#tutorial21| T21 Probabilistic Situation Assessment for Abnormal Interaction Detection]] | * [[Tutorials#tutorial21| T21 Probabilistic Situation Assessment for Abnormal Interaction Detection]] | ||
− | * [[Tutorials#tutorial22| T22 Multitarget Tracking and Sensor Calibration in Centralized and Distributed Networks]] | + | * <span style="background:#cdcdcd">[[Tutorials#tutorial22| <s>T22 Multitarget Tracking and Sensor Calibration in Centralized and Distributed Networks]]</s></span> - Withdrawn |
* [[Tutorials#tutorial23| T23 Information Fusion in Resource-Limited Camera Networks]] | * [[Tutorials#tutorial23| T23 Information Fusion in Resource-Limited Camera Networks]] | ||
* [[Tutorials#tutorial24| T24 Introduction to Bayesian Filtering and Smoothing]] | * [[Tutorials#tutorial24| T24 Introduction to Bayesian Filtering and Smoothing]] | ||
* [[Tutorials#tutorial25| T25 Sensor Fusion for Intelligent Vehicles]] | * [[Tutorials#tutorial25| T25 Sensor Fusion for Intelligent Vehicles]] | ||
− | * [[Tutorials#tutorial26| T26 Multisensor Data Fusion in Wireless Sensor and Actuator Networks]] | + | * <span style="background:#cdcdcd">[[Tutorials#tutorial26| <s>T26 Multisensor Data Fusion in Wireless Sensor and Actuator Networks]] </s></span> - Withdrawn |
</div> | </div> | ||
Line 112: | Line 116: | ||
<!-- FUSION 2016 Tutorial Time Table --> | <!-- FUSION 2016 Tutorial Time Table --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<!-- FUSION 2016 Tutorial Time Table --> | <!-- FUSION 2016 Tutorial Time Table --> | ||
<div id="timetable"></div> | <div id="timetable"></div> | ||
Line 121: | Line 125: | ||
| style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | | style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | ||
{| class="wikitable" style="text-align:center;background-color:#e7deef;" | {| class="wikitable" style="text-align:center;background-color:#e7deef;" | ||
− | |||
! style="width:8em;text-align:center;background-color:#e7deef;"|'''Start & end times''' | ! style="width:8em;text-align:center;background-color:#e7deef;"|'''Start & end times''' | ||
− | ! style="width:15em;text-align:center;background-color:#e7deef;"|'''Room | + | ! style="width:15em;text-align:center;background-color:#e7deef;"|'''Room B''' |
− | ! style="width:15em;text-align:center;background-color:#e7deef;"|'''Room | + | ! style="width:15em;text-align:center;background-color:#e7deef;"|'''Room C''' |
− | ! style="width:15em;text-align:center;background-color:#e7deef;"|'''Room | + | ! style="width:15em;text-align:center;background-color:#e7deef;"|'''Room D''' |
− | ! style="width:15em;text-align:center;background-color:#e7deef;"|'''Room | + | ! style="width:15em;text-align:center;background-color:#e7deef;"|'''Room E''' |
− | ! style="width:15em;text-align:center;background-color:#e7deef;"|'''Room | + | ! style="width:15em;text-align:center;background-color:#e7deef;"|'''Room F''' |
− | ! style="width:15em;text-align:center;background-color:#e7deef;"|'''Room | + | ! style="width:15em;text-align:center;background-color:#e7deef;"|'''Room G''' |
− | ! style="width:15em;text-align:center;background-color:#e7deef;"|'''Room | + | ! style="width:15em;text-align:center;background-color:#e7deef;"|'''Room H''' |
− | ! style="width:15em;text-align:center;background-color:#e7deef;"|'''Room | + | ! style="width:15em;text-align:center;background-color:#e7deef;"|'''Room I''' |
− | ! style="width:15em;text-align:center;background-color:#e7deef;"|'''Room | + | ! style="width:15em;text-align:center;background-color:#e7deef;"|'''Room K''' |
|- | |- | ||
− | |style="text-align:center;background-color:#e7deef;" | '''Morning''' | + | |style="text-align:center;background-color:#e7deef;" | '''Morning'''<br /> |
− | + | '''08:30–11:30''' | |
− | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials# | + | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial7| T7]]'''<br /> |
− | + | Multitarget Tracking and Multisensor Information Fusion<br /> | |
− | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial4| T4]]''' | + | ''Bar-Shalom'' |
− | + | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial4| T4]]'''<br /> | |
− | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials# | + | … Track-to-Track Fusion and the Distributed Kalman Filter<br /> |
− | + | ''Govaers'' | |
− | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials# | + | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial3| T3]]'''<br /> |
− | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials# | + | Multisensor-Multitarget Tracker/Fusion Engine …<br /> |
− | + | ''Kiruba'' | |
− | + | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutoria9| T9]]'''<br /> | |
− | + | Quantum Physics Methods For Nonlinear Filtering<br /> | |
− | + | ''Balaji, Daum'' | |
− | + | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial2| T2]]'''<br /> | |
+ | Bayesian Networks and Trust Fusion with Subjective Logic<br /> | ||
''Jøsang'' | ''Jøsang'' | ||
− | |style="text-align:center;background-color:#f7eff7;" | | + | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial15| T15]]'''<br /> |
− | '' | + | Big Data Fusion and Analytics<br /> |
− | + | ||
− | + | ||
− | + | ||
''Das'' | ''Das'' | ||
− | |style="text-align:center;background-color:#f7eff7;" | | + | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial8| T8]]'''<br /> |
− | '' | + | Overview of High-Level Information Fusion Theory …<br /> |
− | + | ||
− | + | ||
− | + | ||
''Blasch'' | ''Blasch'' | ||
− | |style="text-align:center;background-color:# | + | |style="text-align:center;background-color:#cdcdcd;" | |
− | '' | + | <s>'''[[Tutorials#tutoria19| T19]]'''<br /> |
+ | Integration of Information to Identify Objects in Big Data<br /> | ||
+ | ''Shieh''</s><br /> | ||
+ | <small>Withdrawn by presenter</small> | ||
+ | |style="text-align:center;background-color:#cdcdcd;" | <s>'''[[Tutorials#tutorial14| T14]]'''<br /> | ||
+ | Multistatic Exploration ... Modern Passive Radar …<br /> | ||
+ | ''Koch''</s><br /> | ||
+ | <small>Withdrawn</small> | ||
|- | |- | ||
− | + | |style="width:7em;text-align:center;background-color:#e7deef;"| '''11:30–12:30''' | |
− | + | |colspan="9" style="width:15em;text-align:center;background-color:#f7eff7;"| | |
− | + | ||
− | |style="width:7em;text-align:center;background-color:#e7deef;"| '''11: | + | |
− | | | + | |
− | + | ||
− | + | ||
− | + | ||
− | + | ||
'''Lunch break'''<br /> | '''Lunch break'''<br /> | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
|- | |- | ||
|style="text-align:center;background-color:#e7deef;" | '''Mid Day''' | |style="text-align:center;background-color:#e7deef;" | '''Mid Day''' | ||
− | + | '''12:30–15:30''' | |
− | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials# | + | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial16| T16]]'''<br /> |
− | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial12| T12]]''' | + | Object tracking, sensor fusion ... self-driving vehicles …<br /> |
− | + | ''Kiruba'' | |
− | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial6| T6]]''' | + | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial12| T12]]'''<br /> |
− | + | ... random-finite-set-based multi-target filters<br /> | |
− | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials# | + | ''Vo, Vo'' |
− | + | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial6| T6]]'''<br /> | |
− | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials# | + | Information Quality in Information Fusion …<br /> |
− | + | ''Rogova'' | |
− | + | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial18| T18A]]'''<br /> | |
− | + | Maneuvering Target Tracking ... Filtering Methods<br /> | |
+ | ''Li, Jilkov'' | ||
+ | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial10| T10]]'''<br /> | ||
+ | Basic concepts in multi-object estimation<br /> | ||
''Clark, Delande, Houssineau'' | ''Clark, Delande, Houssineau'' | ||
− | |style="text-align:center;background-color:#f7eff7;" | | + | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial13| T13]]'''<br /> |
− | '' | + | Tracking and Sensor Data Fusion ... Framework …<br /> |
− | + | ||
− | + | ||
− | + | ||
''Koch'' | ''Koch'' | ||
− | |style="text-align:center;background-color:#f7eff7;" | | + | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial17| T17]]'''<br /> |
− | '' | + | Emerging Quantum Technologies for Fusion<br /> |
− | + | ||
− | '' | + | |
− | + | ||
− | + | ||
− | + | ||
''Balaji'' | ''Balaji'' | ||
− | |style="text-align:center;background-color:#f7eff7;" | | + | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial23| T23]]'''<br /> |
− | '' | + | Information fusion in resource-limited camera networks<br /> |
+ | ''Cavallaro, SanMiguel'' | ||
+ | |style="text-align:center;background-color:#cdcdcd;" | <s>'''[[Tutorials#tutorial11| T11]]'''<br /> | ||
+ | System-of-Systems … Issues for Information Fusion<br /> | ||
+ | ''Steinberg''</s><br /> | ||
+ | <small>Withdrawn by presenter</small> | ||
|- | |- | ||
− | + | |style="width:7em;text-align:center;background-color:#e7deef;"| '''15:30–16:00''' | |
− | + | |colspan="9" style="width:15em;text-align:center;background-color:#f7eff7;"| | |
− | + | ||
− | |style="width:7em;text-align:center;background-color:#e7deef;"| '''15: | + | |
− | | | + | |
− | + | ||
− | + | ||
− | + | ||
− | + | ||
'''Coffee/tea break'''<br /> | '''Coffee/tea break'''<br /> | ||
− | |||
− | |||
− | |||
− | |||
|- | |- | ||
− | | | + | |style="text-align:center;background-color:#e7deef;" | '''Afternoon'''<br /> |
− | + | '''16:00–19:00''' | |
− | + | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial5| T5]]'''<br /> | |
− | + | ... Finite-Set Statistics for Information Fusion<br /> | |
− | + | ''Mahler'' | |
− | + | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial25| T25]]'''<br /> | |
− | + | Sensor Fusion for Intelligent Vehicles<br /> | |
− | + | ''Duraisamy, Yuan, Schwarz, Fritzsche'' | |
− | + | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial1| T1]]'''<br /> | |
− | + | Bayesian Multiple Target Tracking<br /> | |
− | + | ''Stone, Streit'' | |
− | + | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial18| T18B]]'''<br /> | |
− | + | Maneuvering Target Tracking ... Filtering Methods<br /> | |
− | + | ||
− | + | ||
− | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials# | + | |
− | + | ||
− | + | ||
− | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials# | + | |
− | + | ||
− | + | ||
− | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials# | + | |
− | + | ||
− | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials# | + | |
− | + | ||
− | + | ||
− | + | ||
''Li, Jilkov'' | ''Li, Jilkov'' | ||
− | |style="text-align:center;background-color:#f7eff7;" | | + | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial20| T20]]'''<br /> |
− | '' | + | Extended Object Tracking: Theory and Applications<br /> |
− | + | ||
''Granström, Reuter, Baum'' | ''Granström, Reuter, Baum'' | ||
− | |style="text-align:center;background-color:#f7eff7;" | Probabilistic situation assessment for abnormal …<br /> | + | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial24| T24]]'''<br /> |
+ | Introduction to Bayesian Filtering and Smoothing<br /> | ||
+ | ''Särkkä'' | ||
+ | |style="text-align:center;background-color:#f7eff7;" | '''[[Tutorials#tutorial21| T21]]'''<br /> | ||
+ | Probabilistic situation assessment for abnormal …<br /> | ||
''Regazzoni, Marcenaro'' | ''Regazzoni, Marcenaro'' | ||
− | |style="text-align:center;background-color:# | + | |style="text-align:center;background-color:#cdcdcd;" | <s>'''[[Tutorials#tutorial26| T26]]'''<br /> |
− | '' | + | Multisensor Data Fusion in Wireless ... Networks<br /> |
− | + | ''Miceli de Farias''</s><br /> | |
− | '' | + | <small>Withdrawn</small> |
− | |style="text-align:center;background-color:# | + | |style="text-align:center;background-color:#cdcdcd;" | <s>'''[[Tutorials#tutorial22| T22]]'''<br /> |
− | '' | + | Multitarget tracking and sensor calibration in ... networks<br /> |
− | + | ''Uney, Julier, Clark''</s><br /> | |
− | + | <small>Withdrawn</small> | |
− | + | ||
− | '' | + | |
|- | |- | ||
+ | |style="width:7em;text-align:center;background-color:#e7deef;"| '''19:00–22:00''' | ||
+ | |colspan="9" style="width:15em;text-align:center;background-color:#f7eff7;"| | ||
+ | '''Welcome Reception'''<br /> | ||
|} | |} | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | + | ||
− | + | Additionaly you can check out the full [[schedule]]. | |
− | + | ||
− | + | ||
|- | |- | ||
− | |} | + | |} |
</div> | </div> | ||
|- | |- | ||
Line 310: | Line 258: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial1"> | <div id="tutorial1"> | ||
<!-- T1 Bayesian Multiple Target Tracking --> | <!-- T1 Bayesian Multiple Target Tracking --> | ||
− | | class="MainPageBG" style="width:100%; border:1px solid # | + | | class="MainPageBG" style="width:100%; border:1px solid #f5946e; background:#fae6de; vertical-align:top; color:#000;" | |
− | {| id="mp-left" style="width:100%; vertical-align:top; background:# | + | {| id="mp-left" style="width:100%; vertical-align:top; background:#fae6de;" |
− | | style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:# | + | | style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:#f9d6c9; font-family:inherit; font-size:120%; font-weight:bold; border:1px solid #f5946e; text-align:left; color:#000; padding:0.2em 0.4em;">T1 Bayesian Multiple Target Tracking</h2> |
|- | |- | ||
| style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | | style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | ||
Line 321: | Line 269: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' This tutorial is based on the book, Bayesian Multiple Target Tracking 2nd Ed. Its purpose is to present the basic results in multiple-target tracking from a Bayesian point of view. People who register will receive a complimentary copy of the book when they attend the tutorial.<br /> | '''Brief description:''' This tutorial is based on the book, Bayesian Multiple Target Tracking 2nd Ed. Its purpose is to present the basic results in multiple-target tracking from a Bayesian point of view. People who register will receive a complimentary copy of the book when they attend the tutorial.<br /> | ||
− | [[T1| More Details]] | + | [[T1| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate1.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 329: | Line 277: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial2"></div> | <div id="tutorial2"></div> | ||
<!-- T2 Bayesian Networks and Trust Fusion with Subjective Logic --> | <!-- T2 Bayesian Networks and Trust Fusion with Subjective Logic --> | ||
Line 341: | Line 289: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' This tutorial gives attendees a first-hand insight into the theory and application of subjective logic by the author and researcher who proposed and started developing this framework in 1997. The tutorial gives an introduction to subjective logic, and how it applies to Bayesian network modelling and information fusion. ...<br /> | '''Brief description:''' This tutorial gives attendees a first-hand insight into the theory and application of subjective logic by the author and researcher who proposed and started developing this framework in 1997. The tutorial gives an introduction to subjective logic, and how it applies to Bayesian network modelling and information fusion. ...<br /> | ||
− | [[T2| More Details]] | + | [[T2| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate2.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 349: | Line 297: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial3"></div> | <div id="tutorial3"></div> | ||
<!-- T3 Multisensor-Multitarget Tracker/Fusion Engine Development and Performance Evaluation for Realistic Scenarios --> | <!-- T3 Multisensor-Multitarget Tracker/Fusion Engine Development and Performance Evaluation for Realistic Scenarios --> | ||
Line 360: | Line 308: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' While numerous tracking and fusion algorithms are available in the literature, their implementation and application on real-world problems are still challenging. Since new algorithms continue to emerge, rapidly prototyping them, developing for production and evaluating them on real-world (or realistic) problems efficiently are also essential. In addition to reviewing state-of-the-art tracking algorithms, this tutorial will focus on a number of realistic multisensor-multitarget tracking problems, simulation of large-scale tracking scenarios, rapid prototyping, development of high performance real-time tracking/fusion software, and performance evaluation on realistic scenarios. ...<br /> | '''Brief description:''' While numerous tracking and fusion algorithms are available in the literature, their implementation and application on real-world problems are still challenging. Since new algorithms continue to emerge, rapidly prototyping them, developing for production and evaluating them on real-world (or realistic) problems efficiently are also essential. In addition to reviewing state-of-the-art tracking algorithms, this tutorial will focus on a number of realistic multisensor-multitarget tracking problems, simulation of large-scale tracking scenarios, rapid prototyping, development of high performance real-time tracking/fusion software, and performance evaluation on realistic scenarios. ...<br /> | ||
− | [[T3| More Details]] | + | [[T3| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate3.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 368: | Line 316: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial4"></div> | <div id="tutorial4"></div> | ||
<!-- T4 An Introduction to Track-to-Track Fusion and the Distributed Kalman Filter --> | <!-- T4 An Introduction to Track-to-Track Fusion and the Distributed Kalman Filter --> | ||
Line 379: | Line 327: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' The increasing trend towards connected sensors (”internet of things” and ”ubiquitous computing”) derive a demand for powerful distributed estimation methodologies. In tracking applications, the ”Distributed Kalman Filter” (DKF) provides an optimal solution under certain conditions. The optimal solution in terms of the estimation accuracy is also achieved by a centralized fusion algorithm which receives either all associated measurements or so-called tracklets.... Two more recent methodologies are based on the ”accumulated state densities” (ASD) which augment the states from multiple time instants. In practical applications, tracklet fusion based on the equivalent measurement often achieves reliable results even if full communication is not available. The limitations and robustness of the tracklet fusion will be discussed. ...<br /> | '''Brief description:''' The increasing trend towards connected sensors (”internet of things” and ”ubiquitous computing”) derive a demand for powerful distributed estimation methodologies. In tracking applications, the ”Distributed Kalman Filter” (DKF) provides an optimal solution under certain conditions. The optimal solution in terms of the estimation accuracy is also achieved by a centralized fusion algorithm which receives either all associated measurements or so-called tracklets.... Two more recent methodologies are based on the ”accumulated state densities” (ASD) which augment the states from multiple time instants. In practical applications, tracklet fusion based on the equivalent measurement often achieves reliable results even if full communication is not available. The limitations and robustness of the tracklet fusion will be discussed. ...<br /> | ||
− | [[T4| More Details]] | + | [[T4| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate4.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 387: | Line 335: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial5"></div> | <div id="tutorial5"></div> | ||
<!-- T5 An Introduction to Finite-Set Statistics for Information Fusion --> | <!-- T5 An Introduction to Finite-Set Statistics for Information Fusion --> | ||
Line 398: | Line 346: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' Finite-set statistics is a theoretically unified mathematical machine for solving information fusion problems, based on random set theory. First systematically described in Statistical Multisource-Multitarget Information Fusion (Artech, 2007), it has attracted the interest of dozens of research groups in at least 19 nations, resulting in well over a thousand publications. Advances in Statistical Multisource-Multitarget Information Fusion (Artech, 2014) systematically described the most intriguing aspects of this research, including algorithms that outperform conventional approaches. Previous tutorials have focused on applications of random set information fusion. This is the first systematic tutorial treatment of finite-set statistics itself. ...<br /> | '''Brief description:''' Finite-set statistics is a theoretically unified mathematical machine for solving information fusion problems, based on random set theory. First systematically described in Statistical Multisource-Multitarget Information Fusion (Artech, 2007), it has attracted the interest of dozens of research groups in at least 19 nations, resulting in well over a thousand publications. Advances in Statistical Multisource-Multitarget Information Fusion (Artech, 2014) systematically described the most intriguing aspects of this research, including algorithms that outperform conventional approaches. Previous tutorials have focused on applications of random set information fusion. This is the first systematic tutorial treatment of finite-set statistics itself. ...<br /> | ||
− | [[T5| More Details]] | + | [[T5| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate5.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 406: | Line 354: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial6"> | <div id="tutorial6"> | ||
<!-- T6 Information Quality in Information Fusion and Decision Making --> | <!-- T6 Information Quality in Information Fusion and Decision Making --> | ||
− | | class="MainPageBG" style="width:100%; border:1px solid # | + | | class="MainPageBG" style="width:100%; border:1px solid #f5946e; background:#fae6de; vertical-align:top; color:#000;" | |
− | {| id="mp-left" style="width:100%; vertical-align:top; background:# | + | {| id="mp-left" style="width:100%; vertical-align:top; background:#fae6de;" |
− | | style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:# | + | | style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:#f9d6c9; font-family:inherit; font-size:120%; font-weight:bold; border:1px solid #f5946e; text-align:left; color:#000; padding:0.2em 0.4em;">T6 Information Quality in Information Fusion and Decision Making</h2> |
|- | |- | ||
| style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | | style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | ||
Line 417: | Line 365: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' ... The tutorial will discuss major challenges and some possible approaches addressing the problem of representing and incorporating information quality into fusion processes. In particular it will present an ontology of quality of information and identify potential methods of representing and assessing the values of quality attributes and their combination. It will also examine the relation between information quality and context, and suggest possible approaches to quality control compensating for insufficient information and model quality.<br /> | '''Brief description:''' ... The tutorial will discuss major challenges and some possible approaches addressing the problem of representing and incorporating information quality into fusion processes. In particular it will present an ontology of quality of information and identify potential methods of representing and assessing the values of quality attributes and their combination. It will also examine the relation between information quality and context, and suggest possible approaches to quality control compensating for insufficient information and model quality.<br /> | ||
− | [[T6| More Details]] | + | [[T6| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate6.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 425: | Line 373: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial7"></div> | <div id="tutorial7"></div> | ||
<!-- T7 Multitarget Tracking and Multisensor Information Fusion --> | <!-- T7 Multitarget Tracking and Multisensor Information Fusion --> | ||
Line 437: | Line 385: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' To provide to the participants the latest state-of-the art techniques to estimate the states ofmultiple targets with multisensor information fusion. Tools for algorithm selection, design and evaluation willbe presented. These form the basis of automated decision systems foradvanced surveillanceandtargeting.The various information processing configurations for fusion are described, including the recently solvedtrack-to-track fusion from heterogeneous sensors.<br /> | '''Brief description:''' To provide to the participants the latest state-of-the art techniques to estimate the states ofmultiple targets with multisensor information fusion. Tools for algorithm selection, design and evaluation willbe presented. These form the basis of automated decision systems foradvanced surveillanceandtargeting.The various information processing configurations for fusion are described, including the recently solvedtrack-to-track fusion from heterogeneous sensors.<br /> | ||
− | [[T7| More Details]] | + | [[T7| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate7.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 445: | Line 393: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial8"></div> | <div id="tutorial8"></div> | ||
<!-- T8 Overview of High-Level Information Fusion Theory, Models, and Representations --> | <!-- T8 Overview of High-Level Information Fusion Theory, Models, and Representations --> | ||
Line 456: | Line 404: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' Over the past decade, the ISIF community has put together special sessions, panel discussions, and concept papers to capture the methodologies, directions, needs, and grand challenges of high-level information fusion (HLIF) in practical system designs. This tutorial brings together the contemporary concepts, models, and definitions to give the attendee a summary of the state-of-the-art in HLIF. Analogies from low-level information fusion (LLIF) of object tracking and identification are extended to the HLIF concepts of situation/impact assessment and process/user refinement. HLIF theories (operational, functional, formal, cognitive) are mapped to representations (semantics, ontologies, axiomatics, and agents) with contemporary issues of modeling, testbeds, evaluation, and human-machine interfaces. Discussions with examples of search and rescue, cyber analysis, and battlefield awareness are presented. ...<br /> | '''Brief description:''' Over the past decade, the ISIF community has put together special sessions, panel discussions, and concept papers to capture the methodologies, directions, needs, and grand challenges of high-level information fusion (HLIF) in practical system designs. This tutorial brings together the contemporary concepts, models, and definitions to give the attendee a summary of the state-of-the-art in HLIF. Analogies from low-level information fusion (LLIF) of object tracking and identification are extended to the HLIF concepts of situation/impact assessment and process/user refinement. HLIF theories (operational, functional, formal, cognitive) are mapped to representations (semantics, ontologies, axiomatics, and agents) with contemporary issues of modeling, testbeds, evaluation, and human-machine interfaces. Discussions with examples of search and rescue, cyber analysis, and battlefield awareness are presented. ...<br /> | ||
− | [[T8| More Details]] | + | [[T8| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate8.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 464: | Line 412: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial9"></div> | <div id="tutorial9"></div> | ||
<!-- T9 Quantum Physics Methods for Nonlinear Filtering --> | <!-- T9 Quantum Physics Methods for Nonlinear Filtering --> | ||
Line 475: | Line 423: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' Relationships between nonlinear filtering and quantum physics has been studied in the past. In this tutorial, more modern connections between the two fields are drawn, particularly based on methods drawn from Feynman path integrals, quantum field theory and renormalization group.<br /> | '''Brief description:''' Relationships between nonlinear filtering and quantum physics has been studied in the past. In this tutorial, more modern connections between the two fields are drawn, particularly based on methods drawn from Feynman path integrals, quantum field theory and renormalization group.<br /> | ||
− | [[T9| More Details]] | + | [[T9| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate9.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 483: | Line 431: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial10"></div> | <div id="tutorial10"></div> | ||
<!-- T10 Basic Concepts in Multiobject Estimation --> | <!-- T10 Basic Concepts in Multiobject Estimation --> | ||
Line 491: | Line 439: | ||
|- | |- | ||
| style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | | style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | ||
− | '''Presenter:''' [mailto:D.E.Clark@hw.ac.uk Daniel Clark], Emmanuel D. Delande, and | + | '''Presenter:''' [mailto:D.E.Clark@hw.ac.uk Daniel Clark], Emmanuel D. Delande, and Isabel Schlangen<br /> |
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' ... This tutorial will highlight some basic mathematical concepts in multiobject estimation to enable researchers to better understand and contribute to innovations in this field. The goal of the presenters is to inspire participants to develop a broader mathematical perspective and explore the literature in spatial statistics and point processes to aid their research in sensor fusion. The presenters will highlight where new concepts to multiobject estimation in sensor fusion, such as regional variance for estimating population uncertainty, can be facilitated when considering a measuretheoretic point process perspective.<br /> | '''Brief description:''' ... This tutorial will highlight some basic mathematical concepts in multiobject estimation to enable researchers to better understand and contribute to innovations in this field. The goal of the presenters is to inspire participants to develop a broader mathematical perspective and explore the literature in spatial statistics and point processes to aid their research in sensor fusion. The presenters will highlight where new concepts to multiobject estimation in sensor fusion, such as regional variance for estimating population uncertainty, can be facilitated when considering a measuretheoretic point process perspective.<br /> | ||
− | [[T10| More Details]] | + | [[T10| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate10.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 502: | Line 450: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial11"> | <div id="tutorial11"> | ||
− | <!-- T11 System-of-Systems | + | <!-- T11 System-of-Systems Opportunities and Issues for Information Fusion --> |
− | | class="MainPageBG" style="width:100%; border:1px solid # | + | | class="MainPageBG" style="width:100%; border:1px solid #f5946e; background:#fae6de; vertical-align:top; color:#000;" | |
− | {| id="mp-left" style="width:100%; vertical-align:top; background:# | + | {| id="mp-left" style="width:100%; vertical-align:top; background:#fae6de;" |
− | | style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:# | + | | style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:#f9d6c9; font-family:inherit; font-size:120%; font-weight:bold; border:1px solid #f5946e; text-align:left; color:#000; padding:0.2em 0.4em;"><span style="background:#cdcdcd"><s>T11 System-of-Systems Opportunities and Issues for Information Fusion</s></span> - Withdrawn by presenter</h2> |
|- | |- | ||
| style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | | style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | ||
Line 521: | Line 469: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial12"></div> | <div id="tutorial12"></div> | ||
<!-- T12 Implementations of Random-Finite-Set-Based Multi-Target Filters --> | <!-- T12 Implementations of Random-Finite-Set-Based Multi-Target Filters --> | ||
Line 533: | Line 481: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' The Finite Set Statistics framework for multi-sensor multi-target tracking has attached considerable interest in recent years. It provides a unified perspective of multi-target tracking in a very intuitive manner by drawing direct parallels with the simpler problem of single-target tracking. This framework has lead to the development of multi-target filters such as the Probability Hypothesis Density (PHD), Cardinalized PHD (CPHD), Multi-Bernoulli filters and recently, the Generalized Labeled Multi-Bernoulli filter. In this tutorial, we show how these filters are implemented and illustrate via Matlab how these filters work. ...<br /> | '''Brief description:''' The Finite Set Statistics framework for multi-sensor multi-target tracking has attached considerable interest in recent years. It provides a unified perspective of multi-target tracking in a very intuitive manner by drawing direct parallels with the simpler problem of single-target tracking. This framework has lead to the development of multi-target filters such as the Probability Hypothesis Density (PHD), Cardinalized PHD (CPHD), Multi-Bernoulli filters and recently, the Generalized Labeled Multi-Bernoulli filter. In this tutorial, we show how these filters are implemented and illustrate via Matlab how these filters work. ...<br /> | ||
− | [[T12| More Details]] | + | [[T12| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate12.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 541: | Line 489: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial13"></div> | <div id="tutorial13"></div> | ||
<!-- T13 Tracking and Sensor Data Fusion – Methodological Framework and Selected Applications --> | <!-- T13 Tracking and Sensor Data Fusion – Methodological Framework and Selected Applications --> | ||
Line 552: | Line 500: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' The tutorial covers the material of the recently published book of the presenter with the same title (Springer 2014, Mathematical Engineering Series, ISBN 978-3-642-39270-2) and thus provides an guided introduction to deeper reading. Starting point is the well known JDL model of sensor data and information fusion that provides general orientation within the world of fusion methodologies and its various applications, covering a dynamically evolving field of ever increasing relevance. Using the JDL model as a guiding principle, the tutorial introduces into advanced fusion technologies based on practical examples taken from real world applications.<br /> | '''Brief description:''' The tutorial covers the material of the recently published book of the presenter with the same title (Springer 2014, Mathematical Engineering Series, ISBN 978-3-642-39270-2) and thus provides an guided introduction to deeper reading. Starting point is the well known JDL model of sensor data and information fusion that provides general orientation within the world of fusion methodologies and its various applications, covering a dynamically evolving field of ever increasing relevance. Using the JDL model as a guiding principle, the tutorial introduces into advanced fusion technologies based on practical examples taken from real world applications.<br /> | ||
− | [[T13| More Details]] | + | [[T13| More Details]][http://fusion2016.org/download/certificates/TutorialCertificate13.pdf Certificate (PDF)] |
− | </div> | + | </div> |
|- | |- | ||
|} | |} | ||
Line 560: | Line 508: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial14"></div> | <div id="tutorial14"></div> | ||
<!-- T14 Multistatic Exploration – Introduction to Modern Passive Radar and Multistatic Tracking & Data Fusion --> | <!-- T14 Multistatic Exploration – Introduction to Modern Passive Radar and Multistatic Tracking & Data Fusion --> | ||
| class="MainPageBG" style="width:100%; border:1px solid #f2ea7e; background:#ffffe8; vertical-align:top; color:#000;" | | | class="MainPageBG" style="width:100%; border:1px solid #f2ea7e; background:#ffffe8; vertical-align:top; color:#000;" | | ||
{| id="mp-left" style="width:100%; vertical-align:top; background:#ffffe8;" | {| id="mp-left" style="width:100%; vertical-align:top; background:#ffffe8;" | ||
− | | style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:#fff7bd; font-family:inherit; font-size:120%; font-weight:bold; border:1px solid #f2ea7e; text-align:left; color:#000; padding:0.2em 0.4em;">T14 Multistatic Exploration – Introduction to Modern Passive Radar and Multistatic Tracking & Data Fusion</h2> | + | | style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:#fff7bd; font-family:inherit; font-size:120%; font-weight:bold; border:1px solid #f2ea7e; text-align:left; color:#000; padding:0.2em 0.4em;"> |
+ | <span style="background:#cdcdcd"><s>T14 Multistatic Exploration – Introduction to Modern Passive Radar and Multistatic Tracking & Data Fusion</s></span> - Withdrawn</h2> | ||
|- | |- | ||
| style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | | style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | ||
Line 579: | Line 528: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial15"></div> | <div id="tutorial15"></div> | ||
<!-- T15 Big Data Fusion and Analytics --> | <!-- T15 Big Data Fusion and Analytics --> | ||
Line 590: | Line 539: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' Big data has tremendous potential to transform businesses but poses significant challenge in searching, processing, and extracting actionable intelligence. In this tutorial, I will present some techniques for fusion and analytics to process big centralized warehouse data, inherently distributed data, and data residing on the cloud. The fusion and analytics techniques to be discussed will handle both structured transactional and sensor data as well as unstructured textual data such as human intelligence, emails, blogs, surveys, etc. As a background, this tutorial is intended to provide an account of both the cutting-edge and the most commonly used approaches to high-level data fusion and predictive and text analytics. The demos to be presented are in the areas of distributed search and situation assessment, information extraction and classification, and sentiment analyses. ...<br /> | '''Brief description:''' Big data has tremendous potential to transform businesses but poses significant challenge in searching, processing, and extracting actionable intelligence. In this tutorial, I will present some techniques for fusion and analytics to process big centralized warehouse data, inherently distributed data, and data residing on the cloud. The fusion and analytics techniques to be discussed will handle both structured transactional and sensor data as well as unstructured textual data such as human intelligence, emails, blogs, surveys, etc. As a background, this tutorial is intended to provide an account of both the cutting-edge and the most commonly used approaches to high-level data fusion and predictive and text analytics. The demos to be presented are in the areas of distributed search and situation assessment, information extraction and classification, and sentiment analyses. ...<br /> | ||
− | [[T15| More Details]] | + | [[T15| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate15.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 598: | Line 547: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial16"> | <div id="tutorial16"> | ||
<!-- T16 Object Tracking, Sensor Fusion and Situational Awareness for Assisted- and Self-Driving Vehicles: Problems, Solutions and Directions --> | <!-- T16 Object Tracking, Sensor Fusion and Situational Awareness for Assisted- and Self-Driving Vehicles: Problems, Solutions and Directions --> | ||
− | | class="MainPageBG" style="width:100%; border:1px solid # | + | | class="MainPageBG" style="width:100%; border:1px solid #f5946e; background:#fae6de; vertical-align:top; color:#000;" | |
− | {| id="mp-left" style="width:100%; vertical-align:top; background:# | + | {| id="mp-left" style="width:100%; vertical-align:top; background:#fae6de;" |
− | | style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:# | + | | style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:#f9d6c9; font-family:inherit; font-size:120%; font-weight:bold; border:1px solid #f5946e; text-align:left; color:#000; padding:0.2em 0.4em;">T16 Object Tracking, Sensor Fusion and Situational Awareness for Assisted- and Self-Driving Vehicles: Problems, Solutions and Directions</h2> |
|- | |- | ||
| style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | | style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | ||
Line 609: | Line 558: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' ... In this tutorial, we aim to discuss a number of problems related to assisted- and self-driving vehicles, potential solutions and directions for research & development. The issues discussed in this tutorial will span multitarget tracking, multisensor fusion and situational awareness within the context of smart cars. We will also present some of the algorithms that are available in the open literature as well as those we have developed recently. In addition, we will also discuss related computational issues and sensor technologies. Finally, we will present some results on real data.<br /> | '''Brief description:''' ... In this tutorial, we aim to discuss a number of problems related to assisted- and self-driving vehicles, potential solutions and directions for research & development. The issues discussed in this tutorial will span multitarget tracking, multisensor fusion and situational awareness within the context of smart cars. We will also present some of the algorithms that are available in the open literature as well as those we have developed recently. In addition, we will also discuss related computational issues and sensor technologies. Finally, we will present some results on real data.<br /> | ||
− | [[T16| More Details]] | + | [[T16| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate16.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 617: | Line 566: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial17"></div> | <div id="tutorial17"></div> | ||
<!-- T17 Emerging Quantum Technologies for Fusion --> | <!-- T17 Emerging Quantum Technologies for Fusion --> | ||
Line 629: | Line 578: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' ... Although the fundamentals of quantum physics have been well-known since the 1920s, in the last few decades several novel consequences of the laws of quantum physics (particularly, in the areas of atomic, molecular and optical physics and quantum computer science and information theory) have been discovered. ... In particular, in the areas of sensing, quantum physics sets the bounds on the sensitivity of sensing... that is orders of magnitude below the sensitivity of current sensors. In the area of computing, it has been observed that a quantum computer allows some computations to be carried out that are unfeasible using current or future classical computing technology. In the area of communication, quantum physics enables provable secure communication and at much higher data rates than those allowed by classical Shannon limit. Many of these advances could have major near-term and long-term consequences in the areas of sensing, secure communication, big data analysis, and machine learning, and hence sensor and information fusion.<br /> | '''Brief description:''' ... Although the fundamentals of quantum physics have been well-known since the 1920s, in the last few decades several novel consequences of the laws of quantum physics (particularly, in the areas of atomic, molecular and optical physics and quantum computer science and information theory) have been discovered. ... In particular, in the areas of sensing, quantum physics sets the bounds on the sensitivity of sensing... that is orders of magnitude below the sensitivity of current sensors. In the area of computing, it has been observed that a quantum computer allows some computations to be carried out that are unfeasible using current or future classical computing technology. In the area of communication, quantum physics enables provable secure communication and at much higher data rates than those allowed by classical Shannon limit. Many of these advances could have major near-term and long-term consequences in the areas of sensing, secure communication, big data analysis, and machine learning, and hence sensor and information fusion.<br /> | ||
− | [[T17| More Details]] | + | [[T17| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate17.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 637: | Line 586: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial18"></div> | <div id="tutorial18"></div> | ||
<!-- T18 Maneuvering Target Tracking: Overview and Nonlinear Filtering Methods --> | <!-- T18 Maneuvering Target Tracking: Overview and Nonlinear Filtering Methods --> | ||
Line 648: | Line 597: | ||
'''Length:''' 3+3 hours<br /> | '''Length:''' 3+3 hours<br /> | ||
'''Brief description:''' The principal challenges for tracking a maneuverable target are nonlinearity in both target motion and measurement models as well as the uncertainty in the pattern of target motion. This tutorial presents theoretical and algorithmic means available to meet these challenges. The overview part elucidates a well organized panorama of maneuvering target tracking. The other part presents an in-depth coverage of recent advances in nonlinear filtering for maneuvering target tracking, including some of the instructors’ results and insights as well as better known methods. The tutorial highlights the underlying ideas and pros and cons of approaches and techniques as well as inter-relationships among them. It is an outgrowth of the instructors’ ongoing comprehensive survey and several short courses of the same subject as well as a graduate course on target tracking taught at the Electrical Engineering Department of the University of New Orleans.<br /> | '''Brief description:''' The principal challenges for tracking a maneuverable target are nonlinearity in both target motion and measurement models as well as the uncertainty in the pattern of target motion. This tutorial presents theoretical and algorithmic means available to meet these challenges. The overview part elucidates a well organized panorama of maneuvering target tracking. The other part presents an in-depth coverage of recent advances in nonlinear filtering for maneuvering target tracking, including some of the instructors’ results and insights as well as better known methods. The tutorial highlights the underlying ideas and pros and cons of approaches and techniques as well as inter-relationships among them. It is an outgrowth of the instructors’ ongoing comprehensive survey and several short courses of the same subject as well as a graduate course on target tracking taught at the Electrical Engineering Department of the University of New Orleans.<br /> | ||
− | [[T18| More Details]] | + | [[T18| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate18.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 656: | Line 605: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial19"></div> | <div id="tutorial19"></div> | ||
<!-- T19 Integration of Information to Identify Objects in Big Data --> | <!-- T19 Integration of Information to Identify Objects in Big Data --> | ||
| class="MainPageBG" style="width:100%; border:1px solid #f2ea7e; background:#ffffe8; vertical-align:top; color:#000;" | | | class="MainPageBG" style="width:100%; border:1px solid #f2ea7e; background:#ffffe8; vertical-align:top; color:#000;" | | ||
{| id="mp-left" style="width:100%; vertical-align:top; background:#ffffe8;" | {| id="mp-left" style="width:100%; vertical-align:top; background:#ffffe8;" | ||
− | | style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:#fff7bd; font-family:inherit; font-size:120%; font-weight:bold; border:1px solid #f2ea7e; text-align:left; color:#000; padding:0.2em 0.4em;">T19 Integration of Information to Identify Objects in Big Data</h2> | + | | style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:#fff7bd; font-family:inherit; font-size:120%; font-weight:bold; border:1px solid #f2ea7e; text-align:left; color:#000; padding:0.2em 0.4em;"><span style="background:#cdcdcd"><s>T19 Integration of Information to Identify Objects in Big Data</s></span> - Withdrawn by presenter</h2> |
|- | |- | ||
| style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | | style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | ||
Line 675: | Line 624: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial20"></div> | <div id="tutorial20"></div> | ||
<!-- T20 Extended Object Tracking: Theory and Applications --> | <!-- T20 Extended Object Tracking: Theory and Applications --> | ||
Line 683: | Line 632: | ||
|- | |- | ||
| style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | | style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | ||
− | '''Presenter:''' [mailto:karl.granstrom@chalmers.se Karl Granström], Stephan Reuter, and [mailto:marcus.baum@cs.uni- | + | '''Presenter:''' [mailto:karl.granstrom@chalmers.se Karl Granström], [mailto:stephan.reuter@uni-ulm.de Stephan Reuter], and [mailto:marcus.baum@cs.uni-goettingen.de Marcus Baum]<br /> |
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' Autonomous driver safety functions are standard in many modern cars, and semi-automated systems (e.g., traffic jam assist) are becoming more and more common. Construction of a driverless vehicle requires solutions to many different problems, among them multiple object tracking. This tutorial will introduce the audience to extended object tracking, i.e., object tracking using modern high resolution sensors that give multiple detections per object. State of the art theory will be introduced, and relevant real world applications will be shown where different object types—e.g., pedestrians, bicyclists, cars—are tracked using different sensors such as lidar, radar, and camera.<br /> | '''Brief description:''' Autonomous driver safety functions are standard in many modern cars, and semi-automated systems (e.g., traffic jam assist) are becoming more and more common. Construction of a driverless vehicle requires solutions to many different problems, among them multiple object tracking. This tutorial will introduce the audience to extended object tracking, i.e., object tracking using modern high resolution sensors that give multiple detections per object. State of the art theory will be introduced, and relevant real world applications will be shown where different object types—e.g., pedestrians, bicyclists, cars—are tracked using different sensors such as lidar, radar, and camera.<br /> | ||
− | [[T20| More Details]] | + | [[T20| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate20.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 694: | Line 643: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial21"> | <div id="tutorial21"> | ||
<!-- T21 Probabilistic Situation Assessment for Abnormal Interaction Detection --> | <!-- T21 Probabilistic Situation Assessment for Abnormal Interaction Detection --> | ||
− | | class="MainPageBG" style="width:100%; border:1px solid # | + | | class="MainPageBG" style="width:100%; border:1px solid #f5946e; background:#fae6de; vertical-align:top; color:#000;" | |
− | {| id="mp-left" style="width:100%; vertical-align:top; background:# | + | {| id="mp-left" style="width:100%; vertical-align:top; background:#fae6de;" |
− | | style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:# | + | | style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:#f9d6c9; font-family:inherit; font-size:120%; font-weight:bold; border:1px solid #f5946e; text-align:left; color:#000; padding:0.2em 0.4em;">T21 Probabilistic Situation Assessment for Abnormal Interaction Detection</h2> |
|- | |- | ||
| style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | | style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | ||
Line 705: | Line 654: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' The tutorial aims at providing an overview of new insights in extending Dynamic Bayesian Networks techniques for representing, modeling and automatically interpreting and managing complex interaction situations occurring in cognitive environments starting from observations provided by multidimensional signals collected through a distributed network of embedded systems. A uniform representation is discussed that can also be used to support decisions concerning interactions between operators and the status of the observed environment. Solutions, which are based on an extension of traditional Bayesian filters for object assessment, are the basis background of discussion from which techniques in this tutorial. ...<br /> | '''Brief description:''' The tutorial aims at providing an overview of new insights in extending Dynamic Bayesian Networks techniques for representing, modeling and automatically interpreting and managing complex interaction situations occurring in cognitive environments starting from observations provided by multidimensional signals collected through a distributed network of embedded systems. A uniform representation is discussed that can also be used to support decisions concerning interactions between operators and the status of the observed environment. Solutions, which are based on an extension of traditional Bayesian filters for object assessment, are the basis background of discussion from which techniques in this tutorial. ...<br /> | ||
− | [[T21| More Details]] | + | [[T21| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate21.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 713: | Line 662: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial22"></div> | <div id="tutorial22"></div> | ||
<!-- T22 Multitarget Tracking and Sensor Calibration in Centralized and Distributed Networks --> | <!-- T22 Multitarget Tracking and Sensor Calibration in Centralized and Distributed Networks --> | ||
Line 719: | Line 668: | ||
{| id="mp-left" style="width:100%; vertical-align:top; background:#e7f7e76;" | {| id="mp-left" style="width:100%; vertical-align:top; background:#e7f7e76;" | ||
− | | style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:#ceecf2; font-family:inherit; font-size:120%; font-weight:bold; border:1px solid #a3babf; text-align:left; color:#000; padding:0.2em 0.4em;">T22 Multitarget Tracking and Sensor Calibration in Centralized and Distributed Networks</h2> | + | | style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:#ceecf2; font-family:inherit; font-size:120%; font-weight:bold; border:1px solid #a3babf; text-align:left; color:#000; padding:0.2em 0.4em;"><span style="background:#cdcdcd"><s>T22 Multitarget Tracking and Sensor Calibration in Centralized and Distributed Networks</s></span> - Withdrawn</h2> |
|- | |- | ||
| style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | | style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | ||
Line 733: | Line 682: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial23"></div> | <div id="tutorial23"></div> | ||
<!-- T23 Information Fusion in Resource-Limited Camera Networks --> | <!-- T23 Information Fusion in Resource-Limited Camera Networks --> | ||
Line 744: | Line 693: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' … This tutorial will introduce key features of modern visual sensor networks while exploring the issues commonly found in such networks, which have recently become central in several applications. For smart-camera networks to enable these emerging applications they need to adapt to unforeseen conditions and varying tasks under constrained resources. The tutorial will offer theoretical explanations followed by examples using the WiseMNet++ simulator.<br /> | '''Brief description:''' … This tutorial will introduce key features of modern visual sensor networks while exploring the issues commonly found in such networks, which have recently become central in several applications. For smart-camera networks to enable these emerging applications they need to adapt to unforeseen conditions and varying tasks under constrained resources. The tutorial will offer theoretical explanations followed by examples using the WiseMNet++ simulator.<br /> | ||
− | [[T23| More Details]] | + | [[T23| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate23.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 752: | Line 701: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial24"></div> | <div id="tutorial24"></div> | ||
<!-- T24 Introduction to Bayesian Filtering and Smoothing --> | <!-- T24 Introduction to Bayesian Filtering and Smoothing --> | ||
Line 763: | Line 712: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' ... The tutorial introduces the current state-of-the-art of non-linear (single-target) optimal filtering and smoothing methods in a unified Bayesian framework. The attendees learn what non-linear Kalman filters and particle filters are, how they are related, and their relative advantages and disadvantages. They also discover how Bayesian parameter estimation methods can be combined with the filtering and smoothing algorithms. ... Example applications from navigation, remote surveillance, and time series analysis.<br /> | '''Brief description:''' ... The tutorial introduces the current state-of-the-art of non-linear (single-target) optimal filtering and smoothing methods in a unified Bayesian framework. The attendees learn what non-linear Kalman filters and particle filters are, how they are related, and their relative advantages and disadvantages. They also discover how Bayesian parameter estimation methods can be combined with the filtering and smoothing algorithms. ... Example applications from navigation, remote surveillance, and time series analysis.<br /> | ||
− | [[T24| More Details]] | + | [[T24| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate24.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 772: | Line 721: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial25"></div> | <div id="tutorial25"></div> | ||
<!-- T25 Sensor Fusion for Intelligent Vehicles --> | <!-- T25 Sensor Fusion for Intelligent Vehicles --> | ||
Line 783: | Line 732: | ||
'''Length:''' 3 hours<br /> | '''Length:''' 3 hours<br /> | ||
'''Brief description:''' This tutorial is focussed towards the stringent requirements, foundations, development and testing of sensor fusion algorithms meant for advanced driver assistance functions and driverless applications in automotive vehicle systems. ... The interesting part of the tutorial is covered on the different challenging and important practical aspects such as fusion with incomplete information, data association, etc. related to fusion and target tracking in automotive setting. Fusion and management of the different extended target representations of heterogeneous nature obtained from sensors with different resolution is presented with examples. More than one kind of intelligent vehicular sensor fusion framework dealing with tracked objects i.e. track level fusion and raw sensor measurements i.e. measurement level fusion, with results obtained using several real world data sets that contains various static and dynamic targets would be presented in this tutorial.<br /> | '''Brief description:''' This tutorial is focussed towards the stringent requirements, foundations, development and testing of sensor fusion algorithms meant for advanced driver assistance functions and driverless applications in automotive vehicle systems. ... The interesting part of the tutorial is covered on the different challenging and important practical aspects such as fusion with incomplete information, data association, etc. related to fusion and target tracking in automotive setting. Fusion and management of the different extended target representations of heterogeneous nature obtained from sensors with different resolution is presented with examples. More than one kind of intelligent vehicular sensor fusion framework dealing with tracked objects i.e. track level fusion and raw sensor measurements i.e. measurement level fusion, with results obtained using several real world data sets that contains various static and dynamic targets would be presented in this tutorial.<br /> | ||
− | [[T25| More Details]] | + | [[T25| More Details]] [http://fusion2016.org/download/certificates/TutorialCertificate25.pdf Certificate (PDF)] |
</div> | </div> | ||
|- | |- | ||
Line 791: | Line 740: | ||
<!-- FUSION 2016 Tutorials --> | <!-- FUSION 2016 Tutorials --> | ||
− | {| id="mp-upper" style="width: | + | {| id="mp-upper" style="width: 100%; margin:4px 0 0 0; background:none; border-spacing: 0px;" |
<div id="tutorial26"> | <div id="tutorial26"> | ||
<!-- T26 Multisensor Data Fusion in Wireless Sensor and Actuator Networks --> | <!-- T26 Multisensor Data Fusion in Wireless Sensor and Actuator Networks --> | ||
− | | class="MainPageBG" style="width:100%; border:1px solid # | + | | class="MainPageBG" style="width:100%; border:1px solid #f5946e; background:#fae6de; vertical-align:top; color:#000;" | |
− | {| id="mp-left" style="width:100%; vertical-align:top; background:# | + | {| id="mp-left" style="width:100%; vertical-align:top; background:#fae6de;" |
− | | style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:# | + | | style="padding:2px;" | <h2 id="mp-tfa-h2" style="margin:3px; background:#f9d6c9; font-family:inherit; font-size:120%; font-weight:bold; border:1px solid #f5946e; text-align:left; color:#000; padding:0.2em 0.4em;"><span style="background:#cdcdcd"><s>T26 Multisensor Data Fusion in Wireless Sensor and Actuator Networks</s></span> - Withdrawn</h2> |
|- | |- | ||
| style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | | style="color:#000;" | <div id="mp-tfa" style="padding:2px 5px"> | ||
Line 808: | Line 757: | ||
| style="border:1px solid transparent;" |<br /> | | style="border:1px solid transparent;" |<br /> | ||
|- | |- | ||
− | + | {{Organisation}} | |
− | { | + | |
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
__NOTOC____NOEDITSECTION__ | __NOTOC____NOEDITSECTION__ |
Latest revision as of 12:47, 21 July 2016
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|