Tutorial 5 intro: Marine object detection using MARUS generated dataset
Simulators play a key role in the development of mobile robots. Simulating vehicle models and their environment without depending on actual hardware has proven beneficial for reducing cost and development time while facilitating safety during testing. One motivation for developing MARUS (https://github.com/MARUSimulator) was to have a simulator that can offer advanced capabilities of generating realistic environment allowing for closer-to-reality validation and verification (V&V) of applications developed for maritime vehicles. The simulator offers synthetic dataset generation with perfect annotations for various sensors (cameras, lidar, sonar) and allows for interaction with the environment for closed loop simulation. In this tutorial, we will show you how to train and validate deep neural network used for object detection in marine environment. You will learn how to generate an annotated dataset consisting of images from single or multiple cameras in MARUS. Generated dataset will be used to train a deep neural network for detecting and classifying chosen objects.
__
Tutorials are consisted of introduction presentation of 30 minutes and hands-on part of 60min. Tutorial hands-on and demos are divided in three groups which will rotate together every hour:
15:30 – 16:30 | Group 1 T5 hands-on: UNIZG FER |
Group 2 T6 hands-on: University of Haifa |
Group 3 Demo: MDM Team |
16:30 – 17:30 | Group 2 T6 hands-on: University of Haifa |
Group 3 Demo: MDM Team |
Group 1 T5 hands-on: UNIZG FER |
17:30 – 18:30 | Group 3 Demo: MDM Team |
Group 1 T5 hands-on: UNIZG FER |
Group 2 T6 hands-on: University of Haifa |