Tutorial 1 intro: Getting started with Reeds, the world’s largest dataset for perception algorithms
Reeds is a new dataset for research and development of robot perception algorithms. The design goal of the dataset is to provide the most demanding dataset for perception algorithm benchmarking, both in terms of the involved vehicle motions and the amount of high quality data. The logging platform consists of an instrumented boat with six high-performance vision sensors, three high-fidelity lidars, a 360° radar, a 360° documentation camera system, as well as a three-antenna GNSS system and a fibre optic gyro IMU used for ground truth measurements. All sensors are calibrated into a single vehicle frame. The tutorial will introduce the dataset and give the participants hands-on experience on how to replay data into self-developed algorithms, with examples available in both Python and C++. The take-away is that participants easily can get started using the data in their own research and development, as well as getting insight into the capabilities of the latest sensors. Read more about the dataset here (currently being updated): https://reeds.opendata.chalmers.se
__
Tutorials are consisted of introduction presentation of 30 minutes and hands-on part of 60min. Tutorial hands-on and demos are divided in three groups which will rotate together every hour:
15:30 – 16:30 | Group 1 T1 hands-on: REEDS |
Group 2 T2 hands-on: INESC-TEC |
Group 3 Demo: H2O |
16:30 – 17:30 | Group 2 T2 hands-on: INESC-TEC |
Group 3 Demo: H2O |
Group 1 T1 hands-on: REEDS |
17:30 – 18:30 | Group 3 Demo: KTH |
Group 1 T1 hands-on: REEDS |
Group 2 T2 hands-on: INESC-TEC |