Articles | Volume 13
https://doi.org/10.5194/ars-13-209-2015
https://doi.org/10.5194/ars-13-209-2015
03 Nov 2015
 | 03 Nov 2015

Multi-view point cloud fusion for LiDAR based cooperative environment detection

B. Jaehn, P. Lindner, and G. Wanielik

Viewed

Total article views: 1,624 (including HTML, PDF, and XML)
HTML PDF XML Total BibTeX EndNote
758 720 146 1,624 114 119
  • HTML: 758
  • PDF: 720
  • XML: 146
  • Total: 1,624
  • BibTeX: 114
  • EndNote: 119
Views and downloads (calculated since 03 Nov 2015)
Cumulative views and downloads (calculated since 03 Nov 2015)
Latest update: 11 Dec 2024
Download
Short summary
In the future autonomous robots will share their environment information captured by range sensors like LiDAR or ToF cameras. In this paper it is shown that a two dimensional position and heading information, e.g. obtained by GPS tracking methods, is enough to initialize a 3D registration method using the range images from different perspectives of different platforms (e.g. car & infrastructure). Thus they will be able to explore their surrounding in a cooperative manner.