Skip to main content

Change of Scenery: Unsupervised LiDAR Change Detection for Mobile Robots

Published onMay 28, 2024
Change of Scenery: Unsupervised LiDAR Change Detection for Mobile Robots
·

ABSTRACT

This paper presents a fully unsupervised deep change detection approach for mobile robots with 3D LiDAR. In unstructured environments, it is infeasible to define a closed set of semantic classes. Instead, semantic segmentation is reformulated as binary change detection. We develop a neural network, RangeNetCD, that uses an existing point-cloud map and a live LiDAR scan to detect scene changes with respect to the map. Using a novel loss function, existing point-cloud semantic segmentation networks can be trained to perform change detection without any labels or assumptions about local semantics. The mean intersection over union (mIoU) score is used for quantitative comparison. RangeNetCD outperforms the baseline by 3.8% to 7.7% depending on the amount of environmental structure. The neural network operates at 67.1 Hz and is integrated into a robot’s autonomy stack to allow safe navigation around obstacles that intersect the planned path. In addition, a novel method for the rapid automated acquisition of per-point ground-truth labels is described. Covering changed parts of the scene with retroreflective materials and applying a threshold filter to the intensity channel of the LiDAR allows for quantitative evaluation of the change detector.

Month: May

Year: 2024

Venue: 21st Conference on Robots and Vision

URL: https://crv.pubpub.org/pub/oiakz0q7


Comments
0
comment
No comments here
Why not start the discussion?