Self-distillation and uncertainty boosting self-supervised monocular depth estimation

Zhou, Hang, Greenwood, David, Taylor, Sarah and Mackiewicz, Michal ORCID: https://orcid.org/0000-0002-8777-8880 (2022) Self-distillation and uncertainty boosting self-supervised monocular depth estimation. In: THe 33rd British Machine Vision Conference Proceedings. UNSPECIFIED.

[thumbnail of bmvc2022zhou]
Preview
PDF (bmvc2022zhou) - Published Version
Available under License Other licence.

Download (3MB) | Preview

Abstract

For self-supervised monocular depth estimation (SDE), recent works have introduced additional learning objectives, for example semantic segmentation, into the training pipeline and have demonstrated improved performance. However, such multi-task learning frameworks require extra ground truth labels, neutralising the biggest advantage of self-supervision. In this paper, we propose SUB-Depth to overcome these limitations. Our main contribution is that we design an auxiliary self-distillation scheme and incorporate it into the standard SDE framework, to take advantage of multi-task learning without labelling cost. Then, instead of using a simple weighted sum of the multiple objectives, we employ generative task-dependent uncertainty to weight each task in our proposed training framework. We present extensive evaluations on KITTI to demonstrate the improvements achieved by training a range of existing networks using the proposed framework, and we achieve state-of-the-art performance on this task.

Item Type: Book Section
Additional Information: © 2022. The copyright of this document resides with its authors. It may be distributed unchanged freely in print or electronic form.
Faculty \ School: Faculty of Science > School of Computing Sciences
UEA Research Groups: Faculty of Science > Research Groups > Colour and Imaging Lab
Related URLs:
Depositing User: LivePure Connector
Date Deposited: 10 May 2023 14:30
Last Modified: 17 Sep 2023 06:32
URI: https://ueaeprints.uea.ac.uk/id/eprint/92014
DOI:

Actions (login required)

View Item View Item