Polarimetic Imaging for Perception

Michael Baltaxe Tomer Pe'er Dan Levi

Abstract

Autonomous driving and advanced driver-assistance systems rely on a set of sensors and algorithms to perform the appropriate actions and provide alerts as a function of the driving scene. Typically, the sensors include color cameras, radar, lidar and ultrasonic sensors. Strikingly however, although light polarization is a fundamental property of light, it is seldom harnessed for perception tasks. In this work we analyze the potential for improvement in perception tasks when using an RGB-polarimetric camera, as compared to an RGB camera. We examine monocular depth estimation and free space detection during the middle of the day, when polarization is independent of subject heading, and show that a quantifiable improvement can be achieved for both of them using state-of-the-art deep neural networks, with a minimum of architectural changes. We also present a new dataset composed of RGB-polarimetric images, lidar scans, GNSS / IMU readings and free space segmentations that further supports developing perception algorithms that take advantage of light polarization.


Dataset

The dataset can be downloaded here (file size 34 GB).

Dataset


Paper

Paper available for download here.
Supplementary material available here.


Results

Main results for free space and depth estimation using polarimetric imaging:

Results free space

Results depth


Citation

@inproceedings{Baltaxe_2023_BMVC,
author = {Michael Baltaxe and Tomer Pe'er and Dan Levi},
title = {Polarimetric Imaging for Perception},
booktitle = {34th British Machine Vision Conference 2023, {BMVC} 2023, Aberdeen, UK, November 20-24, 2023},
publisher = {BMVA},
year = {2023},
url = {https://papers.bmvc2023.org/0566.pdf}
}