Related to the Smart Asset re-Use in Creative Environments (SAUCE) project
Published in 2020 IEEE International Conference on Image Processing (ICIP), 2020
Light fields are able to capture light rays from a scene arriving at different angles, effectively creating multiple perspective views of the same scene. Thus, one of the flagship applications of light fields is to estimate the captured scene geometry, which can notably be achieved by establishing correspondences between the perspective views, usually in the form of a disparity map. Such correspondence estimation has been a long standing research topic in computer vision, with application to stereo vision or optical flow. Research in this area has shown the importance of well designed descriptors to enable fast and accurate matching. We propose in this paper a binary descriptor exploiting the light field gradient over both the spatial and the angular dimensions in order to improve inter view matching. We demonstrate in a disparity estimation application that it can achieve comparable accuracy compared to existing descriptors while being faster to compute.