One-Shot Learning of Scene Locations via Feature Trajectory Transfer

Abstract

The appearance of (outdoor) scenes changes considerably with the strength of certain transient attributes, such as” rainy”,” dark” or” sunny”. Obviously, this also affects the representation of an image in feature space, eg, as activations at a certain CNN layer, and consequently impacts scene recognition performance. In this work, we investigate the variability in these transient attributes as a rich source of information for studying how image representations change as a function of attribute strength. In particular, we leverage a recently introduced dataset with fine-grain annotations to estimate feature trajectories for a collection of transient attributes and then show how these trajectories can be transferred to new image representations. This enables us to synthesize new data along the transferred trajectories with respect to the dimensions of the space spanned by the transient attributes. Applicability of this concept is demonstrated on the problem of one-shot recognition of scene locations. We show that data synthesized via feature trajectory transfer considerably boosts recognition performance,(1) with respect to baselines and (2) in combination with state-of-the-art approaches in one-shot learning.

Publication
2016 IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2016, Las Vegas, NV, USA, June 27-30, 2016
Marc Niethammer
Marc Niethammer
Professor of Computer Science

My research interests include image registration, image segmentation, shape analysis, machine learning, and biomedical applications.

Related