FreeAugment: Data Augmentation Search Across All Degrees of Freedom

Technion - Israel Institute of Technology
ECCV 2024
MY ALT TEXT

Abstract

Data augmentation has become an integral part of deep learning, as it is known to improve the generalization capabilities of neural networks. Since the most effective set of image transformations differs between tasks and domains, automatic data augmentation search aims to alleviate the extreme burden of manually finding the optimal image transformations. However, current methods are not able to jointly optimize all degrees of freedom: (1) the number of transformations to be applied, their (2) types, (3) order, and (4) magnitudes. Many existing methods risk picking the same transformation more than once, limit the search to two transformations only, or search for the number of transformations exhaustively or iteratively in a myopic manner. Our approach, \OurMethod, is the first to achieve global optimization of all four degrees of freedom simultaneously, using a fully differentiable method. It efficiently learns the number of transformations and a probability distribution over their permutations, inherently refraining from redundant repetition while sampling. Our experiments demonstrate that this joint learning of all degrees of freedom significantly improves performance, achieving state-of-the-art results on various natural image benchmarks and beyond across other domains.

BibTeX


        @misc{bekor2024freeaugmentdataaugmentationsearch,
        title={FreeAugment: Data Augmentation Search Across All Degrees of Freedom}, 
        author={Tom Bekor and Niv Nayman and Lihi Zelnik-Manor},
        year={2024},
        eprint={2409.04820},
        archivePrefix={arXiv},
        primaryClass={cs.CV},
        url={https://arxiv.org/abs/2409.04820}}