Talk / Overview

For many real-world applications, obtaining stable and robust statistical performance is more important than simply achieving state-of-the-art predictive test accuracy, and thus robustness of neural networks is an increasingly important topic. In this talk, we present NoisyMix, an inexpensive yet effective training scheme that judiciously combines data augmentations with stability training and noise injections to improve both model robustness and in-domain accuracy. We demonstrate the benefits of NoisyMix on a range of benchmark datasets, including ImageNet-C, ImageNet-R, and ImageNet-P. In particular, NoisyMix currently claims the top spot on the leaderboards of RobustBench, a standardised benchmark for adversarial robustness. Moreover, we provide theory to understand the implicit regularisation effects and robustness of NoisyMix.

Talk / Speakers

Soon Hoe Lim

WINQ Fellow and incoming Assistant Professor, Nordita

Talk / Slides

Download the slides for this talk.Download ( PDF, 1706.63 MB)

Talk / Highlights

Boosting Model Robustness by Leveraging Data Augmentations, Stability Training, and Noise Injections

With Soon Hoe LimPublished April 27, 2022

AMLD / Global partners