MutualNet: Adaptive ConvNet via Mutual Learning From Different Model Configurations

Taojiannan Yang, Sijie Zhu, Matias Mendieta, Pu Wang, Ravikumar Balakrishnan, Minwoo Lee, Tao Han, Mubarak Shah, Chen Chen

Research output: Contribution to journalArticlepeer-review

11 Scopus citations

Abstract

Most existing deep neural networks are static, which means they can only perform inference at a fixed complexity. But the resource budget can vary substantially across different devices. Even on a single device, the affordable budget can change with different scenarios, and repeatedly training networks for each required budget would be incredibly expensive. Therefore, in this work, we propose a general method called MutualNet to train a single network that can run at a diverse set of resource constraints. Our method trains a cohort of model configurations with various network widths and input resolutions. This mutual learning scheme not only allows the model to run at different width-resolution configurations but also transfers the unique knowledge among these configurations, helping the model to learn stronger representations overall. MutualNet is a general training methodology that can be applied to various network structures (e.g., 2D networks: MobileNets, ResNet, 3D networks: SlowFast, X3D) and various tasks (e.g., image classification, object detection, segmentation, and action recognition), and is demonstrated to achieve consistent improvements on a variety of datasets. Since we only train the model once, it also greatly reduces the training cost compared to independently training several models. Surprisingly, MutualNet can also be used to significantly boost the performance of a single network, if dynamic resource constraints are not a concern. In summary, MutualNet is a unified method for both static and adaptive, 2D and 3D networks. Code and pre-trained models are available at https://github.com/taoyang1122/MutualNet.

Original languageEnglish (US)
Pages (from-to)811-827
Number of pages17
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Volume45
Issue number1
DOIs
StatePublished - Jan 1 2023

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Vision and Pattern Recognition
  • Computational Theory and Mathematics
  • Artificial Intelligence
  • Applied Mathematics

Keywords

  • Dynamic neural networks
  • adaptive inference
  • deep learning
  • efficient neural networks

Fingerprint

Dive into the research topics of 'MutualNet: Adaptive ConvNet via Mutual Learning From Different Model Configurations'. Together they form a unique fingerprint.

Cite this