Home >

news Help

Publication Information


Title
Japanese: 
English:Smooth Transfer Learning for Source-to-Target Generalization 
Author
Japanese: 髙山 啓太, 佐藤 育郎, 鈴木 哲平, 川上 玲, 宇都 有昭, 篠田 浩一.  
English: Keita Takayama, Ikuro Sato, Teppei Suzuki, Rei Kawakami, Kuniaki Uto, Koichi Shinoda.  
Language English 
Journal/Book name
Japanese: 
English:Proc. NeurIPS 2021 Workshop on Distribution Shifts: Connecting Methods and Applications 
Volume, Number, Page        
Published date Dec. 2021 
Publisher
Japanese: 
English: 
Conference name
Japanese: 
English:NeurIPS 2021 Workshop on Distribution Shifts: Connecting Methods and Applications 
Conference site
Japanese: 
English: 
Official URL https://openreview.net/forum?id=2FE0NwK3Jbn
 
Abstract Transfer learning for deep models has shown great success for various recognition tasks. Typically, a backbone network is pre-trained on a source dataset, then fine-tuned on a target dataset. We considered that when both datasets are at hand, learning them simultaneously at least for some period of iterations would yield higher test performance rather than the step-wise optimization. We propose Smooth Transfer Learning, which uses a learnable scheduler function for the loss coefficients so that degrees of contributions from two datasets can be smoothly changed along training time for optimal target performance. The scheduler function is designed so that it can express either pre-training-then-fine-tuning or multi-task learning with fixed weights as special cases. Our method consistently outperforms these special cases in object classification with CIFAR-10 and CIFAR-100, and in digit classification with SVHN and MNIST.

©2007 Tokyo Institute of Technology All rights reserved.