Home >

news ヘルプ

論文・著書情報


タイトル
和文: 
英文:Smooth Transfer Learning for Source-to-Target Generalization 
著者
和文: 髙山 啓太, 佐藤 育郎, 鈴木 哲平, 川上 玲, 宇都 有昭, 篠田 浩一.  
英文: Keita Takayama, Ikuro Sato, Teppei Suzuki, Rei Kawakami, Kuniaki Uto, Koichi Shinoda.  
言語 English 
掲載誌/書名
和文: 
英文:Proc. NeurIPS 2021 Workshop on Distribution Shifts: Connecting Methods and Applications 
巻, 号, ページ        
出版年月 2021年12月 
出版者
和文: 
英文: 
会議名称
和文: 
英文:NeurIPS 2021 Workshop on Distribution Shifts: Connecting Methods and Applications 
開催地
和文: 
英文: 
公式リンク https://openreview.net/forum?id=2FE0NwK3Jbn
 
アブストラクト Transfer learning for deep models has shown great success for various recognition tasks. Typically, a backbone network is pre-trained on a source dataset, then fine-tuned on a target dataset. We considered that when both datasets are at hand, learning them simultaneously at least for some period of iterations would yield higher test performance rather than the step-wise optimization. We propose Smooth Transfer Learning, which uses a learnable scheduler function for the loss coefficients so that degrees of contributions from two datasets can be smoothly changed along training time for optimal target performance. The scheduler function is designed so that it can express either pre-training-then-fine-tuning or multi-task learning with fixed weights as special cases. Our method consistently outperforms these special cases in object classification with CIFAR-10 and CIFAR-100, and in digit classification with SVHN and MNIST.

©2007 Tokyo Institute of Technology All rights reserved.