Home >

news ヘルプ

論文・著書情報


タイトル
和文:MSR-DARTS: Minimum Stable Rank ofDifferentiable Architecture Search 
英文:MSR-DARTS: Minimum Stable Rank of Differentiable Architecture Search 
著者
和文: 町田 兼梧, 宇都 有昭, 篠田 浩一, 鈴木 大慈.  
英文: Kengo Machida, Kuniaki Uto, Koichi Shinoda, Taiji Suzuki.  
言語 English 
掲載誌/書名
和文: 
英文:Proc. IJCNN2022 
巻, 号, ページ        
出版年月 2022年7月 
出版者
和文: 
英文:IEEE 
会議名称
和文: 
英文:International Joint Conference on Neural Networks (IJCNN) 2022 
開催地
和文: 
英文:Padova 
公式リンク https://wcci2022.org/
 
アブストラクト In neural architecture search (NAS), differentiable architecture search (DARTS) has recently attracted much attention due to its high efficiency. However, this method finds a model with the weights converging faster than the others, and such a model with fastest convergence often leads to overfitting. Accordingly, the resulting model cannot always be well-generalized. To overcome this problem, we propose a method called minimum stable rank DARTS (MSR-DARTS), for finding a model with the best generalization error by replacing architecture optimization with the selection process using the minimum stable rank criterion. Specifically, a convolution operator is represented by a matrix, and MSR-DARTS selects the one with the smallest stable rank. We evaluated MSR-DARTS on CIFAR-10 and ImageNet datasets. It achieves an error rate of 2.54% with 4.0M parameters within 0.3 GPU-days on CIFAR-10, and a top-1 error rate of 23.9%on ImageNet.

©2007 Institute of Science Tokyo All rights reserved.