Home >

news Help

Publication Information


Title
Japanese:MSR-DARTS: Minimum Stable Rank ofDifferentiable Architecture Search 
English:MSR-DARTS: Minimum Stable Rank of Differentiable Architecture Search 
Author
Japanese: 町田 兼梧, 宇都 有昭, 篠田 浩一, 鈴木 大慈.  
English: Kengo Machida, Kuniaki Uto, Koichi Shinoda, Taiji Suzuki.  
Language English 
Journal/Book name
Japanese: 
English:Proc. IJCNN2022 
Volume, Number, Page        
Published date July 2022 
Publisher
Japanese: 
English:IEEE 
Conference name
Japanese: 
English:International Joint Conference on Neural Networks (IJCNN) 2022 
Conference site
Japanese: 
English:Padova 
Official URL https://wcci2022.org/
 
Abstract In neural architecture search (NAS), differentiable architecture search (DARTS) has recently attracted much attention due to its high efficiency. However, this method finds a model with the weights converging faster than the others, and such a model with fastest convergence often leads to overfitting. Accordingly, the resulting model cannot always be well-generalized. To overcome this problem, we propose a method called minimum stable rank DARTS (MSR-DARTS), for finding a model with the best generalization error by replacing architecture optimization with the selection process using the minimum stable rank criterion. Specifically, a convolution operator is represented by a matrix, and MSR-DARTS selects the one with the smallest stable rank. We evaluated MSR-DARTS on CIFAR-10 and ImageNet datasets. It achieves an error rate of 2.54% with 4.0M parameters within 0.3 GPU-days on CIFAR-10, and a top-1 error rate of 23.9%on ImageNet.

©2007 Tokyo Institute of Technology All rights reserved.