AdaBoost is a machine learning technique which integrates many weak classifiers into one strong classifier to enhance its classification performance. Gentle AdaBoost is a variant of AdaBoost which introduces Newton steps to the boosting process. It is proved that the overall performance considering both the training error and generalization error of Gentle AdaBoost is better than other AdaBoost variants on low-noise data. However, it suffers from overfitting problem
when the training data include high noise. To solve this problem, we propose a new approach to limit the weight distortion according to a stretched distribution of the whole sample weights. Experimental results have shown that our
algorithm obtains a better generalization error on both standard and noise-input datasets. Moreover, our method does not increase the calculation time compared with Gentle AdaBoost.