Computer scienceData scienceMachine learningEnsemble learning

Gradient boosting

Subsampling effects

Report a typo

Suppose we decided to run gradient boosting for 700 iterations, setting the learning rate to 0.5:

A black curve with the learning rate of 0.5 and 700 rounds of boosting, with boosting rounds on the X axis and the test set deviance on the Y axis

Later, we used subsampling 2 times: with the same initial learning rate (0.5), and later, with the learning rate of 0.3, plotting the black and the magenta curves, illustrated below:

A black and a magenta curves with the learning rate of 0.5 and 0.3 and 700 rounds of boosting, with boosting rounds on the X axis and the test set deviance on the Y axis

What could be said about the performance and the effects of both the learning rate and the subsampling?

Select one or more options from the list
___

Create a free account to access the full topic