Loss of GradientBoostingRegressor

Hi,

As the scoring of cross validation is neg_mean_absolute_error, to be consistent should we set the loss of GradientBoostingRegressor to abolute value.

Here it leads to slightly better results mean error drops to 46 k$ with squared error to 44 with absolute error then i have two questions :
is the a general trend ?
is it the same to have loos = ‘absolute value’ and loss=‘quantile’ with alpa=0.5 and if so, is one to be prefered ?
thanks

This is not necessary. The score used to validate the model can be different from the loss optimized by the model. Using an absolute loss in the regressor means that the estimator will be more robust to outlier and estimate the median instead of the mean. The scoring of the cross-validation can be any metric like the MAE or for instance, a business-oriented metric that is adapted to the problem at hand.

As stated above, minimizing the absolute loss will be a better estimator of the median. So we could expect that the MAE will be decreasing if the mean and median estimators are different.

Indeed, the absolute loss is exactly equal to the quantile loss (pinball loss) for quantile=0.5.