Hi,
As the scoring of cross validation is neg_mean_absolute_error, to be consistent should we set the loss of GradientBoostingRegressor to abolute value.
Here it leads to slightly better results mean error drops to 46 k$ with squared error to 44 with absolute error then i have two questions :
is the a general trend ?
is it the same to have loos = ‘absolute value’ and loss=‘quantile’ with alpa=0.5 and if so, is one to be prefered ?
thanks