Cost computational

Can you explaine. How can see the computational cost is reduced in comparison of seeking for the optimal hyperparameters?

decision tree regressor with default parameters.

R2 score obtained by cross-validation: 0.354 +/- 0.087
CPU times: user 4.04 ms, sys: 36.7 ms, total: 40.8 ms
Wall time: 1.03 s

Making a grid-search to tune the hyperparameters

R2 score obtained by cross-validation: 0.523 +/- 0.107
CPU times: user 9.27 ms, sys: 4.67 ms, total: 13.9 ms
Wall time: 3.62 s

Using 50 decision trees

R2 score obtained by cross-validation: 0.642 +/- 0.083
CPU times: user 15 ms, sys: 4.31 ms, total: 19.3 ms
Wall time: 3.82 s

What is the “tree in depth” notebook which you have mentioned?

Thanks for your time

I already stated the problem in that post : Potentially misleading comment on computational cost of BaggingRegressor
In short you can see the gain in computational cost locally on a computer but something is wrong on the server.
for “tree in depth” I think it is this notebook: Importance of decision tree hyperparameters on generalization

Thank you Echidne!