Hi, I am wondering: if you add more samples to the training set, wouldn’t it be more probable for the model to overfit?
I got this question wrong too. But I think it’s because larger train set = less likely to be affected by noise/peculiarities as they get smoothed out = less likely to overfit = training error increases.
Do correct me if i’m wrong!
1 Like
Yes your intuition is correct.