Weights and Erros

As we can see at last, as weight increase errors decrease. From my understanding, as it was able to 1st predictions well it had less errors while because black data points are far away so they lead to more average weights. Is my understanding right?

Compared to other 2 predictions we get average weights less because they are near to decision boundary and since there are more wrong predictions so the error increase consequently.

In AdaBoost, the fewer mistakes the weak classifier does the higher one can have in this classifier and thus the higher the weight.