I was watching andrew ng's lecture on machine learning and I came across 'geometric margin' in the SVM lecture. I am confused about he obtained the equation for the point B ?

Notice that the hyperplane is the slanted line where $w^Tx + b = 0$
The main question: How did he obtain $$B = x^{(i)} - \gamma^{(i)} \frac{w}{||w||}$$
I have several questions to ask:
is the line segment $AB$ perpendicular to the decision boundary (the hyperplane where $w^Tx + b = 0$) ?
The most confusing part for me is: why does he do $x^{(i)} minus$ ? What does it really mean in geometrically ?
Thanks if someone can explain the ideas behind this .