5

I was watching andrew ng's lecture on machine learning and I came across 'geometric margin' in the SVM lecture. I am confused about he obtained the equation for the point B ?

enter image description here

Notice that the hyperplane is the slanted line where $w^Tx + b = 0$

The main question: How did he obtain $$B = x^{(i)} - \gamma^{(i)} \frac{w}{||w||}$$

I have several questions to ask:

  1. is the line segment $AB$ perpendicular to the decision boundary (the hyperplane where $w^Tx + b = 0$) ?

  2. The most confusing part for me is: why does he do $x^{(i)} minus$ ? What does it really mean in geometrically ?

Thanks if someone can explain the ideas behind this .

mynameisJEFF
  • 1,843

1 Answers1

4

Geometrically it is the projection of a point onto a line, so

  1. AB is perpendicular to the line.

  2. $\gamma$ is the shortest Euclidean distance from the point A to the line.

$b$ is minus the distance from the origin to the line. If $x_{A}$, resp. $x_{B}$, is the vector from the origin to the point $A$, resp. $B$, then, $$ (x_{A}-x_{B})^{T}\frac{\omega}{||\omega||} = \gamma^{i} $$

In this tutorial you shall find a detailed formulation of those equations and a detailed formulation of the SVM optimization problem. Really worth reading.

jpmuc
  • 13,964
  • thanks, actually i just managed to depict it geometrically to understand what exactly is happening. Thank you so much anyway – mynameisJEFF Mar 25 '14 at 09:40
  • Can you explain how you derived this? I am having the same confusion. – B_Miner Nov 13 '14 at 13:51
  • For others looking for this explanation: http://math.stackexchange.com/questions/1020345/how-to-derive-point-on-plane-from-normal-vector-geometric-margins – B_Miner Nov 14 '14 at 13:43