You have two classes of points. Instead of managing them in two sets, one just assigns each point in the first class the value $-1$ and in the second class the value $+1$. So in fact you have point-value pairs $(x_i,y_i)$. To classify future points in a consistent way you now want to construct a function $f(x)$ that has not exactly $f(x_i)=y_i$ as in interpolation, but the still sufficient condition $f(x_i)\le -1$ for points in the first class and $f(x_i)\ge +1$ for points in the second class. These two kinds of inequality can be compressed into one single class of inequalities by multiplying with the sign $y_i$,
$$y_if(x_i)\ge 1$$
for all $i=1,...,N$, where $N$ is the number of training points.
"for all" has the symbolic sign $\forall$, an inverted letter "A". The inverted letter "E", $\exists$, is the symbol for "exists".
Now to find such a function, you select a parametrized class of functions $f(w,x)+b$ with some parameter vector $(w,b)$ and strive to find a compromise between having a simple form of $f$ and small function values on the test set, or rather, $f(w,x_i)=y_i$, which defines the support vectors, on as many points as possible. Simplicity includes that the parameters in $w$ are small numbers.
So we come to the linear SVM where $f(w,x)=w^Tx$ and minimal paramters means to minimize $\|w\|_2^2=w^Tw$.
In optimization, this task is encoded via a Lagrange function
$$L(w,b,α)=\tfrac12\|w\|_2^2-\sum_{i=1}^Nα_i(y_i(w^Tx_i+b)-1)$$
with the restriction $α_i\ge 0$.
Standard optimization techniques solve this problem via its KKT system.
\begin{align}
0=\frac{\partial L}{\partial w}&=w-\sum_{i=1}^Nα_iy_ix_i\\
0=\frac{\partial L}{\partial b}&=-\sum_{i=1}^Nα_i y_i\\
α_i&\ge 0\\
y_i(w^Tx_i+b)-1&\ge 0\\
α_i\,(y_i(w^Tx_i+b)-1)&=0
\end{align}
The last three equations again for all $i$. They can be combined using NCP functions like
$$N(u,v)=2uv-(u+v)_-^2$$
with $(u+v)_-=\min(0,u+v)$ to one condition per $i$
$$N(α_i,\, y_i(w^Tx_i+b)-1)=0.$$
This now is smooth enough so that Newton's method or quasi-Newton methods may be applied.