1

I have read the description of the ELM in this question, but a lot of it goes over my head.

What is a less mathematical, more intuitive understanding of ELM?

Tom Hale
  • 2,561
  • A quick question just to clarify from which angle you try to understand it: Is it 'ok' for you to take basis functions other than the very simple projections $\phi_j(x_1, ..., x_d) = x_j$ in linear regression? In that case the author of the post you are referring to gives the answer: It actually is a linear regression with a fixed but random choice of basis functions...

    Ooops... I'm probably a little late :-(

    – Fabian Werner Feb 02 '18 at 11:00
  • @FabianWerner What do you mean by a basis function? I understand backprop and NN basics having done the first few weeks of Andrew Ng's Machine Learning course. I haven't yet seen the projection (what is one anyway) that you mention above. – Tom Hale Feb 26 '18 at 08:36
  • So let us talk about regression. Initially we write $f(x) = y = w_1x_1 + ...+ w_dx_d$ but it turns out that one can use much more complicated ingredients than just $x_1, ..., x_d$, i.e. we could write $f(x) = w_1 e^{-x_1} + ... + w_d e^{-x_d}$, i.e. any complciated function. This is still linear regression in the sense that you do not need to change the algorithm: just use any gradient descent method just now the gradient looks a little different. In general $f(x) = w_1\phi_1(x) + ... + w_d\phi_d(x)$. The $\phi_j$ are called 'basis functions' and ELM is just one fixed choice of basis functions – Fabian Werner Feb 26 '18 at 16:53

0 Answers0