Questions tagged [partial-least-squares]

A class of linear methods for modeling the relationship between two groups of variables, X and Y. Includes PLS regression.

Partial Least Squares (PLS) is a class of linear methods for modeling the relationship between two groups of variables (X and Y). It includes regression methods, where X are independent variables and Y are dependent, as well as modeling methods that treat X and Y symmetrically. All PLS methods revolve around maximizing covariance between linear combinations of variables in X and Y, but several different variants exist in the literature.

266 questions
4
votes
1 answer

one dimension PLS

We know one of the definition of partial least squares (PLS) is: $$\max\limits_{\alpha_x,\alpha_y}Cov(\alpha_x^Tx,\alpha^T_yy)$$ $$||\alpha_x|| = ||\alpha_y|| = 1.$$ Here $x = (x_1,\cdots,x_n)$ is the dependent variable; $y = (y_1,\cdots,y_m)$ is…
user6703592
  • 1,325
3
votes
1 answer

Partial Least Squares regression - coefficients vs loadings

In partial least squares regression, what is the difference between the regression coefficients and the loadings for each independent variable in each component? Specifically, I understand in evety component, each of the independent variables has a…
3
votes
1 answer

What is the origin of the PLS1 algorithm given on the PLS Wikipedia page?

The Wikipedia page for Partial Least Squares (PLS) gives an algorithm for the method which is uncited and for which I cannot find the source material. It appears to be very much simpler than most if not all other implementations which I have come…
2
votes
0 answers

Partial Least Squares NIPALS Algorithm when Y has more than one column (PLS2)

I wanted to exactly understand how Partial Least Squares Regression works and thus got my hands onto a paper called "A Simple Explanation of Partial Least Squares". After some thinking and consulting on other related papers and tutorials I think I…
Joel H
  • 88
2
votes
0 answers

If there is only one variable in Y, does the first PLS component go in the exact same direction?

In partial least squares (PLS), I have multiple variables in X and only one variable in Y. If I only choose one PLS component to use for the PLS model, can I assume that this PLS component is in the same direction as the single Y variable? As shown…
lanselibai
  • 507
  • 2
  • 6
  • 14
2
votes
1 answer

Partial least squares for expression datasets

I'm quite new to the applications of partial least squares regression analysis, and was hoping I could get an overview of how this analysis can be applied to the datasets I have. I have two datasets: one contains microRNA (miRNA) expression values…
1
vote
0 answers

meaning of projection subspace in a PLSDA plot

I have a dataset with a handful of predictors and one output variable which is categorical and can only be C or N. I am working in R, using the plsda function from the mixOmics package. When I plot the results of the PLSDA with the plotIndiv…
Dave
  • 507
  • 2
  • 9
1
vote
1 answer

plsr output RMSEP increasing as number of components

I have a dataset with 20 rows and 480 columns. When I run plsr command with validation="LOO" my output shows RMSEP or CV is increasing with number of components and stabilizing after 6 components. My impression is that RMSEP should decrease.…
Rk57
  • 11
  • 1
1
vote
1 answer

RMSEP values for choosing number of components in PLSr

I'm new to PLSR and have 1 response variable (iso.freq) and 6 explanatory variables (leaf traits). I ran the following code: df.IsoFreq <- plsr( iso.freq ~ lma + ldmc + tough + thick + carbon + nitrogen, scale = TRUE, ncomp = 6, validation = "LOO",…
Tellez
  • 11
0
votes
0 answers

Linear independency of Loading matrix of Partial Least Squares

I'm now studying for a partial least squares regression (PLS-R). Unlike PCA, it says that score vectors are orthogonal, while loadings are not. Does this mean that loading vectors (which are actually just regression parameters) can happen to be…
0
votes
1 answer

How to interpret null or nearly null coefficients with VIP > 1 in PLSR?

I try to interpret a PLSr model that I used to predict a response variable using full range spectroscopy (500 - 2400 nm). I followed the method from Serbin et al. 2014 (https://doi.org/10.1890/13-2110.1) to build the PLSr model, ie to chose the…
0
votes
1 answer

PLS-DA dependent variables

Is it possible to use more than one categorical dependent variable with partial least square discriminant analysis? Thanks.
Patsy
  • 1
0
votes
0 answers

PLS regression predictions

We have the following sample containing two predictors ($x_1, x_2$) and one dependent variable ($y$). $x_1=[-1.01, 3.23, 5.49, 0.23, -2.87, 3.67]$ $x_2=[-0.99, 3.25, 5.55, 0.21, -2.91, 3.76]$ $y=[-1.89, 10.33, 19.09, 2.19, -8.09, 11.29]$ I…
muffin
  • 5
0
votes
0 answers

When do all PLS components together explain only part of the variance of the original data?

According to this question and answer, the sum of variances of all partial least squares (PLS) components is normally less than 100%: Why do all the PLS components together explain only a part of the variance of the original data? Can somebody…
Iggy25
  • 199
0
votes
0 answers

PLS regression analysis based on Likert scale

Can dependent variable for PLS be based on Likert scale? As i understand, dependent variable needs to be continuous. I wish to multiply frequency (i.e. integer value 3) with rating coming from likert scale (for example 5) to get value of Y as 15.…