Faq difference between a loading and a weighting: Difference between revisions
imported>Lyle No edit summary |
imported>Lyle No edit summary |
||
Line 5: | Line 5: | ||
===Possible Solutions:=== | ===Possible Solutions:=== | ||
When performing Principal Components Analysis (PCA), you get loadings, P, which are an orthonormal basis which can be used to calculate scores: <code>T = X*P</code> or to estimate data <code>X = T*P'</code> | When performing Principal Components Analysis (PCA), you get loadings, <code>P</code>, which are an orthonormal basis which can be used to calculate scores: <code>T = X*P</code> or to estimate data <code>X = T*P'</code> | ||
These operations are invertible (repeating them gives the same result) because the loadings are the eigenvectors of <code>X'X</code>. | These operations are invertible (repeating them gives the same result) because the loadings are the eigenvectors of <code>X'X</code>. | ||
Line 11: | Line 11: | ||
When using Partial Least Squares (PLS), you get loadings, <code>P</code>, but also weights, <code>W</code>, because the decomposition is based on <code>X'Y</code>. The weights and loadings must be used together to calculate scores: <code>T = X*W*pinv(P'*W)</code> From a phenomenological point of view, the weights represent features in <code>X</code> which are related to the original <code>Y</code> values. The loadings represent the features in <code>X</code> which are related to the scores, <code>T</code>, which are the given factor's estimate of <code>Y</code>. | When using Partial Least Squares (PLS), you get loadings, <code>P</code>, but also weights, <code>W</code>, because the decomposition is based on <code>X'Y</code>. The weights and loadings must be used together to calculate scores: <code>T = X*W*pinv(P'*W)</code> From a phenomenological point of view, the weights represent features in <code>X</code> which are related to the original <code>Y</code> values. The loadings represent the features in <code>X</code> which are related to the scores, <code>T</code>, which are the given factor's estimate of <code>Y</code>. | ||
Note, by the way, that the weights are the ones used to calculate the regression vector (that which is used to make a prediction). Loadings are only used when calculating scores and, of course, Hotelling's | Note, by the way, that the weights are the ones used to calculate the regression vector (that which is used to make a prediction). Loadings are only used when calculating scores and, of course, Hotelling's T<sup>2</sup>. | ||
Latest revision as of 13:57, 8 January 2019
Issue:
What is the difference between a loading and a weighting?
Possible Solutions:
When performing Principal Components Analysis (PCA), you get loadings, P
, which are an orthonormal basis which can be used to calculate scores: T = X*P
or to estimate data X = T*P'
These operations are invertible (repeating them gives the same result) because the loadings are the eigenvectors of X'X
.
When using Partial Least Squares (PLS), you get loadings, P
, but also weights, W
, because the decomposition is based on X'Y
. The weights and loadings must be used together to calculate scores: T = X*W*pinv(P'*W)
From a phenomenological point of view, the weights represent features in X
which are related to the original Y
values. The loadings represent the features in X
which are related to the scores, T
, which are the given factor's estimate of Y
.
Note, by the way, that the weights are the ones used to calculate the regression vector (that which is used to make a prediction). Loadings are only used when calculating scores and, of course, Hotelling's T2.
Still having problems? Please contact our helpdesk at helpdesk@eigenvector.com