Faq why get negative scores when all modes are set to nonnegativity: Difference between revisions

From Eigenvector Research Documentation Wiki
Jump to navigation Jump to search
imported>Lyle
No edit summary
imported>Lyle
 
Line 11: Line 11:
That the data are all-negative. A nonnegative solution would probably be something which had some loading vectors that were all zero. For mathematical reasons, this can not be allowed (because the rank goes down). The algorithm, when struggling, therefore tries to make the model as nonnegative as possible without reaching a situation where one whole loading vector is all-zero.  
That the data are all-negative. A nonnegative solution would probably be something which had some loading vectors that were all zero. For mathematical reasons, this can not be allowed (because the rank goes down). The algorithm, when struggling, therefore tries to make the model as nonnegative as possible without reaching a situation where one whole loading vector is all-zero.  


This can happen when you use too many components or have data that are otherwise 'strange'. Sometimes, starting the algorithm with more initial guesses can help (set options.init=10). But most of the time, it is an indication that the constraint is not meaningful on your data with the given number of components.  
This can happen when you use too many components or have data that are otherwise 'strange'. Sometimes, starting the algorithm with more initial guesses can help (set <code>options.init=10</code>). But most of the time, it is an indication that the constraint is not meaningful on your data with the given number of components.  


It can be very instructive to run the algorithm without any constraints and try to interpret what it is in the data that causes negative components. This can often point to a solution to you problem.  
It can be very instructive to run the algorithm without any constraints and try to interpret what it is in the data that causes negative components. This can often point to a solution to you problem.  

Latest revision as of 13:35, 8 January 2019

Issue:

Nonnegativity (PARAFAC, PARAFAC2, Tucker): Why do I get negative scores when all modes are set to nonnegativity?

Possible Solutions:

This can happen, when the data are really difficult to model meaningfully with nonnegativity.

Imagine for example:

That the data are all-negative. A nonnegative solution would probably be something which had some loading vectors that were all zero. For mathematical reasons, this can not be allowed (because the rank goes down). The algorithm, when struggling, therefore tries to make the model as nonnegative as possible without reaching a situation where one whole loading vector is all-zero.

This can happen when you use too many components or have data that are otherwise 'strange'. Sometimes, starting the algorithm with more initial guesses can help (set options.init=10). But most of the time, it is an indication that the constraint is not meaningful on your data with the given number of components.

It can be very instructive to run the algorithm without any constraints and try to interpret what it is in the data that causes negative components. This can often point to a solution to you problem.

***Note that applying centering will give negative data (and loadings)***


Still having problems? Please contact our helpdesk at helpdesk@eigenvector.com