Regression of a data matrix on descriptors of both its rows and of its columns via latent variables: L-PLSR
Journal : Computational Statistics & Data Analysis , vol. 48 , p. 102–123–22 , 2005
Publisher : Elsevier
International Standard Numbers
Printed : 0167-9473
Electronic : 1872-7352
Publication type : Academic article
DOI : doi.org/10.1016/j.csda.2003.10...
If you have questions about the publication, you may contact Nofima’s Chief Librarian.
A new approach is described, for extracting and visualising structures in a data matrix Y in light of additional information BOTH about the ROWS in Y, given in matrix X, AND about the COLUMNS in Y, given in matrix Z. The three matrices Z–Y–X may be envisioned as an “L-shape”; X(I×K) and Z(J×L) share no matrix size dimension, but are connected via Y(I×J). A few linear combinations (components) are extracted from X and from Z, and their interactions are used for bi-linear modelling of Y, as well as for bi-linear modelling of X and Z themselves. The components are defined by singular value decomposition (SVD) of X′YZ. Two versions of the L-PLSR are described—using one single SVD for all components, or component-wise SVDs after deflation. The method is applied to the analysis of consumer liking data Y of six products assessed by 125 persons, in light of 10 other product descriptors X and 15 other person descriptors Z. Its performance is also checked on artificial data.