TY - JOUR
T1 - Symmetric low-rank preserving projections for subspace learning
AU - Chen, Jie
AU - Mao, Hua
AU - Zhang, Haixian
AU - Yi, Zhang
PY - 2018/11/13
Y1 - 2018/11/13
N2 - Graph construction plays an important role in graph-oriented subspace learning. However, most existing approaches cannot simultaneously consider the global and local structures of high-dimensional data. In order to solve this deficiency, we propose a symmetric low-rank preserving projection (SLPP) framework incorporating a symmetric constraint and a local regularization into low-rank representation learning for subspace learning. Under this framework, SLPP-M is incorporated with manifold regularization as its local regularization while SLPP-S uses sparsity regularization. Besides characterizing the global structure of high-dimensional data by a symmetric low-rank representation, both SLPP-M and SLPP-S effectively exploit the local manifold and geometric structure by incorporating manifold and sparsity regularization, respectively. The similarity matrix is successfully learned by solving the nuclear-norm minimization optimization problem. Combined with graph embedding techniques, a transformation matrix effectively preserves the low-dimensional structure features of high-dimensional data. In order to facilitate classification by exploiting available labels of training samples, we also develop a supervised version of SLPP-M and SLPP-S under the SLPP framework, named S-SLPP-M and S-SLPP-S, respectively. Experimental results in face, handwriting and object recognition applications demonstrate the efficiency of the proposed algorithm for subspace learning.
AB - Graph construction plays an important role in graph-oriented subspace learning. However, most existing approaches cannot simultaneously consider the global and local structures of high-dimensional data. In order to solve this deficiency, we propose a symmetric low-rank preserving projection (SLPP) framework incorporating a symmetric constraint and a local regularization into low-rank representation learning for subspace learning. Under this framework, SLPP-M is incorporated with manifold regularization as its local regularization while SLPP-S uses sparsity regularization. Besides characterizing the global structure of high-dimensional data by a symmetric low-rank representation, both SLPP-M and SLPP-S effectively exploit the local manifold and geometric structure by incorporating manifold and sparsity regularization, respectively. The similarity matrix is successfully learned by solving the nuclear-norm minimization optimization problem. Combined with graph embedding techniques, a transformation matrix effectively preserves the low-dimensional structure features of high-dimensional data. In order to facilitate classification by exploiting available labels of training samples, we also develop a supervised version of SLPP-M and SLPP-S under the SLPP framework, named S-SLPP-M and S-SLPP-S, respectively. Experimental results in face, handwriting and object recognition applications demonstrate the efficiency of the proposed algorithm for subspace learning.
KW - Low-rank representation
KW - Manifold regularization
KW - Sparsity regularization
KW - Subspace learning
KW - Dimensionality reduction
U2 - 10.1016/j.neucom.2018.07.031
DO - 10.1016/j.neucom.2018.07.031
M3 - Article
SN - 0925-2312
VL - 315
SP - 381
EP - 393
JO - Neurocomputing
JF - Neurocomputing
ER -