TY - JOUR
T1 - Efficient Sparse Representation for Learning With High-Dimensional Data
AU - Chen, Jie
AU - Yang, Shengxiang
AU - Wang, Zhu
AU - Mao, Hua
N1 - Funding information: This work was supported in part by the National Natural Science Foundation of China (NSFC) under Grant 61303015 and Grant 61673331, in part by the Sichuan Science and Technology Program under Grant 2021YJ0078, and in part by the National Key Research and Development Program of China (Studies on Key Technologies and Equipment Supporting a High Quality and Highly Efficient Court Trial under Grant 2018YFC0830300 and AI in Law Advanced Deployed Discipline of Sichuan University).
PY - 2023/8/1
Y1 - 2023/8/1
N2 - Due to the capability of effectively learning intrinsic structures from high-dimensional data, techniques based on sparse representation have begun to display an impressive impact in several fields, such as image processing, computer vision and pattern recognition. Learning sparse representations is often computationally expensive due to the iterative computations needed to solve convex optimization problems in which the number of iterations is unknown before convergence. Moreover, most sparse representation algorithms focus only on determining the final sparse representation results and ignore the changes in the sparsity ratio during iterative computations. In this paper, two algorithms are proposed to learn sparse representations based on locality-constrained linear representation learning with probabilistic simplex constraints. Specifically, the first algorithm, called approximated local linear representation (ALLR), obtains a closed-form solution from individual locality-constrained sparse representations. The second algorithm, called approximated local linear representation with symmetric constraints (ALLRSC), further obtains all symmetric sparse representation results with a limited number of computations; notably, the sparsity and convergence of sparse representations can be guaranteed based on theoretical analysis. The steady decline in the sparsity ratio during iterative computations is a critical factor in practical applications. Experimental results based on public datasets demonstrate that the proposed algorithms perform better than several state-of-the-art algorithms for learning with high-dimensional data.
AB - Due to the capability of effectively learning intrinsic structures from high-dimensional data, techniques based on sparse representation have begun to display an impressive impact in several fields, such as image processing, computer vision and pattern recognition. Learning sparse representations is often computationally expensive due to the iterative computations needed to solve convex optimization problems in which the number of iterations is unknown before convergence. Moreover, most sparse representation algorithms focus only on determining the final sparse representation results and ignore the changes in the sparsity ratio during iterative computations. In this paper, two algorithms are proposed to learn sparse representations based on locality-constrained linear representation learning with probabilistic simplex constraints. Specifically, the first algorithm, called approximated local linear representation (ALLR), obtains a closed-form solution from individual locality-constrained sparse representations. The second algorithm, called approximated local linear representation with symmetric constraints (ALLRSC), further obtains all symmetric sparse representation results with a limited number of computations; notably, the sparsity and convergence of sparse representations can be guaranteed based on theoretical analysis. The steady decline in the sparsity ratio during iterative computations is a critical factor in practical applications. Experimental results based on public datasets demonstrate that the proposed algorithms perform better than several state-of-the-art algorithms for learning with high-dimensional data.
KW - Linear representation
KW - low-dimensional structures
KW - probabilistic simplex
KW - sparse representation
UR - http://www.scopus.com/inward/record.url?scp=85118532237&partnerID=8YFLogxK
U2 - 10.1109/TNNLS.2021.3119278
DO - 10.1109/TNNLS.2021.3119278
M3 - Article
AN - SCOPUS:85118532237
SN - 2162-237X
VL - 34
SP - 4208
EP - 4222
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
IS - 8
ER -