A new extracting algorithm of k nearest neighbors searching for point clouds

Zisheng Li, Guofu Ding, Rong Li, Sheng-feng Qin

Research output: Contribution to journalArticlepeer-review

12 Citations (Scopus)

Abstract

k Nearest neighbors (kNN) searching algorithm is widely used for finding k nearest neighbors for each point in a point cloud model for noise removal and surface curvature computation. When the number of points and their density in a point cloud model increase significantly, the efficiency of a kNN searching algorithm becomes critical to various applications, thus, a better kNN approach is needed. In order to improve the efficiency of a kNN searching algorithm, in this paper, a new strategy and the corresponding algorithm are developed for reducing the amount of target points in a given data set by extracting nearest neighbors before the search begins. The nearest neighbors of a reverse nearest neighborhood are proposed to use in extracting nearest points of a query point, avoiding repetitive Euclidean distance calculation in an extracting process for saving time and memories. For any point in the model, its initial nearest neighbors can be extracted from its reverse neighborhood using an inner product of two related vectors other than direct Euclidean distance calculations and comparisons. The initial neighbors can be its full or partial set of the all nearest neighbors. If it is a partial set, the rest can be obtained by using other fast searching algorithms, which can be integrated with the proposed approach. Experimental results show that integrating extracting algorithm proposed in this paper with other excellent algorithms provides a better performance by comparing to their performances alone.
Original languageEnglish
Pages (from-to)162-170
JournalPattern Recognition Letters
Volume49
DOIs
Publication statusPublished - 1 Nov 2014

Fingerprint

Dive into the research topics of 'A new extracting algorithm of k nearest neighbors searching for point clouds'. Together they form a unique fingerprint.

Cite this