Simultaneous Positioning and Orientating (SPAO) for Visible Light Communications: Algorithm Design and Performance Analysis

Bingpeng Zhou, Vincent Lau, Yue Cao, Qingchun Chen

Research output: Contribution to journalArticlepeer-review

18 Citations (Scopus)
7 Downloads (Pure)

Abstract

Visible light communication (VLC)-based simultaneous positioning and orientating (SPAO), using received signal strength (RSS) measurements, is studied in this paper. RSS-based SPAO for VLCs of great challenge as it is essentially a non-convex optimization problem due to the nonlinear RSS model. To address this non-convexity challenge, a novel particle-assisted stochastic search (PASS) algorithm is proposed. The proposed PASS-based SPAO scheme does not require the knowledge of the height of receiver, the perfect alignment of transceiver orientations or inertial measurements. This is a huge technical improvement over the existing VLC localization solutions. The algorithmic convergence is established to justify the proposed ASS algorithm. In addition, a closed-form Cramer-Rao lower bound (CRLB) on localization error is derived and analyzed to gain insights into how the VLC-based SPAO performance is related to system configurations. It is shown that the receiver's position and orientation accuracy is linear with signal-to-noise ratio and direction information. In addition, the position accuracy decays with six powers of the transceiver distance, while the orientation accuracy decays with four powers of the transceiver distance. Finally, simulation results verify the performance gain of the proposed PASS algorithm for VLC-based SPAO.
Original languageEnglish
Pages (from-to)11790-11804
JournalIEEE Transactions on Vehicular Technology
Volume67
Issue number12
Early online date9 Oct 2018
DOIs
Publication statusPublished - Dec 2018

Fingerprint

Dive into the research topics of 'Simultaneous Positioning and Orientating (SPAO) for Visible Light Communications: Algorithm Design and Performance Analysis'. Together they form a unique fingerprint.

Cite this