Papers
Topics
Authors
Recent
Search
2000 character limit reached

Sparsity based Efficient Cross-Correlation Techniques in Sensor Networks

Published 26 Jan 2015 in cs.OH | (1501.06473v3)

Abstract: Cross-correlation is a popular signal processing technique used in numerous location tracking systems for obtaining reliable range information. However, its efficient design and practical implementation has not yet been achieved on mote platforms that are typical in wireless sensor network due to resource constrains. In this paper, we propose SparseS-XCorr: cross-correlation via structured sparse representation, a new computing framework for ranging based on L1-minimization and structured sparsity. The key idea is to compress the ranging signal samples on the mote by efficient random projections and transfer them to a central device; where a convex optimization process estimates the range by exploiting the sparse signal structure in the proposed correlation dictionary. Through theoretical validation, extensive empirical studies and experiments on an end-to-end acoustic ranging system implemented on resource limited off-the-shelf sensor nodes, we show that the proposed framework can achieve up to two orders of magnitude better performance compared to other approaches such as working on DCT domain and downsampling. Compared to the standard cross-correlation, it is able to obtain range estimates with a bias of 2-6cm with 30% and approximately 100cm with 5% compressed measurements. Its structured sparsity model is able to improve the ranging accuracy by 40% under challenging recovery conditions (such as high compression factor and low signal-to-noise ratio) by overcoming limitations due to dictionary coherence.

Citations (4)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.