Analyzing failures of the L1 minimization method in compressive sampling

2012-04-18
Loading...
Thumbnail Image

View/Download File

Persistent link to this item

Statistics
View Statistics

Journal Title

Journal ISSN

Volume Title

Title

Analyzing failures of the L1 minimization method in compressive sampling

Published Date

2012-04-18

Publisher

Type

Presentation

Abstract

This article analyzes ways in which the L1 minimization method fails in a new theory of signal processing called “compressive sampling” or “compressive sensing”. This new theory aims to reconstruct original signals based on very few samples. The samples are taken by multiplying a sample matrix with sparse signals, and then the recovery can be obtained by L1 minimization. However, sometimes it is impossible to recover original signals by finding the solution which minimizes the L1 norm. Therefore, it is necessary to generate data sets to find when and why L1 minimization doesn’t work. We show that probabilistic sample matrices generated by Gaussian and uniform distributions fail for very different reasons than those generated by Bernoulli distributions.

Description

Faculty adviser: Willard Miller

Related to

Replaces

License

Series/Report Number

Funding information

This research was supported by the Undergraduate Research Opportunities Program (UROP).

Isbn identifier

Doi identifier

Previously Published Citation

Other identifiers

Suggested citation

Haiyun, Zhao. (2012). Analyzing failures of the L1 minimization method in compressive sampling. Retrieved from the University Digital Conservancy, https://hdl.handle.net/11299/146313.

Content distributed via the University Digital Conservancy may be subject to additional license and use restrictions applied by the depositor. By using these files, users agree to the Terms of Use. Materials in the UDC may contain content that is disturbing and/or harmful. For more information, please see our statement on harmful content in digital repositories.