Haiyun, Zhao2013-03-112013-03-112012-04-18https://hdl.handle.net/11299/146313Faculty adviser: Willard MillerThis article analyzes ways in which the L1 minimization method fails in a new theory of signal processing called “compressive sampling” or “compressive sensing”. This new theory aims to reconstruct original signals based on very few samples. The samples are taken by multiplying a sample matrix with sparse signals, and then the recovery can be obtained by L1 minimization. However, sometimes it is impossible to recover original signals by finding the solution which minimizes the L1 norm. Therefore, it is necessary to generate data sets to find when and why L1 minimization doesn’t work. We show that probabilistic sample matrices generated by Gaussian and uniform distributions fail for very different reasons than those generated by Bernoulli distributions.en-USCollege of Liberal ArtsAnalyzing failures of the L1 minimization method in compressive samplingPresentation