Browsing by Subject "PCA"
Now showing 1 - 3 of 3
- Results Per Page
- Sort Options
Item Novel Nonlinear and RR-based Methods for Inappropriate ICD Therapy Reduction(2021-05) Newell, SamuelImplantable cardioverter-defibrillators (ICD) are a commonly implanted device used to deliver arrythmia terminating therapy to the heart in the presence of life-threatening arrythmias such as ventricular fibrillation and ventricular tachycardia. However, not all therapies, shocks or anti-tachycardia pacing (ATP), are appropriate. In many instances, therapy is delivered when the heart is in an abnormal, but non-life-threatening arrythmia such as atrial fibrillation. While major medical device companies have devised numerous strategies to eliminate these cases of inappropriate therapy, there remains room for improvement. Thus, the goal of this thesis is to describe the development of novel strategies to discriminate between appropriate and inappropriate ICD therapy. The strategies use nonlinear measures and RR interval measures fed into principal component analysis or linear combination scores to discriminate. The final method using linear combination scores showed near 100% discrimination between appropriate and inappropriate therapy events retrospectively and 100% discrimination in a pseudo real time study. Thus, this strategy shows immense promise for use in a future large clinical study to eliminate inappropriate therapy.Item Online Convex Optimization in Changing Environments and its Application to Resource Allocation(2019-12) Yuan, JianjunIn the era of the big data, we create and collect lots of data from all different kinds of sources: the Internet, the sensors, the consumer market, and so on. Many of the data are coming sequentially, and would like to be processed and understood quickly. One classic way of analyzing data is based on batch processing, in which the data is stored and analyzed in an offline fashion. However, when the volume of the data is too large, it is much more difficult and time-consuming to do batch processing than sequential processing. What’s more, sequential data is usually changing dynamically, and needs to be understood on-the-fly in order to capture the changes. Online Convex Optimization (OCO) is a popular framework that matches the above sequential data processing requirement. Applications using OCO include online routing, online auctions, online classification and regression, as well as online resource allocation. Due to the general applicability of OCO to the sequential data and the rigorous theoretical guarantee, it has attracted lots of researchers to develop useful algorithms to fulfill different needs. In this thesis, we show our contributions to OCO’s development by designing algorithms to adapt to changing environments. In the first part of the thesis, we propose algorithms to have better adaptivity by examining the notion of dynamic regret, which compares the algorithm’s cumulative loss against that incurred by a comparison sequence. Dynamic regret extends a common performance measure known as static regret. Since it may not be known whether the environment is dynamic or not, it is desirable to take advantage of both regrets by having a trade-off between them. To achieve that, we discuss recursive least-squares algorithms and show how forgetting factors can be used to develop new OCO algorithms that have such a regret trade-off. More specifically, we rigorously characterize the effect of forgetting factors for a class of online Newton algorithms. For exp-concave or strongly convex objective, the improved dynamic regret of max{O(log T),O(\sqrt{TV })} is achieved, where V is a bound on the path length of the comparison sequence. In particular, we show how classic recursive least-squares with a forgetting factor achieves this dynamic regret bound. By varying V , we obtain the regret trade-off. In order to obtain more computationally efficient algorithm, we also propose a novel gradient descent step size rule for strongly convex functions, which recovers the dynamic regret bounds described above. For smooth problems, we can obtain static regret of O(T^{1−\beta}) and dynamic regret of O(T^{\beta}V^*), where \beta\in(0,1) and V^* is the path length of the sequence of minimizers. By varying \beta, we obtain the regret trade-off. The second part of the thesis describes how to design efficient algorithms to adapt to the changing environments. Previous literature runs a pool of algorithms in parallel to gain better adaptivity, which increases both the running time and the online implementation complexity. Instead, we propose a new algorithm requiring only one update per time step, while with the same adaptive regret performance guarantee as the current state-of-the-art result. We then apply the algorithm to online Principal Component Analysis (online PCA) and variance minimization under changing environments, since the previous literature on online PCA has focused on performance guarantee under stationary environment. We demonstrate both theoretically and experimentally that the proposed algorithms can adapt to the changing environments. The third part of the thesis starts from the observation that the projection operator used in constrained OCO algorithms cannot really achieve true online implementation due to the high time-consumption. To accelerate the OCO algorithms’ update, previous literature is proposed to approximate the true desired projection with a simpler closed-form one at the cost of constraint violation (g(\theta) >0) for some time steps. Nevertheless, it can guarantee sub-linearity for both the static regret and the long-term constraint, \sum_{t=1}^T g(\theta_t), having constraint satisfaction on average. However, the sub-linear long-term constraint does not enforce small constraint violation for every time step, because a strictly feasible solution can cancel out the effects of violated constraints. To resolve it, we propose algorithms to have the cumulative constraint of the form \sum_{t=1}^T(max{g(\theta_t), 0})^2 upper bounded sub-linearly. This new form heavily penalizes large constraint violations while the cancellation effects cannot occur. Furthermore, useful bounds on the single step constraint violation are derived. For convex objectives, our result generalizes existing bounds, and for strongly convex objectives we give improved regret bounds. In numerical experiments, we show that our algorithm closely follows the constraint boundary leading to low cumulative violation. Furthermore, we extend the proposed algorithms’ idea to the more general time-dependent online resource allocation problems with performance guarantee by a variant of dynamic regret.Item A Study of Dimensionality Reduction Techniques and its Analysis on Climate Data(2015-10) Kumar, ArjunDimensionality reduction is a significant problem across a wide variety of domains such as pattern recognition, data compression, image segmentation and clustering. Different methods exploit different features in the data to reduce dimensionality. Principle component Analysis is one such method that exploits the variance in data to embed data onto a lower dimensional space called the principle component space. These are linear techniques which can be expressed in the form B=TX where T is the transformation matrix that acts on the data matrix X to the reduced dimensionality representation B. Other linear techniques explored are Factor Analysis and Dictionary Learning. In many problems, the observations are high-dimensional but we may have reason to believe that the they lie near a lower-dimensional manifold. In other words, we may believe that high-dimensional data are multiple, indirect measurements of an underlying source, which typically cannot be directly measured. Learning a suitable low-dimensional manifold from high-dimensional data is essentially the same as learning this underlying source. Techniques such as ISOMAP, Locally Linear Embedding, Laplacian EigenMaps (LEMs) and many others try to embed the high-dimensional observations in the non-linear space onto a low dimensional manifold. We will explore these methods making comparative studies and their applications in the domain of climate science.