Gordon Moore’s famous prediction, known as the Moore’s Law, projected that the most cost-effective number of transistors per integrated circuit (IC) doubles every year. This observation has stayed almost true for many decades, until today. Thanks to this trend, computing with ICs has become more powerful and cheaper year-by-year, to a point that almost all electronic devices of today have a computer in it. Robert H. Dennard formulated another trend, ICs getting faster and more energy-efficient exponentially year-by-year as well, known as Dennard Scaling. More specifically, Dennard predicted that while we can add exponentially more number of smaller and faster transistors on ICs every year, power density stays nearly the same. This trend was crucial, since power budget of ICs is severely bounded due to cooling limits, and cannot scale well. Unfortunately, Dennard Scaling does not hold any more, which leads to severe growth in total power consumption of newer ICs, bringing systems to a point where large number of transistors of ICs become unusable due to overheating, resulting in the dark silicon problem. The total power consumption of a computing system equals to the product of energy per operation, a proxy for energy-efficiency, and operations per second, computing system’s performance, i. e., how fast we can compute. Hence, in a limited power budget, the only way to increase performance is by decreasing energy per operation, i. e., by improving energy-efficiency. This has motivated a diverse set of approaches to form this thesis, all trying to improve the performance even after the end of Dennard scaling by Addressing Computing’s Energy Problem via Minimization of Energy Waste in Computing Systems, including reliability concerns of reducing energy loss in power conversion of on-chip regulators, a novel covert channel induced by aggressive power management in modern computing systems to optimize energy-efficiency, an accelerator based on processing near-memory for a bioinformatics application, read mapping, to reduce energy waste in data transfer, a modular architecture-level model of parametric variation for Thin-Channel switches (with lower energy loss compared to conventional switches), spatio-temporal omission of synchronization points in noise-tolerant applications to improve energy-efficiency of execution, and rethinking memory system of stochastic computing systems to reduce energy loss in data conversion.
University of Minnesota Ph.D. dissertation. March 2019. Major: Electrical/Computer Engineering. Advisor: Ulya Karpuzcu. 1 computer file (PDF); xi, 175 pages.
Khatamifard, Sayyed Karen.
Addressing Computing’s Energy Problem via Minimization of Energy Waste in Computing Systems.
Retrieved from the University of Minnesota Digital Conservancy,
Content distributed via the University of Minnesota's Digital Conservancy may be subject to additional license and use restrictions applied by the depositor.