Yuan, Amber2022-02-152022-02-152021-12https://hdl.handle.net/11299/226381University of Minnesota Ph.D. dissertation. December 2021. Major: Mathematics. Advisor: Jeff Calder. 1 computer file (PDF); ii, 65 pages.Semi-supervised and unsupervised machine learning methods often rely on graphs to model data, prompting research on how theoretical properties of operators on graphs are leveraged in learning problems. In the first part of the thesis, we propose a framework for rigorously studying continuum limits of learning algorithms on directed graphs. We use the new framework to study the PageRank and show how it can be interpreted as a numerical scheme on a directed graph involving a type of normalized graph Laplacian. We show that the corresponding continuum limit problem, which is taken as the number of webpages grows to infinity, is a second-order, elliptic equation that contains reaction, diffusion, and advection terms. In the second part of the thesis, we work in the undirected graph setting and study the short-term behavior of a graph-based random walk defined via the heat kernel. We prove how to estimate the random walk via a Gaussian and propose a method for homogenizing the graph Laplacian to obtain better length-scale restrictions for parameters in the graph model.enAnalysis of partial differential equationsGraphsPartial differential equationsTwo Problems Involving Random Walks on Graphs: Random surfers, PageRank, and short-time asymptotics for the heat kernelThesis or Dissertation