Hasenohrl, Vaclav2018-08-142018-08-142018-05https://hdl.handle.net/11299/198971University of Minnesota M.S. thesis. May 2018. Major: Computer Science. Advisor: Andrew Sutton. 1 computer file (PDF); vi, 79 pages.Solving jump functions by using traditional evolutionary algorithms (EAs) seems to be a challenging task. Mutation only EAs have a hard time flipping the right number of bits to generate the optimum. To optimize a jump function, an algorithm must be able to execute an initial hill-climbing phase, after which a point across a large gap must be generated. We study a family of EAs called estimation of distribution algorithms (EDAs) which works differently than standard EAs. In EDAs, we do not store the actual bitstrings, but rather a probability distribution that is initially uniform and should evolve to a model that always generates the global optimum. We study an EDA called Univariate Marginal Distribution Algorithm (UMDA) and analyze it on jump functions with gap k. We show experimental work on runtimes and probability of succeeding to solve the jump function for different values of k. We take an innovative approach and modify the UMDA by turning off selection. For this new algorithm we present a formal analyses in which, if certain conditions are met, we prove an upper bound on generating the optimum all 1s bistring. Lastly, we compare our results with a different EDA called the compact Genetic Algorithm (cGA) analyzing the jump function. We mention pros and cons of both algorithms under different scenarios.encGAEstimation of distribution algorithmsEvolutionary algorithmsUMDAOn the Runtime Dynamics of the Univariate Marginal Distribution Algorithm on Jump FunctionsThesis or Dissertation