Browsing by Subject "Master of Science in Mathematical Sciences"
Now showing 1 - 2 of 2
- Results Per Page
- Sort Options
Item (1+1) Evolutionary Algorithm on Random Planted Vertex Cover Problems(2024-03) Kearney, JackEvolutionary Algorithms are powerful optimization tools that use the power of randomness and inspiration from biology to achieve results. A common combinatorial optimization problem is the recovery of a minimum vertex cover on some graph 𝐺 = (𝑉, 𝐸). In this work, an evolutionary algorithm will be employed on specific instances of the minimum vertex cover problem containing a random planted solution. This situation is common in data networks and translates to a core set of nodes and larger fringe set that are connected to the core. This study introduces a parameterized analysis of a standard (1+1) Evolutionary Algorithm applied to the random planted distribution of vertex cover problems. When the planted cover is at most logarithmic, restarting the (1+1) EA every 𝑂(𝑛 log 𝑛) steps will, within polynomial time, yield a cover at least as small as the planted cover for sufficiently dense random graphs (𝑝 > 0.71). For superlogarithmic planted covers, the (1+1) EA is proven to find a solution within fixed-parameter tractable time in expectation. To complement these theoretical investigations, a series of computational experiments were conducted, highlighting the intricate interplay between planted cover size, graph density, and runtime. A critical range of edge probability was also investigated.Item Constructing Confidence Intervals for L-statistics Using Jackknife Empirical Likelihood(2020-06-16) Wang, FuliThe linear function of order statistics which is quite known as L-statistics has been widely used in non-parametric statistic such as location estimation and construction of tolerance level. The L-statistics include a family of statistics. The trimmed mean, Gini’s mean difference, and discard-deviation are all important L-statistics which have been well-investigated in relevant research. In order to make inference on L-statistics, we apply jackknife method to L-statistics and generate jackknife pseudo samples. There are two significant advantages of jackknifing the data. First, observations from the jackknife samples behave as if they were independent and identically distributed (iid) random variables. Second, the central limit theorem holds for jackknife samples under mild conditions, see, e.g Cheng [1], so the normal approximation method can be applied to the new sample to estimate the true values of L-statistics. In addition to normal approximation, we also apply jackknife empirical likelihood method to construct the confidence intervals for L-statistics. Our simulation and real-data application results both indicate that the jackknife empirical likelihood-based confidence intervals performs better than the normal approximation-based confidence intervals in terms of coverage probability and the length of confidence intervals.