# Browsing by Subject "Heterogeneity"

Now showing 1 - 9 of 9

- Results Per Page
1 5 10 20 40 60 80 100

- Sort Options
Ascending Descending

Item Essays in Inequality and Heterogeneity(2019-07) Ocampo, SergioShow more Recent trends in both developed and developing economies show increasing inequality in income and wealth. Technological change is reshaping the nature of work for many, as automation, offshoring and other practices are adopted by firms around the globe. These changes to the type of jobs workers have are linked to changes in wages and labor earnings, in particular the adoption of new (worker-replacing) technologies has been linked to decreases in wages and increases in income inequality. Simultaneously, the trend towards higher inequality has sparked questions about the desirability (optimality) of inequality and whether governments should use the tools at their disposal to try to curb these trends. My dissertation contributes to the discussion on these topics in two distinct ways. The first two chapters deal with the effects of technological change in the nature of occupations, and its effects for wage inequality, while the third chapter deals with the implications of fiscal policy (particularly capital income and wealth taxation) in the face of wealth inequality caused by differences in the rate of return across individuals. The first part of my dissertation develops a new theory of how the specific tasks carried out by workers are determined, providing a flexible framework in which to study the implications for workers of automation, offshoring, skill-biased technological change among others. I use this framework along with U.S. occupational data to study the recent adoption of automation and its effects on the wage structure. The final chapter shows how the determinants of inequality matter for determining the optimal policy in the face of inequality. In the presence of rate of return heterogeneity wealth taxes dominate capital income taxes. Relative to capital income taxes, wealth taxes benefit the individuals who are more productive, increasing the allocative efficiency in the economy, in turn leading to potentially large welfare gains despite increases in inequality.Show more Item Essays in Inequality and Public Economics(2022-08) Malkov, EgorShow more This dissertation consists of three chapters which contribute to quantitative and theoretical understanding of inequality and associated public policies. The first essay studies how different should income taxation be across singles and couples. I answer this question using a general equilibrium overlapping generations model that incorporates single and married households, intensive and extensive margins of labor supply, human capital accumulation, and uninsurable idiosyncratic labor productivity risk. The degree of tax progressivity is allowed to vary with marital status. I parameterize the model to match the U.S. economy and find that couples should be taxed less progressively than singles. Relative to the actual U.S. tax system, the optimal reform reduces progressivity for couples and increases it for singles. The key determinants of optimal policy for couples relative to singles include the detrimental effects of joint taxation and progressivity on labor supply and human capital accumulation of married secondary earners, the degree of assortative mating, and within-household insurance through responses of spousal labor supply. I conclude that explicitly modeling couples and accounting for the extensive margin of labor supply and human capital accumulation is qualitatively and quantitatively important for the optimal policy design. In the second essay, I develop a framework for assessing the welfare effects of labor income tax changes on married couples. I build a static model of couples' labor supply that features both intensive and extensive margins and derive a tractable expression that delivers a transparent understanding of how labor supply responses, policy parameters, and income distribution affect the reform-induced welfare gains. Using this formula, I conduct a comparative welfare analysis of four tax reforms implemented in the United States over the last four decades, namely the Tax Reform Act of 1986, the Omnibus Budget Reconciliation Act of 1993, the Economic Growth and Tax Relief Reconciliation Act of 2001, and the Tax Cuts and Jobs Act of 2017. I find that these reforms created welfare gains ranging from -0.16% to 0.62% of aggregate labor income. A sizable part of the gains is generated by the labor force participation responses of women. Despite three reforms resulting in aggregate welfare gains, I show that each reform created winners and losers. Furthermore, I uncover two patterns in the relationship between welfare gains and couples' labor income. In particular, the reforms of 1986 and 2017 display a monotonically increasing relationship, while the other two reforms demonstrate a U-shaped pattern. Finally, I characterize the bias in welfare gains resulting from the assumption about a linear tax function. I consider a reform that changes tax progressivity and show that the linearization bias is given by the ratio between the tax progressivity parameter and the inverse elasticity of taxable income. Quantitatively, it means that linearization overestimates the welfare effects of the U.S. tax reforms by 3.6-18.1%. The third essay studies the policies that are aimed at mitigating COVID-19 transmission. Most economic papers that explore the effects of COVID-19 assume that recovered individuals have a fully protected immunity. In 2020, there was no definite answer to whether people who recover from COVID-19 could be reinfected with the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). In the absence of a clear answer about the risk of reinfection, it is instructive to consider the possible scenarios. To study the epidemiological dynamics with the possibility of reinfection, I use a Susceptible-Exposed-Infectious-Resistant-Susceptible model with the time-varying transmission rate. I consider three different ways of modeling reinfection. The crucial feature of this study is that I explore both the difference between the reinfection and no-reinfection scenarios and how the mitigation measures affect this difference. The principal results are the following. First, the dynamics of the reinfection and no-reinfection scenarios are indistinguishable before the infection peak. Second, the mitigation measures delay not only the infection peak, but also the moment when the difference between the reinfection and no-reinfection scenarios becomes prominent. These results are robust to various modeling assumptions.Show more Item Essays in macro and labor economics(2013-06) Wiczer, David GeoffreyShow more The first chapter studies the rate of long-term unemployment, which spiked during the Great Recession. To help explain this, I exploit the systematic and counter-cyclical differences in unemployment duration across occupations. This heterogeneity extends the tail of the unemployment duration distribution, which is necessary to account for the observed level of long-term unemployment and its increase since 2007. This chapter introduces a model in which unemployment duration and occupation are linked; it measures the effects of occupation-specific shocks and skills on unemployment duration. Here, a worker will be paid more for human capital in his old occupation but a bad shock may make those jobs scarce. Still, their human capital partly ``attaches'' them to their prior occupation, even when searching there implies a longer expected duration. Hence, unemployment duration rises and becomes more dispersed across occupations. Redistributive shocks and business cycles, as in the Great Recession, exacerbate this effect. For quantitative discipline, the model matches data on the wage premium to occupational experience and the co-movement of occupations' productivity. The distribution of duration is then endogenous. For comparison's sake, if a standard model with homogeneous job seekers matches the job finding rate, then it also determines expected duration and understates it. That standard model implies just over half of the long-term unemployment in 1976-2007 and almost no rise in the recent recession. But, with heterogeneity by occupation, this chapter nearly matches long-term unemployment in the period 1976-2007 and 70% of its rise during the Great Recession. The second chapter studies the link between wage growth and the match of a worker's occupation and skills. The notion here is that if human capital accumulation depends on match quality, poor matches can have long-lasting effects on lifetime earnings. I build a model that incorporates such a mechanism, in which human capital accumulation is affected by imperfect information about one's self. This informational friction leads to matches in which a worker accumulates human capital more slowly and has weaker earnings growth. To get direct evidence, the chapter pieces together two sets of data on the skills used by an occupation and the skills a worker is particularly good at. Data on occupations describes occupations by the intensity with which they use many dimensions of workers' knowledge, skills and abilities. To pair, we have data on tests taken by respondents in a panel that tracks occupations and earnings. The test designers created a mapping between their tests and the occupational descriptors, which allows us to create two measures. The first measure of match quality is just the dot product between the dimensions of workers' skills and utilization rate of these skills by occupations. The second measure mismatch relative to an optimal matching computed using the Gale-Shapley algorithm for stable pairs. In both, worse matches have significantly slower returns to occupational tenure. With the most conservative estimate, plus or minus one standard deviation of mismatch affects the return to occupational tenure by 1% per year.Show more Item Heterogeneous protein distribution during rapid and equilibrium freezing(2013-04) Twomey, Alan MichaelShow more Interactions between proteins and ice were studied in situ using FTIR and confocal Raman microspectroscopy under equilibrium and non-equilibrium conditions over a range of temperatures. During quasi-equilibrium freezing of aqueous solutions of dimethyl sulfoxide (DMSO) and bovine serum albumin, preferential exclusion of albumin and/or DMSO was observed. It was hypothesized that the albumin may be adsorbed onto the ice interface or entrapped in the ice phase. To investigate protein-ice interactions during freezing under non-equilibrium conditions, confocal Raman microspectroscopy was used to map the distribution of albumin and the cryoprotective agent trehalose. Microheterogeneity was found in the composition of the freeze-concentrated liquid phase that indicated that albumin was preferentially distributed near or at the boundary of the ice phase. The observed microheterogeneity did not occur under all freezing protocols, which suggests that the technique developed here could be used to develop freezing protocols that would reduce harmful protein-ice interactions.Show more Item Innovative Statistical Methods for Meta-analyses with Between-study Heterogeneity(2022-06) Xiao, MengliShow more To assess the benefits and harms of medical interventions, meta-analysis plays an important role in combining results from multiple studies. While the notion of combining independent results is motivated by similarities between studies, a pooled estimate may be insufficient in the presence of between-study heterogeneity in a meta-analysis. The sources of between-study heterogeneity come from studies being: 1) different and unrelated (possibly due to a mixture of non-replicable study findings); 2) different but similar (i.e., drawn from the same distribution); or 3) susceptible to modeling using covariates. In the first, studies do not replicate each other, and meta-analysis is not considered an option. In the second, a random-effects model may be used to reflect the similarity of studies, and in the third, a meta-regression analysis is suggested. To differentiate the first from the others, it is essential to develop a statistical framework establishing whether multiple studies give sufficiently similar results, i.e., replicate each other, before undertaking a meta-analysis. However, traditional meta-analysis approaches cannot effectively distinguish whether the between-study difference is from non-replicability or unknown study-specific characteristics. No rigorous statistical methods exist to characterize the non-replicability of multiple studies in a meta-analysis. In Chapter 2, we introduce a new measure, the externally standardized residuals from a leave-m-studies-out procedure, to quantify replicability. We also explore its asymptotic properties and use extensive simulations and three real-data studies to illustrate this measure's performance. We also provide the R package "repMeta" to implement the proposed approach. The remainder of this dissertation concerns scenarios when substantial heterogeneity still exists among replicable studies in a meta-analysis. Such heterogeneity may or may not decrease by incorporating available covariates in a meta-analysis, given that the sources of effect heterogeneity are commonly unknown and unmeasured. A proxy for those unknown and unmeasured factors may still be available in a meta-analysis, namely the baseline risk. Chapter 3 proposes using the bivariate generalized linear mixed-effects model (BGLMM) to 1) account for the potential correlation of the baseline risk with the treatment effect measure, and 2) obtain estimated effects conditioning on the baseline risk. We demonstrate a strong negative correlation between study effects and the baseline risk, and the conditional effects notably vary with baseline risks. Chapter 4 reinforces the suggestion that a meta-analysis should model the heterogeneity in effect measures with respect to baseline risks and study conditions. It finds that two commonly-used binary effect measures, the odds ratio (OR) and risk ratio (RR), have a similar dependence on the baseline risk in 20,198 meta-analyses from the Cochrane Database of Systematic Reviews, a leading source of healthcare evidence. This empirical evidence contrasts with a false argument that OR does not vary with study conditions. We illustrate that understanding effect heterogeneity is essential to patient-centered practice in an actual meta-analysis of the interventions addressing the chronic hepatitis B virus infection.Show more Item Mechanical Heterogeneity and Mechanoadaptation in Cerebral Aneurysms(2022-12) Shih, ElizabethShow more Cerebral aneurysms are abnormal dilations of blood vessels in the brain found in 2% of the population. While rupture is rare, it is fatal or will most likely cause neurological deficits. The prevalence of unexpected ruptures suggests that the current predictive measurements to evaluate rupture risk are incomprehensive and require more investigation. To understand progression and stabilization versus rupture, we adopt a biomechanical approach to investigate how cellular mechanism influence tissue-scale mechanics. In my first aim, I mechanically characterize the local heterogeneity in acquired human cerebral aneurysm and arterial specimens using the Generalized Anisotropic Inverse Mechanics method. I find that both ruptured and unruptured aneurysms are considerably weaker and more heterogeneous than normal arteries, suggesting that maladaptive remodeling results in complex mechanical properties arising from initially ordered structures. From these changes, stress concentrations at boundaries between stiff and weak regions and diverse cell microenvironments are all likely to influence stabilization versus rupture. After identifying that aneurysms contain a wide range of microenvironment stiffnesses, I investigate how local extracellular stiffnesses influence the mechanically dominant and mechanosensitive vascular smooth muscle cells using cellular microbiaxial stretching. First, I examine the common assumptions used in inverse calculations of cell tractions and find that a crucial filtering term must be scaled accordingly to cell substrate mechanical properties to ensure accurate calculations. When this term is adjusted across different microenvironment/substrate groups, I find that healthy smooth muscle cells are remarkably robust across a wide range of substrate moduli. Lastly, I develop a continuum model to capture the physical forces exerted on single cells during aneurysm progression, in which cell density begins to decrease and cells are only able to remodel their immediate surroundings. The model introduces a strain factor for vascular smooth muscle cells, which combines the homogeneous rule-of-mixtures approach with an Eshelby-based strain factor to describe a single inclusion in an infinite matrix. This model will be incorporated into future growth and remodeling laws to describe aneurysm progression. Taken together, the results of this work elucidate the complex tissue and cell mechanics that govern aneurysm development, stabilization, and rupture. This provides a basis to eventually identify new metrics for risk evaluation and improve future predictive models for clinical translation, ultimately aiding aneurysm diagnoses and treatment plans.Show more Item Queueing Analysis of Computer Systems(2022-08) Abdul Jaleel, JazeemShow more Heterogeneity and Performance Interference are two characteristics of modern large scale computer systems. The policies developed for simple homogeneous computer systems do not scale well when accounting for Heterogeneity and Performance Interference. In this thesis, we develop novel “power-of-d” policies that better address such systems. In the first part of this thesis, we approach the challenges of load balancing in large scale heterogeneous server systems. We sequentially develop effective load balancing models and introduce a framework to help characterize the different load balancing policies based on their querying and assignment rules. We compare the performance of these novel optimizable policies with the conventional policies present in literature for different parameter settings. Our policy framework allows us to develop complex load balancing policies --- i.e., allowing for probabilistic querying and/or job assignment not following simple rules such as Join-the-Idle-Queue, Join-the-Shortest-Queue, Join-the-Shortest-Expected-Wait-Queue and Join-the-Shortest-Expected-Delay-Queue--- which is a novel addition to the literature. Furthermore, our work makes it possible to do a comprehensive numerical study of policies that consider each queried server's queue length and speed information for job assignment. Prior to our work, conducting such a study required simulations which was computationally infeasible. Our performance comparisons for different mixes of querying and assignment rules allows us to identify the trade-off between policy simplicity and performance. In the last chapter of this thesis, we address a new area of interest which is load balancing models for systems where servers undergo performance interference. We first develop simple novel “power-of-d” policies for such systems that consider queried server's idleness and/or interference state information for job assignment. Using analytical proofs and numerical experiments we analyze the performance of these policies and identify system parameter regions that favors different heuristics. The analysis further motivates us to develop a more general and complex optimizable policy that has better performance under all parameter settings.Show more Item Statistical methods for meta-analysis(2017-05) Lin, LifengShow more Meta-analysis has become a widely-used tool to combine findings from independent studies in various research areas. This thesis deals with several important statistical issues in systematic reviews and meta-analyses, such as assessing heterogeneity in the presence of outliers, quantifying publication bias, and simultaneously synthesizing multiple treatments and factors. The first part of this thesis focuses on univariate meta-analysis. We propose alternative measures to robustly describe between-study heterogeneity, which are shown to be less affected by outliers compared with traditional measures. Publication bias is another issue that can seriously affect the validity and generalizability of meta-analysis conclusions. We present the first work to empirically evaluate the performance of seven commonly-used publication bias tests based on a large collection of actual meta-analyses in the Cochrane Library. Our findings may guide researchers in properly assessing publication bias and interpreting test results for future systematic reviews. Moreover, instead of just testing for publication bias, we further consider quantifying it and propose an intuitive publication bias measure, called the skewness of standardized deviates, which effectively describes the asymmetry of the collected studies’ results. The measure’s theoretical properties are studied, and we show that it can also serve as a powerful test statistic. The second part of this thesis introduces novel ideas in multivariate meta-analysis. In medical sciences, a disease condition is typically associated with multiple risk and protective factors. Although many studies report results of multiple factors, nearly all meta-analyses separately synthesize the association between each factor and the disease condition of interest. We propose a new concept, multivariate meta-analysis of multiple factors, to synthesize all available factors simultaneously using a Bayesian hierarchical model. By borrowing information across factors, the multivariate method can improve statistical efficiency and reduce biases compared with separate analyses. In addition to synthesizing multiple factors, network meta-analysis has recently attracted much attention in evidence-based medicine because it simultaneously combines both direct and indirect evidence to compare multiple treatments and thus facilitates better decision making. First, we empirically compare two network meta-analysis models, contrast- and arm-based, with respect to their sensitivity to treatment exclusions. The arm-based method is shown to be more robust to such exclusions, mostly because it can use single-arm studies while the contrast-based method cannot. Then, focusing on the currently popular contrast-based method, we theoretically explore the key factors that make network meta-analysis outperform traditional pairwise meta-analyses. We prove that evidence cycles in the treatment network play critical roles in network meta-analysis. Specifically, network meta-analysis produces posterior distributions identical to separate pairwise meta-analyses for all treatment comparisons when a treatment network does not contain cycles. This equivalence is illustrated using simulations and a case study.Show more Item Three essays on Public Economics and heterogeneity.(2009-08) Schneider, Anderson LuisShow more This thesis presents a collection of essays about Public Economics and individual heterogeneity. The essays are motivated by two different subjects. The first subject refers to the relation between economic outcomes and majority voting in a democratic regime. More specifically, the outcome regarding redistributive labor income taxes is analyzed when heterogeneous individuals vote, once and for all, over an infinite sequence of taxes. The second subject refers to time consistency problems in Public Economics. Issues about optimal fiscal policy are considered in an environment where different individuals hold distinct information about (sequential) action performed by the government. This friction prevents the standard punishment mechanism that enforces good policy outcomes or, alternatively, inhibits the occurrence of time consistency problems. The best equilibrium outcome is then analyzed in this new situation. Chapters 2 focus on the first subject. Moreover, the chapter explores the relationship between changes in labor income inequality and movements in labor taxes over the last decades in the US. In order to do so, this relation is modeled through a political economy channel by developing a median voter result over sequence of taxes. We consider an infinite horizon economy in which agents are heterogeneous with respect to both initial wealth and labor skills. We study indirect preferences over redistributive fiscal policies - sequences of affine taxes on and capital income - that can be supported as a competitive equilibrium. The paper assumes balanced growth preferences and full commitment. The first result is the following: if initial capital holdings are an affine function of skills, then the best fiscal policy for the agent with the median labor skill is preferred to any other policy by at least half of the individuals in the economy. The second result provides the characterization of the most preferred tax sequence by the median agent: marginal taxes on labor depend directly on the absolute value of the distance between the median and the mean value of the skill distribution. We extend the above results to an economy in which the distribution of skills evolves stochastically over time. A temporary increase in inequality could imply either higher or lower labor taxes, depending on the sign of the correlation between inequality and aggregate labor. The calibrated model does a good job on fitting both the increasing trend and the levels of labor taxes in the last decades, and also on matching some short run co-movements. Chapter 3 generalizes the median voter theorem developed in chapter 2 to a situation where there is no commitment or, alternatively, voting is sequential over time. More specifically, the same equilibrium definition as in Bernheim and Slavov (2008) is adopted. Chapter 4 deals with optimal fiscal policy when the government takes actions sequentially over time and cannot commit to a pre-specified plan of actions. These features potentially generate what is known in the literature as time consistency problems. Although these problems play an important role in public policy, game theoretical models in macroeconomics seem to indicate the opposite. Due to the complexity of this kind of models, it is commonly assumed that information is complete and perfect. In turn, this assumption becomes the key element that allows agents to coordinate perfectly to punish the government if it does not do what private agents want. As a result, a wide range of feasible payoffs can be sustained as equilibrium, including the best payoff under commitment. Since this approach is widely used for normative purposes a natural question emerges: are the above results robust to small variations in information? This paper analyzes an investment taxation problem in an economy with incomplete information. Specifically, we study an environment with the following main characteristics: 1) the aggregate productivity (fundamental) is stochastic, 2) only the government observes it and; 3) every agent privately receives a noisy signal about the fundamental. The first characteristic implies that the best policy (tax on investment) with commitment is state contingent. The second and third characteristics make the information incomplete. In particular, agents have different information sets, and therefore different beliefs, about the true state of the economy. As a result, independently of the accuracy of the signal, incomplete information reduces the set of equilibrium payoffs. First, we show that any policy that depends solely on the fundamental cannot be an equilibrium. Second, the best equilibrium policy is independent of the fundamental. Finally, for any discount factor strictly smaller than one and for any size of the noise, the best equilibrium is inefficient.Show more