Logan, Perry William2011-02-172011-02-172010-12https://hdl.handle.net/11299/100430University of Minnesota Ph.D. dissertation. December 2010. Major: Environmental Health. Advisor: Gurumurthy Ramachandram, Ph.D. 1 computer file (PDF);vii, 133 pages, appendices I-II. Ill. (some col.)Accurate exposure assessments are critical for ensuring that potentially hazardous exposures are properly identified and controlled. The availability and accuracy of exposure assessments can determine whether resources are appropriately allocated to engineering and administrative controls, medical surveillance, personal protective equipment and other programs designed to protect workers. A desktop study was performed using videos, task information and sampling data to evaluate the accuracy and potential bias of participants' exposure judgments. Desktop exposure judgments were obtained from occupational hygienists for material handling jobs with small air sampling data sets (0-8 samples) and without the aid of computers. In addition, data interpretation tests were administered to participants where they were asked to estimate the 95th percentile of an underlying lognormal exposure distribution from small data sets. Participants were presented with an exposure data interpretation or rule-of-thumb training which included a simple set of rules for estimating 95th percentiles for small data sets from a lognormal population. Results of each data interpretation test and qualitative and quantitative exposure judgments were compared with a reference judgment obtained through a Bayesian probabilistic analysis of the sampling data to investigate overall judgment accuracy and bias. There were a total of 4,386 participant-task-chemical judgments for all data collections: 552 qualitative judgments made without sampling data and 3,834 quantitative judgments with sampling data. The data interpretation tests and quantitative judgments were significantly better than random chance and much improved by the rule of thumb training. In addition, the rule of thumb training reduced the amount of bias in the data interpretation tests and quantitative judgments. The mean data interpretation test % correct scores increased from 47% to 64% after the rule-of-thumb training (p<0.001). The accuracy for quantitative desktop judgments increased from 43% to 63% correct after the rule-of-thumb training (p<0.001). The rule of thumb training did not significantly impact accuracy for qualitative desktop judgments. The finding that even some simple statistical rules of thumb improve judgment accuracy significantly suggests that hygienists need to routinely use statistical tools while making exposure judgments using monitoring data. Logistic regression analysis indicated "years of exposure assessment experience" (p<0.05), "highest EHS degree" (p<0.05) and a participant's "data interpretation test score" (p<0.05) directly impacted qualitative exposure judgment accuracy. Logistic regression models of quantitative judgment accuracy showed positive correlation with "greater than 10 years of exposure assessment experience" (p<0.05), "highest EHS degree" (p<0.05), a participant's "data interpretation test score" (p<0.001), rule-of-thumb data interpretation training (p<0.001), and the number of sample data points available for a judgment (p<0.005). Analyzing judgments in subsets for participants with fewer or more than 10 years experience indicated additional correlations with Certified Industrial Hygienist and Certified Safety Professional certifications, total number of task exposure assessments, and career number of air surveys. The correlation of qualitative and quantitative exposure judgment accuracy with "greater than 10 years experience" supports similar research findings from other fields. The results of this study indicate that several determinants of experience, education and training, in addition to the availability of sampling data, significantly impact the accuracy of exposure assessments for the set of exposure tasks and agents used in this study. The findings also suggest methods for enhancing exposure judgment accuracy through statistical tools and specific training. Simulations were designed to evaluate the performance of several quantitative exposure assessments strategies for different exposure distributions. Bayesian tools are becoming popular and have been included in the simulations for this study along with simple comparison, point estimate and upper confidence limit strategies using minimum sample sizes less than 7 samples. The decision statistic selected for the simulations was the 95th percentile which defines acceptable exceedance fractions by 0.01%, 0.1%, 1%, and unacceptable defined by 10%, 20%, 30% and 50%. Bayesian strategies with using professional judgment were also included to illustrate the impact of an incorrect prior judgment. For acceptable exposure distributions, simple comparison and professional judgment integrated Bayesian strategies showed the highest probability for detecting an acceptable exposure. Bayesian strategies without professional judgment followed by upper confidence limit strategies were least likely to incorrectly define unacceptable exposure distributions as acceptable. Reviewing the different minimum sampling numbers for strategies indicate that Bayesian integrated methods most often arrive a correct decisions with less samples than other strategies. The results of this study can help design more effective and efficient exposure assessment and management strategies which will hopefully provide a transparent mechanism to strengthen accuracy and bias of exposure judgments.en-USBayesianChemical ExposureExposure AssessmentExposure ModelingHeuristics and BiasesProfessional JudgmentEnvironmental HealthUnderstanding and strengthening exposure judgments using Bayesian integrated exposure assessment strategies.Thesis or Dissertation