Browsing by Author "Sell, Andrew"
Now showing 1 - 4 of 4
Results Per Page
Sort Options
Item Call me maybe? It's not crazy! Data collection offices are a good partner in data management(2015) Sell, Andrew; Hofelich Mohr, AliciaFor data management professionals, attention is largely focused on the beginning and ends of the research process, as many researchers are worried about meeting federal requirements for data management plans (DMPs) and are looking for ways to share and archive their data. As a University office specializing in survey and experimental data collection, we have seen how the "middle" steps of data collection and analysis can be influenced by, and be an influence on, these upstream and downstream data management processes. In this Pecha Kucha, we will present relevant data management lessons we have learned from designing, developing, and hosting data collection tools. Challenges of anonymity and paying participants, quirks of statistical files produced by data collection tools, and transparency in the research process are among some of the issues we will discuss. As many of these challenges directly impact later sharing and curation of the data collected, we emphasize that data collection offices can be important partners in data management efforts.Item Recruitment, Participation, and Sampling: Researchers’ Results in General Practice(2011-06-01) Lindsay, Thomas; Sell, AndrewWhile large-scale projects have the resources to ensure best practice, most social science researchers face compromises relating to cost, time, and availability of respondents. We worked with researchers to experimentally test specific approaches to sampling and recruitment. We discuss the results of these tests within the framework of theoretical best practices and expectations.Item Thinking Inside the Box: Data from an Online Alternative Uses Task with Visual Manipulation of the Survey Response Box(2016-09-29) Hofelich Mohr, Alicia; Sell, Andrew; Lindsay, Thomas; hofelich@umn.edu; Hofelich Mohr, AliciaThis study was designed to test whether responses to a divergent thinking task (the Alternative Uses Task, AUT; Guildford, 1967) could be influenced by visual design characteristics of the survey response box. We manipulated the type of response box (whether participants saw one large, essay style box - unsegmented - or whether they saw several small, list-style boxes - segmented; see variable "Segmented") and the size/number of boxes seen (5, 10, or 15 lines or boxes; see variable "Lines"). Participants were recruited from the United States between February and early May, 2014 from Amazon's Mechanical Turk (MTurk) and completed the task online. They were given two minutes to list as many uses for either a brick or a paperclip (randomized across participants; see variable "Item"), and then were automatically advanced to answer questions about their personality (the 44 item Big Five Inventory; John & Srivastava, 1991) and demographic information (Age, Sex, Education). Judges scored their responses for elaboration, flexibility, and originality.Item Thinking Inside the Box: Visual Design of the Response Box Affects Creative Divergent Thinking in an Online Survey(Social Science Computer Review, SAGE, 2015) Hofelich Mohr, Alicia; Sell, Andrew; Lindsay, ThomasWhile the visual design of a question has been shown to influence responses in survey research, it is less understood how these effects extend to assessment-based questions that attempt to measure how, rather than just what, a respondent thinks. For example, in a divergent thinking task, the number and elaboration of responses, not just how original they are, contribute to the assessment of creativity. Using the Alternative Uses Task in an online survey, we demonstrated that scores on fluency, elaboration, and originality, core constructs of participants’ assessed creative ability, were systematically influenced by the visual design of the response boxes. The extent to which participants were susceptible to these effects varied with individual differences in trait conscientiousness, as several of these effects were seen in participants with high, but not low, conscientiousness. Overall, our results are consistent with previous survey methodology findings, extend them to the domain of creativity research, and call for increased awareness and transparency of visual design decisions across research fields.