Browsing by Subject "data"
Now showing 1 - 6 of 6
Results Per Page
Sort Options
Item The Data Management Village: Collaboration among Research Support Providers in the Large Academic Environment(Databrarianship: The Academic Data Librarian in Theory and Practice (editors Kristi Thompson and Lynda Kellam) Association of College and Research Libraries (ACRL), 2016) Hofelich Mohr, Alicia; Johnston, Lisa R; Lindsay, Thomas AData management encompasses the practices and people that acquire, control, protect, deliver and enhance the value of data throughout the research lifecycle. Done well, data management requires that these practices and people be connected throughout the entire research lifecycle. However, much of this work takes place in researchers’ own offices or labs or with the help of specialized support offices on campus, who only directly interact with researchers at single points in their projects. In academic libraries, a data management specialist may only interact with researchers at the beginning and end of a project, assisting with the creation of a data management plan (DMP) and preservation of the data when the research is completed. This poses a challenge when trying to help researchers integrate best practices into their workflows throughout the planning, collection, and analysis stages. Most libraries are focused on providing broad, public access to the content under their stewardship, and given this mission, libraries alone may not be able to offer all of the data services that our researchers need (for example, dark archives for sensitive or private data). Therefore, given the diverse nature of research data and the distributed support researchers may seek throughout their project, universities need a well-connected, distributed way to support data management; it is a service that “takes a village.”Item Digitizing Difference: Fraudulence, Gender Non-Conformity, and Data(2019-03) Mackenzie, LarsThis dissertation explores how fraudulence shapes contemporary trans life. It examines the impacts of software design, law, and policy on trans and gender non-conforming people, arguing that social expectations about the stability of sex, gender, and identity systematically devalue the lives of trans and gender non-conforming people with particularly harmful impacts in the financial and healthcare sectors. Further, it demonstrates that incongruent or gender non-conforming data wields significant and dangerous power in an era of data-driven decision-making and present alternative approaches towards challenging these paradigms.Item Journalism in an Era of Big Data: Cases, Concepts, and Critiques(Digital Journalism, 2015) Lewis, Seth C.“Journalism in an era of big data” is thus a way of seeing journalism as interpolated through the conceptual and methodological approaches of computation and quantification. It is about both the ideation and implementation of computational and mathematical mindsets and skill sets in newswork—as well as the necessary deconstruction and critique of such approaches. Taking such a wide-angle view of this phenomenon, including both practice and philosophy within this conversation, means attending to the social/cultural dynamics of computation and quantification—such as the grassroots groups that are seeking to bring pro-social “hacking” into journalism (Lewis and Usher 2013, 2014)—as well as the material/technological characteristics of these developments. It means recognizing that algorithms and related computational tools and techniques “are neither entirely material, nor are they entirely human—they are hybrid, composed of both human intentionality and material obduracy” (Anderson 2013, 1016). As such, we need a set of perspectives that highlight the distinct and interrelated roles of social actors and technological actants at this emerging intersection of journalism (Lewis and Westlund 2014a). To trace the broad outline of journalism in an era of big data, we need (1) empirical cases that describe and explain such developments, whether at the micro (local) or macro (institutional) levels of analysis; (2) conceptual frameworks for organizing, interpreting, and ultimately theorizing about such developments; and (3) critical perspectives that call into question taken-for-granted norms and assumptions. This special issue takes up this three-part emphasis on cases, concepts, and critiques.Item Laying the Groundwork: Telling Our Story - Minnesota Community Land Trusts(2008) Howard, TeresaItem OFR22-01, Assessment of preservation needs and long-range plan for geologic collections and data in Minnesota; a report prepared in fulfillment of National Geological and Geophysical Data Preservation Program Award Number 07HQGR0126(Minnesota Geological Survey, 2009) Thorleifson, L HarveyAccording to the National Geological and Geophysical Data Preservation Program (NGGDPP) web site, section 351 of the Energy Policy Act of 2005 directs the Secretary of the Department of the Interior (DOI), through the Director of the U.S. Geological Survey (USGS), to carry out a National Geological and Geophysical Data Preservation Program. The Implementation Plan for the NGGDPP, submitted to Congress in August 2006, outlines the vision and purpose of the program and makes recommendations for its implementation. One of the early action items in the implementation plan is for USGS to begin interactions with State Geological Surveys and other DOI agencies that maintain geological and geophysical data and samples to address their preservation and data rescue needs. As the first step in this process, USGS requested that each state provide an assessment of current collection resources and data preservation needs, and thus provide a summary of the collections held, supported, or used by state geological surveys.Item Translational Cancer Research Data Quality – The Context Factor(2017-08) Orreggio, GiordiCronbach’s alpha indicates that as the count of items in a set increases, so does the level of relationship between them. Translational cancer research (TCR) data is an example of increasing items within a set. As a national priority, TRC is well-funded contributing to continued increase in data organizations produce, the number of organizations producing data, and the amount of sharing in which each organization participates. However, rather than leveraging the data relationships – a contextual approach – intrinsic measures such as accuracy and completeness remain referenced most often in data quality (DQ) articles and conceptual frameworks. The purpose of this set of studies is to expand our knowledge of TCR data quality (DQ) by examining context-sensitive DQ methods. The knowledge gained could be incorporated into future TCR DQ efforts, leading to more informative and actionable data, and quicker development of better clinical treatments.