Development of an analytic rubric to facilitate and standardize the review of NSF data management plans
Parham, Susan Wells
MetadataShow full item record
The last decade has seen a dramatic increase in calls for greater accessibility to research results and the datsets underlying them. In the United States, federal agencies with over $100 million in annual research and development expenditures are now compelled to create policies regarding public access to research outcomes.1 A sense of urgency has arisen, as researchers, administrators, and institutions must now determine how to comply with new funding agency requirements for data management planning and the sharing of data. As academic institutions develop or expand services to support researchers in meeting these planning and accessibility mandates, there is an increasing demand for mechanisms to better understand researcher needs and practices. The National Science Foundation (NSF) has required a data management plan (DMP) with each new proposal since January 2011. As a document produced by researchers themselves, DMPs provide a window into researchers’ data management knowledge, practices, and needs. They can be used to identify gaps and weaknesses in researchers’ understanding of data management concepts and practices, as well as existing barriers in applying best practices. Formal analysis of DMPs can provide a means to develop data services that are responsive to the needs of local data producers. The IMLS-funded “Data management plans as A Research Tool (DART) Project” has developed an analytic rubric to standardize the review of NSF DMPs. We seek to complement existing tools that have been designed to assist in the creation of a data management plan, such as DMPTool and DMPonline, by developing a tool that will enable consistent analysis of DMP content and quality ex post facto. In this poster, we describe the methodology for developing the analytic rubric, and present results from an initial assessment of DMPs from five U.S. research universities: Oregon State University (lead), Georgia Institute of Technology, Pennsylvania State University, the University of Michigan, and the University of Oregon. The rubric was developed through a review of the NSF’s general guidelines, as well as additional requirements from individual NSF directorates.2 In the rubric, DMP guidelines are translated into a set of discrete, defined tasks (e.g., “Describes what types of data will be captured, created, or collected”), describes levels of compliance for each task, and provides some illustrative examples. We are now conducting a more comprehensive study of DMPs, applying the rubric against a minimum of 100 plans from each study partner. The resulting data set will be analysed with a focus on common observations between study partners and will provide a broad perspective on the data management practices and needs of academic researchers. Once the analysis takes place, the rubric will be openly shared with the community in ways that facilitate its adoption and use by other institutions.