NABET, NABET 2019 Conference

Font Size: 
A Data Provenance Framework Replication XOR Reproduction
David Lanter, David Lanter

Last modified: 2019-09-27

Abstract


Many components for a formal framework for studying failures to replicate or reproduce results and findings in the management information science (MIS) are in use and have been previously demonstrated. Integrated within future analytic software, capabilities suggested in this presentation may enable management information scientists to detect when submitted evidence from new management information science experiments failed to replicate or reproduce prior results of published experiments. They would also make it possible for researchers and students to revisit and analyze experiments that fail to replicate or reproduce prior results and assess and document root causes along with the characteristics and extents of such failures.  In addition to zeroing in on failure, the capabilities sketched in this presentation would bolster management information science by identifying new experiments that succeeded in replicating or reproducing published results.

The framework offers a blueprint for an interactive catalog of management information science that enables researchers and students to study and practice the validation of knowledge. The catalog could enable browsing, searching, accessing and drilling-down through published datasets and metadata describing the methods used in experiments to demonstrate researchers’ creation of new knowledge and successful and unsuccessful attempts at confirming (i.e. replicating or reproducing) the validity of prior knowledge.  Over time, a growing inventory of published experiments in the catalog offers the possibility of visualizing, analyzing and understanding breadth and depth of MIS and support identifying areas of MIS research in which replicability or reproduction were successful and other areas with increasing risk of non-replicability and non-reproducibility.


Keywords


information systems, MIS, data provenance