Analyzing the Conceptual Integrity of Computing Applications Through Ontological Excavation and Analysis
MetadataShow full item record
In the world of commercial computing, consumers are being inundated with baroque, bloated, and difficult-to-use computing applications, tools that use computational methods and technologies to perform tasks. Market forces demand that new versions of these applications implement more features, the user-accessible behaviors and services implemented by the application, than their predecessors or competitors. Ensuring that planned features required for market competitiveness enhance a computing application without these side effects first requires that we understand how these features contribute to the overall design and conceptual integrity of the application While conceptual integrity affects all aspect of the application, we are primarily interested in how an applications user-accessible features have been designed and implemented. To this end, we have developed a research framework, methodologies, and artifacts for measuring the conceptual integrity of a computing artifact from its theory of the world or its ontology. We use conceptual coherence, which we define as the degree to which an applications concepts are tightly related, as a first approximation for conceptual integrity. We claim the following: any computing application has a central or core set of concepts that are essential to that applications ontology and can be identified through analytical means; and concepts that are not essential to an applications ontology either exist to support core concepts or are peripheral to the ontology. Peripheral concepts reduce an applications conceptual coherence. We have developed the method of ontological excavation to identify the concepts in a computing application and model them as an ontology expressed as a semantic network. To identify core and peripheral concepts and to measure an ontologys conceptual coherence, we developed methodologies for ontological analysis. If usefulness depends on the conceptual integrity of an applications ontology such that it ensures high fitness to a problem domain, then we would expect that users solving problems in that domain will invoke the concepts integral to the solution more often than those concepts that do not. Thus, to validate our structural measures, we claim the following: the probable use of the application will invoke core concepts more frequently than peripheral concepts in the ontology.
Showing items related by title, author, creator and subject.
Bogost, Ian; Crawford, Hugh; DiSalvo, Carl; Johnston, John; Stafford, Barbara; Thacker, Eugene (Georgia Institute of Technology, 2010-04-23)For decades, scholars in the liberal arts have relinquished wonder to the natural sciences, and then swooped in ostentatiously to blame their awe on false consciousness. Science and engineering gets things wrong too, ...
Bryan, Levi; Crawford, Hugh; DiSalvo, Carl; Johnston, John; Stafford, Barbara; Thacker, Eugene (Georgia Institute of Technology, 2010-04-23)Levi Bryant is a Professor of Philosophy at Collin College. In addition to his book on Deleuze's Philosophy, Difference and Givenness: Deleuze’s Transcendental Empiricism and the Ontology of Immanence, Bryant is co-editor ...
Mocko, Gregory Michael (Georgia Institute of Technology, 2006-04-11)Problem: Engineering design decisions require the integration of information from multiple and disparate sources. However, this information is often independently created, limited to a single perspective, and not formally ...