In April 2016 Manchester eScholar was replaced by the University of Manchester’s new Research Information Management System, Pure. In the autumn the University’s research outputs will be available to search and browse via a new Research Portal. Until then the University’s full publication record can be accessed via a temporary portal and the old eScholar content is available to search and browse via this archive.

VERSION ANALYSIS FOR FAULT DETECTION IN OWL ONTOLOGIES

Copeland, Maria Fernanda

[Thesis]. Manchester, UK: The University of Manchester; 2016.

Access to files

Abstract

Understanding changes in an ontology is becoming an active topic of interest to ontology engineers because of the increasing number of requirements to better support and maintain large collaborative ontologies. Ontology support and debugging mechanisms have mainly addressed errors in ontologies derived from reasoning tasks such as checking concept satisfiability and ontology consistency. Although debugging and tools to help the understanding of entailments have been introduced in the past decade, see [1, 2], these do not address the desirability and expectations of the entailments. Currently, logical faults in ontologies are treated in a vacuum approach that does not take into consideration the information available regarding the entailment evolution of the ontology as recorded in ontology versions, the expectation of entailments, and how the ontology and its logical consequences comply with historical changes. In this thesis we present a novel approach for detecting logical warnings that are directly linked to the desirability and expectation of entailments as recorded in the ontology’s versions. We first introduce methods for evaluating ontology evolution trends, editing dynamics, and identify versions that correspond to areas of major change in the ontology. This lifetime view of the ontology gives background information regarding the growth and change of the ontology from an axiom centric perspective and their entailment presence through out the studied versions. We then subject the asserted ax- ioms from each version to a cross-functional and systematic analyses of changes, the effectiveness of these changes, and the consistency of these changes in future versions. From this detailed axiom change record and their entailment profiles, we derived en- tailment warnings that indicate or suggest domain modelling bugs in terms of content redundancy, regression, refactoring, and thrashing.We validate and confirm these methods by analysing a ten year evolution period of the National Cancer Research ontology NCIt. We present a detailed entailment report for each of the problematic axioms that contain domain modelling bugs, and provide a clear summary of the versions where these axioms introduce logical warnings. This detailed report of entailment history and the detection of domain modelling bugs is done without in-depth domain knowledge and purely derived from the publicly avail- able versions of the ontology. It is through this distinctive usage of ontology versions that we pioneer the detection of domain modelling bugs as logical warnings based on the evaluation of expected and wanted entailments.

Layman's Abstract

Understanding changes in an ontology is becoming an active topic of interest to ontology engineers because of the increasing number of requirements to better support and maintain large collaborative ontologies. Ontology support and debugging mechanisms have mainly addressed errors in ontologies derived from reasoning tasks such as checking concept satisfiability and ontology consistency. Although debugging and tools to help the understanding of entailments have been introduced in the past decade, see [1, 2], these do not address the desirability and expectations of the entailments. Currently, logical faults in ontologies are treated in a vacuum approach that does not take into consideration the information available regarding the entailment evolution of the ontology as recorded in ontology versions, the expectation of entailments, and how the ontology and its logical consequences comply with historical changes. In this thesis we present a novel approach for detecting logical warnings that are directly linked to the desirability and expectation of entailments as recorded in the ontology’s versions. We first introduce methods for evaluating ontology evolution trends, editing dynamics, and identify versions that correspond to areas of major change in the ontology. This lifetime view of the ontology gives background information regarding the growth and change of the ontology from an axiom centric perspective and their entailment presence through out the studied versions. We then subject the asserted ax- ioms from each version to a cross-functional and systematic analyses of changes, the effectiveness of these changes, and the consistency of these changes in future versions. From this detailed axiom change record and their entailment profiles, we derived en- tailment warnings that indicate or suggest domain modelling bugs in terms of content redundancy, regression, refactoring, and thrashing.We validate and confirm these methods by analysing a ten year evolution period of the National Cancer Research ontology NCIt. We present a detailed entailment report for each of the problematic axioms that contain domain modelling bugs, and provide a clear summary of the versions where these axioms introduce logical warnings. This detailed report of entailment history and the detection of domain modelling bugs is done without in-depth domain knowledge and purely derived from the publicly avail- able versions of the ontology. It is through this distinctive usage of ontology versions that we pioneer the detection of domain modelling bugs as logical warnings based on the evaluation of expected and wanted entailments.

Bibliographic metadata

Type of resource:
Content type:
Form of thesis:
Type of submission:
Degree type:
Master of Philosophy
Degree programme:
MPhil Computer Science
Publication date:
Location:
Manchester, UK
Total pages:
80
Abstract:
Understanding changes in an ontology is becoming an active topic of interest to ontology engineers because of the increasing number of requirements to better support and maintain large collaborative ontologies. Ontology support and debugging mechanisms have mainly addressed errors in ontologies derived from reasoning tasks such as checking concept satisfiability and ontology consistency. Although debugging and tools to help the understanding of entailments have been introduced in the past decade, see [1, 2], these do not address the desirability and expectations of the entailments. Currently, logical faults in ontologies are treated in a vacuum approach that does not take into consideration the information available regarding the entailment evolution of the ontology as recorded in ontology versions, the expectation of entailments, and how the ontology and its logical consequences comply with historical changes. In this thesis we present a novel approach for detecting logical warnings that are directly linked to the desirability and expectation of entailments as recorded in the ontology’s versions. We first introduce methods for evaluating ontology evolution trends, editing dynamics, and identify versions that correspond to areas of major change in the ontology. This lifetime view of the ontology gives background information regarding the growth and change of the ontology from an axiom centric perspective and their entailment presence through out the studied versions. We then subject the asserted ax- ioms from each version to a cross-functional and systematic analyses of changes, the effectiveness of these changes, and the consistency of these changes in future versions. From this detailed axiom change record and their entailment profiles, we derived en- tailment warnings that indicate or suggest domain modelling bugs in terms of content redundancy, regression, refactoring, and thrashing.We validate and confirm these methods by analysing a ten year evolution period of the National Cancer Research ontology NCIt. We present a detailed entailment report for each of the problematic axioms that contain domain modelling bugs, and provide a clear summary of the versions where these axioms introduce logical warnings. This detailed report of entailment history and the detection of domain modelling bugs is done without in-depth domain knowledge and purely derived from the publicly avail- able versions of the ontology. It is through this distinctive usage of ontology versions that we pioneer the detection of domain modelling bugs as logical warnings based on the evaluation of expected and wanted entailments.
Layman's abstract:
Understanding changes in an ontology is becoming an active topic of interest to ontology engineers because of the increasing number of requirements to better support and maintain large collaborative ontologies. Ontology support and debugging mechanisms have mainly addressed errors in ontologies derived from reasoning tasks such as checking concept satisfiability and ontology consistency. Although debugging and tools to help the understanding of entailments have been introduced in the past decade, see [1, 2], these do not address the desirability and expectations of the entailments. Currently, logical faults in ontologies are treated in a vacuum approach that does not take into consideration the information available regarding the entailment evolution of the ontology as recorded in ontology versions, the expectation of entailments, and how the ontology and its logical consequences comply with historical changes. In this thesis we present a novel approach for detecting logical warnings that are directly linked to the desirability and expectation of entailments as recorded in the ontology’s versions. We first introduce methods for evaluating ontology evolution trends, editing dynamics, and identify versions that correspond to areas of major change in the ontology. This lifetime view of the ontology gives background information regarding the growth and change of the ontology from an axiom centric perspective and their entailment presence through out the studied versions. We then subject the asserted ax- ioms from each version to a cross-functional and systematic analyses of changes, the effectiveness of these changes, and the consistency of these changes in future versions. From this detailed axiom change record and their entailment profiles, we derived en- tailment warnings that indicate or suggest domain modelling bugs in terms of content redundancy, regression, refactoring, and thrashing.We validate and confirm these methods by analysing a ten year evolution period of the National Cancer Research ontology NCIt. We present a detailed entailment report for each of the problematic axioms that contain domain modelling bugs, and provide a clear summary of the versions where these axioms introduce logical warnings. This detailed report of entailment history and the detection of domain modelling bugs is done without in-depth domain knowledge and purely derived from the publicly avail- able versions of the ontology. It is through this distinctive usage of ontology versions that we pioneer the detection of domain modelling bugs as logical warnings based on the evaluation of expected and wanted entailments.
Thesis main supervisor(s):
Thesis co-supervisor(s):
Language:
en

Institutional metadata

University researcher(s):

Record metadata

Manchester eScholar ID:
uk-ac-man-scw:296111
Created by:
Copeland, Maria
Created:
27th January, 2016, 20:57:59
Last modified by:
Copeland, Maria
Last modified:
16th November, 2017, 14:23:48

Can we help?

The library chat service will be available from 11am-3pm Monday to Friday (excluding Bank Holidays). You can also email your enquiry to us.