Sciweavers

IQ
2007

Rule-Based Measurement Of Data Quality In Nominal Data

13 years 5 months ago
Rule-Based Measurement Of Data Quality In Nominal Data
: Sufficiently high data quality is crucial for almost every application. Nonetheless, data quality issues are nearly omnipresent. The reasons for poor quality cannot simply be blamed on software issues or insufficiently implemented business processes. Based on our experiences the main reason is that data quality shows the strong tendency to converge down to a level that is inherent to the existing applications. As soon as applications and data are used for other than the established tasks they were originally designed for, problems arise. In this paper we extend and evaluate an approach to measure the accuracy dimension of data quality based on association rules. The rules are used to build a model that is intended to capture normality. Then, this model is employed to divide the database records into three subsets: “potentially incorrect”, “no decision”, and “probably correct”. We thoroughly evaluate the approach on data from our automotive domain. The results it achieves ...
Jochen Hipp, Markus Müller, Johannes Hohendor
Added 30 Oct 2010
Updated 30 Oct 2010
Type Conference
Year 2007
Where IQ
Authors Jochen Hipp, Markus Müller, Johannes Hohendorff, Felix Naumann
Comments (0)