As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
Although data quality is well defined, the relationship to data quantity remains unclear. Especially the big data approach promises advantages of volume in comparison with small samples in good quality. Aim of this study was to review this issue. Based on the experiences with six registries within a German funding initiative, the definition of data quality provided by the International Organization for Standardization (ISO) was confronted with several aspects of data quantity. The results of a literature search combining both concepts were considered additionally. Data quantity was identified as an umbrella of some inherent characteristics of data like case and data completeness. The same time, quantity could be regarded as a non inherent characteristic of data beyond the ISO standard focusing on the breadth and depth of metadata, i.e. data elements along with their value sets. The FAIR Guiding Principles take into account the latter solely. Surprisingly, the literature agreed in demanding an increase in data quality with volume, turning the big data approach inside out. A usage of data without context – as it could be the case in data mining or machine learning – is neither covered by the concept of data quality nor of data quantity.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.