Data Quality Control
Data Quality Control procedures are important for:
- Detecting missing mandatory information
- Detecting errors made during the transfer or reformatting
- Detecting duplicates
- Detecting remaining outliers (spikes, out of scale data, vertical instabilities etc)
- Attaching a quality flag to each numerical value in order not to modify the observed data points
A guideline of recommended QC procedures has been compiled, reviewing NODC schemes and other known schemes (e.g. WGMDM guidelines, World Ocean Database, GTSPP, Argo, WOCE, QARTOD, ESEAS,SIMORC, etc.). The guideline at present contains QC methods for CTD (temperature and salinity), current meter data (including ADCP), wave data and sea level data. The guideline has been compiled in discussion with IOC, ICES and JCOMM, to ensure an international acceptance and tuning. Important feedback originated from the joint IODE/JCOMM Forum on Oceanographic Data Management and Exchange Standards (January 2008), joined by SeaDataNet and international experts to consider on-going work on standards and to seek harmonisation, where possible. Activities are now underway for extending the guideline with QC methods for surface underway data, nutrients, geophysical data, and biological data.
Furthermore a harmonised scheme of QC Flags to be used in SeaDataNet to label individual data values has been defined and adopted.This QC Flag scale is available in the SeaDataNet Common Vocabularies as list L20.
This activity is coordinated by BODC