
Data quality is strictly related to preserving the integrity of your research data as well as to guaranteeing the transparency of research processes, in order to ensure that your data is qualitatively robust and reliable, so that your research is reproducible.
Importantly, each disciplinary area and each field of research have their own data quality control strategies and tools; thus, make sure to follow the standard for your discipline (e.g., describe the software or code used for the analysis, describe the data transfer strategy and checks, like checksums; describe how the data will be cross-checked and validated; document all changes and versioning strategy; deport questionable data, etc.). Anyhow, here are some useful tools that can help in better assessing the quality of your data:
- OpenRefine - data manipulation tool fo data cleaning and quality control purposes (especially for tabular data)
- QAMyData - open source tool to automatically assess and report on elements of quality
- Quality control for quantitative and qualitative data from UK Data Service