1 Introduction
Since the emergence of the era of big data, a classic consideration of the associated issues is the four Vs: volume, velocity, variety, and veracity. The last one is clearly less studied and is intrinsically linked with data quality issues, such as consistency, accuracy, imprecision, correctness, etc. In a geographic context, the significant growth of geospatial data is mainly due to the increasing volume of data produced from many sources, such as geo-located sensors, location-based social media data, and volunteer geographic information (VGI) [1]. This expansive volume, with varying formats, raises big data storage and big data analysis issues and poses additional challenges in verifying data quality [2]. To ensure a better quality of geographic data, it is important to consider its imperfect nature, study it, and integrate it into the analyzing process [3].