As with any large data set, Digital Chart of the World (DCW), have some errors generated resulting from the map digitizing and subsequent processing. The problem then becomes how to detect these errors, and possibly correct them, as functions like TOPOGRID will not give desired results with incorrect input. Fortunately, the DCW hypsography layers have a systematic labeling of the data, allowing for a systematic solution. ArcInfo's (ArcInfo) vector and raster tools are used to detect errors within the hypsography layers. Errors within the contours are detected first. This is done by looking at the elevation difference between the current arc and its neighbor, determining if it is within the specified tolerance. ArcInfo's raster tool EUCALLOCATION, forms a polygon zone for each arc, and the border arcs are used to compute the needed difference. Arcs not within the specified tolerance are flagged as an error. A similar approach is then done for the points. The contours are also used with the points, however, the actual values between the two neighboring contours are needed. Here, the boundaries formed by EUCALLOCATION, are expanded back to the original arc's location with ArcInfo's COSTALLOCATION function. Points with an elevation value not within the range are flagged as an error. The only case where data point correction can be automated is when the point data has been generated based on another layer. The DCW supplemental point hypsography layer represents locations and values of collapsed contours. Because they are collapsed contours, their elevations will be based on the surrounding contours. Here contour correction would be much more difficult, and less certain, because more data than just an arc's neighbor is needed. Potential data errors are flagged, and where possible, corrected, after processing. Using the data to detect/correct itself helps keep problems (associated with incorporating external ancillary data) like registration, and projections from complicating the situation. Two major limitations are present in this solution. First, raster processing cannot represent vector data exactly. Second, the raster functions used are slow, taking several hours for results.
Organizations and agencies which use GISs are requiring more precise "metadata" which describe the confidence one might place in stored spatial data. This is true not only for primary datasets but for derived datasets as well. The need for such metadata, and for the quality control (QC) which it supports, will increase as GISs are used more often to decide issues which may produce litigation. The approach proposed herein allows the user to interactively ascertain the degree of accuracy of the spatial data concerned. The intent of its design is to provide a universal data frame that promotes truly "honest" GIS processing, while at the same time permitting "fuzziness" in GIS data which both polygon and cell paradigms deny. The Dot-Probability Paradigm (DPP) is a GIS dataframe for the storage and manipulation of areal, network, and point spatial data; further the DPP has built into it the ability to provide the user with detailed information about the quality of data contained in a given dataset. The DPP project was sponsored by Esri and the Ohio Center for Mapping.