Development of Worldwide Multi-Scale Bathymetric Data

for World Vector Shoreline Plus (WVSPLUS)

 

Kimberley H. Berger

NIMA EPPE Lab L-64

3200 S. Second St.

St. Louis, MO 63118-3399

bergerk@nima.mil

(314)263-4728

 

 

Abstract

The National Imagery and Mapping Agency (NIMA) recently completed the revision and conversion of its World Vector Shoreline Plus (WVSPLUS) digital database product. The project included adding bathymetric data, generalizing original data into smaller scale libraries, implementing design changes, and converting the data to the Department of Defense's (DoD) Vector Product Format (VPF) according to the WVSPLUS product specification. One particularly challenging portion of the project proved to be the creation and attribute population of the bathymetric line and area features. This paper discusses the experimental nature of the team's approach to AML and procedure development and the resulting solution for bathymetric data creation. It also addresses both project impacts due to unique data characteristics and software limitations, and quality control issues.

 

 

1.0 Introduction

The National Imagery and Mapping Agency's (NIMA) Enhanced Product Prototyping Environment (EPPE) Laboratory in St. Louis, Missouri recently completed the revision and conversion of its World Vector Shoreline Plus (WVSPLUS) product. This paper contains a discussion of the Department of Defense�s (DoD) Vector Product Format (VPF), the World Vector Shoreline (WVS) and WVSPLUS products, and the purpose for and methodology used in the revision of WVSPLUS. It will concentrate on the development of procedures used in the creation and attribute population of bathymetric line and area features. It will also address the impacts due to and resulting solutions for unique data characteristics and software limitations, and quality control issues.

NIMA was formed in 1996 by the merger of several federal mapping and imagery organizations, one being the Defense Mapping Agency (DMA) of which the EPPE Lab was a part. Much of the former DMA's development and support for the VPF initiative and subsequent product prototypes came from the EPPE Lab in St. Louis. The EPPE Lab, now organizationally in NIMA�s Systems and Technology Directorate Office of Research, Development, Test & Evaluation, is a multi-platform computer lab staffed with highly skilled technical personnel including physical scientists, photogrammetrists, cartographers, geodesists, computer scientists, and computer specialists. Two EPPE sites exist, one at NIMA St. Louis and one at NIMA Bethesda, each maintaining similar combinations of computer hardware and software. The traditional responsibilities of this group range from the creation of Mapping, Charting, Geodesy and Imagery Analysis (MCG&IA) data and software prototypes to normal and crisis operations support of MCG&IA applications. More recently, the lab supports MCG&IA activities that have emerged from evolving technologies such as Global Geospatial Information and Services (GGI&S) and Defense Modeling and Simulation. Because the main focus of the EPPE is to support MCG&IA product and software development, the lab is regularly upgraded to keep up with MCG&IA computer technology advances. Collaborative relationships have been established with other research and development laboratories in the DoD and intelligence communities and are encouraged in order to promote the interoperability of products and services.

1.1 Vector Product Format (VPF)

In 1989, in cooperation with Environmental Systems Research Institute (Esri) and military mapping components of the United Kingdom, Australia, and Canada, DMA developed VPF in an effort to standardize the vector-based products created and used by the DoD. A prototype database, Digital Chart of the World (DCW), was created as part of this developmental effort. VPF is a standard format, structure, and organization for large geographic databases that is based on a georelational data model. Although conversion software exists that allows VPF data to be converted to several commercial geographic information system (GIS) internal formats, VPF is not considered a transfer or interchange standard. VPF data is intended for direct use in GIS applications and analysis. It is designed to be used with any vector-based geographic data that can be represented using nodes, edges, and faces. VPF�s characteristics include:

(1) Sheetless database support;

(2) Neutral format;

(3) Attribute support;

(4) Data dictionary;

(5) Text and metadata support;

(6) Index file support;

(7) Direct access;

(8) Flexible, general-purpose schema;

(9) Data quality;

(10) Feature definitions.

VPF uses tables and indices, organized in directories, to access the spatial and thematic properties of geographic data. Directory levels exist for databases, libraries, and coverages. The database level includes library subdirectories and metadata tables describing the database and defining the geographic extent of each library contained in the database. The library level includes coverage subdirectories and metadata tables which describe the library and define the coverages contained in the library. The coverage level contains feature data in the form of feature tables and primitive tables. If the data has been tiled, the primitive data will be found in a lower tile-level directory. Feature tables contain feature attributes and group together features with similar attribution. Primitive tables hold the coordinate and topologic data. Metadata tables at the coverage level define coded attribute values and the relationships between tables.

VPF provides a flexible framework from which multiple products may be defined. Tiling schemes, if desired, may be defined within product specifications and may vary between libraries in a particular database. VPF provides logically consistent topological relationships even if the data is tiled. Libraries containing tiled coverages are required to have two special untiled reference coverages: Tile Reference (TILEREF) and Library Reference (LIBREF). The TILEREF coverage defines the tiling scheme used in the thematic coverages of a library, whereas the LIBREF coverage contains a thinned version of the major features in the library. Thematic coverages and their features are further defined through product specifications. VPF product feature data may be fully layered into thematic coverages or completely integrated in a single coverage and may contain simple and/or complex features, depending upon the definition in the product specification. VPF also supports various levels of topology. Product specifications will determine these topology levels since they are dependent upon the types of features contained within each coverage. The inherently versatile characteristics of VPF have given DMA, and now NIMA, the ability to define a wide range of products including Vector Map (VMap Levels 0, 1 and 2), Urban Vector Map (UVMap), Vector Product Interim Terrain Data (VITD), Digital Nautical Chart (DNC), Digital Topographic Data (DTOP), and WVSPLUS.

1.2 World Vector Shoreline (WVS) and World Vector Shoreline Plus (WVSPLUS)

WVS is a digital 1:250,000 scale NIMA product in ASCII coded Standard Linear Format (SLF) that has been in existence since the late 1980�s. It contains shorelines derived from Digital Landmass Blanking (DLMB) data and supplemented by Operational Navigation Charts (ONCs) and Tactical Pilotage Charts (TPCs). It also contains international boundaries and country names derived from paper products including ONCs, TPCs, and Joint Operation Graphics (JOGs).

WVSPLUS, on the other hand, is a digital, multi-scale NIMA product in VPF that contains shorelines, international boundaries, maritime boundaries, and bathymetric data. The first complete version of WVSPLUS was released in January of 1995. Before conversion to VPF, the original WVS data in SLF was supplemented by the National Oceanographic Office�s (NOO) Digital Bathymetric Database (DBDB5), offshore territorial boundary information obtained from the DoD Maritime Claims Reference Manual, and new political boundaries inserted for the countries of the former Soviet Union. With these changes made in SLF, the data was translated to VPF using conversion software developed under contract with Naval Command Control and Ocean Surveillance Center (NRaD). This initial conversion method introduced topologic errors requiring a major revision of the VPF data itself. The revision project began in August of 1995. In addition to topologic error correction, however, the agency had expanded prototype objectives for the revision project. Bathymetric area feature data was to be added to three of the smaller-scale libraries using DBDB5. Database design improvements were also introduced including additional thematic indices and the implementation of an adaptive tiling scheme based on data density in one library.

The revised WVSPLUS is one database comprised of six libraries. These libraries, WVS250K, WVS001M, WVS003M, WVS012M, WVS040M, and WVS120M, cover the world at 1:250,000, 1:1,000,000, 1:3,000,000, 1:12,000,000, 1:40,000,000, and 1:120,000,000 scales respectively. Each library contains a TILEREF and a LIBREF coverage and from four to seven other coverages (see Table-1).

 

Table-1: WVSPLUS Coverage Names

libraries:WVS250KWVS001M - WVS120M
Coverage

Names:

Coastlines/Countries/Oceans (COC)

Maritime Boundaries (MAB)

Maritime Boundaries Supplemental (MBS)

Names Placement (GAZETTE)

Tile Reference (TILEREF)

Library Reference (LIBREF)

Data Quality (DQY)

Coastlines/Countries/Oceans (COC)

Bathymetric (BAT)**

Names Placement (GAZETTE)

Tile Reference (TILEREF)

Library Reference (LIBREF)

** Bathymetric coverage exists only in the WVS003M, WVS012M, and WVS040M libraries.

 

With the exception of the bathymetric (BAT) coverage, topologic errors found in the VPF coverages were corrected using the same multi-step process. This process included importing coverages tile-by-tile into ArcInfo's internal format, automatically identifying errors and performing edits, edgematching and appending individual ArcInfo coverages into worldwide ArcInfo coverages, and exporting the data back into VPF. Due, in part, to edgematching problems at the 180� line, the largest scale bathymetric coverage did not use the method described above, but was regenerated from DBDB5 instead. The approach taken for correcting topologic errors found in coverages existing in more than one library (i.e. existing at multiple scales) was to make the corrections once, at the largest scale, and then to perform iterative generalizations within ArcInfo to create the smaller scale coverages. This approach not only had the advantage of reducing the amount of importing required, but also used a generalization method with a known and trusted algorithm.

1.3 Project Resources

As the entity responsible for the revision of the WVSPLUS data, the EPPE Lab offered a wide range of resources. Although the main WVSPLUS team was limited to four people, the diverse knowledge and skills of many lab personnel played an important role in the project�s success. The WVSPLUS team consisted of one project leader and three VPF specialists who were responsible for generating and inspecting the WVSPLUS prototype. The team also received valuable quality control support from personnel at the EPPE Lab�s Bethesda site. The pertinent experience/background of the team�s members included VPF Military Standard development, VPF product prototyping, VPF product generation and product validation, a firm understanding of the concepts of GIS, and GIS software experience.

The EPPE�s integrated network of computing workstations and peripherals in St. Louis is comprised of IBM-PCs, Macintoshes, SUNs, Hewlett-Packards, and Silicon Graphics. Several SUN SPARCserverMP workstations were used for the purposes of the WVSPLUS revision project, each having 120 megabytes of memory using two to four CPUs and running SUN-OS version 4.1.3. Productivity was greatly increased due to the introduction later in the project of a SUN SPARCserver1000E with 256 megabytes of memory, containing four SuperSPARC CPUs, and running SUN SOLARIS version 2.4.

Commercial-off-the-Shelf (COTS) software available in the EPPE Lab includes ArcInfo, ArcView, ERDAS IMAGINE, ADOBE PHOTOSHOP, SPYGLASS, and ORACLE. Because its VPFIMPORT and VPFEXPORT capabilities are proven tools in the creation of VPF products, ArcInfo was chosen as the main software environment for the purposes of the WVSPLUS revision project. In the process of working on the project, ArcInfo versions 7.0.2 through 7.0.4 were used. The cooperative nature of the relationship between NIMA�s EPPE Lab and Esri has been mutually beneficial in the enhancement and revision of ArcInfo�s VPFEXPORT capabilities.

Additional software employed in the WVSPLUS revision project include DMA MC&G Utility Software Environment (DMAMUSE) and ERDAS IMAGINE. DMAMUSE was first used in the project to read the DBDB5 bathymetric data into its internal format and output it in ArcInfo LATTICE version 5.0 format. DMAMUSE was later used in combination with ERDAS IMAGINE to convert DMA Compressed Arc Digitized Raster Graphic (CADRG) data to ArcInfo GRID format. Data converted from CADRG served as a raster backdrop in digitizing the new country boundaries in the area of the former Yugoslavia. Because no capability of reading or converting DBDB5 or CADRG existed within ArcInfo version 7.0.4, DMAMUSE served as a valuable tool in these conversion processes.

2.0 Creation of Bathymetric Data Within WVSPLUS

A particularly challenging portion of the project was the creation and attribute population of the bathymetric line and area features. The bathymetric data in the initial WVSPLUS prototype contained only bathymetric contours possessing inherent topologic problems. Part of the expanded objectives of the revision project was to create bathymetric area features, attributed with appropriate high and low values, between the contours. In an effort to avoid the task of importing, correcting and merging each tile of bathymetric data, a decision was made to take advantage of ArcInfo�s capabilities and simply recreate the bathymetry from DBDB5. As will be revealed in this section, the creation of bathymetric data proved to be a more complicated task than had been anticipated.

2.1 Creation of Bathymetric Contours

DBDB5, the source of bathymetric data for WVSPLUS, is a NIMA gridded bathymetric database product developed by the Naval Oceanographic Office (NOO). Depth values are in uncorrected meters for each five minutes of latitude and longitude worldwide. Because ArcInfo cannot directly read or import DBDB5 data, conversion to an appropriate format was required. The Raster Importer module of DMAMUSE is capable of importing and exporting a variety of formats and provided the means for conversion to an ArcInfo lattice version 5.0 file. This file was then converted to a lattice recognized by ArcInfo using the LATTICE60 command.

Several alternative procedures for creating bathymetric contours from a lattice were investigated before arriving at one the team considered acceptable. In some cases, procedures were considered sound but later found to be flawed. This section discusses the various procedures considered along the way and describes the chosen method.

2.1.1 Generalization and Weed Tolerances Within ArcInfo

Before reviewing the various methods investigated for creating bathymetric contours, a discussion of the generalization algorithm used in ArcInfo version 7.0.4 is necessary. The four elements of generalization are (1) simplification - elimination of unwanted detail; (2) classification - grouping of data; (3) symbolization - graphic coding of classes; and (4) induction - application of inference. The element of generalization discussed and referred to in this section is simplification. Simplification is the determination of the important characteristics of the data, the elimination of unwanted detail, and the retention of the important characteristics (Robinson et al, p. 125).

ArcInfo utilizes a standard Douglas-Peuker generalizing algorithm with a user-defined weed tolerance. Weed tolerances used for each scale of the WVSPLUS project were based upon comparison of the generalization results with cartographic products of the same scale. The generalizing algorithm runs a straight trend-line between the start and end nodes of each arc. It then calculates the perpendicular distance from that trend-line to each vertex along the arc. Those vertices whose perpendicular distance falls below the weed tolerance are deleted. Two new trend-lines are created from the endpoints to the remaining vertex with the longest perpendicular distance from the original trend-line. Again, vertices are deleted if their distance from the trend-line is below the set tolerance. This iterative process continues until there are no vertices less than the set tolerance from the trend-line. The Douglas-Peuker generalizing algorithm is considered by many cartographers to be the most accurate simplification algorithm available (Jenks, p. 30). This algorithm is not only used in ArcInfo's GENERALIZE command, but is also incorporated, through weed tolerance options, into commands that create contours from data in other formats (e.g. LATTICECONTOUR and TINCONTOUR).

2.1.2 Trial 1: Filter and Latticecontour

The first attempt used a low pass filter on the worldwide lattice to remove any anomalies that might exist in the data and then used the LATTICECONTOUR command to create a bathymetric contour coverage. There were several problems in the resulting coverage: (1) the contours were not edgematched at the 180� longitude line, (2) bathymetric contours were created across areas where landmasses occurred in the Coastlines/Countries/Oceans (COC) coverage, and (3) great variation existed in the vertex density of arcs between the poles and the equator.

The solution to the 180� line problem seemed, at first, to be to simply snap the arcs together. The coverage, however, had its westernmost limit at -180� and its easternmost limit at 180�. After many attempts to view these extremes of the coverage side-by-side, it became evident that this was a software limitation and would not be possible. Even had it been possible, however, the edge interpolation effects had created such different contours on either side of the 180� line that edgematching would have been nearly impossible. The final solution to this problem was to use a lattice that had overlap at the 180� line (i.e. from -190� to +190�). The resulting coverage could then be clipped at the 180� line on both sides and be assured of edgematching correctly since there was no longer any edge interpolation effect taking place.

Unfortunately, the solutions for the second and third problems could not be automated as was done for the first. It should be noted that, although the irregularity in vertex density did exist in the output from this method, it was not discovered until later when a variation of this trial was investigated. The team concentrated its next efforts on finding a solution to the bathymetric contours crossing landmasses in the COC coverage.

2.1.3 Trial 2: Exploring TIN

To overcome the mismatch between bathymetric contours and land, the team explored the possibility of creating the contours from a TIN instead of a lattice. An initial TIN was created from a lattice over a small test area using the LATTICETIN command. The TINARC command then created a point and a line coverage for input to the next TIN. The CREATETIN command generated a new TIN using these point and line coverages as input and also used a "hardreplace" value of zero for triangles over corresponding landmasses in the COC coverage. Finally, a contour coverage was produced by employing TINCONTOUR. This command applies a subdivision degree and a weed tolerance in its algorithm, both of which affect the detail of output contour coverages. Unfortunately, although a hardreplace had been performed for triangles representing land, the mismatch was still not resolved. This was due to the weed_tolerance specified in the TINCONTOUR command. Although appropriate for the desired scale (1:3 million), it would not allow TINCONTOUR to create bathymetric contours around landmasses if doing so violated the weed_tolerance. The team abandoned the TIN alternative after discussing the issue with a TIN expert from Esri. He confirmed that there was no automated method to create the bathymetric contours that both avoided the land and were generalized appropriately. He also expressed concern about data degradation due to multiple format conversions and suggested that a LATTICECONTOUR approach would be our best option.

2.1.4 Trial 3: Resample, Replace and Latticecontour

Still concentrating on finding a solution to bathymetric contours crossing landmasses in the COC coverage, the team used a variation of Trial 1. In this case, no filter was used on the initial lattice. Two measures were taken in an attempt to "force" the creation of bathymetric contours around, instead of over, land: (1) resampling, or densifying, the lattice (LATTICERESAMPLE) and (2) replacing the lattice values with zero over corresponding landmasses (LATTICEREPLACE).

The resampling measure was performed in the hope that a denser lattice would produce smoother contours and aid in creating the bathymetric contours around the land. Incorporating this step in the procedure, however, had just the opposite effect resulting in very short, fragmented and blocky contours. LATTICECONTOUR creates individual arcs before applying a default or user-specified weed tolerance. Due to using a resampled lattice as input, these interim arcs were created with extremely dense vertices. Individual arcs were short because of the density of the vertices and ArcInfo's limitation on the maximum number of vertices allowable per arc. The arcs then became fragmented in creating the LATTICECONTOUR output when the weed tolerance was applied to the interim arcs. The blocky nature of the resulting contours may also have been caused by the fact that a low-pass filter had not been processed against the initial lattice as was done in Trial 1. It was the combination, therefore, of ArcInfo's vertex limitation for arcs, using a resampled lattice, and the weed tolerance used in the LATTICECONTOUR command that contributed to the undesirable characteristics of the contours.

The sole purpose for incorporating the LATTICEREPLACE command in this procedure was to "force" the creation of bathymetric contours around land by first replacing land lattice values. Unfortunately, the weed tolerance specified in LATTICECONTOUR worked to undo progress that was made by performing the LATTICEREPLACE. There was no method within ArcInfo to apply a weed tolerance appropriate for 1:3 million scale features which did not cause bathymetric contours to cross land. It was determined, instead, that a partially interactive solution to this problem would be necessary.

Occurrences of bathymetric contours in the BAT coverage crossing landmasses in the COC coverage were automatically identified within ARCPLOT using the following method. All bathymetric contours in the BAT coverage and all landmasses in the COC coverage were selected. The RESELECT command was then issued using the "overlap" option, thus identifying all bathymetric contours overlapping land. In order to create a file of these occurrences, output was redirected to a file using the LISTOUTPUT and LIST commands. This file was then printed out to be used as a guide in interactively editing the data within ARCEDIT. This process could then be repeated until no bathymetric contours crossed land.

2.1.5 Trial 4: Final Solution for Creating Bathymetric Contours

With solutions in hand for both the 180� line problem and the bathymetric contour/land mismatch, it was at this point that the team discovered the magnitude of the irregularity in vertex density between the equator and the poles. Because longitude lines converge at the poles, ground distance corresponding to a decimal degree decreases upon moving from the equator towards the poles. The input lattice was in unprojected decimal degrees. The weed tolerance enforced in LATTICECONTOUR, therefore, corresponded to a much smaller ground distance at the poles than at the equator, resulting in arcs with extremely high vertex density at the poles. An equal area projection would be necessary for consistent vertex density results from the equator to the poles.

In order to effectively create and generalize the 1:3 million scale bathymetric contours, the data was required to be projected in an equal area, or near equal area, projection that retained the shape of features. A transverse Mercator (TM) projection was chosen due, in part, to its conformal, or shape-retaining, characteristic. Since scale exaggeration increases moving away from the central meridian, the TM projection is useful for only a certain zone along either side. In order to maintain an equal area characteristic, ArcInfo�s ARCEDIT module suggests the use of bands no greater than 40� wide. For the purposes of creating and generalizing the bathymetric contours, the worldwide lattice was subdivided into eighteen overlapping 25� longitude bands, each of which would then be projected to TM. The smaller TM zones not only were more accurate in terms of area, but also provided more manageable units to process. Two additional small longitude bands were also created from the eastern and westernmost extremes of the worldwide lattice. These would not be projected to TM because it is possible to lose features when projecting to TM at the 180� line.

Upon attempting to project the individual lattice bands into TM, the team discovered a bug within the ArcInfo software. The PROJECT command was not capable of projecting the lattice from unprojected decimal degrees to TM. This deficiency was given an incident number and was eventually submitted as a bug report. In order for the weed tolerance in LATTICECONTOUR to work consistently from the equator to the poles, the input lattice was required to be in TM. Therefore, a practical alternative was still necessary to project the lattice into TM. Such an alternative was provided by technical experts at Esri. ArcInfoís GRIDPOINT command was used to create a point coverage from each of the eighteen interior lattices. Each point coverage was then projected to TM. TM lattices were then created from the newly projected point coverages using the POINTGRID command.

With all lattices now in the appropriate projection, the LATTICECONTOUR command was used to create individual bathymetric contour coverages at a specified contour interval of 200 meters. The WVSPLUS product specification, however, calls for only the following bathymetric contours: 200, 400, 600, 1000, 2000, 4000, 6000, and 8000 meters. An AML was written to delete all contours not indicated by the product specification and change the remaining values from negative to positive. Although negative values were logical relative to mean sea level, the product specifications called for bathymetric contours with positive values. This meant that bathymetric contours with higher values actually had lower elevation, or more depth. Likewise, contours with low values had higher elevation, or less depth (see Figure 1). The individual coverages were then clipped to eliminate the overlap that existed between the twenty longitude bands.

It was at this point that the team determined there to be a problem with the weeding capability of the LATTICECONTOUR command. As an experiment, one of the new contour coverages was generalized using the same weed tolerance as was used in the LATTICECONTOUR command. If all was working properly, no vertices should have been deleted. This, however, was not the case, as many vertices had been deleted. This, too, was given an incident number and submitted as a possible bug. In order to work around this problem, the team incorporated the extra step of generalizing the bathymetric contour coverages after they had been clipped.

 

 

 

 

Figure 1: Bathymetric Profile

 

The two major steps in generalizing the bathymetric contours in the longitudinal bands were: (1) generalizing arcs and (2) correcting intersection errors created by the generalization of arcs. Although these errors could be detected and fixed in any projection, for the purposes of the revision project, the generalized coverage was projected back to decimal degrees first. Intersection errors in ArcInfo occur when two arcs cross with no node at their intersection, when one arc loops back upon itself, or when two arcs are directly on top of each other. These errors can be detected and listed using the ArcInfo INTERSECTERR command and then corrected in the ARCEDIT module.

For each longitude band, bathymetric contours crossing landmasses in the COC coverage were identified using ARCPLOTís RESELECT command as described in the previous section. Bathymetric contours were modified within ARCEDIT to eliminate their conflict with land. The RESELECT command within ARCPLOT was used again to assure that none had been missed. It should be noted that, although this procedure did identify all occurrences of overlap, it also tended to identify some bathymetric contours which had no overlap with land in the COC coverage.

One final type of edit was performed before joining the individual coverages together. While moving bathymetric contours off of landmasses in ARCEDIT, the COC coverage was displayed as a background coverage. The team noticed many instances where the first bathymetric contour around an island was not the 200 meter contour. Because the DBDB5 contained values only at five minute postings, islands existing in between were sometimes missing the necessary surrounding contours. These instances were identified visually by displaying the land in one color and displaying only the 200 meter contours in another color. New ARCEDIT windows were created where surrounding contours were missing. In these cases, contours were then added and attributed appropriately to make the data cartographically correct.

Once generalized and edited, the twenty longitude bands were systematically edgematched and appended to create one combined coverage. Edgematching was accomplished by establishing an editcover and snapcover in ARCEDIT and employing commands such as SNAPFEATURE, SNAPPING, and SNAP. The ArcInfo APPEND command together with DISSOLVE acted to physically join and merge the twenty separate coverages in BAT into one world-wide coverage.

2.1.6 Creation of 1:12 Million and 1:40 Million Scale Bathymetric Contours

Bathymetric contours for the 1:12 million and 1:40 million scale coverages were derived from the clipped, clean, and edited 1:3 million scale longitude band coverages. Each interior band was reprojected to TM and generalized using a weed tolerance appropriate for the scale. The small eastern and westernmost bands were generalized using the same weed tolerance converted to decimal degrees at 45 degrees latitude. After projecting back to geographics, all intersection errors and bathymetric contours crossing land in the COC coverage were identified and corrected using the same methods as were used for the 1:3 million scale data. Individual longitude bands were then edgematched and appended to create two additional worldwide coverages.

2.2 Creation of Bathymetric Areas

With the creation of bathymetric contours complete, the team turned to the task of creating and attributing bathymetric area features between the contours. Curve Value Low (CVL) and Curve Value High (CVH) attributes were required for each bathymetric area feature to reflect the Depth Curve (CRV) attribute values of the bounding contours. The methods explored for accomplishing this task are described in this section.

2.2.1 Trial 1: The Latticespot Method

A procedure listed in the Fifteenth Annual Esri User Conference Workshop Proceedings presented a method to derive and attribute area features from a lattice database. Because the source data for the bathymetric contours was a lattice, the team was eager to test this method. After using LATTICECONTOUR to generate a contour coverage, polygon topology was built, creating area features between the bathymetric contours. Polygon label points were added to each polygon using CREATELABEL and then were PUT into a separate point coverage. The LATTICESPOT command was then used to interpolate depth values from the original lattice for every point in the point coverage. The newly attributed label points were then put back into the polygon coverage where they would be used to help determine the depth range of each area feature. Area feature CVH and CVL attributes were added as items and then attributed according to the product specified range of depth values in which the interpolated depth values fell.

Although this procedure had been used successfully by other ArcInfo users, it proved not to be robust enough for the purposes of the WVSPLUS revision project. CVH and CVL values were attributed incorrectly for many areas due to: (1) the interpolation procedure itself and (2) the fact that additional bathymetric contours had been added to the originally generated contour coverage where necessary around small islands. These characteristics caused many interpolated depth values to fall outside of the ranges dictated by the associated depth contours.

2.2.2 Trial 2: Identification and Re-attribution of Incorrect Area Features

The teamís first idea about how to proceed was, for each depth range, to identify and correct only the incorrectly attributed areas. Unfortunately, automatically finding the areas in error proved to be impossible. An automated way was, however, found to identify arcs whose CRV value contradicted the CVL and CVH of the associated area feature. These arcs were identified in ARCEDIT by: (1) selecting areas in a particular depth range, (2) using SELECTPUT ARCS to automatically select the arcs making up each area within that depth range, (3) changing the edit feature to arcs, and (4) reselecting arcs whose curve value was not equal to either the CVL or CVH of the range in question. At this point, if any were selected, either the arc itself was misattributed or the depth range of the area was incorrect. By finding these arcs, the team had hoped there would be an automatic method to determine the problematic areas. As soon as the edit feature was changed back to polys, however, all polygons within the range of interest were selected again. Although this procedure could not solve the problem, it would be used later as an effective quality control mechanism.

2.2.3 Trial 3: Attribution of All Area Features by AML

It quickly became evident that a completely new procedure for attributing all area features was necessary. Because the team intended to process each area individually, an automatic method was essential. ArcInfoís Arc Macro Language (AML) provided the team with a programming language to automate this intensive task. This section describes the AML, batpoly.aml, that was used to process each area feature and discusses the limitations of the final code. It also details some of the complications faced and solutions found along the way. See Appendix A for the complete AML code for batpoly.aml.

Before running the AML, the shoreline was added from the COC coverage as a bathymetric contour with a CRV of zero. This preprocessing step helped to correctly attribute those areas that were in the 0-200 meter and 200-400 meter ranges. For each individual area feature, batpoly.aml used ARCEDITís SELECTPUT ARCS to examine the arcs making up the polygon. LISTUNIQUE was used to determine the number of unique CRV values for the arcs that made up each polygon. It was at this point that the team realized the LISTUNIQUE command within ARCEDIT does not work on only the selected set of features, as would be necessary. Instead, it works on all features of the current edit feature type. The LISTUNIQUE command in ARCPLOT, however, will work only on the selected set of features. ARCPLOTís LISTUNIQUE, therefore, was used to determine if arcs with one or two unique CRV values comprised each polygon. From within ARCEDIT, selected arcs were passed to ARCPLOT using the SELECTPUT ARCPLOT command. Once the unique values had been determined, processing returned to ARCEDIT by quitting ARCPLOT. The AML ended, returning a message that an error had been found in the data, if more than two unique values existed. Polygons consisting of arcs with two unique values were referred to as ìdonutsî, whereas those consisting of arcs with only one unique value were referred to as ìdonut holesî.

Donut polygons were the easier of the two types of area features to process. Using the AML MAX function, the maximum of the two CRV values was determined. After changing the edit feature back to polys, the CVL and CVH were calculated based on the maximum CRV. For example, if the maximum value returned was 6000 meters, the CVL and CVH values were calculated as 4000 and 6000 meters respectively.

Processing donut holes proved to be more challenging. If the donut holeís unique value, or ìonly_crvî, was zero, the area feature was a landmass and required no further processing. Otherwise, the CRV values of arcs comprising the polygon(s) adjacent to the donut hole required analysis to determine the donut holeís CVL and CVH. A donut hole with a 400 meter only_crv, for instance, could either have a 200-400 or 400-600 meter depth range depending on the values of the adjacent polygonís arcs. If the adjacent polygonís arcs are 200 and 400 meters, the range of the donut hole is 400-600 meters. If, on the other hand, the adjacent polygonís arcs are 400 and 600 meters, the range of the donut hole is 200-400 meters.

Because there is no method of selecting adjacent polygons within ARCEDIT, ARCPLOTís ASELECT command was utilized with the POLY and ADJACENT options. Selected features were passed from ARCEDIT to ARCPLOT once again using SELECTPUT ARCPLOT. Once the adjacent polygon had been added to the selected set, processing returned to ARCEDIT by quitting ARCPLOT. The SELECTGET command within ARCEDIT then retrieved the new selected set from ARCPLOT. The AML again used ARCEDITís SELECTPUT ARCS to examine the arcs making up the selected polygons. Those arcs with a CRV equal to the only_crv value were unselected, leaving only one unique CRV value known as the adjacent_crv. It was this CRV value, ascertained using ARCPLOTís LISTUNIQUE command, that finally determined the CVL and CVH of the donut hole. CVL and CVH values are calculated based on whether the only_crv value is greater or less than the adjacent_crv value.

The description above describes ìnormalî donut holes, where the number of unique adjacent_crv values equals one. Two types of special case donut holes existed in the data which required special treatment (see Figures 2 and 3): (1) donut holes touching donuts at one node, and (2) donut holes inside of donut holes. In creating the bathymetric contours using LATTICECONTOUR, contours were created such that some donut holes and donuts touched at one node, a cartographically unacceptable result. This situation was

 

 

Figure 2: Special Case - "Donut-Hole" touching "Donut" at One Node

 

discovered when a LISTUNIQUE of the adjacent curve values returned more than one value. In order to keep the AML running, these types of donut holes were simply marked with CVLs of -444 for later editing. After the AML was finished processing, each polygon with a CVL of -444 was edited so that it no longer touched arcs of the adjacent polygon. After rebuilding the polygon topology, a file was created listing the cover-ids of the newly edited polygons. This file was then accessed by correct.aml, a subset of batpoly.aml, to calculate the polygonsí CVL and CVH values. A more efficient AML

 

 

Figure 3: Special Case - "Donut-Hole" within a "Donut-Hole"

 

was attempted using cursors, instead of a file, to step through these polygons. The team quickly realized, however, that using cursors was not an option since it is not possible to change the edit feature without getting out of cursors. See Appendix B for the complete AML code for correct.aml.

The second kind of special case polygons, donut holes inside of donut holes, was discovered when a LISTUNIQUE of the adjacent curve values returned nothing. In other words, when arcs whose CRV values equal the only_crv value were unselected, no arcs remained. This situation was possible because donut holes had been defined by the AML simply as polygons comprised of arcs with only one unique CRV value. In order to correctly attribute this type of polygon, it would be necessary to examine the polygon adjacent to the donut holeís adjacent polygon, and possibly even further out. Without determining the appropriate adjacent CRVs, it was not possible to resolve if the donut hole was a rise within a depression or a depression within a rise. Because this situation occurred relatively infrequently (less than 200 occurrences in approximately 20,000 polygons in the 1:3 million scale coverage) and was difficult to program, the team decided to treat them all as rises within depressions and flag them to be verified when the AML was finished. In verifying the attribution of these donut holes, less than five percent required attribution changes.

With the batpoly.aml finished and the postprocessing complete for both of the special types of polygons, the team used the procedure described in the previous section to validate the correct attribution of the bathymetric areas (see qc.aml in Appendix C). The final task remaining before exporting the bathymetric coverage to VPF was to remove the shoreline and rebuild polygon topology. The procedures and AMLs described in this section were applied to all three scales of bathymetric data.

3.0 Conclusions

The sheer magnitude and complexity of the WVSPLUS revision project dictated the requirement for team members with a broad range of cartographic, VPF, and ArcInfo knowledge and skills. A firm grasp on all of these elements was a key factor in the success of the project. This was especially true concerning the experimental nature of the team�s approach to AML and procedure development in creating bathymetric data. Cartographic knowledge was essential in understanding the effects of generalizing data in different projections. It was also important in creating cartographically acceptable bathymetric contours that did not cross landmasses in the COC coverage and were logical relative to each of other and neighboring land. The fundamental ArcInfo proficiency of the team proved to an indispensable characteristic. Although project requirements pushed the envelope of ArcInfo�s capabilities, successful solutions to software limitations were achieved and executed by the team. Also imperative was the VPF background required to convert the ArcInfo data to a valid VPF database which conformed to the WVSPLUS product specifications. This paper has concentrated on the methods used to create bathymetric features. Further details of the WVSPLUS revision project are discussed in "The Defense Mapping Agency (DMA) World Vector Shoreline Plus (WVSPLUS) Revision and Conversion Project," a paper published in the Proceedings of the 1996 Esri User Conference.

 

 

Acknowledgments

A great deal of thanks are due to the technical support team at Esri for their valued support over the course of the project. Particular thanks are owed to Harlan Heimgartner, an Esri course instructor and Training Services Manager, whose suggestions were a great contribution to the success of the batpoly.aml.

Thanks are also due to Judy Packman and Dave Berg in the EPPE Lab for their suggestions and editing ideas for this paper.

 

 

References

Berger, Kimberley, Monica Mroz, Richard Becherer, Judith Packman.
1996. The Defense Mapping Agency (DMA) World Vector Shoreline Plus (WVSPLUS) Revision and Conversion Project. Proceedings of the 1996 Esri User Conference. USA: Environmental Systems Research Institute, Inc.
Environmental Systems Research Institute.
1995. Workshop Proceedings of the 1995 Esri User Conference. Volume 2. USA: Environmental Systems Research Institute, Inc.
Environmental Systems Research Institute.
1996. ArcDoc Version 7.0.
Environmental Systems Research Institute.
November 1993. "ArcInfo-to-VPF Conversion: A Step-by-Step Guide".
Jenks, George F.
1989. Geographic Logic in Line Generalization. Cartographica Volume 26 Number I (Spring): p. 27-42.
Robinson, Arthur H., Randall D. Sale, Joel L. Morrison, Phillip C. Muehrcke.
1984. Elements of Cartography, 5th Edition. New York: John Wiley & Sons.
U.S. Defense Mapping Agency.
No Date. Digitizing the Future. 4th Edition.
U.S. Department of Defense.
12 October 1993. MIL-STD-2407 Military Standard Vector Product Format.
U.S. Department of Defense.
30 September 1995. MIL-W-89012A Military Specification World Vector Shoreline Plus.

Appendix A - BATPOLY.AML - Bathymetric Area Attribution

/*============================================================
/*  Program:    BATPOLY.AML
/*  Usage:      &r batpoly
/*  Description/Purpose:
/*    This AML populates the CVL and CVH values for the bathymetric area 
/*    features.  It works on a coverage called batpoly12m that has used the 
/*    shoreline as a bathymetric contour with CRV=0.  Once the area feature
/*    CVL and CVH values are computed, the CRV=0 contours must be removed 
/*    (before export to VPF).
/*  Arguments:  none
/*  Assumptions:  
/*    A polygon coverage exists containing bathymetric contours, the shoreline, 
/*    and bathymetric areas.  CVL and CVH items have already been added to the 
/*    .pat but need to be populated.
/* ------------------------------------------------------------------------	
/*   Input:            Bathymetric coverage with unattributed CVL and CVH
/*                       values.
/*   Output:          Bathymetric coverage whose area features are 
/*                       accurately attributed with CVL and CVH values 
/*                       according to the depth values of the bathymetric
/*                       contours that comprise the area.
/*   Calls:                  NONE
/*   Globals created:     NONE
/*   Globals required:    NONE
/*   System Dependencies: NONE
/**History:               Kim Berger and Dave Berg 11/96
/* ------------------------------------------------------------------------
/* Variable Dictionary
/*	adjacent_crv - value of the bathymetric contour next to the donut hole
/*	counter - a polygon counter variable
/*	cov - the name of the edit coverage 
/*	fileclose - a variable used to close a file storing unique depth
/*                       curve values for an individual polygon feature
/*	fileunit1 - a variable used to read a file storing unique depth curve
/*                       values for an individual polygon feature
/*	max_crv - the maximum of two depth curve values
/*	num_polys - the number of polygons contained in cov
/*	only_crv - the depth curve value for donut holes
/*	record - variable equal to the value of the counter variable
/*	rec1 - one of two possible depth curve values comprising an individual
/*                 polygon feature
/*	rec2 - the second of two possible depth curve values comprising an
/*                 individual polygon feature
/*	uacv - the number of "unique adjacent curve values" comprising the 
/*                  polygon adjacent to donut hole types of polygon features
/*	unique - the number of unique depth curve values of arcs comprising 
/*                     an individual polygon feature
/* -------------------------------------------------------------------------
/* Items Used From the batpoly12m.pat
/*	$id - the ArcInfo -id item
/*	$recno - the ArcInfo # item
/*	facc - the DIGEST feature attribute coding catalogue code
/*	cvl - Curve Value Low attribute for bathymetric areas
/*	cvh - Curve Value High attribute for bathymetric areas
/*	flag - attribute to flag polygons considered donut holes inside of
/*                 donut holes
/* -------------------------------------------------------------------------
/* Items Used From the batpoly12m.aat
/*	crv - the Depth Contour Value attribute for bathymetric lines
/* -------------------------------------------------------------------------
 
&w batpoly.wch  /* creating a watch file of the processing
&type The start time is: [date -full]
arcedit

/* setting variables
&setvar cov = batpoly12m  
&describe %cov%
&setvar num_polys = %DSC$POLYGONS%  /* the number of polygons in %cov%

/* Initialize polygon counter variable to skip the universe polygon
&setvar counter = 2

editcoverage %cov%
keepselect on  /* assures that the selected set is saved even if the 
                    /* edit feature is changed

/* assuring that the $id values are sequential, starting from 1, with no gaps
editfeature poly
select all
calc $id = $recno
save

&do &while %counter% <= %num_polys%
  select $id = %counter%
  &type ***** THE COUNTER IS NOW AT %counter% OF %DSC$POLYGONS% *****
  selectput arcs 
  editfeature arcs
  unselect facc = 'ZD003'  /*Unselecting the neatline feature

  selectput arcplot  /*sends the selected set to arcplot 
  apmode edit
  arcplot   /*using the listunique command in arcplot instead of arcedit
            /*because the one in arcplot works only on the selected set
  /* Finding the number of unique values and sending them to a file named file1
  &setvar unique = [listunique %cov% -line crv file1] 
  quit     /*quit arcplot and return to arcedit
  /* Opening, reading, and closing file1 which holds the unique values
  &setvar fileunit1 = [open file1 openstatus -read]
  &setvar rec1 = [read %fileunit1% readstatus]
  &setvar rec2 = [read %fileunit1% readstatus]
  &setvar fileclose = [close %fileunit1%]
  &sys rm file1

/* When the polygon is land, there is no need to process it ******
  &if %unique% = 1 and %rec1% = 0 &then
    &do
      editfeature poly
      &setvar counter = %counter% + 1
    &end

/* When an error exists in the attribution of the arcs comprising
/* the polygon feature (%unique% > 2)
  &else &do
    &if %unique% > 2 &then
    &do
      &type 'DATA ERROR: polygon #' %counter% 'has more than two CRV values'
      &return 
    &end  /*end of &do for %unique% > 2


/* When the polygon is a donut ************************************
  &else &do
    &if %unique% = 2 &then
    &do
      &setvar max_crv = [max %rec1% %rec2%]
      ef poly
      &select %max_crv%
        &when 8000
          &do
            calc cvl = 6000
            calc cvh = 8000
            &setvar counter = %counter% + 1
          &end   /* end of &do loop for &when 8000
        &when 6000 
          &do 
            calc cvl = 4000 
            calc cvh = 6000 
            &setvar counter = %counter% + 1 
          &end   /* end of &do loop for &when 6000
        &when 4000 
          &do 
            calc cvl = 2000 
            calc cvh = 4000 
            &setvar counter = %counter% + 1 
          &end   /* end of &do loop for &when 4000
        &when 2000 
          &do 
            calc cvl = 1000 
            calc cvh = 2000 
            &setvar counter = %counter% + 1 
          &end   /* end of &do loop for &when 2000
        &when 1000 
          &do 
            calc cvl = 600 
            calc cvh = 1000 
            &setvar counter = %counter% + 1 
          &end   /* end of &do loop for &when 1000
        &when 600 
          &do 
            calc cvl = 400 
            calc cvh = 600 
            &setvar counter = %counter% + 1 
          &end   /* end of &do loop for &when 600
        &when 400 
          &do 
            calc cvl = 200 
            calc cvh = 400 
            &setvar counter = %counter% + 1 
          &end   /* end of &do loop for &when 400
        &when 200
          &do
            calc cvl = 0
            calc cvh = 200
            &setvar counter = %counter% + 1
          &end   /* end of &do loop for &when 200
      &end    /*end of &select  
    &end   /*end of &do for %unique% = 2

/* When the polygon is a donut hole *************************************
  &else &do   /*If listunique = 1
    &setvar only_crv = [truncate %rec1%] /*because crv values are type F
    editfeature poly
    selectput arcplot
    apmode edit
    arcplot 
    aselect %cov% poly adjacent  /*adding adjacent polygon to selected set
    quit    /*quitting out of arcplot going back to arcedit

    editfeature poly
    selectget /********gets the selected set from AP*************/

    selectput arcs
    editfeature arcs
    unsel facc = 'ZD003'  /*unselecting the neatline feature
    unsel crv = %only_crv%
    selectput arcplot
    apmode edit
    arcplot

    /* uacv = the number of unique adjacent crv values not equal to %only_crv%
    &setvar uacv = [listunique %cov% -line crv file2]

    /*If uacv > 1, there is a data problem such as a "donut-hole" touching a
    /*"donut" at one node.  Setting only_crv = -444 will force the aml to go
    /*to the &select &when -444 section where the poly will be marked for
    /* further review by calc cvl = -444.

    &if %uacv% > 1 &then
      &setvar only_crv = -444   
    &if %uacv% = 1 &then  /*"normal" donut-hole situation
      &do
        &setvar adjacent_crv = [listunique %cov% -line crv]
        &setvar adjacent_crv = [truncate %adjacent_crv%]  /*because crv values
                                                          /*are type F
      &end
    &if %uacv% < 0 &then  /*too many uacv values to put in a file
      &do
        &setvar file_unit = [open prob_polys openstatus -write]
        &setvar record = %counter% 
        &if [write %file_unit% %record%,%uacv%] = 0 &then
          &type prob_polys file written successfully
        &if [close %file_unit%] = 0 &then
          &type prob_polys file closed successfully
      &end    
    &if %uacv% = 0 &then   /*situat ion of "donut-hole in a donut-hole"
        &setvar adjacent_crv = 99999  /*Setting variable to very high number
                                                 /*to account for a rise within a 
                                                 /*depression.  Will require 
                                                 /*postprocessing to verify.
    &sys rm file2
    quit    /*quitting out of arcplot going back to arcedit

    editfeature poly
    select $id = %counter%

    &select %only_crv%
      &when 8000
        &do
          &if %only_crv% > %adjacent_crv% &then
            &do
              calc cvl = 8000 
              calc cvh = 10500 
              &setvar counter = %counter% + 1 
            &end    /*end of &do 
          &else /*&if %only_crv% < %adjacent_crv% &then
            &do 
              &if %adjacent_crv% = 99999 &then
                calc flag = 1 /*flagging donut-hole in donut-hole
              calc cvl = 6000 
              calc cvh = 8000 
              &setvar counter = %counter% + 1 
            &end    /*end of &do
        &end    /*end of &do for &when 8000

      &when 6000
        &do
          &if %only_crv% > %adjacent_crv% &then
            &do
              calc cvl = 6000 
              calc cvh = 8000 
              &setvar counter = %counter% + 1 
            &end    /*end of &do 
          &else /*&if %only_crv% < %adjacent_crv% &then
            &do 
              &if %adjacent_crv% = 99999 &then 
                calc flag = 1 /*flagging donut-hole in donut-hole
              calc cvl = 4000 
              calc cvh = 6000 
              &setvar counter = %counter% + 1 
            &end    /*end of &do
        &end    /*end of &do for &when 6000

      &when 4000
        &do
          &if %only_crv% > %adjacent_crv% &then
            &do
              calc cvl = 4000 
              calc cvh = 6000 
              &setvar counter = %counter% + 1 
            &end    /*end of &do 
          &else /*&if %only_crv% < %adjacent_crv% &then
            &do 
              &if %adjacent_crv% = 99999 &then 
                calc flag = 1 /*flagging donut-hole in donut-hole
              calc cvl = 2000 
              calc cvh = 4000 
              &setvar counter = %counter% + 1 
            &end    /*end of &do
        &end    /*end of &do for &when 4000

      &when 2000
        &do
          &if %only_crv% > %adjacent_crv% &then
            &do
              calc cvl = 2000 
              calc cvh = 4000
            &setvar counter = %counter% + 1 
            &end    /*end of &do 
          &else /*&if %only_crv% < %adjacent_crv% &then
            &do 
              &if %adjacent_crv% = 99999 &then 
                calc flag = 1 /*flagging donut-hole in donut-hole
              calc cvl = 1000
              calc cvh = 2000
              &setvar counter = %counter% + 1 
            &end    /*end of &do
        &end    /*end of &do for &when 2000

      &when 1000
        &do
          &if %only_crv% > %adjacent_crv% &then
            &do
              calc cvl = 1000 
              calc cvh = 2000 
              &setvar counter = %counter% + 1 
            &end    /*end of &do 
          &else /*&if %only_crv% < %adjacent_crv% &then
            &do 
              &if %adjacent_crv% = 99999 &then 
                calc flag = 1 /*flagging donut-hole in donut-hole
              calc cvl = 600 
              calc cvh = 1000 
              &setvar counter = %counter% + 1 
            &end    /*end of &do
        &end    /*end of &do for &when 1000

      &when 600
        &do
          &if %only_crv% > %adjacent_crv% &then
            &do
              calc cvl = 600 
              calc cvh = 1000 
              &setvar counter = %counter% + 1 
            &end    /*end of &do 
          &else /*&if %only_crv% < %adjacent_crv% &then
            &do 
              &if %adjacent_crv% = 99999 &then 
                calc flag = 1 /*flagging donut-hole in donut-hole
              calc cvl = 400 
              calc cvh = 600 
              &setvar counter = %counter% + 1 
            &end    /*end of &do
        &end    /*end of &do for &when 600

      &when 400
        &do
          &if %only_crv% > %adjacent_crv% &then
            &do
              calc cvl = 400 
              calc cvh = 600 
              &setvar counter = %counter% + 1 
            &end    /*end of &do 
          &else /*&if %only_crv% < %adjacent_crv% &then
            &do 
              &if %adjacent_crv% = 99999 &then 
                calc flag = 1 /*flagging donut-hole in donut-hole
              calc cvl = 200 
              calc cvh = 400 
              &setvar counter = %counter% + 1 
            &end    /*end of &do
        &end    /*end of &do for &when 400

      &when 200
        &do
          &if %only_crv% > %adjacent_crv% &then
            &do
              calc cvl = 200
              calc cvh = 400
              &setvar counter = %counter% + 1 
            &end
          &else /*&if %only_crv% < %adjacent_crv% &then
            &do 
              &if %adjacent_crv% = 99999 &then 
                calc flag = 1 /*flagging donut-hole in donut-hole
              calc cvl = 0 
              calc cvh = 200 
              &setvar counter = %counter% + 1 
            &end    /*end of &do
        &end    /*end of &do for &when 200
      &when -444
        &do
          calc cvl = -444 /*flagging problem donut-holes for further review
          &setvar counter = %counter% + 1
        &end /*end of -444 loop
      &otherwise
        &do
          &setvar counter = %counter% + 1
        &end /*end of otherwise loop 
    &end     /*end of &select
    
   &end   /*end of &else &do loop (If %unique% = 1) Donut hole polygons 
  &end    /*end of &else &do loop (If %unique% = 2) Donut polygons
 &end     /*end of &else &do loop (If %unique% > 2) Data errors

&if [mod %counter% 500] = 0 &then  /*intermittent saves
  save

&end /*end of &do &while loop
save

&type The end time is: [date -full]
&w &off
&return

Appendix B - CORRECT.AML - Postprocessing Special Case ìDonut Holesî

/* ========================================================================
/*  Program:    CORRECT12M.AML
/*  Usage:      &r correct12m
/*  Description/Purpose:
/*    This AML populates the CVL and CVH values for the bathymetric area
/*    features.  It works on a coverage called batpoly12m that has used the
/*    shoreline as a bathymetric contour with CRV=0.  Once the area feature
/*    CVL and CVH values are computed, the CRV=0 contours must be removed
/*    (before export to VPF).
/*    This AML is a subset of batpoly.aml which worked on all polygons in
/*    the coverage.  This AML processes only the polygons whose CVL
/*    had been previously populated with -444 in the batpoly.aml.  It opens
/*    and reads from a file containing the -id of only these polygons.
/*    These polygons had been flagged by batpoly.aml because they were
/*    "donut-hole" polygons that were touching "donut" polygons at one node.
/*    These polygons have been edited so that they no longer intersect with
/*    the "donut" polygons.
/*  Arguments:  none
/*  Assumptions:
/*    A polygon coverage exists containing bathymetric contours, the shoreline,
/*    and bathymetric areas.  CVL and CVH items have already been added to the
/*    .pat and have been populated by batpoly.aml.  The -444 flagged
/*    polygons have been edited so that they no longer intersect other polygons.
/* ------------------------------------------------------------------------
/*   Input:             Bathymetric coverage with attributed CVL and CVH
/*                        values and a file of the -ids of those 
/*                        polygons with a CVL=-444.
/*   Output            Bathymetric coverage whose area features are
/*                        accurately attributed with CVL and CVH values
/*                        according to the depth values of the bathymetric
/*                        contours that comprise the area.
/*   Calls:                  NONE
/*   Globals created:     NONE
/*   Globals required:    NONE
/*   System Dependencies: NONE
/**History:               Kim Berger and Dave Berg 11/96
/* ------------------------------------------------------------------------
/* Variable Dictionary
/*      adjacent_crv - value of the bathymetric contour next to the donut hole
/*      counter - a variable used to count the number of polygons that have 
/*                  been processed so that intermittent saves can be performed
/*      cov - the name of the edit coverage
/*      cov-id - the -id of polygons flagged with CVL=-444 and stored
/*                 in a file which is accessed by this AML
/*      fileclose - a variable used to close a file storing unique depth
/*                    curve values for an individual polygon feature
/*      fileunit3 - a variable used to read a file storing the -ids of
/*                    the polygons that had been flagged by batpoly.aml with
/*                    CVL=-444
/*      only_crv - the depth curve value for donut holes
/*      record - variable equal to the value of the counter variable
/*      uacv - the number of "unique adjacent curve values" comprising the
/*               polygon adjacent to donut hole types of polygon features
/* -------------------------------------------------------------------------
/* Items Used From the batpoly12m.pat
/*      $id - the ArcInfo -id item
/*      facc - the DIGEST feature attribute coding catalogue code
/*      cvl - Curve Value Low attribute for bathymetric areas
/*      cvh - Curve Value High attribute for bathymetric areas
/*      flag - attribute to flag polygons considered donut holes inside of
/*              donut holes
/* -------------------------------------------------------------------------
/* Items Used From the batpoly12m.aat
/*      crv - the Depth Contour Value attribute for bathymetric lines
/* -------------------------------------------------------------------------


&w correct12m.wch  /*creating a watch file of the processing
&type The start time is: [date -full]
arcedit

/* setting variables
&setvar cov = correct12m

ec %cov%
keepselect on  /*assures that the edit feature is saved even if the 
                   /*edit feature is changed

ef poly

/* Opening and reading from the file containing the -ids of the
/* polygons that had been flagged with CVL=-444 by the batpoly.aml
&setvar fileunit3 = [open poly_list.txt openstatus -read]
&setvar cov-id = [read %fileunit3% readstatus]
&setvar counter = 1

&do &while %cov-id% <> -8888 /*while not at end of file
  select $id = %cov-id%
  &type ***** THE COVER-ID IS NOW AT %cov-id%  *****
  selectput arcs 
  ef arcs
  unselect facc = 'ZD003'  /*Unselecting the neatline feature

  selectput arcplot  /*sends the selected set to arcplot
  apmode edit
  arcplot  /*using the listunique command in arcplot instead of arcedit
           /*because the one in arcplot works only on the selected set
  &setvar only_crv = [listunique %cov% -line crv]
  &listvar
  &setvar only_crv = [truncate %only_crv%]  /*because crv values are 
                                            /*type F
  quit     /*quitting arcplot & going back to arcedit

/* When the polygon is a donut hole *************************************
    ef poly
    selectput arcplot
    apmode edit
    arcplot 
    aselect %cov% poly adjacent
    quit    /*quitting out of arcplot going back to arcedit
    ef poly
    selectget /********gets the selected set from AP*************/
    selectput arcs
    ef arcs
    unsel facc = 'ZD003'  /*unselecting the neatline feature
    unsel crv = %only_crv%
    selectput arcplot
    apmode edit
    arcplot

    /* uacv = the number of unique adjacent crv values not equal to %only_crv%
    &setvar uacv = [listunique %cov% -line crv file2]

    /*If uacv>1, there is still a data problem such as a "donut-hole" touching
    /*a "donut" at one node.  Setting only_crv = -444 will force the aml to go
    /*to the &select &when -444 section where the poly will be marked for
    /* further review by calc cvl = -444.
    &if %uacv% > 1 &then
      &setvar only_crv = -444   
    &if %uacv% = 1 &then  /*"normal" donut-hole situation
      &do
        &setvar adjacent_crv = [listunique %cov% -line crv]
        &setvar adjacent_crv = [truncate %adjacent_crv%]  /*because crv values
                                                                            /*are type F
      &end
    &if %uacv% < 0 &then  /*too many uacv values to put in a file
      &do
        &setvar file_unit = [open prob_polys openstatus -write]
        &setvar record = %cov-id% 
        &if [write %file_unit% %record%,%uacv%] = 0 &then
          &type prob_polys file written successfully
        &if [close %file_unit%] = 0 &then
          &type prob_polys file closed successfully
      &end    
    &if %uacv% = 0 &then   /*situation of "donut-hole in a donut-hole"
        &setvar adjacent_crv = 99999  /*Setting variable to very high number
                                                 /*to account for a rise within a 
                                                 /*depression.  Will require 
                                                 /*postprocessing to verify.
    &sys rm file2
    quit    /*quitting out of arcplot going back to arcedit

    ef poly
    sel $id = %cov-id%

    &select %only_crv%
      &when 8000
        &do
          &if %only_crv% > %adjacent_crv% &then
            &do
              calc cvl = 8000 
              calc cvh = 10500 
              list cvl cvh
              &setvar cov-id = [read %fileunit3% readstatus]
            &end    /*end of &do 
          &else /*&if %only_crv% < %adjacent_crv% &then
            &do 
              &if %adjacent_crv% = 99999 &then
                calc flag = 1  /*flagging donut-hole in donut-hole
              calc cvl = 6000 
              calc cvh = 8000 
              &setvar cov-id = [read %fileunit3% readstatus]
            &end    /*end of &do
        &end    /*end of &do for &when 8000

      &when 6000
        &do
          &if %only_crv% > %adjacent_crv% &then
            &do
              calc cvl = 6000 
              calc cvh = 8000 
              &setvar cov-id = [read %fileunit3% readstatus]
            &end    /*end of &do 
          &else /*&if %only_crv% < %adjacent_crv% &then
            &do 
              &if %adjacent_crv% = 99999 &then 
                calc flag = 1  /*flagging donut-hole in donut-hole
              calc cvl = 4000 
              calc cvh = 6000 
              &setvar cov-id = [read %fileunit3% readstatus]
            &end    /*end of &do
        &end    /*end of &do for &when 6000

      &when 4000
        &do
          &if %only_crv% > %adjacent_crv% &then
            &do
              calc cvl = 4000 
              calc cvh = 6000 
              list cvl cvh
              &setvar cov-id = [read %fileunit3% readstatus]
            &end    /*end of &do 
          &else /*&if %only_crv% < %adjacent_crv% &then
            &do 
              &if %adjacent_crv% = 99999 &then 
                calc flag = 1  /*flagging donut-hole in donut-hole
              calc cvl = 2000 
              calc cvh = 4000 
              list cvl cvh
              &setvar cov-id = [read %fileunit3% readstatus]
            &end    /*end of &do
        &end    /*end of &do for &when 4000

      &when 2000
        &do
          &if %only_crv% > %adjacent_crv% &then
            &do
              calc cvl = 2000 
              calc cvh = 4000
              list cvl cvh
              &setvar cov-id = [read %fileunit3% readstatus]
            &end    /*end of &do 
          &else /*&if %only_crv% < %adjacent_crv% &then
            &do 
              &if %adjacent_crv% = 99999 &then 
                calc flag = 1  /*flagging donut-hole in donut-hole
              calc cvl = 1000
              calc cvh = 2000
              list cvl cvh
              &setvar cov-id = [read %fileunit3% readstatus]
            &end    /*end of &do
        &end    /*end of &do for &when 2000

      &when 1000
        &do
          &if %only_crv% > %adjacent_crv% &then
            &do
              calc cvl = 1000 
              calc cvh = 2000 
              list cvl cvh
              &setvar cov-id = [read %fileunit3% readstatus]
            &end    /*end of &do 
          &else /*&if %only_crv% < %adjacent_crv% &then
            &do 
              &if %adjacent_crv% = 99999 &then 
                calc flag = 1  /*flagging donut-hole in donut-hole
              calc cvl = 600 
              calc cvh = 1000 
              list cvl cvh
              &setvar cov-id = [read %fileunit3% readstatus]
            &end    /*end of &do
        &end    /*end of &do for &when 1000

      &when 600
        &do
          &if %only_crv% > %adjacent_crv% &then
            &do
              calc cvl = 600 
              calc cvh = 1000 
              list cvl cvh
              &setvar cov-id = [read %fileunit3% readstatus]
            &end    /*end of &do 
          &else /*&if %only_crv% < %adjacent_crv% &then
            &do 
              &if %adjacent_crv% = 99999 &then 
                calc flag = 1  /*flagging donut-hole in donut-hole
              calc cvl = 400 
              calc cvh = 600 
              list cvl cvh
              &setvar cov-id = [read %fileunit3% readstatus]
            &end    /*end of &do
        &end    /*end of &do for &when 600

      &when 400
        &do
          &if %only_crv% > %adjacent_crv% &then
            &do
              calc cvl = 400 
              calc cvh = 600 
              list cvl cvh
              &setvar cov-id = [read %fileunit3% readstatus]
            &end    /*end of &do 
          &else /*&if %only_crv% < %adjacent_crv% &then
            &do 
              &if %adjacent_crv% = 99999 &then 
                calc flag = 1  /*flagging donut-hole in donut-hole
              calc cvl = 200 
              calc cvh = 400 
              list cvl cvh
              &setvar cov-id = [read %fileunit3% readstatus]
            &end    /*end of &do
        &end    /*end of &do for &when 400

      &when 200
        &do
          &if %only_crv% > %adjacent_crv% &then
            &do
              calc cvl = 200
              calc cvh = 400
              list cvl cvh
              &setvar cov-id = [read %fileunit3% readstatus]
            &end
          &else /*&if %only_crv% < %adjacent_crv% &then
            &do 
              &if %adjacent_crv% = 99999 &then 
                calc flag = 1  /*flagging donut-hole in donut-hole
              calc cvl = 0 
              calc cvh = 200 
              list cvl cvh
              &setvar cov-id = [read %fileunit3% readstatus]
            &end    /*end of &do
        &end    /*end of &do for &when 200
      &when -444
        &do
          calc cvl = -444  /*flagging problem donut-holes for further
                                /*review
              list cvl cvh
              &setvar cov-id = [read %fileunit3% readstatus]
        &end /*end of -444 loop
      &otherwise
        &do
              &setvar cov-id = [read %fileunit3% readstatus]
        &end /*end of otherwise loop 
    &end     /*end of &select
&if [mod %counter% 200] = 0 &then  /*intermittent saves
  save
&set counter = %counter% + 1    
&end /*end of &do &while loop

&setvar fileclose = [close %fileunit3%]

save

&type The end time is: [date -full]
&w &off
&return

 

Appendix C - QC.AML - Quality Control of Bathymetric Area Attribution

/* ========================================================================
/*  Program:    QC.AML
/*  Usage:      &r qc
/*  Description/Purpose:
/*    This AML selects polygons by CLV/CVH ranges and checks agains the 
/*    the attributes of the bounding bathymetric contours for any errors.
/*
/*  Arguments:  none
/*  Assumptions:
/*    A polygon coverage exists containing bathymetric contours, the shoreline,
/*    and bathymetric areas.  CVL and CVH items have already been added and
/*    populated using batpoly.aml and correct.aml.
/* ------------------------------------------------------------------------
/*   Input:            Bathymetric coverage with attributed CVL and CVH
/*                       values.
/*   Output:          A file that lists the -id, facc and crv
/*                       of arcs whose crv values disagree with the CVL and
/*                       CVH values of associated polygons.
/*   Calls:                  NONE
/*   Globals created:     NONE
/*   Globals required:    NONE
/*   System Dependencies: NONE
/**History:               Kim Berger and Dave Berg 11/96
/* ------------------------------------------------------------------------
/* Item Used From the batpoly12m.pat
/*      cvl - Curve Value Low attribute for bathymetric areas
/* -------------------------------------------------------------------------
/* Items Used From the batpoly12m.aat
/*      $id - the ArcInfo -id item 
/*      facc - the DIGEST feature attribute coding catalogue code 
/*      crv - the Depth Contour Value attribute for bathymetric lines
/* -------------------------------------------------------------------------

arcedit
&w qc.wch  /*creating a watch file of the processing
ec bat_dp_dis
&severity &error &ignore

ef poly  
sel cvh = 200
selectput arcs
ef arcs  
resel crv <> 0 and crv <> 200 and facc <> 'ZD003'
&type *********  ERRONEOUS ARCS for polys with cvl=0 & cvh=200 ***********
list $id facc crv

ef poly
sel cvl = 200
selectput arcs
ef arcs
resel crv <> 200 and crv <> 400 and facc <> 'ZD003'
&type *********  ERRONEOUS ARCS for polys with cvl=200 & cvh=400 ***********
list $id facc crv

ef poly
sel cvl = 400
selectput arcs
ef arcs
resel crv <> 400 and crv <> 600 and facc <> 'ZD003'
&type *********  ERRONEOUS ARCS for polys with cvl=400 & cvh=600 ***********
list $id facc crv

ef poly
sel cvl = 600
selectput arcs
ef arcs
resel crv <> 600 and crv <> 1000 and facc <> 'ZD003'
&type *********  ERRONEOUS ARCS for polys with cvl=600 & cvh=1000 ***********
list $id facc crv

ef poly
sel cvl = 1000
selectput arcs
ef arcs
resel crv <> 1000 and crv <> 2000 and facc <> 'ZD003'
&type *********  ERRONEOUS ARCS for polys with cvl=1000 & cvh=2000 ***********
list $id facc crv

ef poly
sel cvl = 2000
selectput arcs
ef arcs
resel crv <> 2000 and crv <> 4000 and facc <> 'ZD003'
&type *********  ERRONEOUS ARCS for polys with cvl=2000 & cvh=4000 ***********
list $id facc crv

ef poly
sel cvl = 4000
selectput arcs
ef arcs
resel crv <> 4000 and crv <> 6000 and facc <> 'ZD003'
&type *********  ERRONEOUS ARCS for polys with cvl=4000 & cvh=6000 ***********
list $id facc crv

ef poly
sel cvl = 6000
selectput arcs
ef arcs
resel crv <> 6000 and crv <> 8000 and facc <> 'ZD003'
&type *********  ERRONEOUS ARCS for polys with cvl=6000 & cvh=8000 ***********
list $id facc crv

ef poly
sel cvl = 8000
selectput arcs
ef arcs
resel crv <> 8000 and facc <> 'ZD003'
&type *********  ERRONEOUS ARCS for polys with cvl=8000 & cvh=10500 ***********
list $id facc crv

&w &off
quit

&return