City of Seattle Implements the Geodatabase
Seattle Public Utilities,
IT Applications Services
The City of Seattle's GIS is built on Esri's ArcInfo coverage model. Triggered by Esri's recent product releases, Seattle has embarked on a project to take advantage of new capabilities and efficiencies not possible under the older model. This
four-year GIS Technology Refresh project will migrate all City GIS layers into the geodatabase format and provide maintenance editors for each. This paper is a midproject report aimed at sharing the City's experiences in all aspects of migrating from the older ArcInfo coverage/library model into a geodatabase model utilizing SDE and Oracle. This paper addresses motivations, strategies, techniques, troubleshooting, problem resolution, and system design issues
Seattle Public Utilities,
IT Applications Services
The City of Seattle has established a GIS Technology Refresh project to migrate the existing GIS environment into the new ArcGis technology.
The primary focus of the project is conversion of the tile based coverage libraries to an ArcSDE geodatabase architecture.
The purpose of this discussion is to share our experience and to show some of the steps to remember during geodatabase creation and design.
The City of Seattle employs roughly 50 technically skilled GIS professionals, and the current GIS supports over 500 users.
The City has mature and complex GIS from mid 80’s, vary detailed and accurate date layers.
The current GIS provides a variety of popular user environments developed with AML and Avenue custom applications using workstation ArcInfo and ArcView.
The City data are supported in standard coverage, librarian, image, and shapefile format.
Business data is currently supported by Oracle DBMS.
The City network comprises of Wide Area Network (WAN) 1-GB fiber backbone and Local Area Network (LAN) 100BaseT connection to the desktop. Some remote locations are still on 10BaseT, but we generally have very high bandwidth.
During this portion of my presentation, I will cover these four topics:
Why do this?
Compelling reasons to move us are:
Better performance for more users. Speed is always an issue.
Superior capabilities. More powerful stuff can be done per average user.
Due to increased efficiency, lower cost for data maintenance.
Significant improvement to business data integration line: Maximo, Hansen, etc
Move application development tools from UNIX to more mainstream Windows environment.
Improved application integration with existing systems.
Broaden user access to GIS tools.
Better interface on client server
Utilize ArcIMS technology
Reduce cost per user
Simple, familiar interface
Less training needed for end users
All our data is in a standard coverage library format, maintained with AMLs.
Reduce data replication e.g.: seamless coverage and shape files.
ArcSDE, ArcGIS, Geodatabase, Windows developments, UML, etc – these are not exactly core to our skill set. However, now we are developing these skills quickly.
Thorough knowledge of the data.
Very experienced GIS staff, many staff with 10+ years of expertise.
Largely same staff developed all data layers to be converted 12 years ago.
Motivated staff, eager to move on and learn new things.
Learn the technology
SDE training for an administrator
ArcGIS: ArcMap, ArcCatalog
Geodatabase design training was crucial
ArcObject "Introduction to Programming ArcObjects with VBA"
Use ArcFM UML model
We began with published Esri ArcFM water distribution data model
We modified the UML model to fit Seattle Public Utilities requirements
System Architecture Design contracted with Esri’s
Dave Peters to draw on his experience in designing the underlying infrastructure of the new environment (servers, network, client hardware, etc.)
Start with existing coverages and let Esri Model guide the initial design effort.
What are the existing coverages?
We are not creating new data all our data has served well for 10 years
Known issues - Granularity of a network to be set as needed programmatically
Map details (insets) for 400 foot scale maps
"Use cases" – Real life scenarios they are going to reveal how the data is being used.
Data maintenance history
Join to business data or replicate
It is more efficient to start simple and add complexity as needed
From ArcFM UML Model, we dropped all domains, all relationships including composite ones
We chose not to employ the connectivity rules now, we will programmatically re-establish these later.
No custom behavior in classes yet
No relationship classes
Most of our attribute data is already stored in independent business systems
If joining to business data at presentation time fewer attributes remain in a model.
Data is simpler
We are not fighting any more to make the technology work (repository, data loads, etc.)
Stable processes, things break less
Better data design because no data replication. No processes transforming, copying data
Potential for access to broader range of data that would have been replicated otherwise
No currency issue. Looking always at the live most recent data
Still better performance
Numerous proof of concept, live demos were presented to the Design Review Team
Display performance was good enough
Query and symbolizing time is still an issue
Experienced Oracle DBAs support and commitment to tune business system (indexes, views)
Water model is complete
Beta version will be released soon
Remain Beta for several years to come while other data is migrated, editors to be built later
Transportation model is underway
Completion to go live about 2004-2005
No funds for migrating user apps until the next budget cycle
Test the UML model
Avoid database reserved keywords
Specify Configuration Keyword
View properties of geonetwork
Load data into schema
Real life loading strategies
Annotation receives some extra attention
That will help to eliminate most of the errors though will not prevent you from errors like: wrong "Data Type" for a database
When working with UML model very early at the design stage do not use database reserved keywords e.g., #, Date in Oracle nor "Action" in SQL for instance, because later in the process of generating the schema errors will be reported.
The datasets and feature classes have to be referenced spatially for the area that will cover the extent of all the features to be loaded into the geodatabase.
The DBA decides what parameters to use when storing the data, this can be assigned as early as at UML (through Properties > Tagged Values > ConfigKeyword) or later in the schema creation process.
This will be crucial when recreating GeoNetwork later, the name has to be the same, feature classes like: complex edge, simple junction have to be the same as well. If the name is different the regeneration of schema will error out.
Do not use long name for the geoNetwork, later while recreating it a "Underlying DBMS error" will be produced.
Use "Simple Data Loader" in ArcCatalog that is very fast method and will load simple features as point, line and polygon. Geometric Network has to be deleted. No composite relationship between classes.
Use ArcToolbox to convert coverage format data directly to the geodatabase.
Use "Object Loader" in ArcMap within "Edit Session" If sde/geodatabase then data have to be versioned. This method loads data very slow.
Real life data loading strategies
Use combination of loading methods
Delete Geometric Networks from a Geodatabase Schema
Load all non-custom features first
Build network, reapply model which will create custom object classes
Use object loader to populate custom feature classes
Can not be created with CASE tools like UML
Has to be created in ArcCatalog and converted within ArcMap
There is feature linked annotation class with composite relationship between feature and annotation, and non-feature linked annotation class
Task: Existing cover annotation in library format has to be converted to nonfeature linked annotation in the geodatabase. The City invested thousends of dollars in existing annotation layers.
Process: Create nonfeature linked annotation class in ArcCatalog
View existing cover format annotation in ArcMap
Set: font, color, size, chose right arrowhead marker symbol still in ArcMap
Convert annotation within ArcMap using command tool available in Customize > Commands > Label > Convert Coverage Annotation… drag and drop that tool on ArcMap toolbar interface.
The Result: the size of converted annotation did not get changed (we have a little problem)
Fix: Go back to cover annotation format and change the pseudo item value to $size = 0
Then the size will be able to be controlled while loading to the geodatabase.
There are already some implementations of geodatabases out there, best used for reference.
Get some help, consider hiring an experienced consultant at some level of participation in order to draw on practical experience.
There were some glitches along the way, and works around have to be employed frequently, so go slow with an implementation.
City of Seattle
Seattle Public Utilities
710 Second Ave
Seattle, WA 98104
(206) 233 5173