City of Seattle Implements the Geodatabase

Janusz Krawczynski

Seattle Public Utilities,

IT Applications Services

Seattle, WA

Abstract:

The City of Seattle's GIS is built on Esri's ArcInfo coverage model. Triggered by Esri's recent product releases, Seattle has embarked on a project to take advantage of new capabilities and efficiencies not possible under the older model. This
four-year GIS Technology Refresh project will migrate all City GIS layers into the geodatabase format and provide maintenance editors for each. This paper is a midproject report aimed at sharing the City's experiences in all aspects of migrating from the older ArcInfo coverage/library model into a geodatabase model utilizing SDE and Oracle. This paper addresses motivations, strategies, techniques, troubleshooting, problem resolution, and system design issues

  1. City of Seattle Implements the Geodatabase
  2. Janusz Krawczynski

    Seattle Public Utilities,

    IT Applications Services

    Seattle, WA

  3. Introduction
  4. The City of Seattle has established a GIS Technology Refresh project to migrate the existing GIS environment into the new ArcGis technology.

    The primary focus of the project is conversion of the tile based coverage libraries to an ArcSDE geodatabase architecture.

    The purpose of this discussion is to share our experience and to show some of the steps to remember during geodatabase creation and design.

    The City of Seattle employs roughly 50 technically skilled GIS professionals, and the current GIS supports over 500 users.

    The City has mature and complex GIS from mid 80’s, vary detailed and accurate date layers.

    The current GIS provides a variety of popular user environments developed with AML and Avenue custom applications using workstation ArcInfo and ArcView.

    The City data are supported in standard coverage, librarian, image, and shapefile format.

    Business data is currently supported by Oracle DBMS.

    The City network comprises of Wide Area Network (WAN) 1-GB fiber backbone and Local Area Network (LAN) 100BaseT connection to the desktop. Some remote locations are still on 10BaseT, but we generally have very high bandwidth.

  5. Experience to share
  6. During this portion of my presentation, I will cover these four topics:

    Why do this?

    Challenges

    Strengths

    Approach

  7. Why do this?
  8. Compelling reasons to move us are:

    Better performance for more users. Speed is always an issue.

    Superior capabilities. More powerful stuff can be done per average user.

    Due to increased efficiency, lower cost for data maintenance.

    Significant improvement to business data integration line: Maximo, Hansen, etc

    Move application development tools from UNIX to more mainstream Windows environment.

  9. Why do this? (cont’d)
  10. Improved application integration with existing systems.

    Broaden user access to GIS tools.

    Better interface on client server

    More programmable

    Utilize ArcIMS technology

    Reduce cost per user

    Simple, familiar interface

    Less training needed for end users

  11. Why do this? (cont’d)
  12. All our data is in a standard coverage library format, maintained with AMLs.

    Reduce data replication e.g.: seamless coverage and shape files.

  13. Challenges we faced
  14. ArcSDE, ArcGIS, Geodatabase, Windows developments, UML, etc – these are not exactly core to our skill set. However, now we are developing these skills quickly.

  15. Our Strengths
  16. Thorough knowledge of the data.

    Very experienced GIS staff, many staff with 10+ years of expertise.

    Largely same staff developed all data layers to be converted 12 years ago.

    Motivated staff, eager to move on and learn new things.

  17. What was the approach
  18. Learn the technology

    SDE training for an administrator

    ArcGIS: ArcMap, ArcCatalog

    Geodatabase design training was crucial

    ArcObject "Introduction to Programming ArcObjects with VBA"

    Use ArcFM UML model

    We began with published Esri ArcFM water distribution data model

    We modified the UML model to fit Seattle Public Utilities requirements

  19. What was the approach (cont’d)
  20. System Architecture Design contracted with Esri’s

    Dave Peters to draw on his experience in designing the underlying infrastructure of the new environment (servers, network, client hardware, etc.)

  21. Use Esri Model to Inform
  22. Start with existing coverages and let Esri Model guide the initial design effort.

    What are the existing coverages?

    We are not creating new data all our data has served well for 10 years

    Known issues - Granularity of a network to be set as needed programmatically

    Map details (insets) for 400 foot scale maps

    "Use cases" – Real life scenarios they are going to reveal how the data is being used.

    Data maintenance history

    Join to business data or replicate

  23. Why not start from Esri Model?
  24. To complicated

    It is more efficient to start simple and add complexity as needed

    From ArcFM UML Model, we dropped all domains, all relationships including composite ones

    We chose not to employ the connectivity rules now, we will programmatically re-establish these later.

    No custom behavior in classes yet

    No relationship classes

    Most of our attribute data is already stored in independent business systems

    If joining to business data at presentation time fewer attributes remain in a model.

  25. Complex ArcFM UML model
  26. Modified City UML model
  27. By doing so
  28. Data is simpler

    We are not fighting any more to make the technology work (repository, data loads, etc.)

    Stable processes, things break less

  29. Join at presentation time to business system
  30. Better data design because no data replication. No processes transforming, copying data

    Potential for access to broader range of data that would have been replicated otherwise

    No currency issue. Looking always at the live most recent data

    Still better performance

  31. Will performance kill the usefulness?
  32. Numerous proof of concept, live demos were presented to the Design Review Team

    Display performance was good enough

    Query and symbolizing time is still an issue

    Experienced Oracle DBAs support and commitment to tune business system (indexes, views)

  33. Current status
  34. Water model is complete

    Beta version will be released soon

    Remain Beta for several years to come while other data is migrated, editors to be built later

    Transportation model is underway

    Completion to go live about 2004-2005

    No funds for migrating user apps until the next budget cycle

  35. Now I would like to cover some important steps in the Geodatabase Creation Process
  36. Test the UML model

    Avoid database reserved keywords

    Set geo-reference

    Specify Configuration Keyword

    View properties of geonetwork

    Load data into schema

    Real life loading strategies

    Annotation receives some extra attention

  37. Test the UML model
  38. Use" Semantics_Checker"

    That will help to eliminate most of the errors though will not prevent you from errors like: wrong "Data Type" for a database

  39. Avoid database reserved keywords in field names
  40. When working with UML model very early at the design stage do not use database reserved keywords e.g., #, Date in Oracle nor "Action" in SQL for instance, because later in the process of generating the schema errors will be reported.

  41. Set the geo-reference while generating the Schema
  42. The datasets and feature classes have to be referenced spatially for the area that will cover the extent of all the features to be loaded into the geodatabase.

  43. Specify Configuration Keyword for Database storage parameters
  44. The DBA decides what parameters to use when storing the data, this can be assigned as early as at UML (through Properties > Tagged Values > ConfigKeyword) or later in the schema creation process.

  45. View and record the properties of GeoNetwork
  46. This will be crucial when recreating GeoNetwork later, the name has to be the same, feature classes like: complex edge, simple junction have to be the same as well. If the name is different the regeneration of schema will error out.

    Do not use long name for the geoNetwork, later while recreating it a "Underlying DBMS error" will be produced.

  47. Load Data into schema
  48. Use "Simple Data Loader" in ArcCatalog that is very fast method and will load simple features as point, line and polygon. Geometric Network has to be deleted. No composite relationship between classes.

    Use ArcToolbox to convert coverage format data directly to the geodatabase.

    Use "Object Loader" in ArcMap within "Edit Session" If sde/geodatabase then data have to be versioned. This method loads data very slow.

    Real life data loading strategies

    Use combination of loading methods

    Delete Geometric Networks from a Geodatabase Schema

    Load all non-custom features first

    Build network, reapply model which will create custom object classes

    Version data

    Use object loader to populate custom feature classes

  49. Annotation in the Geodatabase
  50. Can not be created with CASE tools like UML

    Has to be created in ArcCatalog and converted within ArcMap

    There is feature linked annotation class with composite relationship between feature and annotation, and non-feature linked annotation class

  51. Annotation (cont’d)
  52. Task: Existing cover annotation in library format has to be converted to nonfeature linked annotation in the geodatabase. The City invested thousends of dollars in existing annotation layers.

    Process: Create nonfeature linked annotation class in ArcCatalog

    View existing cover format annotation in ArcMap

    Set: font, color, size, chose right arrowhead marker symbol still in ArcMap

  53. Annotation (cont’d)
  54. Convert annotation within ArcMap using command tool available in Customize > Commands > Label > Convert Coverage Annotation… drag and drop that tool on ArcMap toolbar interface.

    The Result: the size of converted annotation did not get changed (we have a little problem)

    Fix: Go back to cover annotation format and change the pseudo item value to $size = 0

    Then the size will be able to be controlled while loading to the geodatabase.

  55. Annotation in ArcInfo Library
  56. Annotation in Sde/Oracle geodatabase
  57. Conclusions
  58. There are already some implementations of geodatabases out there, best used for reference.

    Get some help, consider hiring an experienced consultant at some level of participation in order to draw on practical experience.

    There were some glitches along the way, and works around have to be employed frequently, so go slow with an implementation.

  59. Questions?

 

Janusz Krawczynski

Gis analyst

City of Seattle

Seattle Public Utilities

710 Second Ave

Seattle, WA 98104

(206) 233 5173

janusz.krawczynski@ci.seattle.wa.us