Perhaps the shrimp was old, or maybe the flame was too hot. Maybe it was the difference between the pinch and the sconce of basil. Or perhaps you allowed the butter to separate. In any event, the dish did not taste right. You found Joanne the next day. She confessed that the recipe took time for her to perfect, but she promised that given time, your shrimp tortellini would be just as mouth watering as hers.
I know it sounds odd, but shrimp tortellini and the Global Positioning System (GPS) have much in common. The perception is that anyone can make shrimp tortellini given a recipe. How hard is it to follow directions. The same perception is that anyone can use GPS if given proper directions. While these statements may be true in the long run, the simple fact is that without training and practice, you may have to throw out your meal ? and your data.
There is a knack, or a sense of cohesion and comprehension that comes with proper training, practice, and long term exposure to a task, a piece of equipment, or a software package. Learning the difference between the taste of basil and thyme, or knowing how to minimize the risk of multipath effects all come from learning, practice, and exposure. This understanding helps you avoid pitfalls, which are prevalent in any task, bringing me to the focus of this paper. Under the management of a GIS and without going into great detail, I hope to make the reader aware of the subtleties of GPS technology, to suggest methods of avoiding these subtle pitfalls, and to champion the idea that GPS is not easy.
The GPS receivers like Trimble's PRO XL unit pick up these signals. Using the information embedded in the signal, the GPS receiver selects the best four satellites that it will use for the trilateration calculation to derive the final solution. The simplified version of the trilateration calculation for each measurement requires the input of four variables: the x value, the y value, the z value, and time. In order to solve four unknowns or variables in mathematics, you need to have four equations; hence, you need to have four satellites.
However, as mentioned earlier, the time variable is encrypted. The encryption has the effect of time distortion, which in turn distorts the GPS locations by as much as 400 meters (Figure 1).
FIGURE 1.
There are military GPS units available to Federal
agencies, which decode the time variable; however, the positional accuracy
for these units is only around 15 meters. To get the 1-3 meter accuracy
or sub-meter accuracy, differential correction is needed. Differential
correction requires two sources of data. One source is the raw uncorrected
GPS data captured by the GPS rover receiver (unit in the field), and the
second source is the corrected GPS information from a base station .
A base station is a stationary GPS receiver where the position of the antenna has been surveyed to sub-centimeter accuracy. Since the base station already has an accurate location, the raw uncorrected data is corrected for any variances by simply subtracting the raw location by the known location of the base station. This process is completed on a second by second basis. The differentially corrected positions are then stored in flat files, broadcast via radio waves, or both.
Basically, the base station data offers a correction vector (i.e. distance and angle) which can then be applied to the raw data collected by the GPS rover either on a real-time or post-processed basis. Each Raw measurement is then corrected using the correction vector, which is applicable to the exact time the raw data was collected. After all the measurements have been corrected, the measurements are then averaged to supply a single solution: the location or point (Figure 2).
While this description is very basic, I encourage
GPS users and users of GPS data to research the topic in greater detail.
The more you understand the technology, the better equipped you are to
understand and deal with data problems caused by the subtle pitfalls of
GPS.
FIGURE 2.
The PDOP mask setting is a filter which will turn off the data collection process if the PDOP rises above the set mask. To assure a 95% confidence in reaching a 1 to 3 meter criteria, the Trimble suggests a PDOP mask setting of no greater than 4.0. However, practical experience shows that such a setting can require a lot of field time waiting for the satellites to reach optimum configuration. A more realistic setting is a PDOP of 6.0. Note that the PDOP mask should be set no higher that 6.0. The error caused by increasing your PDOP is not linear, rather exponential. Be wary.
Unfortunately, the PDOP mask is an easy target for the impatient GPS'er, let us call him FRED. The enticement to 'bump up' the PDOP mask to 6.2 or 7.5 or even 8 is great when the mosquitoes are biting, the heat is intense, and a cool air conditioned car waits.
Another benefit to the 15° setting is that you increase your chances of using the same satellites that the base station is using. The base stations are set to 10° in order to see the most possible satellites (base stations are usually on a very high tower which act as a shift in the horizon line). Because the base station is seeing more satellites than you may be seeing, your chances of using the same satellites increase.
This easy option is a good candidate for a FRED attack. By lowering the elevation mask, FRED can receive signals from satellites which before he could not read. These satellites may indeed lower the PDOP which will enable him to collect data and avoid PDOP filters. There is currently no filter to catch position mask changes. However, there are times in which this setting should be changed. For instance, if you are at the extreme edges of the base station's coverage, this angle should be increased to limit the amount of satellites which you can use. The amount of overlap caused by the curvature of the earth limits the duplicate use of the same satellites by both the base station and the rover. In order to correct the positions, both the base station and the rover must be using the same satellites. This principle applies to real-time and post processing differential correction.
Manual 2D mode should never be used for regular GPS needs. This selection uses only 3 satellites to calculate the location instead of the required 4. This selection requires that the user add a constant altitude coefficient instead of allowing the satellites to determine the altitude. Like time, altitude is one of the four coefficients used to calculate location. This selection may work while on an ocean or flat plain, but if there is any variance in elevation, the GPS unit will not compensate. Without an accurate altitude, the locational accuracy is suspect.
Unfortunately, FRED loves to use this option to cut corners. For example, there are times when you must wait in order to get the four needed satellites to continue collecting data, but if FRED selects Manual 2D, he can collect points without having to wait on the satellites. Another outcome of selecting Manual 2D is that FRED can change the PDOP equation. Instead of looking at four satellites, the equation is now using three. In most cases the PDOP will fall below the required level and allow the FRED to continue collecting data.
Auto 2D/3D mode should never be used for regular GPS needs. This option will use the Manual 3D mode until the PDOP rises above the PDOP switch. After it has risen above the PDOP switch setting, the unit will revert to Manual 2D, and all the problems with Manual 2D will apply.
Manual 3D mode is used most often. This mode uses 4 or more satellites to calculate the location. With all other required settings correct, this mode will result in a 95% confidence that your position will fall within 1 - 3 meters.
Overdetermined 3D mode is the best possible solution. This mode forces the GPS unit to collect data only when there are 5 valid satellites to use. Any less and the unit will not collect data. However, this has been seen by most as overkill. With Manual 3D, the unit can use more than four satellites if it can; however, the likelihood of having four satellites available is much better than having five satellites available. Therefore, Manual 3D is the preferred mode.
Some see the log as being a waste of space; however, given the value to the database manager as a QA/QC control, this item is invaluable making the space inconsequential.
The problem with this option comes into play if you only want to have real-time differentially corrected data. Getting a RTCM lock can often be tedious and aggravating. If you do not know the tricks (and sometimes even if you do know the tricks), there are times when you cannot get a RTCM lock and you have to collect raw data. FRED will often collect raw data without even trying any tricks.
If you know that the data collected is raw data, this is not necessarily a problem unless you do not have access to base station data. If you do not have base station data, then the information cannot be corrected, and could be off by as much as 100 meters.
Just from experience, however, post processing is not something that you would want to do everyday or ever. It is time consuming work. I personally avoid it like the plague. My rule of thumb is that if I cannot get a real-time corrected signal, the very last option in my arsenal is to post process.
In practice, however, warning time can be a useful tool to avoid post processing. Given the ability to edit points outside a few standard deviations, this option can be a blessing. For example, if you were unable to get a RTCM lock at the feature site due to high voltage power lines, but you could get a lock 40 feet away, you could use the technique 'coming in hot'. This technique starts where RTCM lock is established. The warning time is bumped up to the setting necessary to get the required amount of measurements. While receiving an RTCM signal and having data collection paused, the GPS'er must hurry to the feature site and resume the collection process. However, using this option requires special care. The data must be monitored for outlyers, which may skew the location.
Of course this option also introduces the possibility that FRED will abuse it. If you used 100 seconds to correct raw data with one correction vector, you might as well just take raw data. But Fred would slip this data into your database easily, because this data is labeled as real-time corrected, and there are no filters to catch such errors.
One possible problem is that FRED can enter any number that he desires. Essentially, Fred could get one measurement for each feature. However, in order to reach the desired 95% confidence of 1-3 meter level, the minimum that Trimble suggests is 180 measurements. Fortunately, practical experience has shown that as little as 20 measurements are needed.
One measurement can also be taken in extreme cases. This one measurement is better than offsetting, increasing the warning time, bumping up the PDOP, changing the input mode, or post processing the data. At least you will have one good measurement. I am not implying that real-time correction is better than post-processed data. For with the advent of phased processing, post processing is much more accurate than real-time correction. However, the time involved in post processing must be taken into consideration. For small projects, perhaps post processing is the best option, but for large projects with massive amounts of information, real-time correction is the preferred option.
However, this option is not in the regular stream of options and can be easily overlooked if a new GPS'er grabs a unit previously set for constant offset. If the data is caught in time, then this is really no big deal. The data can be cleaned. But if the data is processed into a database without the knowledge that there was an offset applied, then the data in the database is incorrect. Unfortunately, no one will ever know, and all the GPS data will be assumed accurate to 1-3 meters.
Fortunately, with offsets, the actual offset settings are stored in a separate area away from the actual data. The offset must be applied within the software before it is part of the final solution. In most cases, however, the default is to allow the application of offsets. This is why constant offset settings option can be a problem.
If all these TDC1 options are correctly set, then the risk of locational error is minimized. You can say with 95% confidence that the solutions will be within 1-3 meters. But there is no fix for FRED. If FRED is on your team, you will have a hard time collecting good data. And you can take your 95% confidence and throw it out the window.
To make a correct offset, however, the GPS'er should take no less than 2 offsets for each feature. The 2 offset readings should be at different angles to the feature. For each offset, the GPS'er must measure the distance, angle, and inclination from the solution to the feature. This can be time consuming. Moreover, the further the GPS'er is from the feature, the greater the possible angle error. Due to the difficulty of offsets, this type of problem invites FRED's creativity to bloom.
The first thing FRED will do is to gather the GPS data away from the physical feature and not apply the offset at all. If FRED is within the 1-3 meter circumference, there is not significant problem, but the problem for you is that YOU WILL NEVER KNOW!
However, FRED could do the offsets. But knowing FRED, he paced off the distance as ten paces, calculated his pace at approximately 3 feet, and entered 30 ft. Fred did not even use his inclinometer, but rather looked at the sun and guessed the angle and inclination. Despite FRED and his influence, if used correctly, however, offsets are a useful tool. You will never reach the 1-3 meter accuracy of normal data collection, but sometimes you have to take what you can get.
Avoiding multipath effects is simple. If you are surrounded by buildings, water towers, trees, large vehicles, large people (no kidding), electromagnetic fields (power lines, generators, big motors), the chance of having multipath effects degrading your data is great. Your alternatives are to either take the position noting that there may be multipath effects present, raise the GPS antenna or take an offset away from the obstructions. I would suggest two things. First, use a data dictionary or GPS log book to note the environment in which you are taking the position. For instance, if you are downtown between buildings, under a water tower, by a cliff face, or next to high voltage wires, these GPS environmental variables should be noted. The GPS metadata allows users of the GPS information to select and check for possible multipath errors.
My second suggestion would be to purchase a large range pole. Most GPS antennas have a fairly long cable attached (approximately 15 feet) and extensions are available. For example, rather than have any problems with multipath effects while trying to GPS the boundary of a fernery under a heavy oak canopy, one GPS'er got a cable extension and a fifty foot range pole. The job was completed in short order, and because the antenna was over the tree canopy, errors were eliminated.
Of course, there is always FRED who will do as little as possible. Multipath effects mean nothing to him, because the effects do not slow him down one bit. There are no filters to catch errors, nor are there any settings which could eliminate the problems. So, the only efficient means of avoiding FRED's bad data is to be wary of data collected in certain geographical location which may cause multipath errors. Besides offsets and multipath effects, there are other collection procedures that can effect the accuracy of the data. These include the antenna location, position, and angle; the number of positions captured; the data processing; the data entry; the data dictionaries, and so on. However, my intent with this paper is to dispel the idea that GPS is easy, not to list every possible problem there may be with GPS.
Pfoffice has a configuration setup menu in its home menu page. This configuration is defaulted to the current display setup in the export setup options. The problem is that the home menu page is too easily accessible. Therefore, any user of the software may change the projection for their needs. Now the current display setup has changed along with the home menu configuration setup.
However, with the export setup programmed to the needed projection, the bad user will have to enter the batch processor to change the export setup. This setup change is not as easy as the first and less likely to occur. This is why the export setup is preferred. I do not have to tell you what could happen when you combine location information with two different projections. Needless to say, the data will be locational impaired.
In Florida, per second longitude and latitude differences are approximately 89 and 90 ft respectively. If we round up to an even 100 feet, then one decimal place is 10 feet; two decimal places equate to 1 foot. Only a survey grade GPS unit has the ability to meet sub-meter accuracy. If you are using a Pro XL, or XR, then the most you can hope for will be an accuracy of 1-3 meters or one significant digit. Therefore, if you are publishing locational data, use only one decimal place if any.
However, if you are generating a coverage, go for as many decimal places as you can. This will decrease the amount of rounding errors which may be encountered in any location calculations during the coverage generation process, and assure that the integrity of the 1-3 meter GPS accuracy has been maintained.
In a database, these sources of information can help determine why locations are corrupt and aid in database management. For instance, if a location is suspect, the database manager need only look at the two position counts to see if there were measurements filtered out of the averaged solution. Then a quick look at the PDOP, and standard deviation will tell if there was a good cluster of measurements. Finally the correction type can tell if this was post-processed, real-time, or another type of correction. The manager can finally use the time and date to find the original raw data file to look at the original measurements.
I strongly encourage the use of these attributes. They are a great way to battle the FREDDIES of the GPS world, and they enhance the reliability of the data in your charge.
For example, a training program which I put together for St. Johns River Water Management District (SJRWMD) in Florida consisted of a Trimble certification (1 week), a departmental GPS certification (1 week), a month of on the job training, a written test (85 to pass), a field practical, and an intern/mentor period. These gateways are to be passed before the employee can enter data into the database. Data sent by new GPS'er is closely monitored. Still with all these precautions, we had a few GPS'ers who just never got the knack of GPS'ing. I relate this experience because this is a good example of a rigorous training regimen that still had participants who passed through the gateways, but who were not good GPS'ers. There are even some FREDDIES in the lot. This might give you an idea of the difficulty and need for proper training.
However, I encourage you to teach your crew more than they really need to know. If I can be so black and white, there are two types of GPS'ers: the foxes and the lemmings. Don't worry about the lemmings. They will only do what you tell them to do. Worry about the foxes. The foxes are crafty and will root out things you had never dreamed they would find. So if you show them how things work, they will be less likely to mess with them later.
Secondly, when processing is automated, it forces regimentation in the GPS data collection in the field. Data dictionaries must be set up exactly, offset procedures must be addressed, data entry criteria must be established, and an overall data collection QA/QC dogma can be set for all the GPS'ers in the project. Again this will establish teamwork and dedication in the GPS'ers and the database manager. With automation FRED will have a hard time slacking off. Not only will the automation remove much of his tampering, but also others in the program are likely to address FRED's lack of commitment.
Along with password protection, another form of security is to append the GPS'er's name to each feature they collected. This give a sense of responsibility and ownership to the GPS'er for the data. With this type of security, two things are accomplished. One being that the database's integrity is kept high. And secondly, the GPS'ers have a greater dedication. Given this dedication, FRED hasn't a chance. For instance, with our system, I regularly get complaints from one of my crew member about another crew member's GPS practices or about the data he collected. I find this dedication indispensable for finding errors and weeding out the FREDDIES in the group.
This process need not be truly automated. Simply pulling the data from shapefiles into Arcview can serve the same purpose. However, I stress that without this step, the bad data will most certainly find its way into the database. For a case in point, let me again refer to the application used by SJRWMD. Many GPS data errors have been found using visual verification. In one instance, a GPS'er collected data all day. Unfortunately, he did not check his TDC1 configuration, for someone changed the PDOP setting to 12. Therefore, he gathered information with too high a PDOP. The automated filters, however, removed the bad data. When he tried to verify the locations of the sites visually, he noticed that over half his data was missing. The filters did the job, and the visual verification proved it. Also the GPS'er learned a valuable lesson.
Also, I have offered insight to possible places where errors can unintentionally and intentionally occur. Armed with this knowledge, I hope I have prepared you to look more carefully at GPS data. Do not take data which seems to be GPS verified for granted. Ask serious questions about how the data was collected. Ask what the TDC1 configurations were. Ask for the original raw data. If the data supplier cannot answer your questions, then it may be wise not to accept the data.
Remember that time is necessary to fully understand any technology. With proper training, practice, and long term exposure to GPS, you will get the knack, and you will not have to throw out your data. However, your shrimp tortellini may be another matter. I think you used a bit too much garlic.
Hofmann-Wellenhof, B., Lichtenegger, and Collins, "GPS Theory and Practice." Springer-Verlag Wien, New York. 2nd. ed., 1993.
Leick, Alfred, "GPS Satellite Surveying." Department of Surveying Engineering, University of Maine. John Wiley & Sons. 1990.
Shrestha, Ramesh L. PhD., Bon Dewitt, Matt Wilson, "Applications of GPS in GIS." Surveying and Mapping, University of Florida, July 1993.
Trimble Navigation Inc., "Training Manual: Pro XL System with Pfinder Software." Part Number: 30990-00. Revision: A. June 1996.
Trimble Navigation Inc., "Mapping Systems; General Reference." Part Number: 24177-00. Revision: B. April 1996.
Trimble Navigation Inc., "Pathfinder Office Software." Vol I - III. Part Number: 31310-00, 31311-00, 31312-00. Revision: A. October 1996.
Wells, David, "Guide to GPS Positioning." Canadian GPS Association, 1987.