"No exemplary models for integrating GIS into preservice teacher preparation programs exist."Since the inclusion of geography as a core subject in the Goals 2000: Educate America Act, there has been widespread acceptance among citizens of the United States of the goal of developing students who are internationally competitive as well as productive and responsible citizens in a global economy. In response to this desire for a geographically literate society, the National Geography Standards 1994 were developed. The function of the standards is to help students and teachers develop a clear understanding of what geography is and how to effectively apply that understanding to life (National Geography Standards Project, 1994).
Bednarz & Audet, The Status of GIS Technology in Teacher Preparation Programs, (1999)
The effective teaching of the National Geography Standards has been the focus of K-12 social studies curricula throughout the nation. It has been identified that in order to effectively teach the standards, teachers require a clear understanding of a geographic information system (Bednarz, 1995; Sui, 1995). A geographic information system (GIS) is software that allows a user to store, retrieve, manipulate and display geographic data about any place in the world (Environmental Systems Research Institute, 1998). Even though such an understanding of GIS is necessary, it has not been adopted in the K-12 American classrooms at a rate that the National Science Foundation, the Environmental Systems Research Institute (Esri), and geography educators had once hoped (Environmental Systems Research Institute, 1998; Fitzpatrick, 2002). Fitzpatrick (2002) noted that Esri's goal in 1992 was for K-12 educators to be the largest single group of users by 1996, a goal still not achieved. The key reason for this slow pace of GIS integration, according to Bednarz and Audet (1999), is that no "exemplary models for integrating GIS into preservice teacher preparation programs exist" (p. 65).
In response to this lack of exemplary models for teaching GIS, the purpose of this study is to develop and research the effectiveness of three GIS instructional models for university-level instructors to use within preservice teacher education courses.
In response to the first goal, Geography for Life: National Geography Standards 1994 was written outlining eighteen standards that K-12 students should meet to become geographically literate. The National Geography Standards have been described as a "vital contribution" toward helping students "use their minds well, so they may be prepared for responsible citizenship, further learning, and productive employment in our Nation's modern economy . . ." (National Geography Standards Project, 1994). Unfortunately, the second and third goals have not resulted in much technology implementation or curriculum revision in college or K-12 geography classrooms (Bednarz & Audet, 1999).
The curricula in geography education focuses on how individuals observe, conceptualize, analyze, and evaluate information from a spatial perspective (Fredrick & Fuller, 1997). Three types of technology that can provide this spatial dimension have been identified: exploratory systems such as Atlas CD-ROMs; database systems such as geographic information systems; and simulation systems such as National Geographic's Weather Machine (Fitzpatrick, 1990). Of all these, database systems and geographic information systems (GIS) have been recognized as critical for the implementation of computer-based technology in geography education (Bednarz & Audet, 1999; Keiper, 1999; Kerski, 2001). In fact, authors of the National Geography Standards have identified GIS as the only technology that can assist students in meeting all of those standards (Bednarz, 1995; Sui, 1995).
This study, in response to Brownwell's (1997) and Bednarz's (1999) appeal for the development of an effective GIS teaching model for K-12 classrooms and for university preservice teachers, develops and researches three GIS teaching models for preservice teacher education programs. The GIS pedagogies are based on three instructional models employed by teachers of the Jasper Woodbury Problem Solving Series published by the Cognition and Technology Group of Vanderbilt (CTGV). To help students develop effective problem solving skills, the CTGV encourages teachers to use anchored instruction which "provides a way to recreate some of the advantages of apprenticeship training in formal education settings involving groups of students" (CTGV, 1990, p. 6) by creating learning environments that emphasize scaffolding and generative thinking (CTGV, 1992). These instructional approaches are the same ones that many educators have identified as necessary for successfully teaching GIS to learners at all age levels (Bednarz, 1995; Keiper, 1999). Therefore, the goals of the Jasper Woodbury Problem Solving Series and GIS are for students to "learn to become independent thinkers and learners rather than simply become able to perform basic computations and retrieve simple knowledge" and to develop their ability to "identify and define issues and problems on their own rather than simply respond to problems that others have posed" (CTGV, 1990).
The three instructional models CTGV (1990) identified for teaching the Jasper series are: "Basics First, Immediate Feedback Direct Instruction," "Structured Problem Solving," and "The Guided Generation Model." The first model, "Basics First, Immediate Feedback Direct Instruction," focuses on the order in which content and tasks in a curriculum are presented. This model is grounded in a "extreme reductionist view that all components of a skills must be mastered before the components can be assembled into the skill they comprise" (CTGV, 1990). The second model, "Structured Problem Solving," focuses on the importance of learners making errors and struggling with a task. This model is based on the premise "that errorless learning is ideal; at the other is the assumption that important lessons of learning occur only when students make errors or reach impasses and are then helped to correct their initial misconceptions" (CTGV, 1990). The third model, "The Guided Generation Model" is grounded in the role of the teacher in the learning process. This can span the range of "authoritative provider of knowledge to a resource who at times is consulted by the students and at other times can become the student whom others teach" (CTGV, 1990).
In addition to developing and researching GIS instructional models, because GIS use is grounded in a learner's ability to successfully navigate and learn with a spatial perspective (Audet & Paris, 1997), this study will analyze the effect of learners being field dependent or field independent on GIS learning. Field dependence and field independence is an established spatial ability cognitive style that correlates with the ability to successfully perform in a computer-based instructional environment (Riding & Cheema, 1991). In addition to this relationship, the key difference between field independent and field dependent learners is their visual perceptiveness or their ability to distinguish the parts of an image or visual environment from the whole (Riding & Cheema, 1991). Given that GIS is a visual environment where distinction between details is critical, this variable may provide a greater understanding to develop a GIS model of learning for preservice teachers.
The design of the ArcVoyager software includes the "ArcVoyager Guide" that aids GIS learners at the learning levels mapped out in Table 1. It was developed to aid the learner in understanding what GIS is, provide the user with ready-to-use data, and allow geographic exploration for students.
GIS Learning Levels | Description |
---|---|
Level 1 | Engages just the help file, providing a tightly controlled text and graphic-based introduction to basic concepts and skills of spatial thinking with a computer. |
Level 2 | Engages an ArcView project, with a world atlas and a constrained interface that lets the user explore in a tightly controlled environment. Users learn easily about choosing, adding, and deleting themes; turning themes on and off; shuffling layers; zooming in andout; and identifying features, all looking at about 25 different layers. |
Level 3 | Engages a more open and powerful but still customized interface, with more capabilities involving tools that students and teachers use most often. Here, users start with a vast set of data at the ready, and can add an unlimited amount and modify it all in standard ArcView fashion. |
Level 4 | Engages the Level 3 interface and provides an open set of data without pre-arranged starting layers." |
Teachers who participated in this study used ArcVoyager for their GIS software. This study focused on the teachers' abilities to perform GIS at levels one through three as shown in Table 1.
The "Basics First" GIS model was a web-based module developed with Macromedia Dreamweaver that included text and graphics to illustrate the main GIS subskills and subconcepts. This module was designed to help learners build their basic cognitive foundation in the use of a geographic information system prior to participating in the project exams. The content presented in the module consisted of ten sections:
A worksheet with ten complex practice questions (http:///www.geographyeducation.com/gisweb/structuredindex.html) was used that required students to use basic subskills and subconcepts in parallel with learning the ten sections in the "Basics First" web-based module. In order that students could minimize their confusion within this module, Quicktime video tutorials (http://www.geographyeducation.com/gisweb/structuredindex.html ) were developed. The QuickTime videos included graphics, animation, and narration that provided a web-based tutorial for students to successfully complete the ten complex practice questions. By providing a QuickTime video, students could play, stop, and replay the video, reducing confusion and minimizing errors while completing the practice questions.
In addition to generative learning, scaffolding was described as a teaching strategy used by instructors of the guided generation model (CTGV, 1992) . Vygotsky (1978) described scaffolding as when an instructor identifies where a student is in his or her learning and then uses instructional tools to build upon the student's experiences to take them to a higher level of understanding.
The materials in this module consisted of a video news clip (http://www.geographyeducation.com/gisweb/guidedindex.html) that imitated a news report announcing the "Case of The Missing Ship" (Esri 1998), a message that was received in the "cartography" room from the "decoding" room where a ship had gone down on earth, but no one knows exactly the location of the disaster. This ship was on a "highly secret" spying mission. Within this video, the news anchor provided nine clues that could have been used with a geographic information system (GIS) to find the ship's location. The video provided "anchored" instruction for the GIS learners. Anchored instruction "provides a way to recreate some of the advantages of apprenticeship training in formal educational settings involving groups of students" (CTGV, 1990).
In addition to the video news clip, in order to scaffold the students' learning, a web-based module featuring QuickTime videos was developed to instruct students on four domains needed to solve the problem of finding the ship. These domains consisted of the biosphere (people, plants, animals), atmosphere (air), lithosphere (rocks), and hydrosphere (water).
The split-half reliability of the GEFT, published in the test manual and using the Spearman-Brown formula, is 0.82 for men and women. The validity of the GEFT, as determined by comparing the GEFT with the Embedded Figures Test, is an r = -.82 for men and r = -.63 for women (Witkin, Oltman, Raskin, & Karp, 1971). The overall usefulness and validity of the GEFT with undergraduate, graduate, and adult populations has been supported by numerous examinations that have established internal consistency, reliability, and validity (Day, McRaie, & Young, 1990; Panek, Funk, & Nelson, 1980).
The project exams measured how well the teachers used GIS to identify a given location on a map when given nine clues. The exam questions were set in an authentic situation that the students could relate to in their everyday life. Both project exams have been used extensively in GIS training throughout the nation and have been found to be valid and reliable (Fitzpatrick 2002). Fitzpatrick (2002) noted that the project exams clearly measure how well students learn and use GIS at all age levels while providing a consistent measurement over the last four years.
The first measurement of GIS learning was the project exam, "Case of the Missing Ship" (Esri 1997). This exam is "accomplished by applying good critical thinking" (Esri 2002) while effectively using a geographic information system. The teachers were required to find a location on a map using a geographic information system and data layers that were provided along with the clues from the exam. The exam develops the scenario where a ship on a secret mission has crashed on earth; however, no one knows the exact location. Given nine clues, the teachers were asked to find the exact location on a map using ArcVoyager (see: http://www.geographyeducation.com/gisweb for the exam).
The focus of the second exam was to identify the delayed retention of GIS knowledge. The second measurement of GIS learning was "Magic Dan's Extreme Sea & Ski Resort" (Esri 1998). This exam required using the GIS skills learned for the first exam, "Case of the Missing Ship" (Esri 1997), within a dissimilar situation - a "business sitting". This exam called upon the teachers to assume the role of a site analyst and find the best location to develop a resort. Once again, they were given nine clues needed to find a successful business location and were asked to identify this location on the map (see http://www.geographyeducation.com/gisweb for the exam).
A GIS basic skills exam was also used to assess the teachers' ability to understand basic GIS procedural tasks. The basic skills exam consisted of ten multiple-choice questions (see http://www.geographyeducation.com/gisweb for the exam).
Cognitive Engagement | Procedural Knowledge | |||
Teaching Methods | GIS Problem Solving | Problem Solving Delayed-Retention |
GIS Basic Skills | Basic Skills Delayed-Retention |
Basics First | . | . | . | . |
Structured Problem Solving | . | . | . | . |
Guided Generation | . | . | . | . |
ArcVoyager Guide (Control) | . | . | . | . |
To account for instructor bias in the delivery of each teaching method, an instructor observer was trained to make systematic observations based on Moore's (2001) "making systematic observations" guidelines. The observer collected descriptive data - "data that have been organized, or quantified by an observer, but do not involve a value judgment" (p. 294), and valued data - "data that involve the judgment of an observer" (p. 294), using frequency and duration measurements. The data from each module instruction were compared to make sure instructor bias was minimal.
Of the 160 teachers that were invited to participate in the study, 40 were instructed in the "Basics First" method, 35 were instructed in the "Structured Problem Solving" method, 35 were instructed in the "Guided Generation" method, and 32 received the control group instruction.
An example of the "Basics First" method was instructing the teachers how to add data layers within ArcVoyager. First, a description and graphic of a data layer was provided to the teachers in the module and they were instructed what a data layer is and how it is imported into ArcVoyager. Second, the teachers were asked if they had any questions about adding a data layer. If there were questions, the instructor made sure they were completely answered. Lastly, after all of the questions were answered, the teachers practiced the skills themselves.
This method of teaching continued until all subskills and subconcepts were covered for all ten sections of the module. Upon the completion of the module, the teachers were given 90 minutes to complete the exam identifying the location they believed was the answer. The teachers were able to ask questions of the instructor at any time throughout the exam. The instructor did not provide answers to the exam question, but focused on answering procedural knowledge questions. The teachers wrote the latitude and longitude of their answer and recorded their response confidence. Upon completion of the first project exam, teachers completed the GIS basic skills exam.
The teachers' delayed-retention was measured when the students reconvened in the same computer lab to take the second project exam titled "Magic Dan's Extreme Ski Shop" (Esri 1998) two weeks later. The teacher once again identified the location they believed was the correct answer by writing the latitude and longitude and recorded their response confidence. The teachers also completed the second GIS basic skills exam. They were given 90 minutes to complete both exams.
The teachers completed the project exam titled "The Case of the Missing Ship" (Esri 1997). The teachers wrote the latitude and longitude of their answer and recorded their response confidence. Upon completion of the first project exam, students completed the GIS basic skills exam.
The teachers' delayed-retention was measured when they reconvened in the same computer lab to take the second project exam titled "Magic Dan's Extreme Ski Shop" (Esri 1998) two weeks later. The teachers once again identified the location they believed was the correct answer by writing the latitude and longitude and recorded their response confidence. The teachers also completed the second GIS basic skills exam. They were given 90 minutes to complete both exams.
Once the teachers watched the "missing ship" news clip, the teachers were immediately given the exam to work on synchronously with access to the same web-based module as the "Structured Problem Solving" group. However, in addition to the "Structured Problem Solving" instruction, web-based QuickTime videos that instructed students on the four content domains that were needed to solve the problem of finding the ship were developed to scaffold their learning. These domains consisted of the biosphere (people, plants, animals), atmosphere (air), lithosphere (rocks), and hydrosphere (water). These videos scaffolded the teachers' learning as it provided instruction on how to add the data layers of the earth science content knowledge. For example, one of the clues in solving the exam is as follows: "Records indicate that the January temperature is around 25 degrees Celsius, perhaps a little higher, perhaps a little lower. Data indicate that, on the day of the ship's disappearance in June, the site was between 20 and 30 degrees Celsius." Therefore, the teachers needed to know how to import the layer, "Atmosphere - Air", that contained the temperature data on the day the ship went down, June 1, 1995. One of the videos showed how to import, identify, and select the correct layer to find the data on temperature.
The teachers used the clues from the news clip and the web-based modules on the use of ArcVoyager and the four content domains to solve the problem. The teachers wrote the latitude and longitude of their answer and recorded their response confidence. Upon completion of the first project exam, students completed the GIS basic skills exam.
The teachers' delayed-retention was measured when they reconvened in the same computer lab to take the second project exam titled "Magic Dan's Extreme Ski Shop." The teachers once again identified the location they believed was the correct answer by writing the latitude and longitude and recorded their response confidence. The teachers also completed the second GIS basic skills exam. They were given 90 minutes to complete both exams.
Using the clues from the exam, the teachers wrote the latitude and longitude of their answer and recorded their response confidence. Upon completion of the first project exam, teachers completed the GIS basic skills exam.
The teachers' delayed-retention was measured when they reconvened in the same computer lab to take the second project exam titled "Magic Dan's Extreme Ski Shop" (Esri 1998) two weeks later. The teachers once again identified the location they believed was the correct answer by writing the latitude and longitude and recorded their response confidence. The teachers also completed the second GIS basic skills exam. They were given 90 minutes to complete both exams.
A test of group differences was the second step in MANOVA. If the overall F test was significant, a test of group differences was used to identify where the differences occurred. The test used was the Wilk's lambda U, the most common where there are more than two groups formed by the independent variable, as in this study (Hand & Taylor, 1987).
The background variables of the students who participated in this study were analyzed in order to establish the quasi-equivalence of the treatment groups. Within this study, random assignment to groups was impossible, therefore, it is important to demonstrate that the students in the groups have similar demographic characteristics. If it can be shown that the students in the groups have similar background characteristics, then it is more likely that any significant gains in GIS performance are due to treatments and not some other background variable.
The results of the demographic questionnaire were analyzed by using a t-test for the continuous variable and a Pearson's Chi-square test for the categorical variables. Because the four groups had different sample sizes, a pooled variance estimate was used. All analysis was performed using the Statistical Package for the Social Sciences (SPSS, 2000).
The demographic results indicated that, for the most part, they were equivalent. There were no significant differences among variables between any of the groups except between the "Basics First" and the "Guided Generation" groups. The results that were significant between these groups were the "total number of countries traveled to" and the "total number of states traveled to." Given this difference, all performance data in this study were analyzed by these two demographic variables to determine if the differences were merely due to the number of places one has traveled to which was found to be insignificant.
Treatment Group | N | Mean* | Std. Deviation | p** |
---|---|---|---|---|
Control | 32 | 47.701 | 62.9725 | |
Basics First | 40 | 30.609 | 50.3295 | 0.205 |
Control | 32 | 47.701 | 62.9725 | |
Structured Problem Solving | 35 | 36.4222 | 53.4472 | 0.431 |
Control | 32 | 47.701 | 62.9725 | |
Guided Generation | 35 | 38.54107 | 38.5417 | 0.353 |
* Lower number is better performance. ** Significant at the .05 level. |
However, as shown in Table 4, when delayed-retention was measured two weeks later, the "Basics First" and "Structured Problem Solving" group performed significantly better than the control group.
Treatment Group | N | Mean* | Std. Deviation | p** |
---|---|---|---|---|
Control | 32 | 141.394 | 93.9213 | |
Basics First | 40 | 55.107 | 50.3292 | 0.000 |
Control | 32 | 141.394 | 93.9213 | |
Structured Problem Solving | 35 | 36.422 | 53.4472 | 0.001 |
Control | 32 | 141.394 | 93.9213 | |
Guided Generation | 35 | 133.335 | 102.5423 | 0.748 |
* Lower number is better performance. ** Significant at the .05 level. |
When cognitive engagement was measured between groups, rather than just to the control group, there were no significant difference sbetween the three treatment groups (Basics First, Structured Problem Solving, Guided Generation).
Treatment Group | N | Mean* | Std. Deviation | p** |
---|---|---|---|---|
Control | 32 | 5.031 | 1.8749 | |
Basics First | 40 | 7.700 | 1.5392 | 0.000 |
Control | 32 | 5.031 | 1.8749 | |
Structured Problem Solving | 35 | 7.7714 | 1.5546 | 0.000 |
Control | 32 | 5.031 | 1.8749 | |
Guided Generation | 35 | 6.600 | 1.4994 | 0.001 | * Higher number is better performance. ** Significant at the .05 level. |
Two weeks later after the first exam, the GIS basic skills test was given again. As shown in Table 6, once again, all treatment groups performed better than the control group at the .05 level and the "Basics First" and "Structured Problem Solving" groups performed significantly better.
Treatment Group | N | Mean* | Std. Deviation | p** |
---|---|---|---|---|
Control | 32 | 4.563 | 1.8826 | |
Basics First | 40 | 5.820 | 1.8098 | 0.005 |
Control | 32 | 4.563 | 1.8826 | |
Structured Problem Solving | 35 | 6.531 | 1.8468 | 0.000 |
Control | 32 | 4.563 | 1.8826 | |
Guided Generation | 35 | 5.433 | 1.775 | 0.066 |
* Higher number is better performance. ** Significant at the .05 level. |
When procedural knowledge was analyzed between groups, rather than just to the control group, there was no significant difference between the three treatment groups (Basics First, Structured Problem Solving, Guided Generation).
Cognitive Engagement Performance - Exam 1 | ||
---|---|---|
Significant Correlations | Pearson | Sig. (2-tailed)* |
# of Science Courses Taken | -0.196 | 0.022 |
GPA | -0.557 | 0.001 |
Response Confidence - Exam #1 | 0.438 | 0.000 |
Response Confidence - Exam #2 | -0.187 | 0.029 |
Procedural Knowledge - Exam #1 | -0.212 | 0.013 |
Procedural Knowledge - Exam #2 | -0.363 | 0.049 |
Cognitive Engagement Performance - Exam 2 | ||
Significant Correlations | Pearson | Sig. (2-tailed)* |
Rating of GIS Knowledge | -0.213 | 0.009 |
Response Confidence - Exam #2 | -0.452 | 0.000 |
Procedural Knowledge - Exam #1 | -0.336 | 0.000 |
Procedural Knowledge - Exam #2 | -0.219 | 0.010 |
Procedural Knowledge - Exam 1 | ||
Significant Correlations | Pearson | Sig. (2-tailed)* |
Rating of GIS Knowledge | 0.233 | 0.006 |
Rating of Computer Skills | 0.203 | 0.017 |
Response Confidence - Exam #1 | 0.218 | 0.011 |
Response Confidence - Exam #2 | 0.360 | 0.000 |
Cognitive Engagement - Exam #1 | -0.212 | 0.013 |
Cognitive Engagement - Exam #2 | -0.336 | 0.000 |
Procedural Knowledge - Exam #2 | 0.705 | 0.000 |
Procedural Knowledge - Exam 2 | ||
Significant Correlations | Pearson | Sig. (2-tailed)* |
Rating of GIS Knowledge | 0.208 | 0.015 |
Rating of Computer Skills | 0.171 | 0.046 |
Response Confidence - Exam #1 | 0.222 | 0.009 |
Response Confidence - Exam #2 | 0.307 | 0.000 |
Cognitive Engagement - Exam #2 | -0.279 | 0.010 |
Test Time - Cognitive Eng. - Exam #2 | 0.221 | 0.010 |
Procedural Knowledge - Exam #1 | 0.705 | 0.000 |
* Significant at the .05 level. |
As Table 7 indicates, the correlations between procedural knowledge and cognitive engagement were significant for both cognitive engagement exams and procedural knowledge exams. The number of science courses significantly correlated with the first cognitive engagement exam as well as GPA. Response confidence for both cognitive engagement exams significantly correlated with all exams except cognitive engagement exam two where only the second response confidence was significant. Self rating of GIS knowledge correlated with both procedural knowledge exams and the second cognitive engagement exam. Self rating of computer skills correlated with procedural knowledge exams one and two.
An in-depth analysis of why teachers may have performed better in the Basics and SPS treatment groups was provided in the personal interviews. Teachers in the Basics group identified two main reasons why they believed they performed well. They included 1) access to the instructor to ask questions and 2) opportunity to practice GIS procedural steps until they felt "comfortable." Even though the teachers had access to the web-based module, none of the teachers interviewed said they used it beyond the initial instruction. They noted that it was much "quicker," "easier," and "less stressful" to simply ask the instructor. One teacher said, "I would much rather wait until [he] had time to answer my question than go searching for it on the web-based module." They also felt that the majority of their questions were answered in the beginning because they were able to practice the "basics" until they had no further questions.
Teachers in the SPS treatment group identified two main reasons why they felt they were successful. They included 1) the opportunity to practice GIS within a situation that was relevant to the exam and 2) the ability to watch the movie at any time throughout the exam. Four of the five students that were interviewed stated that the practice questions "definitely helped" as they were able to "see exactly why [they] were doing the GIS steps." All of the students mentioned that they used the QuickTime videos that were provided for them. They said they "bypassed" the graphics and "jumped right to the videos." One student commented, "It was great! I could fast forward, rewind, stop, - I had complete control over finding the answer and matching what I was supposed to do in ArcVoyager with the QuickTime videos."
When the students in the Basics and SPS treatment groups were asked if they felt they understood what GIS is and how it could be used, all ten of them answered "yes." They believed they "knew enough to be successful" within their classroom. They felt they new what the benefits of using GIS are and commented on them in detail. The next step they felt was to start developing lessons that integrated the GIS technology.
The Guided treatment group did not perform significantly better than the control group on both cognitive engagement exams. Within the interviews, the teachers commented that they understood the task as the "video was excellent in setting the stage", but they felt "lost" and "frustrated" when the tried to answer the problem. The students commented that they felt they understood the GIS "functions, but didn't know how to use them." Four of the teachers commented that they would have liked some extra guidance.
The Basics treatment group performed extremely well in all exams. They noted that access to the instructor was most important in their GIS learning. They would rather wait to ask the instructor than looking on the web-based module for the answer. Therefore, the quality of the GIS instructor was very important when learning GIS. If instructors with different GIS backgrounds and understanding use the Basics pedagogy, one must ask the question, "Will the GIS learners perform at such a high level?" This also raises the second point of online GIS learning.
The SPS treatment group also performed significantly well with only access to the complex practice questions and QuickTime videos. Therefore, if one were going teach GIS online or would not be able to offer assistance at a level that is required in the Basics treatment group, the SPS approach would be most beneficial. Learners in the Guided treatment group noted this assistance in GIS learning as the missing variable.
The Guided treatment group felt that they understood what they were supposed to do after they watched the anchoring video (http://www.geographyeducation.com/gisweb/guidedindex.html ) but didn't "know where to start." This was even when they had scored significantly better on the procedural knowledge exam. They noted that they needed to know how to apply the GIS skills to solving the task at hand. Therefore, a constructivist approach that provides no instructor guidance with GIS learning was not found to be effective.
This study strived to find an effective pedagogy when teaching preservice teachers GIS. Of the four approaches used, two were identified where the teachers performed significantly better on both GIS cognitive engagement and procedural knowledge. These two identified pedagogies were very different and provide us with two methodologies to use depending if we are able to give GIS learners assistance within the classroom throughout a four hour period or if we are encouraging the GIS learners to learn on their own with only the tools that we have provided them. The initial finding in this study will provide GIS educators two successful ways of teaching preservice teachers as well as other GIS educators. It is hoped that these pedagogies will encourage others to pursue avenues of research related to GIS pedagogy in preservice teacher education as well as in the k-12 classroom.
Bednarz, S. W. (1995). Reaching New Standards: GIS and K-12 Geography. Retrieved April 10, 2001, from the World Wide Web: http://www.odyseey.maine.edu/gisweb/spatdb/gislis95/gi95006.html
Bednarz, S. W., & Audet, R. H. (1999). The Status of GIS Technology in Teacher Preparation Programs. Journal of Geography, 98(2), 60-67.
Brownwell, K. (1997). Technology in teacher education: Where are we and where do we go from here? Journal of Technology and Teacher Education, 5, 117-138.
CTGV. (1990). Anchored instruction and its relationship to situated cognition. Educational Researcher, 19(6), 2-10.
CTGV. (1992). The Jasper Experiment: An Exploration of Issues in Learning and Instructional Design. Educational Technology, Research and Development, 40(1), 65-80.
Day, D., McRaie, L. S., & Young, J. D. (1990). The Group Embedded Figures Test: a factor analytic study. Perceptual and Motor Skills, 70(3), 835-839.
Environmental Systems Research Institute, I. E. (1998). White Paper: GIS in K-12 Education. Esri Press.
Esri. (1997). Case of the Missing Ship. Retrieved February 5, 2001, from the World Wide Web: http://www.Esri.com/k-12
Fitzpatrick, C. (1990). Computers in geography instruction. Journal of Geography, 89, 148-149.
Fitzpatrick, C. (2002). Personal Communication. In A. Doering (Ed.). St. Paul.
Fredrick, B., & Fuller, K. (1997). What we see and what they see: Slide tests in geograpy. Journal of Geography, 97, 63-71.
Keiper, T. (1999). GIS for Elementary Students: An Inquiry Into a New Approach to Learning Geography. Journal of Geography, 98, 47-59.
Kerski, J. J. (2001). The Implementation and Effectiveness of Geographic Information Systems Technology and Methods in Secondary Education. Paper presented at the Esri Education User Conference, San Diego, CA.
National Geography Standards Project. (1994). Geography for Life: National Geography Standards 1994. Washington DC: National Geographic Society.
Nellis, D. (1994). Technology in geographic education; reflections and future direction. Journal of Geography, 93(1), 36-69.
Panek, P. E., Funk, L. G., & Nelson, P. K. (1980). Reliability and validity of the Group Embedded Figures Test across the life span. Perceptual and Motor Skills, 50, 1171-1174.
Riding, R., & Cheema, I. (1991). Cognitive Styles--An Overview and Integration. Educational Psychology: An International Journal of Experimental Educational Psychology, 11(3-4), 193-215.
SPSS. (2000). Statistical package for the social sciences (Version 10.0). Chicago: Author.
Sui, D. (1995). A pedagogics framework to link GIS to the intellectual core of geography. Journal of Geography, 94, 578-579.
Vygotsky, L. S. (1978). Mind in society. Cambridge, MA: Harvard University Press.
Witkin, H. A., Oltman, P. K., Raskin, E., & Karp, S. A. (1971). A Manual for the embedded figures test. Palo Alto: Consulting Psychologists Press.
Wittrock, M. C. (1974). Learning as a generative process. Educational Psychologist, 11, 87-95.