FEATURE

Using Immersive Virtual Environments to Create Immersive Science Assessments

Authors: Diane Jass Ketelhut, Catherine Schifter, Melissa Karakus, Uma Natarajan, Angi Shelton, and Chris Teufel, Brian Nelson, Cecile Foshee, Younsu Kim, and Kent Slack

Affiliations: University of Maryland, Temple University, Arizona State University

ABSTRACT:
Situated Assessment using Virtual Environments for Science Content and Inquiry, SAVE Science, is an NSF-funded study developing an innovative system for contextualized, authentic assessment of learning in science. In SAVE Science, we are creating, implementing, and evaluating a series of immersive virtual environment-based modules for assessing both science content and inquiry in the middle grades. The modules are designed to enable students to perform a series of assessment tasks that provide data about how well they have mastered and can apply content knowledge and inquiry skills taught via their regular classroom curricula. We hypothesize that through careful design of the virtual environment-based assessments, data can be collected and analyzed to produce meaningful and accurate inferences about student learning that provide additional insights about student understanding beyond what is possible from more traditional assessments. With videos and web links, this article describes the project, each module, and the associated structures facilitating implementation, concluding with a link to download an example module and opportunities to join the research team.

Introduction

How do we assess whether students are achieving our goals for scientific understanding? The National Research Council [1] in the United States suggests that tests should reflect the complexity that is science. Unfortunately, the current climate in the United States puts the burden of assessment on standardized tests which do not reflect this complexity. Students are more often assessed on whether they understand terms such as “hypothesis” or “control,” but in-depth assessment of their abilities to formulate questions and hypotheses, and design and analyze experiments, is often neglected [2].

In many states, the state assessment has tried to account for this by creating detailed open-ended questions. However, in order to set the stage, these questions rely on students’ reading abilities as well as their science knowledge. For example, consider this physical science question  from the 2009 National Assessment of Educational Progress (NAEP) test.

Try and answer this question. Did you have any difficulties? What if English is not your first language or if you are a poor reader? It turns out that using the SMOG test, this 4th grade question is rated at a 10th grade reading level! Does this question then test your science knowledge, your reading comprehension or your understanding of English? This problem is not relegated only to state and national tests. It has also been raised regarding the international TIMSS assessment as well where in one study it was shown that students could correctly answer science questions in an interview that they had answered incorrectly on a TIMSS implementation because of poor reading and English language skills [3].

We suggest that assessments embedded in immersive virtual environments offer one approach to fulfilling the NRC’s recommendation for more authentic and complex science assessments. Situated Assessments using Virtual Environments for Science Content and Inquiry (SAVE Science), is an NSF-funded study developing an innovative system for contextualized, authentic assessment of learning in science. In SAVE Science, we are creating, implementing, and evaluating a series of digital game-based modules for assessing both science content and inquiry in the middle grades. The modules, based in an immersive virtual environment (IVE), are designed to enable students to perform a series of assessment tasks that provide data about how well they have mastered and can apply content knowledge and inquiry skills taught via their regular classroom curricula. We hypothesize that through careful design of the virtual environment-based assessments, data can be collected and analyzed to produce meaningful and accurate inferences about student learning that provide additional insights about student understanding beyond what is possible from more traditional assessments.

Module Descriptions

This section gives a brief overview of five SAVE Science modules: two introductory modules and three assessment modules in science for middle school children. The purpose of the introductory modules is to acclimate students to the virtual environment by navigating an avatar to interact with characters, use embedded tools and solve scientific problems. The three assessment modules, Sheep Trouble, Weather and Basketball are designed to assess students’ understanding in adaptation and speciation, fronts and air masses and gas laws respectively. Video 1 provides a visual overview of the SAVE Science modules.

Video 1. An overview of the SAVE Science modules.


Introductory Module 1: Snake Trouble

  • Figure 1 Figure 1. A student carrying corn in his backpack while coming upon NPC Bill and his corncrib.
  • Figure 1 Figure 1 (detail). A student carrying corn in his backpack while coming upon NPC Bill and his corncrib.
Figure 1
of

The Snake Trouble module is designed as a form of pre-training to introduce students to Scientopolis — a virtual medieval world. Unlike related assessment modules that students complete later, Snake Trouble does not assess students on any previously learned curriculum. Instead, the purpose of this introductory module is to provide students with experience operating an avatar in a virtual world, interacting with objects they find, using the embedded investigation tools—called SciTools, (see Figure 1) and gathering and applying evidence to solve a problem through scientific inquiry.

In Snake Trouble, students meet Farmer Brown and his Brother Bill, two non-player characters (NPCs), who are featured in subsequent assessment modules. Farmer Brown and Bill have found snakes on their farm. Out of fear, a neighboring farm worker has killed most of the snakes, but now discovers that her corn is disappearing rapidly. Farmer Brown does not like the senseless killing of animals and wants the remaining snakes to be left alone. His brother Bill is fearful of the snakes and wants them all killed. Farmer Brown asks students to collect evidence to convince his brother that the remaining snakes should not be harmed. Students investigate the snake problem using their SciTools, building familiarity with the tools that they will need to use in subsequent modules. Students also learn how to interact with artifacts, such as newspapers, which they can use to gather information within this problem-solving quest.

Assessment Module 1: Sheep Trouble

  • Figure 2 Figure 2. Boy character looking at Farmer Brown, an original sheep, and an imported sheep.
  • Figure 2 Figure 2 (detail). Boy character looking at Farmer Brown, an original sheep, and an imported sheep.
Figure 2
of

The Sheep Trouble assessment module is designed to evaluate student understanding of concepts of beginning speciation and adaptation within a given environment. Using a similar cast of characters as in the Snake module, Farmer Brown asks students to investigate why newly imported sheep, brought in from a flat island out in the middle of the ocean, are not thriving on his farm. Through interactions with the characters, farm, and sheep, students explore the problem and are able to collect and analyze data to connect with their scientific understanding. Data collection includes talking to characters, observing the world, measuring physical attributes of the sheep, and determining the age of the sheep, see Figure 2. Students use inquiry, problem solving skills, science content understanding, and their investigative SciTools to deduce that the recently imported sheep have legs that are adapted to flat terrain and are preventing them from traversing the hilly terrain to reach the only food source available on the farm.

Assessment Module 2: Weather

  • Figure 3 Figure 3. Female character observing the painter and newspaper girl and their artifacts.
  • Figure 3 Figure 3 (detail). Female character observing the painter and newspaper girl and their artifacts.
Figure 3
of

This module is designed to assess weather concepts such as air masses, fronts, and precipitation. Scientopolis is in the midst of a severe drought; crops are dying and animals are starving. Because the town’s people believe the town is jinxed, they are packing up to leave. Farmer Brown, who has lived in the area his whole life and does not want to depart, tasks students with a mission to save his town by finding a scientific explanation for the drought and predicting when it might end. In order to accomplish this, students must collect data about the weather from several locations and sources. Students use their SciTools to measure wind direction, barometric pressure, and temperature and then compare these measurements through the graph or notes tools.Further, students can interact with various characters and artifacts to obtain information about previous weather conditions, in order to discover weather patterns. Once students deem they have collected enough information, they present their conclusions and support their claims with scientific evidence.

Introductory Module 2: Mysteria

  • Figure 4 Figure 4. Female character interacting with the Transportation Expert NPC.
Figure 4
of

The Mysteria module introduces students to a cartoon-themed version of the Scientopolis virtual world. Mysteria and its related assessment modules are equipped with a different set of tools from those included in the medieval virtual world of the Snake, Sheep Trouble, and Weather modules. Mysteria is set in the present day, on a tropical island. Like the Snake Trouble introductory module, Mysteria is not designed to assess students on any particular curriculum. Instead, its function is to familiarize students with the world by providing them time to learn how to maneuver their character, to interact with objects and characters, and to take measurements. A researcher character asks students to investigate the newly discovered island to determine if it is safe to use as a research base. During their investigation of the island, students complete mini-challenges that expose them to the investigation tools (again called Scitools), objects, and character dialog tools that they will need to use in related assessment modules set in this virtual world.

Assessment Module 3: Basketball

  • Figure 5 Figure 5. A student measuring the bounce height of a basketball inside the Jordan Gymnasium.
Figure 5
of

The main purpose of the basketball module is to assess students’ knowledge of gas laws and related properties as well as aspects of scientific inquiry. In this module, students are asked to help Julius, the manager of a local basketball tournament, find out why basketballs are not bouncing well at the outdoor tournament court, while identical balls are functioning properly at the indoor court in the local gymnasium. To investigate this problem, students can interact with various characters from the outdoor and indoor courts to obtain information about conditions at each environment. Also, students can gather information about objects found at each court (basketballs and balloons) using a number of SciTools such as pressure gauges, scales, and tape measures. To support scientific investigations within the module, students can pick up and carry basketballs and balloons in a personal backpack between the outdoor and indoor basketball courts to observe and test any changes in the measurements when the objects are exposed to different environmental conditions.

SAVE Science Implementation and Professional Development

SAVE Science has been implemented with more than 1000 middle school students in a large urban district and many suburban districts in the mid-Atlantic region of the United States. Implementation of the SAVE Science project in the classrooms has been an evolutionary process requiring a responsive, dynamic, and flexibly designed teacher professional development (PD) component. This section discusses the goals of the various PD sessions as they evolved over the first three years of the project.

The primary goal of early PD sessions was to initiate the teachers to the SAVE modules and to model the necessary elements for implementing SAVE Science in their classrooms. After the students completed the assessment modules, teachers led audio-taped classroom discussions called “wraparounds” which were designed to help teachers link the students’ problem-solving processes to their classroom instruction. As more modules were designed and developed, the subsequent PD sessions focused on introducing the teachers to the newly added features. Video 2 presents a historical synopsis of implementation with a special emphasis on professional development.

Video 2. Historical synopsis of implementation (emphasis on professional development).


Thanks to an impressive retention rate of over 80% among teachers participating in SAVE Science, the project has experienced simultaneous increases in the average level of experience among its teachers, and in data collected that informs design improvements in the program. In addition to performance results from assessment modules, the data that shapes the program’s future includes teacher interviews, feedback from PD sessions, student focus groups and responses to written and verbal “wrap-around” sessions. As we began to integrate teacher feedback into the modification of our professional development sessions, “delivery-oriented” instructional PDs have grown into collaborative sharing sessions. Building an on-going teacher-researcher community now hosted on the SAVE Science googlegroups was a success in this regard. The intention was to build a more sustainable support system within which teachers rely upon each other without a dependence on researchers as the program grows beyond its original research parameters.

The SAVE Science team believes that in addition to building the teachers’ professional knowledge as part of the PD, personal investment is greater when teachers are engaged in other domains of project planning with the research team. As the project develops further, we are more reflective about our designs at every stage and have now moved from a “needs-driven” approach to a participant “autonomy-driven” approach. The SAVE Science dashboard which serves as an “implementation portal” for teachers brings in a new dimension of the PD design where the teachers are exposed to tools and resources that will empower them to take “ownership” and control of the entire implementation process in the future.

Monitoring Student Performance in SAVE Science

The SAVE Science teacher dashboard has been designed to house the tools and resources necessary for implementation and data management. It allows for automation of the creation of student logins as well as for increased teacher ownership and control of the student learning process by providing them access to information on student progress and performance. To view data collected from students during their virtual environment assessments, both teachers and researchers use our SAVE Science dashboard. Through the dashboard, teachers have real time access to their students’ answers and in-world actions (all actions students undertake in the assessment modules are automatically recorded by time stamps and location stamps.). Teachers and researchers are able to generate reports of students’ performance in the modules by looking at their module answers as well as their in-world actions that include tool measurements, collisions with in-world characters, graphs and clipboard recordings. All of these data from the dashboard is analysed to capture student understanding. .

The administrative functionalities in the dashboard allow teachers to do the following: upload student rosters, create student login accounts for access to the assessment modules, record and update status on student participation consents in the system, and implement surveys. Teacher surveys are also conducted through the dashboard. The survey creation tool has a feature that allows researchers to include an optional audio component where the surveys are read-aloud to the students. This is intended to eliminate challenges for students with reading difficulties and/or language barriers.

The database management functionalities in the dashboard permit both the teachers and researchers to view the data collected from students during their virtual environment assessments. For example, the “Classroom Views” feature allows teachers to monitor the “Progress” of their students while they are doing the assessment modules as well as view the results once the students have “Completed” the assessment modules. Video 3 details what a participating teacher can see and do in the dashboard.

Video 3. Monitoring student performance in the SAVE Science teacher dashboard.


For the research team, this dashboard serves both as an implementation management system and a data management system. As primary project duties move from design to implementation and analysis, we have turned our attention to better understanding the data being generated, and the dashboard is our access window to this data. We are working closely with experts from Computer Science to help interpret our data. Our first step has been to apply a variety of common data mining techniques to analyze our data. One finding that has emerged from initial analyses is that understanding how to solve the problem in the modules is a separate knowledge base from being able to solve high stakes multiple choice questions on the same topic. Our analysis is quite preliminary, but this finding might indicate, as hypothesized in the proposal for our study, that contextualizing science problems allows students to demonstrate understanding in ways they cannot on a multiple choice test. Gathering and analyzing such information should ultimately enable the creation of detailed assessment summary reports that are useful for helping teachers better understand what their students know.

Future Direction for SAVE Science

SAVE Science implementations are ongoing and will continue under the umbrella of the funded project through at least August 2013. We will continue to increase participation of schools, design more assessment modules, and create ways to share and visualize student data with the teachers.

For more information about the SAVE Science project, modules, implementations, the team, publications related to the project, and for interested participants to get involved in the project, please visit SaveScience.net 

 


References and Notes

  1. ^Educating Teachers of Science, Mathematics, and Technology: New Practices for the New Millenium. (National Research Council. Washington, D.C., 2001)
  2. ^America's Lab Report: Investigations in High School Science. (National Research Council, Washington, D.C, 2005).
  3. ^A. Harlow, A. Jones, A., Why students answer TIMSS science test items the way they do. Research in Science Education, 34, 221-238 (2004).

Contents subject to change. Early access restricted to Immersive Education Initiative Chapter board members, Technology Working Group (TWG) chairs and iED 2012 participants.