![]() |
|
Matthew Realff
modeling, simulation, interface design, learners.
Creating educational software forces a difficult tradeoff. The software must be easy for the students to use, yet not so simple that the parts that you want students to learn from are done for them by the computer. The challenge is deciding what parts of the problem are not necessary for students to learn and should be made easy, and what parts are absolutely critical where the students should be allowed to struggle to gain mastery of the material.
Computer simulation is an area where this tradeoff
is particularly acute. Simulation provides students realistic experience, even in domains
where realistic activities are too complex to be performed by
novices, too expensive to be offered in an undergraduate lab,
or too dangerous to allow students to make mistakes. Our hypothesis
is that merely watching a simulation is not enough to trigger
learning, the student must have some hand in creating the model
that drives the simulation. However, learning to build models
in a fully functional simulation environment is equivalent to
learning an entire programming language. It is too difficult
to expect students to be able to do this in addition to learning
the domain knowledge.
The problem, then, is to create a simulation environment which allows students to construct and execute complex simulations in a manner that allows them to connect their theoretical knowledge to the real world and which does not force them to learn formalisms that will not be of later use. A key component of this environment is support for modeling - creating a conceptual representation of reality.
The objective of our project is to create a computer environment, called DEVICE (Dynamic Environment for Visualization in Chemical Engineering) which will facilitate student learning through construction of equation-based models and evaluation of those models executing as simulations. Modeling is a skill that engineers use frequently in practice, but is rarely explicitly taught [1]. Existing modeling packages do not meet the specific needs of engineering students. The use of equation based models in our system allows the engineering students to work with a formalism that they are familiar with.
This paper will not discuss the engineering education specific aspects of DEVICE (for more information, see [6] and [10]). Instead, this paper will focus on how the interface can either help, hinder or help the student too much during a learning session.
The primary users of DEVICE are undergraduate engineering students. For the purposes of our initial studies, we have used sophomore and junior students in the Chemical Engineering department at the Georgia Institute of Technology. These students are still towards the beginning of the Chemical Engineering course sequence. They are expected, however, to already have taken courses in physics, calculus and fluid dynamics.
The students involved are, based on their average SAT scores, considered to be somewhat above the national average for engineering students. Not surprisingly, they tend to have much stronger math skills than language skills. They are expected to have used a Macintosh computer, and to have had at least an introduction to the chemical engineering concepts being presented. Our analysis of student problem solving supports the perception that the students have difficulty with adapting theory to fit the real world.
Our initial problem domain is chemical engineering pumping systems. This domain was chosen because it is encountered early in the course sequence. The equations that describe the system are simple compared to other pieces of equipment. The primary equation in this system is called the Mechanical Energy Balance (which can be seen in figure 5). This equation balances the energy coming into the system split into kinetic, potential, and frictional components with the energy output by the system, which is the shaft work performed by the pump. In order to correctly use this equation, the student must define a consistent system, which can be thought of as a box around the system such that input and output values can be defined.
There are two basic problems in this domain. The first is determining the flowrate of a system that contains a given pump. The second is to choose an appropriate pump for a system that will result in a desired flowrate. Both problems use concepts of energy balance that the student will use frequently throughout his or her career.
In order to measure student needs, we analyzed student performance solving modeling problems on paper without computer assistance. We gave students in two separate courses test problems in the pumping domain. Both classes had learned about the behavior of such systems at differing level of detail, and the problem was tailored to expected student knowledge in each course.
Examination of student performance revealed that fewer than half of the students were able to apply the correct set of equations to the problem, while as least two-thirds made some mistake in their problem solving that indicated a misunderstanding of the connection between the real world problem and the equations. Student mistakes tended to concentrate in three areas. Students had difficulty defining a mathematical model of the system with consistent boundaries. They frequently made assumptions in the mathematical model that reflected contradictory choices of boundary points to apply the mechanical energy balance to. Second, students had trouble at places in the problem where it was necessary to convert between units, particularly dynamic units like flowrate and velocity. Finally, students were unfamiliar with empirical guides to system performance used in actual practice, and in the relationship between these guides and the mathematical model.
The results of this evaluation suggested to us that students needed a system that would encourage them to examine their assumptions as they worked through problems in their domain.
Previous attempts to support modeling using computer environments are not applicable to the needs of engineering students. Some environments, such as STELLA [9], use a notational constructs that are difficult for novice users to understand and are not connected to the theoretical understanding that students are building [11] [12]. Engineering students require environments with quantitative precision, unlike Model-It [8] [7], which uses a graphical environment that allows students to model an ecosystem. More generalized simulation or programming environments are too complex to expect students to learn along with domain content, and have no direct tie to that content. Examples of such systems that do support simulation and modeling include Emile [5], an environment that allows students to build procedural simulations of basic physics, and Boxer [3] [4], a programming environment designed to allow novice programmers to create complex programs and simulations.

Figure 1 shows the above projects along a continuum representing the amount of interaction that the student has with the simulations underlying model. At one end, Maxis' SimCity [13] does not allow the users to view or change the underlying model. At the other end, Boxer is a domain-independent system for creating models. Our experiences suggest that the fixed simulations, while easy to use, do not contribute greatly to learning. However, full model-building environments, while they can help students learn, require students to expend a great deal of energy learning to program. With DEVICE, we are exploring this tension between ease of use and learning, with the hope of focusing difficulty on pieces that are most important for the student to learn.
DEVICE also differs from the other programs by providing support for the important task of testing the model against the real world. Novice engineering students need an environment that is easy for them to understand, yet has the numerical accuracy they need. They also need to be able to compare their model to the actual performance of the system.
The original version of DEVICE allowed students to calculate the flowrate for a given system with a pump already included. This discussion of the interface will focus on where, within the problem, students had the opportunity for interactivity.
In DEVICE 1.0, the student had the most flexibility in creating the physical layout of the system. The student created a physical representation of the system, laying out the tanks, pump and pipe on a tile grid not unlike SimCity. Dialog boxes associated with each object allowed the student to adjust the size and other characteristics of that object. Students could also control system level parameters such as the type of fluid being pumped. Figure 2 shows a student in the process of laying out the system.
This version of DEVICE only allowed students to see the equation model underlying the system, and did not allow them to change that model in any way. In particular, the student could not change which attribute of the layout contributed the value of a particular piece of the equation. They could, however, get a textual description of where the value came from.
Figure 3 shows the Equation Notebook that the student saw in DEVICE 1.0. The upper portion of the window is the equation itself. Each term in the equation is labeled, and its value is displayed. The bottom half of the window shows values of the unknown pieces of the equation that the program already calculated. Having stored the values, the program then uses them to calculate the final flowrate.

A number of system parameters within DEVICE 1.0 were not visible to the student. Most importantly, the equation that calculated the final flowrate based on the pump in use, while technically visible to the students, was in an awkward location and was not uncovered by any of the test students.
DEVICE 1.0 was evaluated in a laboratory pilot study with three students who had already taken the undergraduate course that studies pump systems. The students were given a problem which required them to step through the design process of a pump system. Their goal was to achieve a desired flowrate. The task had the students perform the steps needed to calculate the flowrate in DEVICE, and then modify system parameters to change the flowrate. A similar problem had been used in a final exam in a previous quarter, and only four of 21 students were able to complete it successfully. Students were encouraged to think-aloud while solving the problem with DEVICE. Each session was videotaped. The students problem had high-level directions for using DEVICE. The program designer was at the sessions to function as a help system and field student questions about the system.
The student's ability to solve the problem using DEVICE was encouraging:
The review of the pilot study videotapes focused on issues of usability and meeting learner's needs. Despite a number of minor problems, the DEVICE interface generally worked well. Every student was able to build a simulation and solve the problem. The tile placing layout was confusing to the students. It was not clear to the students exactly how to begin placing their layout. All three students had difficulty tweaking parameters. The object parameters, accessible by double clicking on tiles, were particularly difficult for students to find. These usability lapses did not keep the students from completing the task in the time allotted, however they did ask questions of the designer frequently.
There is evidence that students learned some information about modeling pumping systems during the experience of using DEVICE. Quotes from the videotapes suggest that they were developing new understandings while working with the system:
We were concerned that the students may not have understood the workings of the simulation very well. For example, one student, asked after the test if the simulation helped him understanding the system equation, responded, "What equation?". Since this student did recognize the mechanical energy while using the system, it seems as though the student had started to see the equation as only an output mechanism, and not as an abstraction.
We concluded that DEVICE 1.0 did not meet the students needs in two primary ways
Our analysis of the pilot test data led us to decide that the 1.0 version of DEVICE would not be a useful learning tool for students, although it might be a useful calculating tool for practicing engineers.
Our goals in the redesign of DEVICE 2.0 were:
One goal in DEVICE 2.0 was to allow the student better access to the parameters of the system. Figure 4 shows the current DEVICE 2.0 interface for entering physical parameters. The student can enter parameters either on the numerical palette, or by dragging the components in the layout window. The interface allows consistent access to all the applicable parameters of the system. It has the secondary effect of allowing the layout parameters to be set more rapidly then before, without the tile-laying, which gave the students trouble. Typically, most of the parameters will not be changed in different runs of the problem and can be provided to the student, allowing even more rapid entry.

The biggest change in DEVICE 2.0 is that the equation model has been made malleable to the students. We hope that this is where the largest part of the students interaction will take place. Ideally, in this step, the student is given pieces of abstract equations described verbally (such as Change In Pressure), and must combine them to create a system of equations that models the system being designed. Those equations must then be explicitly linked to the parameters of the physical layout that specify the values of the equation - in contrast to the first version, where the linkage was hard-wired into the program.
Figure 5 shows the current interface for building an equation model. As a simplifying step, the student is provided entire equations, but still must link the variables to their source attributes. Clicking on a variable in the equation opens a source window, shown in Figure 6. In this window, the student must state the type of attribute desired, and then select the actual attribute from the menu. The system can, at the student's option, prevent the student from placing an attribute of an inappropriate type (a height attribute in a pressure variable, for example). A future version of the program will allow students to create their own equations from component parts.


After completing the equation model, the must test the model. This corresponds to the real world task of building the system to the students parameters and testing its performance. Within DEVICE, the student layout is sent to the remote modeler, which returns data about how the real pump would perform. The student can compare the data to the prediction given by their equation model. There are three possible outcomes. First, the model could be completely correct, in which case the student is done. Second, the model could be completely wrong, in which case the student needs to return to the model building step and correct the model. The system needs to provide support for fixing the model in this case. The third case is that the model is slightly off due to factors not considered in the students' model. In that case the student can choose to create a more accurate model, or stick with the current model with an understanding of how to adjust for the factors that could have caused the discrepancy (friction, for example.)
This is potentially a more complicated process than just tweaking parameters. The students will need support for the process, as well as support in understanding interactions of various parts of the model. However, before we started to design these types of scaffolding, we decided to observe students using these features, and work from their needs. During the DEVICE 2.0 development cycle, we did a very brief round of user testing. Findings from that testing suggest:
During Fall Quarter 1996, we performed a full scale user test integrated into the curriculum of one of the Chemical Engineering courses that we had been targeting.
The students were introduced to the program during a one-hour lab session that took place during normal class time. In this session, they were to walk through a self-directed tutorial showing how to use DEVICE to solve pump problems. Following the session, they were given two weeks to work on three homework problems on their own time. The three homework problems were more open ended than the tutorial. The three problems also were progressively less similar to the tutorial.
We gathered data from four sources:
In addition, there was some informal observation of students using the program in the lab setting.
Analysis of the data is continuing as of this writing, but here are some of our current findings:
The requirements of interfaces designed to support learning are different than for interfaces designed to support performance. There is a necessary trade off between ease of use and facilitating learning by creating opportunities for the student to think. Specifically, there is a difference between an interface which merely allows a student to observe a simulation, and one which allows the student to actively build and run a simulation model.
In testing DEVICE 1.0, we found that a simulation-only interface does not facilitate learning. The ease of use that makes the program a useful support tool can work against it as a learning tool, by negating the learner's need to think about what they are doing. Our needs analysis shows that the chemical engineering students we are working with need to improve their modeling skills. To do that, we need to provide for them an environment that encourages them to use those skills without overwhelming them with new types of models.
Early results on DEVICE 2.0 suggest that it may be successful in focusing student attention on areas of the problem that students typically have the most difficulty with. However, more scaffolding and support must be provided to compensate for the more complex environment.
DEVICE is funded by NSF grant #RED-9550458 and by the EduTech Institute.
1. Abelson, H., Computation as a Framework for Engineering Education, .
2. Barton, P.I. and C.C. Pantelides, The Modelling of Combined Discrete/Continuous Processes. AIChE Journal, 1994. 40: p. 966-979.
3. diSessa, A.A. and H. Abelson, Boxer: A reconstructible computational medium. Communications of the ACM, 1986. 29(9): p. 859-868.
4. diSessa, A.A., H. Abelson, and D. Ploger, An overview of Boxer. TheJournal of Mathematical Behavior, 1991. 10(1): p. 3-15.
5. Guzdial, M., Software-realized scaffolding to facilitate programming for science learning. Interactive Learning Environments, 1995. 4(1): p. 1-44.
6. Guzdial, M., et al. Simulated Environments for Learning Real World Contexts in Chemical Engineering. in International Conference on the Learning Sciences. 1996. Evanston, IL.
7. Jackson, S., et al. The ScienceWare Modeler: A Case Study of Learner-Centered Software Design. in CHI. 1995.
8. Jackson, S.L., et al. Model-It: A case study of learner-centered software for supporting model building. in Proceedings of the Working Conference on Technology Applications in the Science Classroom. 1995. Columbus, OH: The National Center for Science Teaching and Learning.
9. Mandinach, E., Model-building and the use of computer simulations of dynamic systems. Journal of Educational Computing Research, 1989. 5(2): p. 221-243.
10. Rappin, N., et al. Teaching Chemical Engineering Modeling through Simulated Environments. in American Educational Research Association. 1997. Chicago, IL: (submitted).
11. Tinker, R.F., Teaching theory building, . 1990, Technical Education Research Centers, Cambridge, MA.
12. Tinker, R.F. and S. Papert, Tools for science education, in 1988 AETS Yearbook: Information Technology and Science Education, J.D. Ellis, Editor. 1988, Association for the Education of Teachers in Science. p. 5-27.
13. Wright, W., SimCity, . 1989, Maxis.
![]() |
|