Cognitive Complexity in System Dynamics Based Learning Environments

Pål I. Davidsen and J. Michael Spector

TEL +47 55 58 4134 / FAX +47 55 58 4107 / EMAIL mike@ifi.uib.no

University of Bergen, N 5020 Bergen, NORWAY

INTRODUCTION

The use of system dynamics to provide the foundation for meaningful learning environments has long been held out as one of the strengths of system dynamics (Forrester, 1992; Sterman, 1988). Indeed, many such learning environments have been implemented and described in general as management flight simulators (Sterman, 1988). However, there is inadequate evidence of their efficacy in promoting learning about complex, dynamic systems. Moreover, there is not a well established methodology to guide the disciplined design and implementation of these promising learning environments. Broadly stated, the missing links have been a firm foundation in cognitive learning theory and a tight coupling of that theoretical perspective with the disciplined practice of instructional design. We shall suggest how these missing links might be incorporated into system dynamics based learning environments.

Many excellent examples of system dynamics as the basis for teaching in complex domains do exist (Mellar, et al., 1994; Morecroft & Sterman, 1994). Our concern is the lack of consistent success with regard to learning effectiveness in system dynamics based learning environments. Inadequate attention has been paid to the establishment of reliable measures of learning effectiveness for such environments. We regard the identification of the relevant learning theory, an elaboration of its relevance to learning about complex and dynamic systems, and the specification of relevant instructional design principles to be the missing links in developing a complete scientific basis for system dynamics based learning environments.

Specifically, we proceed based on the following critical principle: An interactive simulation is not the same thing as a learning environment. Many persons have and continue to confuse a simulator with a learning environment. This confusion exists outside the system dynamics community. For example, some of those persons responsible for planning and managing flight training in the United States Air Force believe that building a physical flight simulator constitutes the creation of a learning environment. It is our view that a simulator may be part of a learning environment, but for learning to occur (efficiently), there must be more than simple user interaction with a simulation. What more? Frequent and constructive feedback from a tutor, especially in the early learning stages, is critical to learning. Explicitly stated learning goals and mechanisms to facilitate progress towards those goals (e.g., linkage to things already known, assessment of progress, helpful guides to improve performance, etc.) are also critical to learning (Gagné, 1985).

ANALYSIS OF THE PROBLEM DOMAIN

Our analysis and results stem from a project to plan and implement an interactive learning environment for instructional project management. Analysis of the domain indicated that it was appropriate for a system dynamics based learning environment in that instructional project management is a dynamic and complex system filled with uncertainties and feedback, not unlike, but somewhat more complex than, the domain of software project management. A pilot effort to build a model of the early phases of instructional project was successful (Grimstad Group, 1995; Spector, 1994).

System dynamics based learning environments have typically been implemented:

Analysis of the subject domain and target learners is part of a well established instructional systems development practice (Tennyson, 1994). That practice occurs in the early phases of planning instruction and is typically called the analysis phase. It involves an analysis of what the targeted learners are expected to do upon completion of the learning. This can be established by a needs assessment which might involve both a job and task analysis. In addition, an analysis of what the learners might already know and be able to do or understand is critical in planning a learning environment. Finally, an analysis of the subject domain (key concepts, principles, procedures, relationships, etc.) is needed. Most instructional system developers include a strong cognitive orientation in these analyses, especially when the domain is complex and the learning goals involve 'deep' understanding of complexities in a domain, as opposed to simpler kinds of learning of facts, concepts, and simple procedures.

We agree with Dörner (1996) and others who believe that it is important for learners to acquire an understanding of the structure of complex systems if the learning goal is to understand the behavior of such systems. In short, if learners are to be expected to reliably predict and make informed policy decisions, then the evidence gathered by Dörner (1996) and others indicates that learners must acquire some reasonably precise notion of relationships among key system variables and develop an understanding of the most influential delays and feedback mechanisms in the system. Missing from this psychological analysis is the issue of how the acquisition of this knowledge can be facilitated by a well-structured instructional design policy for constructing learning environments (Davidsen, 1996).

We investigated successful uses of system based learning environments (Sterman, 1988), in domains of similar complexity, such as software project management (Abdel-Hamid & Madnick, 1991). What we found was that many of the more successful simulators depended on an informed and insightful instructor to perform preliminary preparations and basic instruction prior to use of the simulator. In addition, learning effects, especially those involving transfer of learning, appeared highly dependent of follow-on discussion and exercise. This once again was a factor external to a computer-based system dynamics based learning environment. Most significant of all, we found almost a general lack of serious measures of learning effectiveness.

METHODOLOGICAL APPROACH

A relevant theory to facilitate learning in complex domains is cognitive apprenticeship (Collins, 1992). Cognitive apprenticeship involves modeling of knowledge and skills to be learned by a highly trained facilitator/instructor, supportive coaching of new learners, and the gradual fading of learning mechanisms as learners progress towards expertise. An instructional design theory which is nicely compatible with such a learning theory is Reigeluth's elaboration theory (Reigeluth, 1983). According to elaboration theory, instruction will be more effective if it makes use of epitomes (simplifying examples, which could be system dynamics models in complex domains). These epitomes should include motivational elements (e.g., provide the knowledge to solve a challenging problem), use of analogy (e.g., linking a causal loop diagram or stock and flow diagram to a frequently encountered situation), and the incorporation of frequent summaries and syntheses in the curriculum. Much of that which contributes to learning exists an interactive simulation in the form of interactions with peers, facilitators, competing groups, and teachers. Especially neglected in the design of complex computer-based learning environments are those human-human interactions, and they are critical to optimal learning outcomes.

We are address these issues in the following ways. First, we use simple computer-based tutors not based in system dynamics for preliminary and refresher instruction (e.g., facts, concepts, relevant schema, etc.). Doing this eliminates the cognitive complexity associated with a system dynamics based simulator and complies with sound instructional principles (i.e., use advance organizers to look ahead to more complex topics, use graduated complexity for teaching complex material, avoid overloading the working memory of learners, etc.). Second, we are using learning effectiveness measures well established in the field of cognitive science. Specifically, we use concept mapping and mental modeling techniques to establish reasonably robust models of expert thinking in a complex domain. Then we use the same techniques as pre- and post-test measures to determine effectiveness of the simulator. We proceed on the assumption that learning is best thought of as the acquisition of expertise.

We prefer to have learners work in small groups since we recognize the efficacy of collaborative learning in complex domains. When learners are introduced to our management flight simulator, there are first presented a very simple browsing interface to a schema of the underlying model. This is used to familiarize learners with key model components. Next, learners are offered the opportunity to manipulate a single variable and observe effects. Then, learners are offered a simply problem which requires a decision involving several variables. A synthesis of how the decision is linked to the observed behavior is then provided. We conclude with an exercise which demands the development of a decision policy complete with a rationale and justification which makes use the simulator experience. This exercise is presented, discussed, and critiqued with peers in a collaborative setting.

It should be obvious that we take seriously the notion of carefully structuring the learning experience, designing according to the principle of graduated complexity, and properly preparing learners for more complex learning situations. We also consider it vital to train the facilitators using the same learning theory and instructional design principles. Someone with domain knowledge is not necessarily an effective facilitator.

CONCLUSIONS

It is too early to assess the overall effectiveness of our particular management flight simulator for instructional project management. We believe that early identification and elaboration of a learning theory which then informs a disciplined approach to the design of instruction are vital to the success of complex learning environments.

REFERENCES

Abdel-Hamid, T. K. & Madnick, S. E. (1991). Software Project Dynamics: An Integrated Approach. Englewood Cliffs, N.J.: Prentice Hall.

Collins, A. (1992). Toward a design science of education. In E. Scanlon, E. & T. O'Shea (Eds.), New directions in educational technology. Berlin: Springer-Verlag.

Davidsen, P. I. (1996). Educational Features of the System Dynamics Approach to Modelling and Learning. Journal of Structural. Learning, 12(4), pp. 269-290.

Dörner, D. (1996). The logic of failure: Why things go wrong and what we can do to make them right (R. Kimber & R. Kimber, Trans.). New York: Metropolitan Books. (Original work published in 1989)

Forrester, J. W. (1992). Policies, decision, and information sources for modeling. European Journal of Operational Research, 59(1), 42-63.

Gagné, R. M. (1985). The conditions of learning (4th Ed.). New York: Holt, Rinehart, and Winston.

Grimstad Group (1995). Applying system dynamics to courseware development. Computers in Human Behavior, 11(2), 325-339.

Mellar, H., Bliss, J., Boohan, R. Ogborn, J., & Tompsett, C. (Eds.) (1994). Learning with artificial worlds: Computer based modelling in the curriculum. London: Farmer Press.

Morecroft, J. D. W. & Sterman, J. D. (Eds.) (1994). Modeling for learning organizations. Portland: Productivity Press.

Reigeluth, C. M. (Ed.) (1983). Instructional-design theories and models: An overview of their current status. Hillsdale, NJ: Erlbaum.

Spector, J. M. (1994). Integrating instructional science, learning theory and technology. In R. D. Tennyson (Ed.), Automating instructional design, development, and delivery. Berlin: Springer-Verlag.

Sterman, J. D. (1988). People express management flight simulator. Cambridge, MA: Sloan School of Management.

Tennyson, R. D.(1994). Knowledge base for automated instructional system development. In R. D. Tennyson (Ed.), Automating instructional design, development, and delivery. Berlin: Springer-Verlag.

ISDC '97 CD Sponsor