Abstract for: Automated Assessment of Learners' Understanding in Complex Dynamic Systems
Research on learning via system-dynamics-based learning environments depends on good measurement of learning. Most such research considers at least two aspects of learning, the participants’ understanding of the models and problems, and the partici-pants' performance in the environment, e.g., quality of decision making. The former, understanding, is much more difficult to measure than the latter, performance. Meas-urement of understanding is often done by eliciting verbal protocols from participants about the problem situation (i.e., the underlying model) and their planned solution strat-egy (i.e., decisions). Coding and analysis of participants’ verbal protocols is very sub-jective and time-consuming. To facilitate measurement and analysis of understanding via verbal protocols, we investigate the utility of a software application which performs such analysis automatically. We assess this automated analysis methodology using data from two different system-dynamics-based learning environments, including how par-ticipants’ understanding compares to experts, how it changes over time, and how it cor-relates with performance.