dc.description.abstract |
Evaluation of the teaching method has great importance in improving the course quality. This evaluation is harder in courses which focus on the process of program development, since it requires observation of the students' approach to problem solving. HtDP offers a "design recipe" which focuses on the process of program development. While there have been a number of studies focusing on the quality of this approach, there has not been any quantitative analysis. In this study, I first introduce a model and implementation of a tool (Screen- Replay) that enables the recording, replaying and annotation of programming sessions. This tool is implemented for DrScheme environment using Scheme programming language. It records and replays a programming session exactly as it occurred. Furthermore, while replaying, an observer may annotate the programming session by associating HtDP design recipe steps with specific time intervals. The resulting annotations form a sequence of design activity descriptions which describe the development process. In order to assess these sequences, a process scoring algorithms is proposed. Finally, the process scores and exam grades from a set of 61 development sessions are examined to gain insight into the impact of following design recipe on exam grades. Screen-Replay was e ective for observing how students develop their programs. In contrast to personal observation, this approach provided consistent and objective observation of students development processes. |
|