Abstract
Assessment of educational outcomes through purchased tests is commonplace in the evaluation of individual student ability and of educational programs. Focusing on the assessment of writing performance in a longitudinal study of first-time, full-time students (n = 598), this research describes the design, use, and assessment of an open-source scoring platform. Augmenting usability testing, the research design relies on a framework of inter-reader agreement, inter-reader reliability, and coefficients of determination. The open-source, web-based portfolio assessment system yielded rates of agreement, reliability, and determination superior to the traditional paper-based portfolio assessment method. In addition, the system appears to be ideally suited to assess EPortfolios created to showcase student ability in digital environments: agreement range = 70% to 85%; reliability range = κ = .67 (p < .01) to κ = .85 (p < .01); coefficient of determination = R2 = .95, F(5, 34) = 118.59 (p < .01). This novel and innovative application of an open source platform for outcomes assessment yields the foundation for a sound validity argument, the control of human error, and complete system transparency and flexibility. Future research directions point to the need for the design and assessment of an open-source system designed to capture complex constructs as they emerge in digital environments.
Original language | English (US) |
---|---|
Pages (from-to) | 5-32 |
Number of pages | 28 |
Journal | Journal of Interactive Learning Research |
Volume | 24 |
Issue number | 1 |
State | Published - 2013 |
All Science Journal Classification (ASJC) codes
- Education
- Human-Computer Interaction
- Computer Science Applications