This paper first briefly explores the current landscape of assessment and assurance management systems, before providing a case-study of Review, a direct-marking and assessment management system in use at the University of New South Wales (UNSW), Sydney, Australia. The current state of theory and practice in systems facilitating assessment for learning is discussed, before features of future oriented assessment management systems are proposed. The underpinnings and affordances in our experience of using Review for students, staff and administrators are then described. Review, designed by academics for academics, is used in several Australian universities and in over a hundred courses each semester at UNSW. The system is used for both direct criteria based marking and as a marks repository for course and Program (Degree) assurance. Review connects course-based learning events, judgment against criteria and feedback with degree and PLGS across the length of student degree programs. Native affordances and limitations of the system are described. The focus of this paper is on how the system supports a holistic approach to assessment which is described as a 'virtuous cycle' of activity for markers, students and administrators. This virtuous cycle supports and improves student learning, staff assessment and Program (Degree) assurance and reporting. This is exemplified with detail of the system's learning / assessment data structures, the intuitiveness of its interfaces and workflow designs, marking efficiencies and the personalised data recoverable by users according to their roles. Few university wide assessment systems carry granular marks data that tracks meaningful and mappable student achievement, an essential attribute of a future oriented assessment system. This paper advances Review as a successful example of a future assessment' system, agile, user-centric and holistically designed to concord with institutional values (promotion of learning). In Review, activity (framing assessments), achievement (judgement outcomes of assessment) and feedback is explicitly, efficiently and ubiquitously mapped to learning progress against degree / program learning goals, benefitting all parties. This national award winning software supports direct, criteria-based marking, self-assessment by students and improves clarity around assessment for staff and students. The software provides students with a more personalised learning experience via degree-long access to course feedback and the ability to run task or longitudinal self-reports at the criteria, task, course, year and Program (degree) level. Essentially, this establishes the basis of an 'inclusive' assessment feedback system, where learner needs are equal to institutional needs. Staff report decreased marking times, easier administration and improved feedback quality. This is facilitated by a team-based Comment Library feature that accelerates the creation, collective improvement and re-use of feedback, one aspect of the 'virtuous cycle'. A range of visual 'home' screens that track marking activity, with high-level heat maps, visually signify the masses of data that lie beneath, easing administration, supporting in-built quality processes and enhancing staff experience of managing assessment. 'Review improves marking efficiency and helps me moderate and benchmark my tutors' marking more easily ... criteria-based feedback has reduced my post-assessment correspondence with students.' Finally, the inbuilt querying interface enables course and program assurance reporting via bottom to top' data mapping. Marking criteria data is mapped to Program and graduate Attributes (such as Critical Thinking, Oral Communication, Digital Literacy etc), and this provides the basis for task data to be meaningfully represented at all levels, including course, School, Program or university graduate attribute reports. The reporting on student's actual achievement from the very granular base of marking criteri
展开▼