Categories
Case Studies Marking

Technology supporting assessment and feedback at Keele

Background and context From 2010 to 2011 Keele University was supported by the Jisc Building Capacity programme to undertake project STAF: technology supporting assessment and feedback. The University does not specifically use the term EMA but there was a general feeling that they were not making effective use of technology to support assessment and feedback […]

Background and context

From 2010 to 2011 Keele University was supported by the Jisc Building Capacity programme to undertake project STAF: technology supporting assessment and feedback. The University does not specifically use the term EMA but there was a general feeling that they were not making effective use of technology to support assessment and feedback processes: ‘With some notable exceptions, the take-up of technology to support improvements in marking and giving feedback has been slow. Our VLE supports objective testing and assignment submission, and is integrated with Turnitin for originality reports, peer review and online marking. Barriers to change in assessment practices include staff workloads with large courses, work habits, concerns about losing personal contact with students, the needs of external examiners, interpretations of the institutional regulations and the variety of needs of different programmes and assessment types. New technologies including audio, video (screencasts), and voice recognition are not widely known.’

The aim was to develop the uses of existing technologies and to support academic staff in their uses, so as to improve assessment and feedback processes for the benefit of staff, students and the institution. The objectives were to develop a portfolio of assessment and feedback processes that took advantage of existing technology, and support their adoption by academic schools and individual academic staff. This included the wider use of existing facilities for online assessment and feedback (Blackboard and Grademark), and the introduction of novel assessment practices especially handwriting recognition, audio and video.

The overarching goal was to better support the assessment strategy and improve student satisfaction with assessment processes alongside meeting a number of identified pressing needs. The pressing needs were:

  1. The poor legibility of written feedback to students provided by some faculty members
  2. Improving the quality and usefulness of  feedback to students on their work
  3. Increasing the awareness by students of the criteria on which their work is being assessed
  4. Preventing plagiarism
  5. Increasing the efficiency of processes in the light of a planned worsening of the staff/student ratio
  6. The sustainability agenda; reducing paper usage, photocopying and printing across the institution.

EMA implementation

The work was focused on the parts of the assessment life-cycle concerned with marking and production of feedback, recording grades and returning marks and feedback to students. The project restricted itself to coursework rather than examinations, and recognised that there were a number of types of assessment that could not be handled electronically therefore it concentrated on the processes used for the commonest types of substantial text assignments.

The University did not adopt a one-size-fits-all approach, instead it undertook a wide ranging review of business processes. Discussions around these processes helped identify the fact that some assessment regulations were widely misunderstood and prompted a review of aspects of the regulations. Following the review the University developed a portfolio of three recommended assessment and feedback processes based on sound educational principles. They processes were designed to provide a framework that would allow staff to make the best use of existing technologies while allowing for some variation for the preferences of programmes, modules or individual markers. These processes were promoted across the institution (initially without compulsion to adopt these practices).

As well as addressing mainstream practice, the University supported 20 innovative assessment projects across the institution using different methods of providing feedback to students, by supporting enthusiast academics, who went on to influence other colleagues. The project blog shows the outcomes of staff and student evaluation of these initiatives.

‘We were aware that there was varied practice in delivering feedback to students, which was mostly paper-based and often not collected. Often assessment processes had not been designed holistically, and administrative staff time, paper and space costs were not considered. Therefore, in designing new coursework assessment processes, we adopted a holistic view, balancing the needs of all stakeholders: academics, administrators, students and the university.’

The project did however note the difficulty of achieving change in a short space of time in an area where existing practices were so deeply embedded:

‘The resistance to change in a few academic areas was very strong and not susceptible to the influence of a short project, even with senior management support.’

‘Recommendations that were acceptable, or even already in place, in some academic areas were unacceptable in others.’

Benefits and outcomes

The following benefits were achieved:

  • A positive cultural shift towards using technology in assessment, in many parts of the university
  • Recommended processes to give cost savings for students and academic schools in printing and space, and savings in staff time once new processes are embedded and familiar.
  • Increased awareness of potentially useful technology for giving student feedback, especially audio files and Turnitin’s Grademark
  • Saving staff time through more efficient processes, once up-front training and practice is accomplished
  • Business process improvements: Revised assessment ‘business processes’ will give a more consistent student experience between programmes and modules – for example, most feedback will be delivered through the VLE – reducing institutional risks of unsatisfied students
  • Increased resilience of the new assessment processes
  • Increased functionality; more options for providing feedback are available, especially audio.
  • Improved interoperability between institutional information systems is being planned
  • Reduction of redundant data
  • Improved performance towards institutional targets, for example, student retention, space usage, and maintenance of quality in the face of worsening staff/student ratios.
  • Improvements to institutional governance; some regulations are being improved.Looking to the future the University would ideally like to find a way to pass student grades from the VLE directly to the student record system without manual intervention. They would also like to find a way of storing all feedback in one place so that students can see a longitudinal view of their progress rather than a module by module view and have an online means of reflecting on and engaging with the feedback.

Since project STAF, in the academic year 2013/14 the decision was made at Keele University to introduce typed/electronic feedback (e-feedback) as the norm for all summative feedback with Heads of School being able  to make cases for exceptions to this. Keele has a continued focus on assessment and feedback engaging with initiatives such as TESTA(Transforming the Experience of Students Through Assessment) and the Transforming Assessment pilot scheme run through the Higher Education Academy.

Find out more:

Project STAF final report

Project STAF Blog

Project STAF review of relevant Jisc projects.

Project STAF evaluation information

Project STAF 3 recommended assessment and feedback processes

Contact m.j.street@keele.ac.uk