Categories
Case Studies Electronic management of assessment Lifecycle

EMA at Queen’s University Belfast

Background and context Queen’s University Belfast (QUB) is a broad-based research-intensive institution with 20 Schools, 11 Institutes, 2 University Colleges and 8 Directorates. The student body is primarily full time undergraduates from Northern Ireland. This case study looks at work undertaken in the period 2011 to 2014 as part of the Jisc assessment and feedback […]

Background and context

Queen’s University Belfast (QUB) is a broad-based research-intensive institution with 20 Schools, 11 Institutes, 2 University Colleges and 8 Directorates. The student body is primarily full time undergraduates from Northern Ireland. This case study looks at work undertaken in the period 2011 to 2014 as part of the Jisc assessment and feedback programme and particularly the e – Assessment and Feedback for Effective Course Transformation (e-AFFECT) project.

The key drivers for the e-AFFECT project were:

  • The wish to build upon existing good practice developed with the support of the Higher Education Academy Enhancement Academy to enhance the student and staff experience of assessment and feedback.
  • The need to develop an effective institution-wide framework for the management of strategic change.
  • A desire to address a lack of consistency in assessment and feedback practice across the University as evidenced in external (NSS) and internal student surveys.
  • To extend the use of technology already supported by the University to support assessment and feedback.
  • To support student attainment and retention in the University.

The project worked across a number of different Schools and began with an examination of the ‘baseline’ study of practice in each. Examination of these baselines revealed huge variation in the timing of assessment and feedback and the ways in which feedback was provided on coursework and exams in different Schools. The approach taken was one of Appreciative Inquiry with the review of current practice being strictly non-judgemental and a collaborative approach to identifying good practice to build on going forward.

EMA implementation

The appropriate use of technology played an important part in this work and QUB ensured that pedagogy was always driving technology adoption rather than the other way round. The project set out to identify effective and efficient practices in assessment and feedback for learning across the institution, with a particular emphasis on the role of technology in enhancing these and to build capacity in use of assessment and feedback technologies.

It was considered important to ensure that the institution was making best use of the technologies it already had at its disposal before seeking to invest in, and support, new technologies. The technologies available at the beginning of the project were:

  • Queen’s Online VLE (SharePoint) used for – for e-submission/marking/uploading feedback and discussion forums and wikis.
  • Questionmark Perception (QMP v5.7).
  • Personal Response Systems (TurningPoint).
  • Turnitin UK used for – originality checking; during the life of the project the license was extended to include PeerMark for peer review and GradeMark.
  • MS Office used for – marking and feedback.
  • WordPress used for – student blogs.

As a result of the enhancements to practice QUB now also uses a range of additional tools that are not supported by their Information Services department:

  • WebPA
  • VoiceThread
  • Jing
  • Audacity
  • PeerWise

QUB has undertaken interventions in many parts of the assessment and feedback lifecycle. In the section on ‘specifying’ we look at the way in which an agreed set of educational principles have influenced assessment and feedback across the institution. In the other sections we look at how particular Schools have focused on particular elements of the life-cycle. In all 14 programme teams were involved in the project directly involving 255 academic staff, 19 administrators and almost 4,500 students.

  1. Specifying

Earlier work had already examined principles for good assessment and feedback practice – it was felt that, of the principles most commonly espoused in current literature, it would be best to focus on no more than about seven that QUB considered to be the most important. A conceptual model for the use of these principles was developed with the underlying rationale that all assessment and feedback activities should encourage positive motivational beliefs and self-esteem. This model formed the basis of designing project interventions to enhance practice and also now underpins assessment design at programme level.

QUB principles

To facilitate and engender dialogue with and by programme teams around the educational principles eight cards were developed that set out the headline, the narrative behind it, suggested ways of accomplishing the principle and different technologies that might be used. The principles cards are available for others to download and use along with a set of Technology cards. The technology cards are themed into their functionality and each card provides information on:

  1. The type of technology
  2. Technology requirements – eg. license, permissions, download
  3. Benefits to students and staff in using the technology
  4. Tips for using the technology (where these have been gleaned)
  5. Implementation considerations
  6. Key features set out for easy comparison
  7. Accessibility considerations

There is also an Action Plan template for programme teams. designed to capture where, when and how an activity would take place in the programme. It also captures whether training for staff and/or students is required and any potential barriers to the completion of the proposed action.

  1. Setting:

As well as the implications for the broad area of curriculum design, application of the principles has important implications for the setting of individual assignments for a specific instance of delivery.

Another widespread activity across QUB was the use of the assessment timelines tool from the University of Hertfordshire’s Jisc-funded ESCAPE project to map the assessment and feedback landscape in order to help with the process of setting the assignments (see our case study on University of Hertfordshire). The timelines were particularly useful in facilitating discussions and action planning. These brought together information from module descriptions and other School data and presented a clear visual summary of the schedule and type of assessments, formative or summative; high stakes or medium stakes, facing students throughout the academic year. This approach has now been built into a Continuing Professional Development event for programme/course teams reviewing or preparing new degree programmes.

A number of different Schools have come up with ways of helping students understand the process of making evaluative judgements on an assignment by engaging them with assessment criteria and standards:

  • staff from the Centre for Biomedical Sciences Education produced a matrix of assessment, content and feedback opportunities across the programmes to identify patterns and demonstrate to students how the programme of work fits across the three years;
  • staff in Midwifery significantly developed the use of assessment criteria as a means of enabling their students to understand what was required and as a basis for the provision of feedback: assessment criteria (instead of guidelines) are provided for essays to make assessment more transparent to students and marking easier for staff; assessment criteria are used in feedback; an assessment rubric has been created using level descriptors; referencing is standardised and penalties defined (a guide for students has been developed by students); a review of the timing of feedback has led to the publication of dates for students and externals; generic feedback on common mistakes has been compiled into in a bank and students are able to post questions on a discussion forum.
  • staff in Civil Engineering developed workshop materials for students on the criteria and standards for reports;
  • staff in Social Work carried out a review of module content and assessment, mapping the content, skills and assessment of the programme for staff and students;
  • staff in Law circulated exemplars of past work in an effort to engage students with standards and assessment criteria.
  1. Supporting:

The increased clarity about timescales, criteria and standards has benefited students in many Schools across the institution. As well as developing the workshop materials for students on the criteria and standards for reports described above, staff in Civil Engineering also supported students via:

  • on screen provision of feedback to students on a draft graph for coursework
  • on screen provision of feedback to students on a draft flownet for coursework

There was a significant difference between the mean module marks for 2012-13 compared to 2011-12 and a shift in the mark distribution with proportionately fewer fails and third class marks and more first class marks following the interventions. Detailed analysis of student marks over 2 years also confirmed that students who participated in the support activities were less likely to make errors in their final submission.

The programme team for Environmental Planning was particularly interested in developing their students’ feedback literacy and agreed that there should be workshops for students at all levels using exemplars and marking exercises as well as the use of VoiceThread to create tutorial and support materials. Interventions included:

  • Facilitated workshops on assessment and feedback: as a part of the assessment for the module students were required to indicate how they had used the feedback from the first assignment in the next.
  • Jing was used to provide screencasts to support subject specific skills development and to provide formative feedback on students’ design plans.
  • Four VoiceThread tutorial resources were developed based around 4 themes with questions for the students to answer. The aim was to encourage year 1 students to express an opinion to the question posed and to then discuss this effectively with their peers. Tutors provided feedback in VoiceThread on the students’ responses.

In the School of Law the action plan included student-led sessions on feedback and time management. Skills are now mapped throughout the degree programme in an effort to highlight where students have opportunities to be taught, to practise and to be assessed in the identified skills. In an effort to engage students with course material throughout the year, ten online ‘take home’ class tests will be developed using QuestionMark Perception: students must take and pass seven.

  1. Submitting:

QUB has an in-house EMA system used for e-submission: Queen’s Online assignment tool. Whilst policy is made locally, an increasing number of Schools are beginning to mandate e-submission following successful pilots and positive reports from other parts of the University.

Business Management decided to proceed with a trial of the Queen’s Online assignment tool as part of the e-AFFECT project. The trial covered 298 students on two campuses and identified the following advantages:

  • the same submission procedures could be followed by students at both campuses;
  • it was easy to upload and deliver feedback for students – the need for multiple individual emails was eliminated;
  • it was much easier to monitor submission times with e-submission and it was possible to monitor when students viewed their feedback;
  • part-time students did not need to take time off work to submit assignments.

The School of English took use of the EMA tool even further and established e-submission, e-marking and e-feedback for all coursework. An unexpected outcome has been the realisation of how powerful the experience of one School can be in influencing others. Two further Schools, Creative Arts and Education, have now adopted e-submission, e-marking and e-feedback as a result of the positive experiences in the School of English.

  1. Marking and production of feedback:

The School of Psychology delivered all of the following elements of its action plan: new guidelines for feedback on dissertations, an inventory of writing skills, new feedback sheets incorporating the University descriptors and an acceptance that staff should be exposed to each other’s feedback. Following a Review of Feedback workshop, other initiatives include attempts to standardise feedback across markers, sharing of good practice, the introduction of tutorial exercises designed to help students interpret feedback, the use of the comments function only on documents rather than track changes, the introduction of a new moderation policy to include a view of feedback provided to students and a change to feedback sheets where staff highlight the single most important aspect to consider for the next assignment.

The QUB Feedback review template is available to download and can be used to initiate discussion among staff around the consistency and quantity of assessment.

Biomedical Sciences is taking steps to ensure feedback is readily comprehensible to learners by using summer studentships to enable students to collaborate with staff in the creation of feedback comment banks to be used with GradeMark and PeerMark.

  1. Recording grades:

Although related to storage of assignment data rather than actual recording of grades, an initiative worth mentioning here is the creation of an online repository (Vimeo Business) for student films created for work on Film Studies. The online repository overcame the problem of file size limits in the University’s VLE and meant that issues of submission, archiving and access for External Examiners were overcome.

  1. Returning marks and feedback:

Audio feedback was trialled in Film Studies and received a very positive response from students who requested further use of this approach. Biomedical Sciences is planning to deliver audio feedback using Jing.

Environmental Planning used Acrobat Pro to provide feedback annotation on assignments in a second year design module. Students could access this feedback on their computers, smartphones or tablets and analysis of the module marks in 2012-13 and 2013-14 demonstrated an upward shift in the profile of marks.

Biomedical Sciences has introduced a ‘marks breakdown’ on exams for each student with ranking, some statistics and a paragraph from the Module Convenor. In future additional similar information on coursework performance will be added.

  1. Reflecting

QUB has produced some detailed staff and student questionnaires on assessment and feedback. These were originally developed to gain insight into experiences and perceptions of assessment and feedback and the technologies used in order to provide the baseline for enhancement. They are however equally useful as reflective tools to stimulate thinking about individual practice and approaches.

Students’ capacity to reflect and make evaluative judgements has been supported at QUB by the use of peer-review techniques in a number of subject areas. The School of Computer Science used PeerMark (part of the Turnitin suite) to enable students to peer and self-review final project submissions in order to develop their skills in critically evaluating their own work and the work of others.

Benefits

QUB has seen many benefits from the use of EMA across the assessment and feedback lifecycle.

  • The overall approach of using Appreciative Inquiry and supporting teaching staff to plan and implement specific interventions has proven to be an effective means of introducing positive change and is an approach that the University intends to apply to other initiatives.
  • There have been savings in staff time as a result of introducing e-submission, e-marking and e-feedback e.g. the School of English calculated it had saved 20 days of administrative staff time.
  • Feedback is being delivered in time to have a positive impact on learning habits. This is particularly noticeable in subjects such as phonetics.
  • Student attainment has improved in many of the areas where interventions have been undertaken e.g. one Linguistics module has seen a 4% increase in the mean student mark since the introduction of online formative feedback opportunities; Civil Engineering has seen an increase in the mean mark and proportionately fewer fails and third class marks since introducing a range of formative support activities.
  • External Examiners have responded positively to the convenience of being able to access material well in advance of exam boards.
  • Student satisfaction with assessment and feedback has increased year on year since 2012 in both internal and external surveys.

Find out more:

  • The e-AFFECT project has produced a wide range of resources and reports available from the Jisc Design Studio.