Categories
Case Studies Lifecycle

TRansforming Assessment + Feedback For Institutional Change (TRAFFIC) at MMU

Background and context Manchester Metropolitan University (MMU) is the largest campus-based undergraduate university in the UK with a total student population of more than 37,000. The university offers over 1,000 courses and qualifications, the majority of which have a strong professional bias. In 2010 MMU set up the EQAL Programme: ‘Enhancing Quality and Assessment for […]

Background and context

Manchester Metropolitan University (MMU) is the largest campus-based undergraduate university in the UK with a total student population of more than 37,000. The university offers over 1,000 courses and qualifications, the majority of which have a strong professional bias.

In 2010 MMU set up the EQAL Programme: ‘Enhancing Quality and Assessment for Learning’. The goal was to make a step-change improvement in student satisfaction by refreshing the entire undergraduate curriculum whilst simultaneously re-engineering administrative processes and creating a seamless personalised experience that wrapped the university’s information and online resources around each learner. The ambitious deadline was to deliver a brand new, technology-supported first year for September 2011 with the new second year starting September 2012 and the new final year September 2013.

The TRansforming Assessment + Feedback For Institutional Change (TRAFFIC) project formed part of the EQAL programme and carried out a thorough review of policies, procedures and practice relating to assessment. At the beginning of the project MMU had many separate systems for managing the various parts of the assessment process, some of which were digital, some of which relied on paper files, and some of which simply relied on people’s memories and transmission of custom and practice.

To support its work the project team developed an academic model which gave a life-cycle view of assessment and feedback (for details and further discussion on this model see the Jisc EMA blog. The lifecycle model offers a ready means of mapping business processes and potential supporting technologies against academic processes and has been used as the basis for the Jisc EMA research of which this case study forms a part.

EMA implementation

Despite the impressive technical developments, the team notes that technical work actually came quite late in the project and the real work was about getting the right policies and procedures in place to support effective academic decision-making. Aspects of the implementation that relate to specific areas of the assessment and feedback life-cycle include:

1. Specifying: as part of the overall undergraduate curriculum change, MMU standardised the credit size of modules (30 credits) and limited the number of summative assignment tasks per module to 2. This was in response to student feedback that there were too many assessment points: up to 20 in a year for some students. The maximum number of summative assignments per student is now 8 per year, including examinations.

An online proforma was designed to capture new unit specifications and store the standardised descriptions in a database. All undergraduate units are now included in this database and the records include information about the type and weighting of the assessment and the skills students are likely to demonstrate in completing the work. This information is linked directly to the student records system (SRS) in order to give a basic picture of the requirements for each unit. For other qualifications, assessment type and weighting information is added manually to the SRS.

MMU guidance on Specifying.

2. Setting: at the beginning of the project, there were no particular institutional expectations about the content and format of assignment briefs although most programme teams had a standardised approach across the programme. MMU now has a consistent structure for assignment briefs, supported by clear guidance on assignment task design and size, developing appropriate assessment criteria, and best practice on feedback and moderation for different types of task.

Analytical work has also raised awareness about the problems of clustering of assignment deadlines and programme teams have been provided with guidance on effective planning of assignment deadlines.

MMU guidance on Setting

3. Supporting: support activities refers to everything that teams do to prepare students for assessment but teams are encouraged to be specific about how aspects such as assignment tutorials, formative tasks such as regular MCQ tests or the submission of drafts and related provision of feedback contribute directly to a student’s preparation. Improvements in the ‘setting’ stage such as clearer guidance on the use of grade descriptors and assessment criteria also mean better support for students. See also 6 below for information on support during reassessment.

MMU guidance on Supporting

4. Submitting: in 2011, a system which had been developed in one faculty to log and track submissions was rolled out across the whole university. The Coursework Receipting System (CRS), takes a feed of approved submission dates for assessment elements from the Student Records System and provides students with bar-coded coversheets for tracking paper submission of those assessment elements. The CRS has also been enhanced so that it can record submission of assignments submitted online via the VLE (Moodle Assignments and Turnitin Assignments). Initially envisaged as an intranet-style, standalone password protected personalised website, a web-service extension was soon developed to enable the personalised assessment information to be presented in the Moodle VLE, the SharePoint Portal and the CampusM mobile App. As the TRAFFIC project progressed data flowing through the CRS was expanded in scope from assessment dates to encompass feedback return dates (to reinforce the MMU Commitment – see 5 below) and provisional marks.

MMU guidance on Submitting.

5. Marking and production of feedback: at the beginning of the project there were no particular institutional expectations about marking or feedback, nor was there any guideline about the time to mark and return student work. The MMU Commitment to return feedback within four weeks was introduced towards the end of the first year of the project. New procedures to support consistent approaches to marking, moderation and feedback were also introduced as part of the project.

MMU guidance on Marking and production of feedback.

6. Recording grades: at the beginning of the project practice was varied; unit leaders now have clear responsibility for managing the marking and moderation process and ensuring that final grades have been entered into the Student Record System by four weeks after the assignment submission deadline, at which point the grades are automatically released to students.

The project also acted on a recommendation to focus more effort on the re-assessment period: the time between the board of examiners’ meetings and the deadline for re-submission of failed work or the re-sitting of failed examinations. All faculties are now required to put in place a clear plan for this period including elements such as a consistent approach to providing reassessment information and resources in the VLE, a rota of staff to cover student queries, telephoning all students who have been given a reassessment opportunity and briefing student support staff about arrangements and staff availability.

MMU guidance on Recording grades.

7. Returning marks and feedback: at the beginning of the project practice was varied. There is now an institutional commitment to a four-week deadline; students now have clear, personalised information about when marks and feedback will be received and the return of marks to students is done automatically via a feed to their Moodle area from the Student Record System. Electronic submissions are normally returned to students using the same tool with which they were submitted for marking.

MMU guidance on Returning marks and feedback.

8. Reflecting: MMU encourages students to reflect on their own performance and make themselves a personal action plan for the future, as well as requiring tutors to reflect on the effectiveness of each part of the assessment cycle from setting to the return of work. As a result of this work MMU has also decided to include data about unit reassessments in its performance monitoring and provision of support for individual units. This should enable heads of department and programme leaders to target resources more quickly on units which staff and students perceive as ‘difficult’ in assessment terms.

MMU guidance on Reflecting.

Benefits

  • Progress at MMU on supporting the timely, personalised information agenda has exceeded expectations with all students receiving personalised assessment schedules setting out their submission deadlines, dates on which they can expect feedback and provisional coursework marks. The service oriented architecture adopted ensured that data from the custom Coursework Receipting System, could be harvested and published in the Moodle VLE and on students’ mobiles using the CampusM App. The combination of web-services and consistent use of identifiers for students, modules and assessment elements enabled integration across different systems and other institutions should consider this approach.
  • Use of the assessment lifecycle has enabled MMU to map out the roles and responsibilities of teams and individuals throughout the whole lifecycle in order to identify key dependencies within the academic process from both a technical and administrative perspective and, as a result, colleagues in different teams are better able to take a joined-up approach to enhancing the overall assessment and feedback experience.
  • Key policy documents have been rewritten in order to ensure clarity and to remove any confusion between the requirements of the institutional framework which needs to support effective processes and the maintenance of academic standards, and decision-making about academic issues such as choices of assignment type and size, and feedback strategy, which need to be retained within programme teams. The institutional code of practice on assessment and associated procedures for assignment briefs, marking, moderation and feedback to students has been revised.
  • Staff development resources have been produced and made available online using Creative Commons licences.
  • Before the curriculum review the University was handling around 620,000 pieces of coursework annually. In 2012/13, this reduced to around 400,000.
  • The project has made good use of both qualitative and quantitative data to support decision-making and is continuing to make progress in the area of assessment analytics.
  • Delivering university-wide change at this pace has given MMU a new confidence about its ability to make institution-wide improvements. It has demonstrated to the whole UK HE sector that change on this scale is possible and Jisc funding has enabled MMU to share lessons with universities throughout the country.
  • The project has supported an institution-wide dialogue on improving assessment and feedback that coincides with improved NSS scores. Although the new final year only went live in September 2013, NSS results have shown progress in MMU’s organisation and management, assessment, learning resources and overall satisfaction scores:NSS 2012: organisation and management up 7%; assessment up 6%; learning resources up 6%; overall satisfaction up 6%

    NSS 2013: organisation and management up 2%; assessment up 2%; learning resources up 1%; overall satisfaction up 3%

Find out more: