Categories
Case Studies co-design Course Data Curriculum Design data literacy digital capabilities Digital capability

Data, Quality Assurance and improving the Student Experience

Data is a fact of life for everyone in higher education. And there’s a lot of it. Among other things, there’s data about students, staff, estates and research. At a national level data turns into metrics, benchmarks and performance indicators. There are percentages, proportions, a standard registration population, headcounts and full person equivalents.

Data–driven decision making

This proliferation of internal and external data leads institutions into taking a more data-driven approach to strategic decision making, one which focuses on student outcomes and experiences.

The findings of Higher Education Reviews (HERs) undertaken by the Quality Assurance Agency (QAA) highlight some of these approaches. Data is being used to improve the student experience in many ways: from supporting changes in curriculum and assessment approaches, to strategic decision making on estates and improving key performance indicators. For example, at De Montfort University they have implemented a new suite of self-service reports and data visualisations that have led to more affective academic monitoring and strategic planning[1].

A set of examples of good practice from reviews is given at the end of this post.

These new approaches are complimented by the recent development of the Jisc learning analytics service[2] and current Higher Education Statistics Agency (HESA) data futures work[3].

Data and quality assurance

Going forward data is becoming even more important in quality assurance. HEFCE’s 2016-17 Annual Provider review (APR)[4] makes use of student and other data that providers already submit to HESA, funders and regulators. Some of these metrics will also be used in the Teaching Excellence Framework (TEF).

TEF APR
Student recruitment patterns (HESES & HEIFEIS from HESA) x
Sub contractual arrangements (HESES & HEIFEIS from HESA) x
Student non-continuation rates (HESA and ILR) x x
National Survey of Students (NSS) commissioned by HEFCE and administered by Ipsos-MORI x x
HESA UK performance indicators, based on returns from the Destination of Leavers from Higher Education Survey (DLHE) x x
Differential student degree outcomes (HESA and ILR) x
Financial data (HEFCE, SFA) x
Estates management record (HESA) x
Assurance information (HEFCE) x

Full details of how the metrics will be defined and used is given in the TEF specification for year 2[5]. Within the TEF assessors will also be supplied with contextual data on each provider. This will include details of the student cohort such as level of study, age, ethnicity, disability, domicile, entry qualification and subject.

Data will also play a role in the ‘Verification of a provider’s approach to its own review processes’ that QAA will be undertaking.[6] The requirement is for a one-off, desk-based process to confirm providers’ readiness to operate with in HEFCE’s new approach to quality assessment. The first year will involve pilot activity with a range of providers, verification activity for all remaining providers will be undertaken during 2017-18.

Beyond data and into context

It is important to point out that data isn’t the complete picture for either the APR or the TEF. As James Wilsdon explained in the Metric Tide report on Research Assessment[7]: “Carefully selected indicators can complement decision-making, but a ‘variable geometry’ of expert judgement, quantitative indicators and qualitative measures … will be required.”

Wilsdon’s review identified a number of principles for the use of metrics. Humility was one of them. Humility is explained as ‘recognising that quantitative evaluation should support – but not supplant – qualitative, expert assessment’. Data and metrics can be a way into discussing a particular issue – such as first year retention. They are a useful starting point, then part of a narrative that provides colour, richness and context to the issue or topic you are interested in. Hence the TEF includes a written submission.

So there is no escaping the use of data in quality assurance, or most other areas of higher education. However we need to ensure that ultimately this data helps us improve the learning and teaching delivered to students. Jisc’s co-design challenge [8] looks at the challenges in doing this and offers opportunities to explore the next steps for institutions in making data work for them.

Good practice case studies

  • Manchester Metropolitan University’s programme review and modification procedures provide proportionate and risk-based consideration of programme health and curriculum currency. The ‘Enhancing Quality and Assessment for Learning’ (EQAL) initiative provided the University with an opportunity to review its monitoring of taught programmes. As part of this initiative, a dashboard was developed to replace a more traditional annual monitoring approach. The dashboard provides unit and programme teams with a wide range of quantitative and qualitative real-time data relating to the attainment, achievement and satisfaction of the student cohort to identify trends or concerns.
  • Middlesex University reviews its course monitoring process regularly using a comprehensive range of data, including feedback from students via the NSS. The introduction of new software has enabled the presentation of data in a more accessible format and in a timely manner, such that results of KPIs like NSS and progression can be considered and actioned well before the formal AMR process and monitored at regular intervals.
  • At the Open University there is a well-developed approach to the use of data, with staff presenting analysis in an accessible graphical format to faculties, and for review purposes. In addition, the IET produces institutional-level analysis of key data which feeds into strategic enhancement projects. Reviews make use of data from an extensive range of sources, including surveys, associate lecturers and external examiners. This data is used extensively to monitor students’ progression, retention and achievement and provide interventions where necessary.
  • At Liverpool John Moores University monitoring of student performance and experience key performance indicators is now managed through the University’s centralised online dataset, WebHub. It provides extensive analytical capabilities and monthly updated reports on the University’s key performance indicators, alongside those of higher education sector comparators, which are regularly accessed by 1,400 staff across the University at all levels, from module leaders to the Senior Management Team. Content includes every type of data recorded on the University student records system, including average tariff points, employability and degree class outcomes of graduates, yearly retention and completion rates, and weekly updated student engagement and feedback data. These are accessible at University, faculty, school, programme and module level.
  • The University of Birmingham has well embedded mechanisms for collecting and analysing data on student learning. A Student Access and Progress Committee consider the performance of different student groups. A range of data are also considered as part of the Annual Review process, with the outputs considered by the University Quality Assurance Committee and, also, high level academic oversight. Student learning data also enables a local response and means to identify areas for enhancement. As a result of these processes, an assessment literacy project was identified in 2014-15. Data is also collected for non-academic activities, for example a space utilisation analysis and an annual survey on student and staff satisfaction with the University’s VLE.

Paul Hazell is the Evaluation and Analytics Manager and Marieke Guy is a Data Analyst in the Evaluation and Analytics team at the Quality Assurance Agency for Higher Education.

Links

By mariekeguy1

I am a Data Analyst at the Quality Assurance Agency for Higher Education.

Leave a Reply

Your email address will not be published. Required fields are marked *