Analysing analytics

Being a linguist at heart, I chose to jump into the new Cetis Analytics Series of briefing papers with no. 5, What is analytics? Definition and essential characteristics, by Adam Cooper. Adam considers some definitions of analytics from other sources before plumping for this description:

Analytics is the process of developing actionable insights through problem definition and the application of statistical models and analysis against existing and/or simulated future data.

In his exploration of what this means, I was struck by the fact that, as ever, it comes down to knowing what you want to know (or at least having a clear understanding of the area/parameters you want to explore), and having an understanding of what data will help you to find that out, to an acceptable degree of confidence. It reminds me in some ways of evaluation planning – always a challenging part of any innovation programme or project. I really like the focus on ‘actionable insights’ – analytics is about informing action. One of the examples given of where actionable insights are often not generated from data capture is in end of module surveys – better design and data analysis would improve this, but really it needs to be seen as one piece in a larger action-research activity and complemented by other evaluation activities to inform module redesign.

Emboldened by my understanding of the definition paper, I turned my attention to a longer paper in the series, Analytics for Learning and Teaching, by Mark van Harmelen and David Workman. The authors define learning analytics as:

the analysis of educational data, including data about learner and teacher activities, to identify patterns of behaviour and provide actionable information to improve learning and learning-related activities.

Examples given are to:

  • Identify students at risk so as to provide positive interventions designed to improve retention.
  • Provide recommendations to students in relation to reading material and learning activities.
  • Detect the need for, and measure the results of, pedagogic improvements.
  • Tailor course offerings.
  • Identify teachers who are performing well, and teachers who need assistance with teaching methods.

They believe that this area is ripe for rapid growth, and predict the sector will see mainstream adoption in two to five years. Crucially, they believe that sensible use of analytics will enable institutions to make significant improvements in student learning services. ‘Analytics’ here covers a broad church of activities, from small-scale projects to large-scale activities requiring data warehouses and experienced analysts. Reassuringly, the paper suggests that institutions might want to cut their teeth on the former to build their skills in this area.

As ever, key enablers and challenges to this approach will be human and organisational, rather than entirely technical – it is a means to inform human decision-making in a given organisational context, and as such great data analysis alone will not deliver results.

In terms of pedagogy, one clear message to institutions is to make the analytics serve the pedagogical aims of the faculty or institution, rather than the other way round. It all comes back to knowing what you want to know and asking the right questions.

If you want to dig deeper into the use of analytics in learning and teaching, have a look at the paper, as it gives a number of examples of uses, and explores adoption issues and risks – much more than I can possibly do justice to here!

Leave a Reply

The following information is needed for us to identify you and display your comment. We’ll use it, as described in our standard privacy notice, to provide the service you’ve requested, as well as to identify problems or ways to make the service better. We’ll keep the information until we are told that you no longer want us to hold it.
Your email address will not be published. Required fields are marked *