I’ve always been fascinated with those old maps with the vast unknown spaces populated with sea monsters or massive serpents. But I’ve never understood whether the monsters were intended as warnings to stay away or as enticements to explore. I get the same feeling about our co-design challenge of using data as a way to help students, teachers and managers improve learning and teaching. I know there are big, dangerous monsters in that territory but is that a reason to keep out or a reason to visit? So, I have listed five monsters lurking in the territory of data to improve learning and teaching. Which should we stay far away from and which should we seek out?
- An interesting implication of Jisc’s learning analytics project is that as all learning analytics data will be collected in a standard way and stored in the Jisc learning records warehouse. If agreement from all parties is obtained then this dataset could be used to explore all sorts of theories about how learners engage and the links between engagement and outcomes. The dataset could include some or all of VLE usage, attainment data, library and electronic resource usage and data from attendance systems for students over a number of years. What questions could we try and address by analysing this data? What other data sources could we use alongside it? NSS? UKES? This is perhaps the least ugly of the monsters in this list as I think in the right hands it could provide fascinating insights for teachers and administrators. But there is still the possibility that too much store is put in the data at the expense of other, just as important measures.
- One of the surprising things about the use of technology for learning and teaching is that there is not a lot of evidence available for what works and what doesn’t. This can make it difficult to choose how to focus efforts to improve teaching and learning using technology. Is it more productive to devote efforts to electronic management of assessment than to ensure widespread lecture capture? Which has the biggest impact on student experience? It could be possible for us to gather, collate and share this evidence more effectively across the sector making it easier to make the case for investing in technology and focusing efforts on what really matters for students, teachers and ultimately the institution. Could this kind of evidence be used in concert with learning analytics to provide insights on changes in behaviour? Like monster number one, it seems to me that the real risk here is that people just focus on metrics and forget all the other important factors that need to be considered.
- It is not always the most popular comparator in the world of education but can we learn lessons from the way that data is used in sport? In football, a complex, dynamic sport not easily reduced to numbers, use of analytics has still had a massive impact on the sport. This has not changed the fact that the most important things in football are the human interactions, training, people management, motivation etc, but what data has done is allow those humans to be even more obsessive about identifying the fine details of where performance can be tweaked and improved. I suspect teaching is even more complex than football, certainly goals are less clear, but I still wonder whether allowing teachers and administrators to access fine details about student engagement and performance in near real time couldn’t provide those teachers with new levels of detail they could explore to find ideas for improvement. Like other of the monsters on this list, the danger here is that we get too caught up in metrics and forget that above all human interaction is the most important part of education or that metrics start to change behaviour as people chase certain numbers believed to be correlated with success.
- The increasing focus on employability could provide the seed for another monster. It could be possible to use learning analytics in concert with employment destination datasets and information about the local and national economy to closely tailor courses and the student experience to maximise employability and to explicitly design courses to address global skills gaps and particular regional employer needs. While this may be a good idea from an employability point of view, preparation for work is not the sole purpose all education and do we risk losing those other valuable benefits of education by stressing employability too strongly?
- Perhaps the biggest, ugliest monster I can think of in this area is the exploration of technologies that monitor human reactions in real time such as Microsoft’s emotion API. The data enthusiast in me can see how that data would be fun to play with and how it could be useful for teachers to refine which parts of lessons were creating a reaction. But could this kind of data really tell you anything meaningful about the complex interactions that make up learning? And even if it could, is it worth the price of the creepiness of this kind of constant monitoring?
Should we stay well away or should we set off exploring? Let us know what you think. Either by commenting on the blog or tweeting with the hashtag #codesign16. If you’d rather express your opinions less publicly then feel free to email me andy.mcgregor@jisc.ac.uk or complete our form.
3 replies on “Codesign challenge: Here be data monsters”
Thank you for sharing this Andy – this really is a broad and difficult topic. I do have real concerns about individual datasets in particular and collective data sets in general.
We are not approaching the topic of learning analytics after a discussion about the rights of the individual and the ethics that should underpin any approaches impacting on our privacy and/or freedom of learning and expression. You mention there is some ‘creepiness in constant monitoring’ – it’s much more than this. Are we going to analyse the reaction of a student (perhaps funded by a foreign government) when a controversial political topic is discussed in a classroom?
I strongly believe that we need a fundamental review of what our rights and responsibilities as educators are in this new digital age before we then progress onto ‘exploring’ these uncharted waters. Otherwise it’s like setting off without a map!
Interesting, thanks Sarah. I agree that there are some troubling implications from these emerging approaches, especially reaction monitoring. There are a few people across the sector who are already exploring these areas. What kind of object would you imagine putting in the hands of these early explorers? A code of practice like we produced for learning analytics? A set of principles? Something else?
Hi Andy, these are really good questions.
A code like the learning analytics code of practice would be useful, but a consultation on key ethical principles to underpin any code would be essential to my mind. This could then inform multiple activities – I doubt we would want a code of practice for every emergent technology under consideration. I envisage the key ethical principles being similar to the role medical ethical guidance plays in medicine and healthcare issues.
This would help us understand and make informed decisions about what is ‘appropriate’ e.g. in the learning analytics code of practice it says ‘Analytics systems and interventions will be carefully designed and regularly reviewed to ensure that: Students maintain appropriate levels of autonomy in decision making relating to their learning, using learning analytics where appropriate to help inform their decisions’. I also notice that in Niall Sclater’s very thorough summary ‘A taxonomy of ethical, legal and logistical issues of learning analytics v1.0’ that in the list of stakeholders he comments ‘while students are potentially impacted by almost every issue here, they are primarily responsible themselves for dealing with a few of them.’ A further discussion around digital ethics in education may reveal that this needs to change for some groups of students – maybe dependent on age etc.
I envisage some tension between a code of practice developed for institutions and what we as educators feel should be the correct approach i.e. what does appropriate really mean in different contexts? So I’m feeling there needs to be more discussion around digital ethics within an education context and that includes issues such as:
Unintended/unplanned consequences of learning analytics and other technologies and the collection of data (both personal and aggregated/anonymised) including the impact this has on staff analytics because if student data is scrutinized is the next step looking at staff engagement with virtual learning environments?
Third party providers and the impact and risks this introduces to all stakeholders
The ethical principles around ownership and safeguarding personal individual data
I know you are considering these issues so my comments aren’t in any way suggesting that these important issues have been ignored. Keep up the good work and I’m now wondering if we are talking about needing a compass on our journey?