I’ve always been fascinated with those old maps with the vast unknown spaces populated with sea monsters or massive serpents. But I’ve never understood whether the monsters were intended as warnings to stay away or as enticements to explore. I get the same feeling about our co-design challenge of using data as a way to help students, teachers and managers improve learning and teaching. I know there are big, dangerous monsters in that territory but is that a reason to keep out or a reason to visit? So, I have listed five monsters lurking in the territory of data to improve learning and teaching. Which should we stay far away from and which should we seek out?
- An interesting implication of Jisc’s learning analytics project is that as all learning analytics data will be collected in a standard way and stored in the Jisc learning records warehouse. If agreement from all parties is obtained then this dataset could be used to explore all sorts of theories about how learners engage and the links between engagement and outcomes. The dataset could include some or all of VLE usage, attainment data, library and electronic resource usage and data from attendance systems for students over a number of years. What questions could we try and address by analysing this data? What other data sources could we use alongside it? NSS? UKES? This is perhaps the least ugly of the monsters in this list as I think in the right hands it could provide fascinating insights for teachers and administrators. But there is still the possibility that too much store is put in the data at the expense of other, just as important measures.
- One of the surprising things about the use of technology for learning and teaching is that there is not a lot of evidence available for what works and what doesn’t. This can make it difficult to choose how to focus efforts to improve teaching and learning using technology. Is it more productive to devote efforts to electronic management of assessment than to ensure widespread lecture capture? Which has the biggest impact on student experience? It could be possible for us to gather, collate and share this evidence more effectively across the sector making it easier to make the case for investing in technology and focusing efforts on what really matters for students, teachers and ultimately the institution. Could this kind of evidence be used in concert with learning analytics to provide insights on changes in behaviour? Like monster number one, it seems to me that the real risk here is that people just focus on metrics and forget all the other important factors that need to be considered.
- It is not always the most popular comparator in the world of education but can we learn lessons from the way that data is used in sport? In football, a complex, dynamic sport not easily reduced to numbers, use of analytics has still had a massive impact on the sport. This has not changed the fact that the most important things in football are the human interactions, training, people management, motivation etc, but what data has done is allow those humans to be even more obsessive about identifying the fine details of where performance can be tweaked and improved. I suspect teaching is even more complex than football, certainly goals are less clear, but I still wonder whether allowing teachers and administrators to access fine details about student engagement and performance in near real time couldn’t provide those teachers with new levels of detail they could explore to find ideas for improvement. Like other of the monsters on this list, the danger here is that we get too caught up in metrics and forget that above all human interaction is the most important part of education or that metrics start to change behaviour as people chase certain numbers believed to be correlated with success.
- The increasing focus on employability could provide the seed for another monster. It could be possible to use learning analytics in concert with employment destination datasets and information about the local and national economy to closely tailor courses and the student experience to maximise employability and to explicitly design courses to address global skills gaps and particular regional employer needs. While this may be a good idea from an employability point of view, preparation for work is not the sole purpose all education and do we risk losing those other valuable benefits of education by stressing employability too strongly?
- Perhaps the biggest, ugliest monster I can think of in this area is the exploration of technologies that monitor human reactions in real time such as Microsoft’s emotion API. The data enthusiast in me can see how that data would be fun to play with and how it could be useful for teachers to refine which parts of lessons were creating a reaction. But could this kind of data really tell you anything meaningful about the complex interactions that make up learning? And even if it could, is it worth the price of the creepiness of this kind of constant monitoring?
Should we stay well away or should we set off exploring? Let us know what you think. Either by commenting on the blog or tweeting with the hashtag #codesign16. If you’d rather express your opinions less publicly then feel free to email me firstname.lastname@example.org or complete our form.