Institutional to Individual: realising the postdigital VLE?

Originally posted on lawrie : converged.

Digital is people – it’s not the first time this has been said on this blog, and I am sure it won’t be the last.

One of the unique things about people, according to Chomsky, is our ability to communicate through language. Institutions are made up of people and there is a now a huge diversity of spaces where conversations are happening. When it comes to learning and teaching, students and staff, it is becoming less useful to think about learning spaces as being informal and formal (an artificial delineation as students might consider any space that contains a member of staff to be formal), students do not think “ah, this is a less formal learning situation, than the one I was in earlier!” At the same time it is also inappropriate to think that learning spaces (especially digital) can be categorised. There are many examples of students and staff setting up social media spaces, and of tutors using tools such as slack and skype in their teaching practice.

Learning is effective when we are connected, in conversations, in groups. And with the number of spaces that conversations can happen inside and outside of institutional systems, it is unrealistic to try and confine them.

Command and Control

Institutional VLEs provide functions that support learning, for example discussions, posting and sharing content, wikis for groupwork etc. However, the provision and access to such tools are controlled by the institution, and don’t often align easily with day to day practice of either teachers or learners. This may be one of the drivers that is causing the shift away from centrally provided systems and towards a more individualised approach for teaching staff and students.

New Approaches

In June Jisc and Leeds Beckett University hosted a workshop around the Codesign Challenge: Next Generation Digital Learning Environment. The event looked at approaches to working with existing virtual learning environments and third party applications such as the Known platform and Aula to extend their reach, to open up the opportunities for conversations.

At Leeds Beckett University Simon Thomson, Head of Digital Pedagogy, is leading a project to explore how the development of a personal student space can link up with institutional learning spaces. The HEFCE funded research project into Personalised User Learning & Social Environments (PULSE) project is looking to develop a hub for connecting students’ existing spaces with institutional spaces and empowering students to take ownership of their content within and beyond their learning. They are not trying to develop an entirely new learning platform, but focusing on the architecture through which to connect existing spaces.

The Aula team have taken a slightly different  approach. Recognising the trend toward disaggregation, and the possible motivations behind it, they have created a “conversational layer”, that can run alongside an existing VLE and provide an ecosystem for a range of other tools.

PostDigital VLE?

In 2009 I co-authored “preparing for the postdigital” the premise for the paper was a simple quote from Douglas Adams:

“We are stuck with technology when what we really want is just stuff that works”

The paper went on to articulate an approach that avoided the digital and analogue distinction of technology and move the discussion to something more nuanced and less divisive. ‘a human context that focuses on the essence of our work rather than the appearance.’

As the discovery phase of the Next Generation Digital Learning Environments draws to a close and we are collating our ideas and examples of practice I have begun to revisit those initial ideas around postdigital practice. The initial paper tried to frame a person-centred, flexible pedagogy nested within a set of social spaces. Postdigital aimed remove digital dogma, where the language of a perceived digital elite, be  that technorati or edupunks, drives not only development, but also skews innovation to either politicised or latest technology. As we witness more examples of the breaking down of the VLE, where  teachers are now using whatever tools they need to, to do what they need, and conversations are happening in myriad online spaces both within and outside institutional control, but driven by the user. I am struck by the initial premise of the paper, and the realisation that teachers and students just want stuff that works, for them. Is this emerging trend toward disaggregation an indicator of Postdigital academic practice? The postdigital VLE made real?

Going Further

When we start to examine the trends around disaggregation for practice purposes, we also need to be mindful of other emerging themes with digital spaces; control, surveillance, “weaponized” metrics such as those used by corporate bodies outside of learning, and of course how the algorithms within spaces use the data to control how feeds.

The report is coming together nicely but if you have use cases you think we should be using, get in touch tell us about your practice, your digital.

Also on:


Designing for Digital Capabilities in the Curriculum

Originally posted on Inspiring learning.

Over the last couple of weeks or so the Connect More events (#ConnectMore17) have been taking place across the UK. This year’s events focussed on the digital capabilities that every practitioner, teacher, librarian and adviser in UK higher and further education needs.

One recurring thread running through Connect More is ensuring that digital is fully embedded into the curriculum. This posed a number of key questions, such as:-

  • What do we do well already and what lessons can we learn from peers?
  • How do we ensure that the digital we embed in the curriculum is aligned to student demand?
  • How can we support curriculum staff to develop the digital capabilities required to meet that student demand?
laptop learners

Image available on Pixabay under a Public Domain licence.

The sessions on Curriculum confidence: designing for digital capabilities in the curriculum, addressed these very questions and presented attendees with a number of Jisc resources that can help institutions embed digital.

First of all, there’s the Digital Capability Activity Cards. These bitesize cards can be used with groups/individuals to surface the kinds of digital activities curriculum staff already do well with learners, mapped to the Jisc digital capability framework. We did an activity at the Birmingham and Sheffield Connect More events where we asked attendees to reflect on their own activities with learners and some of these reflections are captured here. Further case studies are also available from the Jisc guide Developing Organisational Approaches to Digital Capability.

Ensuring technology is aligned to student need requires a collaborative approach to technology adoption. Conversations with learners themselves are critical. Jisc has recently published the outcomes from the open pilot of the Student Digital Experience Tracker, which includes the response from 74 UK institutions of over 22,000 students. There is a wealth of data within this report that provides invaluable insights on students’ expectations, both from HE, and FE and Skills.

Jisc has also produced the Digital Capability Curriculum Mapping document, which encourages curriculum staff to explore their programmes of study through a ‘technology lens’ and identify areas where digital can be embedded to enhance learning outcomes.

Finally, there have also been further developments on helping institutions to assess and support the digital capabilities of staff. The HE Teacher question set and FE and Skills Teacher question set can either be used as standalone resources or to complement the existing work done to date on the discovery tool. Both are intended to identify how digital can augment the pedagogic aspects of teaching, from the creation of digital learning resources to ensuring that accessibility issues have been considered when introducing new technologies.

The post Designing for Digital Capabilities in the Curriculum appeared first on Inspiring learning.

The McMaster Summer Institute for Students as Partners 2017: living and breathing partnership internationally

Originally posted on Change Agents' Network.

We are delighted to have a guest post from Dr Catherine Bovill, Senior Lecturer in Student Engagement, Institute for Academic Development, University of Edinburgh:

The McMaster Summer Institute for Students as Partners was first run in 2016 and brings together researchers and practitioners interested and active in Students as Partners (SaP) work across the world. The second Summer Institute was held in May 2017 in Hamilton Canada, and focused on three strands of work: 1) SaP workshops to support students and staff to examine their existing SaP practice as well as planning for new SaP initiatives; 2) a Change Programme for teams of staff and students planning to enact SaP cultural change in their institutions; and 3) a writing retreat for staff and students collaborating in writing about SaP. I was fortunate to be invited to co-lead the workshops at this second Summer Institute with Sophia Abbott, Post-Baccalaureate Fellow for Collaborative Programs at Trinity University, Texas, USA and Lucy Mercer-Mapstone, PhD student in the Centre for Social Responsibility in Mining and Project Lead for Students as Partners Program Design at the University of Queensland Australia. We were a truly international facilitation team.

A vision of student-teacher partnership with ever increasing responsibility taken by students throughout their degree – thanks to Christel Brost, Malmö University, Sweden for the art work

A vision of student-teacher partnership with ever increasing responsibility taken by students throughout their degree – thanks to Christel Brost, Malmö University, Sweden for the art work

Six months prior to the Summer Institute, Sophia, Lucy and I met for the first time via Skype to get to know one another and to start planning the two SaP workshops focused on Course and Curriculum design, and on Teaching Learning and Assessment. This could have been very challenging, when three people from different backgrounds come together to co-facilitate four full days of workshops for staff and students from around the world. In reality, the experience was phenomenally positive. I have often reflected upon how many people working in the SaP field are quite like-minded: they are not only interested in partnership, but they are keen to walk the walk and model good partnership in practice. Thankfully, this was our experience of co-facilitation, we shared many of our goals and approaches to working, whilst remaining intellectually challenging and critical with one another.

Indeed, this heady mix of collegiality and criticality seemed to engender further and deeper exploration and thinking during the Summer Institute.

The Summer Institute as a whole attracted nearly 80 participants from nine countries; half of whom were students and half, staff members. Creating an ethos of collegiality and criticality within the workshops was key to enabling individuals and groups from institutions around the world to work constructively with one another. The workshops included participants from the UK, Sweden, the Netherlands, USA, Canada and Grenada. Some participants were very experienced in undertaking SaP work and had quite well developed SaP initiatives within their institutions. Other participants were new to SaP, such as the colleagues from Grenada. One of the most rewarding parts of the workshops was when some of our Grenadian colleagues reported how they had moved from feeling like outsiders, completely new to SaP and with little to share, to feeling included and valued members of the workshops with ideas and perspectives that were of great value to others. This sense of sharing from one another’s experiences and supporting one another in thinking through the challenges of culture change within different institutions, is absolutely at the heart of the Summer Institute for SaP.

I will share briefly three elements of the workshops that were highlights for me personally. First was a session I led on the importance of behaviour and language being consistent with the principles and values of SaP. This seemed to capture the imagination of participants and I appreciated the flexibility with which Sophia, Lucy and I worked together to provide more time for participants to explore this crucial aspect of SaP practice in more depth. According to feedback from the workshops, this session, and the extra time given to it, was clearly appreciated.

Twitter post demonstrating enjoyment of Sarah Dyer’s Appreciate cards – thanks to Rachel Braun, University of Calgary, Canada for the original Tweet

Twitter post demonstrating enjoyment of Sarah Dyer’s Appreciate cards – thanks to Rachel Braun, University of Calgary, Canada for the original Tweet

The second short session I enjoyed was when we used the Appreciate partnership cards produced by Sarah Dyer, University of Exeter. Participants were able to take time to try out some of the activities suggested by the cards as well as to reflect upon how they might use the cards in their own SaP work. Finally, I really enjoyed and learned from seeing the ways in which participants envisioned ideal forms of student-staff partnership. It was great to have space to discuss the thinking behind these visions and to see the visions inform the planning of real SaP activities across institutions internationally. It was an incredibly busy time in Canada – it was tiring but incredibly rewarding.

The Appreciate Partnership cards created by Sarah Dyer with support from the UK Higher Education Academy are available online:

Consent and the GDPR: what approaches are universities taking?

Originally posted on Effective Learning Analytics.

We’ve already published some practical guidance for institutions on how to interpret and apply the new EU legislation – the General Data Protection Regulation – with regards to requesting student consent for the use of their data for learning analytics. We suggested that institutions should:

  • Not ask for consent for the use of non-sensitive data for analytics (our current understanding is that this can be considered as of legitimate interest or public interest)
  • Ask for consent for use of sensitive data (which, under the GDPR, will be called “special category data”)
  • Ask for consent to take interventions directly with students on the basis of the analytics

As this is an important topic, we asked seven of our pathfinder institutions – those which are moving forward fairly rapidly with implementing Jisc’s learning analytics architecture – how they are approaching the area of requesting consent for the collection and use of student data.

Are the institutions following our suggested approach?

6 of the 7 universities said that they were basically taking our suggested approach, outlined above. One is going to seek consent for the use of special category data e.g. ethnicity, for returning and newly enrolling students for the purposes of learning analytics. They will not yet be ready to implement alerts and interventions so will not at present be seeking consent for interventions based on risk calculations.

Another institution is drafting a new data collection notice which includes specific reference to using engagement data. They’re not yet using special category data for learning analytics but will seek consent if they do so in the future. Neither have they yet agreed a formal approach towards obtaining consent for interventions.

The seventh university is being “more conservative” (their words) by requesting consent from students taking part in a pilot for the use of any of their data, not just their special category data. This institution says that it interprets the law such that if use of the data is not contractual then it must be based on consent. They do not consider pilot projects to be part of the contract with students. With full-scale implementation of learning analytics however this university would be looking to tie its use to the learning and teaching contract instead – with consent being sought only for the use of special category data and interventions.

What wording are they using?

Our draft learning analytics policy and student guide, based on those developed for the University of Gloucestershire, are being used by several of the institutions. One of these recently requested a clarification on the sensitive data part of our model LA policy, which we propose to adapt by adding the final clause in the following:

Any use of such data for learning analytics will be fully justified, documented in the Student Guide to Learning Analytics and require the consent of the student concerned.

Another university is awaiting further advice from the Information Commissioner’s Office before coming up with a form of words on the enrolment form which would clarify what student data they are processing and why.

One of our other pathfinder institutions is planning to update their consent notice to include a note about attendance capture, also adding a line about learning analytics with links to their LA policy and student guide.

A further institution said that it’s in the process of finalising the wording of its consent notice, which is short and concise, and preceded by an email from their student union president, explaining what learning analytics is and why consent is being requested.

How are staff and students being informed?

The approach varies from one university providing a basic information webpage on what learning analytics is and what they’re planning to do with it – to another which is about to discuss four documents at its learning, teaching and assessment committee: LA policy, LA student guide, FAQs and data collection notice. A few institutions have already put such documents through their relevant committees.

The university mentioned earlier, which is carrying out a pilot and currently requesting consent for all data to be used for learning analytics, provides a consent pop-up box at point of enrolment for those students on the pilot. This includes a link to its learning analytics policy. A director of ethics has been appointed and has oversight of the policies and wording. Their staff are kept informed through face to face engagement.


There seems to be broad agreement among our pathfinder institutions that our suggested approach relating to GDPR and consent make sense for them. We will obviously keep this under review and will also be keeping a close eye on further guidance from the Information Commissioner’s Office.

My main concern here is that any student who does not opt-in to the inclusion of their data in learning analytics potentially disadvantages both themselves and their peers, by reducing the size and coverage of the data set – and hence its usefulness.

Our proposed approach gives learners control over whether the institution can use their special category (sensitive personal) data for learning analytics and whether they’re happy to be contacted in the event that they’re deemed to be at risk. But it still allows the rest of the data to be collected and analysed by the institution in order to enhance education for current and future students.

There’s a balance to be struck here between student privacy and what the institution regards as in the best academic interests of learners. Importantly, this approach helps to protect students and means that institutions can proceed with learning analytics while staying within the law.


Get involved in shaping the Jisc digital capability service

Originally posted on Jisc digital capability codesign challenge blog.

Would you like to help shape the digital capability service that Jisc will be launching in late 2018? We would love to hear your views on what would make the service most useful to you and what you would expect to find there.

As the various strands of the Jisc Building Digital Capability project are progressing we are looking at the best way to bring them together into a service for our members. We are currently doing some visioning work trying to imagine what the Jisc digital capability service might look like and what other strands of work we need to take up to deliver the service to our members.

Apart from the work we are doing internally (such as a cross Jisc workshop for staff to develop ideas and prototype the service) we are very much interested in hearing the voice of our members to ensure their requirements are met.

We have been asking:

What is the one key thing that you would like Jisc to provide you and your institution with in relation to digital capability? What would it look like and what you would do with it?

The answer that came up again and again was a student facing version of the discovery tool. This would be useful throughout the student journey – on entry for students to self-reflect on their digital capabilities and later on at various points to see how they are progressing.

Another part of the service that was recognized as important was the work that is taking place already around the framework but with the addition of practical ways and stories on how people have implemented it at strategic level. See our case studies of institutions who are developing their own approaches here. It would be useful to have a template or an action plan with what worked well and what steps were made to implement practice: key people you need to speak to, where you get the funding from, how you evaluate your progress. See our digital capability audit tool and curriculum checklists as a starting point for supporting discussions at a strategic and curriculum level – see these linked from this page.

Some respondents, envisaged the digital capability service as a one stop shop for all the range of Jisc resources, case studies and well researched content. Others included training materials and online workshops on how staff  can learn and keep up to date with digital capabilities both independently and through institutional support.

While we cannot promise that each and every single requested feature will be delivered it is crucial for us to hear your ideas and be steered by our members. We would love to hear your views on what the Jisc digital capability service should provide for you.

You can  fill in a short survey or get in touch directly with me ( or via Twitter @alicja_shah).  Follow us on Twitter #digitalcapabilties and join our community of practice.

We are also looking for volunteers for a focus group to provide feedback on the initial vision for the service in the autumn 2017 and would be delighted if you would like to get involved!

New service agreement for discussion

Originally posted on Effective Learning Analytics.

Download – Learning Analytics Service Agreement_Consultation Version

The new service agreement for the Learning Analytics Service is available above and we would like to invite you to review the agreement and provide feedback. This agreement will replace the existing Data Processing Agreement from August 2017 currently in place with institutions implementing the learning analytics service.

The period of consultation is outlined below to gather any feedback, explain some of the trickier aspects and how we will be working with third party vendors. We will be updating this page during the consultation period with further information to address any feedback.


–          27 June service agreement released for institutions to review and email invitation issued

–          27 June to 11 July. Consultation period where institutions are invited to submit comments or feedback via comments on the blog post or via email to

–          11 July at 13:00 – 14:00 service agreement webinar to discuss comments and feedback. The address for the webinar is

–          11 July onwards the service agreement will be available for institutions to sign

 Overview of the new agreement

The service agreement:

–          Is simplified and easier to understand and issue

–          Has been written to meet the requirements of GDPR

–          Addresses feedback from the pilot institutions

–          Includes an order form to specify which products you require when you sign up for the service,

Notes and explanation

There are some points worth noting relating to changes and also work on progress

  • Security Requirements: Clauses 8.2.2 and  9.1.3. We are also in the process of extending Jisc’s ISO 27001 Certification (See to include the learning analytics service and this will be complete in early 2018.
  • Charges for the service: The order form and service agreement mention charges for the service being available on the Jisc website, however these are still being finalised and will be accessible soon. Institutions who have been part of the initial pilot project (pathfinder sites) will not be charged until 1 August 2018. All new customers will get a six month pilot implementation period free of charge, after which charges will apply.
  • Notification of a personal data breach: Clause 9.1.5. Under Recital 85 of the General Data Protection Regulation (GDPR) a Controller has up to 72hrs to report a personal data breach to the regulator (see )
    If the Institution is not aware of the breach itself, then the 72hrs reporting deadline for the Institution as a Controller begins after it has received the breach notification from Jisc.
  • Liability: Clause 15.  Note that the maximum liability stated in 15.1 ‘as the greater of charges paid in a year and £10,000’ is also subject to clauses 15.2 & 15.3 which address larger liabilities under for example GDPR
  • Termination and deletion of data: Clause 9.1.10 describes the process and timescales for the return or deletion of data and any back-ups upon termination of the agreement, the time period covers the deletion of data in backups.

Please email any further questions or comments to

The Intelligent Library #CILIPConf17

Originally posted on e-Learning Stuff.

So what is the intelligent library? What is the future of the library?


At the CILIP Conference in Manchester this year, on Thursday 6th July, I am delivering a high level briefing session on technology, specifically looking at the library within the intelligent campus space. The session will explore the potential technologies and the possibilities that can arise from the developments in artificial intelligence and the internet of things.

There has been plenty of hype over artificial intelligence and the internet of things. Is it time to put aside the cynicism that this kind of hype generates and look seriously at how we can take advantage of these emerging technologies to improve the student experience and build an intelligent library?


The internet of things makes it possible for us to gather real-time data about the environment and usage of our library spaces. It is easy to imagine using this data to ensure the library is managed effectively, but could we go further and monitor environmental conditions in the library, or even, using facial recognition software, student reactions as they use the library so that we can continually refine the learning experience?

Most smartphones now make use of artificial intelligence to make contextual recommendations based on an individual’s location and interests. Could libraries take advantage of this technology to push information and learning resources to students? If we could, it offers some interesting possibilities. On-campus notifications could nudge students to make best use of the available services such as the library. Off-campus notifications could encourage them to take advantage of the learning opportunities all around them. Could we use approaches like this to turn student’s smartphones into educational coaches, nudging students towards the choices that lead to higher grades and prompting them to expand their learning horizons.

As we start to use a range of tracking technologies, smart cards, beacons, sensors we are facing a deluge of data in the use of buildings, spaces and equipment across a college or university campus. We are faced with a breadth and depth of data which can be challenging to use effectively and have greatest impact. These tracking technologies are already widespread in environments such as airports and retail. Often using wifi tracking to track users via their wifi enabled devices and smartphones. In addition sensors are used to track space utilisation and occupancy. Interpreting the data is fraught with challenges and difficulties, as well as potential ethical and legal issues. However this wealth of data does offer the potential to deliver more satisfying experiences for students and staff as well as ensuring the library is used as effectively as possible.


Looking in more detail we can outline some potential use cases for the intelligent library, and we may want to think which of these are desirable, but also which are possible with the current state of technology.

We can imagine an intelligent library which not only knows what seats and PCs are free, but can learn from history and predict when the library will be busy and when it will be emptier. The library then provides this information to students via an app, pushing the library when there is more availability of places and computers.

Having a deeper understanding of the utilisation of the library, will allow for more effective and efficient use of space. Could this also mean we have a flexible library that expands and contracts as demand for space in the library changes over the day or over the year?

Could we use wireless technologies, such as RFID, not just for issue and return, but also track those resources as they are used within the library itself? Could we also use the same technologies to track resources across campus to identify areas where they are being used or stored (or even lost)? Could we then enhance those spaces to improve learning?

Could we use facial recognition to monitor regular users of the library and provide insight and data into learning analytics? Could we go one step further and use facial recognition technology to discover when students are “troubled” or “in need of help” and then make appropriate interventions to support them in their studies?


If the library is getting full, could we identify those students who have been in there a long time, and push a notification, incentivising them to take a break with a free coffee from the library coffee shop? Could we go one step further, and promote wellbeing, by doing the same, but with a free coffee on the other side of campus, so they have to go outside and get some air and exercise?

using a mobile phone and a laptop

Is there any benefit in providing a platform to help gather this data from a range of systems in a standard format that makes it easier to analyse and act upon? Would it be useful to have a national view over this data? Would that enable us to find new patterns that could help us discover the story behind the data, to make appropriate interventions and improve the use of our libraries? Could we build the tools and practices an institution would need to use to gather, organise and push this data to student’s smartphones as well as exploring novel user interfaces such as chatbots?

Of course all this tracking and data collection has huge implications from an ethical perspective. We already gather large amounts of data in the library, sometimes this is anonymised, but sometimes it relates to individuals. At a basic level, we have seen physical counters to determine the number of users in the library, as well as using library management systems to gather data about the usage of resources and the borrowing behaviour of library users. The intelligent library as outlined above takes all this initial tracking of users one step further.


As the technology in the intelligent library space grows, we need to consider various questions on why we want to use these technologies, how we use them and if we should. We already use a range of systems to collect data, do we want to put in new systems to gather even more data? Some data we need to collect regardless of our concerns, a library management system by definition will collect and store huge amounts of data about resources and users. What happens less often now, but may increase in the future is the processing of that data. This is the analysis of that data and displaying that data in a way that shows a picture. The next step is taking action about what that data shows. It could be an organisational action, but could equally be action related to an individual user. How do we ensure that we have consent to collect data (sometimes this is implicit by using the library), how do we ensure we have consent for us to process that data and finally do we have constant to take action on that data?

What is the future of the library? This session at the CILIP Conference will explore the potential technologies and the possibilities that can arise from the developments in artificial intelligence and the internet of things. Can we build an intelligent library? Do we want to?

Student digital experience tracker 2017: the voice of 22,000 UK learners

Originally posted on Jisc Digital Student.

Universities and colleges are investing large sums of money on their digital environment both in terms of infrastructure, learning materials and supporting their staff with the development of their digital capabilities.
But how do we know if the investment being made in these areas is impacting on our students’ digital experience?
What do students’ expect in relation to technology? How are they using digital to support their learning? What are colleges and universities getting right and where are the areas where further work is needed?

Let’s take a look at the key findings and the implications of these for institutions and you can access the full report here and you can sign up for the 2018 Tracker here.

Student digital experience tracker 2017: the voice of 22,000 learners

Student digital experience tracker 2017: the voice of 22,000 learners

Using the student digital experience tracker:
Through extensive consultation with staff and students in further education and skills and higher education, our Digital student project developed the student digital experience tracker. The tracker allows universities, colleges and skills providers to:
• Gather evidence from learners about their digital experience, and track changes over time
• Make better informed decisions about the digital environment
• Target resources for improving digital provision
• Plan other research, data gathering and student engagement around digital issues
• Demonstrate quality enhancement and student engagement to external bodies and to students themselves.

The Tracker, delivered in BOS an online survey service especially developed for the UK education sector, is based on a concise set of questions that have been intensively tested with learners for readability and ease of response. It builds on resources such as the Jisc/NUS Digital Student Experience benchmarking tool, and the Jisc guide to Enhancing the Digital Student Experience: a strategic approach. The questions cover issues that are important to learners and/or to staff who have a focus on the digital learning experience.

This year, 74 UK institutions ran the tracker with their students collecting 22, 593 student responses. Institutions had the options of running surveys designed specifically for their learner types, HE, FE, adult and community learners, skills learners and those learners studying remotely online. The high level findings from the 2017 report are outlined below together with the implications for institutions in how we can better support our students’ digital experience.

Digital environment and services

The digital environment is key to a successful student experience. Colleges and universities should endeavour to make their systems easy to access and to provide a safe, secure and seamless online environment for learning. Many learners assume it’s a given that they will have access to free Wi-Fi whilst having the freedom to bring their own devices on campus.

What are our students saying in relation to the digital environment?

• Only 69% of FE and 80% of HE learners say they have reliable access to Wi-Fi in comparison with 90% of ACL and Skills and 96% of Online learners
• FE students use an average of 2 institutional devices (most commonly desktop and printer) and an average of 2 personal devices (most commonly smartphones and laptops)
• HE students use an average of 1.4 institutional devices (most commonly desktops and printers) and 2.7 personal devices (most commonly laptops, smartphones, tablets and printers)
• 83% FE students and 66% HE students use institutional desktops

What does this mean for institutions?

There is further work to be done in ensuring FE learners have seamless access to Wi-Fi and to ensure a parity of experience across all sites and areas of study. Despite the increasing number of learners bringing their own devices, they still like to feel that everything they need to succeed will be made available by their institution. This includes infrastructure (fixed computers and printers), access to online and off line learning resources and a physical learning environment where staff and students can work both individually and collaboratively.
Jisc offers guidance on providing a robust, flexible, digital environment and how to develop policies for supporting students’ use of their own devices.

Using digital systems for learning

Our research on student experiences and expectations of technology (Beetham, H and White, D. 2013) has shown that students respond favourably to authentic, meaningful digital activities that are linked to or directly embedded in their learning and assessment, especially if those activities are relevant to their future employment ambitions. The integration of technology in so many aspects of our daily lives means that learners now enter university of college with increased experience of technology, and have the expectation that technology will feature in their learning journey in some way.

What are our students saying in relation to how technology is supporting their learning?

• Over 90% of learners in all sectors have access to online course materials
• 70% HE and FE learners agree that when digital technology is used in their course they are more independent in their learning, and can fit learning into their lives more easily
• 80% HE and 62% FE learners agreed that submitting assignments electronically is more convenient

This suggests that learners value the convenience and flexibility that the use of digital technologies provides.

• 80% HE and 61% FE rely on the VLE for coursework, but only 40% agree they enjoy using the collaborative features or want their tutors to use it more
• Fewer than 50% of learners agreed that they feel connected to others when they use digital technology on their course
• A minority of learners (ca 10-15%) find digital systems increase their sense of isolation: they may have difficulty with distraction and information overload

What does this mean for institutions?

This suggests that learners tend to experience digital technologies that aim to solve practical issues rather than support alternative pedagogic practice. Technology is still being used to support the ‘transactional’ learning elements rather than ‘transformational’ learning opportunities.

Make it clear to students how and why technology is being used to support learning from induction and at the start of new modules to establish an institutional digital entitlement. Reinforce this by embedding digital activities and assessment opportunities as part of the curriculum design to set the expectation that students will use technology throughout their study. Accompany this with responsive support to establish a base level of digital capability and confidence and a platform to explore and develop subject and discipline specific uses.

Technology can be particularly useful in bridging the gap between study and work – apprentices and students on work placement can use technology to access resources, monitor their own progress and keep in touch with employers, tutors and assessors. Jisc offers guidance on how to develop a digital curriculum which offers students’ opportunities to develop their digital skills which will also prepare them for a digital workplace.

Support for digital skills

Not all students have clear ideas on how digital technologies can support their studies or how they may be important in their lives beyond education. Technology is so pervasive in everyday life that ensuring students are digitally capable by the end of a programme of study has to be considered as one of the key employability skills that institutions need to help students develop. Where do students go to access support with their digital skills? Can we assume that learners are confident in using technology for their learning?

What are our students saying in relation to support for their digital skills?

• 46% of learners in FE/ACL and skills look to their tutors first for support with digital skills in comparison with only 16% HE learners
• HE learners most commonly look to online resources first (37%)
• Collectively, informal support (totalling across friends, family, other students and online resources) is more common than formal support (tutors and other organisational support options) for all learners in all sectors, but especially for HE and Online learners (76% and 80%)
• Few learners in any sector look to specialist support staff first: but 65% do know where to get help with digital skills

What does this mean for institutions?

The 2017 UCISA Digital Capabilities Survey identifies the importance of staff digital capabilities as a positive influence on students highlighting the need for staff who are confident and proficient in using technology and designing appropriate digital activities.

Threading the use of digital technologies throughout the whole learning experience from pre-entry to induction, to specialised and contextualised use and emerging professional practice will help students become familiar with common workplace practices and embed technology more naturally within personal practice.

From our digital learner stories with in-depth interviews with students, Helen Beetham reports:

‘As in the pre-digital age, learners still need access to rich resources, opportunities to practice, and supportive interactions with their tutors and peers. They make notes, organise ideas, prepare assignments, collaborate, express themselves, manage their time and motivation, revise and review, listen to feedback and showcase what they can do. But there are also some striking discontinuities. Learners are making more use of graphical, video and audio resources, both to learn and to express what they can do. They curate their personal learning resources in ways that were unimaginable in the days of paper. They share, comment, mix, remix and repurpose freely. They use digital networks to connect across boundaries, whether the barriers between learning and work, or between learners in different countries, or between formal learning and all the other opportunities and interests they have.’

The traditional approach to skills development by training staff and students separately is a model that is at odds with the fast pace of change and can result in delays in implementing new technologies and new approaches. A more agile approach where staff and students are supported to work in partnership is proving to be more effective. This helps to overcome difficulties of identifying separate time, resources and offer a more responsive approach. See our guidance on supporting organisational approaches to developing digital capability.

Preparing students for a digital workplace

‘The UK will need 745,000 additional workers with digital skills to meet rising demand from employers between 2013 and 2017, and almost 90% of new jobs require digital skills to some degree, with 72% of employers stating that they are unwilling to interview candidates who do not have basic IT skills.’
Digital Skills Crisis, House of Commons Science and Technology Committee, Second Report of Session 2016-2017 (

Our research into developing student employability found that expectations from employers and education providers in relation to digital entrepreneurialism is low. To ensure students are developing these skills, their learning experiences need to be embracing these practices.

What are our students saying about being prepared for a digital workplace?

• While over 80% of HE learners and 63 % of FE learners, feel that digital skills will be important in their chosen career, only 50% agree that their course prepares them well for the digital workplace
• 40-50% of learners didn’t know or weren’t sure what digital skills their course required before they started it
• Fewer than 50% agreed that they have been told what digital skills they need to improve

What does this mean for institutions?

Learners need to understand the digital environment they are entering and the kinds of learning practices expected of them as they prepare for employment. These expectations and requirements should be embedded into induction processes as well as the curriculum and the wider learning experience. Our technology for employability toolkit (pdf) provides effective practice tips on incorporating technology-for-employability. Several universities have adopted digital capability, digital citizenship, or similar as a graduate outcome. Others have required digital activities and outcomes to be discussed during course design and review. Further work is required to ensure students are effectively supported with the development of their digital skills.

There is a range of approaches to using technologies in the development of students’ digital skills for the workplace. For example, some institutions are helping learners to enter into partnerships with employers around the world to identify and solve real world problems. This approach can prove highly motivating for learners, while also enabling providers to develop efficient and cost effective authentic learning experiences for learners, an approach that can bring benefits for employers too.

Engaging learners in planning for digital

Learners who feel a sense of belonging with their college or university and who feel the institution cares about their learning experience are more likely to succeed, to maintain good relationships beyond their initial course of study and contribute more through alumni activities. Engaging with learners in a genuine and meaningful way regarding their digital learner journey is one such way to build loyalty and enhance their learning experience.
What are our students saying on being involved in developing their digital environment?

• 35% HE students and 44% FE learners agree that they are involved in decisions about the digital environment

What does this mean for institutions?

This mirrors the 2017 UCISA Digital capabilities survey report , which stated that 43% of Universities that responded to the survey are working with students as change agents (another 38% said they were working towards this).
Through engaging in meaningful and collaborative dialogue and partnership whilst working with students as “change agents”, colleges and universities can encourage a deeper understanding of how digital technology can support learners’ needs. Effective use of technology can enhance the learning experience, for example, by providing additional channels of support or opening up enriched opportunities for learning and communicating to those who may otherwise find it difficult to participate.

When students and staff work together to combine their skills and expertise, results can exceed expectations. Here are some examples of how colleges and universities are engaging learners in the development of their own digital environment.

Further information:

You can read our report, Jisc Digital Experience Tracker 2017: the voice of 22,000 UK learners to gather further insights into what students are saying in relation to their digital student experience and join us for the Connect More events where further discussion around these findings will take place.

Sign up for 2018 Tracker here:

Join the tracker mailing list

Follow #digitalstudent and @jisc on Twitter

Using learning analytics to enhance the curriculum

Originally posted on Effective Learning Analytics.

At our Planning Interventions workshop at Aston University last week we spent the morning looking at interventions with individual at-risk students. This post reports on the afternoon session, where we examined the use of learning analytics to enhance the curriculum.

Workshop participants

Participants were given several simulated analytics outputs relating to the first year of a degree programme. Data concerned both the programme and individual modules, and included:

  • VLE activity for one module showing access of pages, quizzes and discussion forums
  • A graph of VLE activity correlating with grades for the programme of study (included below)
  • Attendance against achievement
  • Assessment dates and activity (included below)
  • The prediction of students at risk of failure over the year

Graph shows a reasonable correlation between VLE use and grades

This graph shows a correlation between VLE use and performance in assessments.

Graph shows how online activity and attendance for the cohort vary each month. It also show the number of assessments due each month across the programme - suggesting that these may have an impact on engagement.

This graph shows online activity and attendance at classes for the group during the first year of the programme, with the number of assessments required to be submitted per month.

Groups were asked to review the various graphs and tables and then develop an intervention plan addressing:

  1. Questions – what should we be asking about the effectiveness of our curricula?
  2. Data – what data should we collect to help answer these questions? Is the data provided useful?
  3. Stakeholders – who are the main actors who can adapt the curriculum?
  4. Interventions – what changes might you make to enhance the curriculum and what would you expect to happen?
  5. Adoption – what will staff do differently? How will they be encouraged to change their working practices?
  6. Evaluation – how will the success of the enhancement be evaluated?
  7. Sustainability – how do we make it effective and on-going?


It’s a realistic scenario that staff are increasingly going to be presented with a selection of different visualisations of data about engagement with learning activities, grades etc.  – and then have to see what questions these raise about the effectiveness of the curricula they’ve put together.

One of the most glaring issues with the simulated programme we presented was the bunching of assessments. “Which clown designed the assessment schedule?” asked one group. They felt that this was quite clearly driving attendance and online activities (or the lack of them). Another group wanted to know more detail about the assessments – were they formative or summative and was the feedback timely? It was also noted that a third of students hadn’t engaged in the quizzes at all.

There was a desire for greater detail about VLE use in general. Which tools are being used and which appear to be effective?

One group asked: “We observe that average grade correlates highly positively with attendance so how do we go about optimising consistent attendance patterns?” They also wondered how higher participation could be achieved in forums and quizzes as VLE engagement is correlated too with better grades.

We do of course need to keep reminding ourselves that correlation is not the same as causation. However if the pattern of a successful learner includes high engagement with the VLE and good face to face attendance, then we can at least attempt to design curricula which encourage this and advise students of the importance of engagement. We can probably assume that overall performance of cohorts will then improve if participation in the learning activities we’ve designed improves – though there are of course lots of factors to consider such as other aspects of the curriculum not standing still, different staff involved and new cohorts which vary in multiple aspects.

Another problem identified was low attendance in a module requiring studio attendance (it was a ceramics degree). The group I was in speculated as to whether attendance was being recorded accurately, or perhaps that students would do their studio work in a few long sessions rather than lots of short ones, the data thus suggesting lower participation than the reality.

One group noted that online engagement was low and asked “Why are we struggling to engage students with online activities? Should the course even be online? Is the VLE the best forum to share practical application? [it being a ceramics degree]”. Other fundamental questions people thought should be asked included “Does the teaching matter?” and “Does the VLE make a difference?”


We asked the groups whether the data given to them was of use and what else should be collected to help them answer their questions about the curriculum. There was a wide variety of suggestions here.

Several mentioned the need for a greater breakdown of data on VLE activity, particularly the timing of access to content, quizzes and forums, and the duration of visits. Seeing patterns of use of quizzes, including numbers of students who’d accessed them multiple times, was also thought to be potentially useful – as well as further data on the scores achieved. Also, further detail on the nature of the assessments e.g. whether they involve group work was considered important.

Another group wanted to be able to drill down to see individual student data when required, as well as a cohort view.

Final module marks for the cohort along with retention and progression data were thought to be necessary pieces of information as well. Student satisfaction data would help to discover why some modules have much better engagement than others. Further insight could be obtained from focus groups asking questions such as “Why are we struggling to engage you with online activities?” or attempting to discover whether IT literacy is an issue.

Some groups suggested additional visualisations e.g. showing the relationship between quiz engagement and course grade, and between VLE engagement and attendance. Do those who attend less catch up at home by watching lecture videos, for example? Does that compensate for missing out on face to face contact?


Students, academic staff, associate deans, programme directors, module leads, tutors, learning technologists, external examiners, alumni, employers, professional bodies and other institutions sharing best practice, were all suggested. One group noted the need for students to be active here rather than passive – another proposed that the staff-student consultative committee could help provide ideas. Students could be asked, for example, “What would the VLE look like if you had a hand in it?”


Several groups mentioned addressing the issue of assessments being clustered too closely. Assessments could be introduced earlier, to help students get used to them, and to provide more timely feedback to staff on student engagement. The assessment load, in this example, should clearly be spread more evenly across the year.

The learning design should ensure that online activities are more clearly linked to assessment, which should encourage better take-up of forums and quizzes. An early process to optimise student engagement with the VLE may be necessary. Study skills training at the start of the programme which shows students the most effective ways to use online learning might also help.

The data might even raise questions as to the effectiveness of the content on the VLE: is it “rich, dynamic, engaging and accessible?” The quality of the VLE platform itself might even be questioned. It may also be that there are too many quizzes and forums, and that they could be consolidated somehow – or non-online alternatives could be considered.

If use of the VLE is considered important by the institution then showing staff the correlations between its usage and assessment performance may be helpful in making the case that they should be using it more.


In this section we were hoping that the groups would think about what staff would need to do differently to implement the use of analytics to enhance the curriculum. How would they be encouraged to change their working practices and what roles would be involved?

However the two groups which got to this point seemed to be thinking more about how to improve use of the VLE in particular.

One group suggested working with students as a partnership to redesign the VLE, and training staff in instructional design. The other suggested flipping the classroom, and moving from tutorials and seminars to a more personalised form of learning.


Some participants wondered what we should be evaluating here: the success of the programme or uptake of the VLE? If our intervention aims to increase VLE uptake because we see a correlation between that and performance then measuring VLE use before and after the intervention would be important. Likewise reviewing data on attendance may help to evaluate the success of the intervention. Comparisons could be made with historical data on attendance, engagement and satisfaction to see if these have improved.

Ultimately, reviewing the grades of future cohorts or the marks in individual assessments may give an indication of whether the intervention has been successful.


Pockets of good practice can disappear with individuals: one group suggested attempting to share ownership across the institution, and to capture any lessons learnt. Ensuring that the student voice is heard, and that there’s a process of feedback and iterative review were also thought to be vital.


Participants worked hard and seemed highly engaged with the activities all afternoon. There is a strong desire to see how we can use the rapidly increasing sources of data on student activity to develop curricula based on evidence of what works. Most universities and colleges are only beginning to do this but staff skills in interpreting such data and updating the curriculum as a result are, I believe, soon going to be essential. We now have the blueprint for a workshop which we can build on for institutions interested in using learning analytics for more than just addressing student retention.

Planning interventions with at-risk students

Originally posted on Effective Learning Analytics.

Last week Paul Bailey and I braved the sweltering heat which had briefly engulfed Birmingham and most of the rest of England to meet with staff and students at Aston University. 30 of us, including representatives from three other universities, spent the day thinking about how institutions can best make use of the intelligence gained from data collected about students and their activities.Workers enjoying the sunshine in Birmingham city centre

We concentrated on two of the main applications of learning analytics:

  1. Intervening with individual students who are flagged as being at risk of withdrawal or poor academic performance
  2. Enhancing the curriculum on the basis of data in areas such as the uptake of learning content or the effectiveness of learning activities (which I’ll cover in a later blog post)

In the morning, each of the five groups which were formed chose a scenario and thought about the most appropriate interventions. Here is one scenario which was selected by two of the groups:

Student Data Example 4: Business student with decreasing attendance and online activity

Student: Fredrik Bell

Programme of Study/Module: Year 1 Business Computing and IT

Background: The data provides a view of attendance vs online activity from the start of the course for a BUS101 Module. Fredrick has a low average mark, has asked 1 request for an extensions deadline. Attendance has been dropping and VLE use following a peak in October has dropped off. The IT modules show that his attendance is low but he has good marks in assignment up to January. [Note that the vertical scale represents this student’s engagement as a percentage of the average for the cohort – thus 150% online activity in October is well above average.]

Graph showing decreasing attendance and VLE activity by student against average


The groups were then asked to develop an intervention plan consisting of:

  1. Triggers – what precipitates an alert/intervention
  2. Assumptions – what other information might you consider and assumptions made before you make an intervention
  3. Types e.g. reminders, questions, prompts, invitations to meet with tutor, supportive messages
  4. Message and behaviour change or expectation on the student – suggest possible wording for a message and anticipated effect on student behaviour
  5. Timing and frequency – when might this type of intervention be made, how soon to be effective
  6. Adoption – what will staff do differently? How will they be encouraged to change their working practices? Which roles will be involved?
  7. Evaluation – how will the success of the interventions be evaluated


The groups specified a variety of triggers for interventions. Attendance at lectures and tutorials was a key one. One group considered 55% attendance to be the point at which an intervention should be triggered.

A low rating in comparison to average is likely to be a better trigger. This is the case for accessing the VLE too – overall online activity well below average for the last 3 weeks was one suggestion. A danger here is that cohorts vary – and a whole class may be unusually engaged or disengaged with their module.

Requesting an extension to an asssignment submission was thought to be a risk factor that could trigger an intervention. Low marks (40%) in an assignment could also be used.

An issue which I was acutely aware of was that ideally we should have the input of a data scientist in helping to decide what the triggers should be. An attendance rate, for example, which indicates whether a student is at risk is likely to be more accurate if it’s based on the analysis of historic data. Combining statistical input with the experience of experienced academics and other staff may be the best way to define the metrics.


It’s likely of course that engagement will vary across modules and programmes within an institution, where different schools or faculties will have different approaches to the importance of attendance – or even whether or not it’s compulsory. One student mentioned that some lecturers have fantastic attendance rates, while students quickly stop going to boring lectures. The fact that they’re not attending the latter doesn’t necessarily imply that they are at risk – though it might suggest that the lecturer needs to make their lectures more engaging!

Events such as half-term holidays, reading weeks etc would also need to be taken into account. Meanwhile, if Panopto recordings of lectures are being made, then student views of these might need to be considered as a substitute for physical attendance.

The use of the VLE varies dramatically too, of course, between modules. Is regular access necessary or has the student accessed the module website and printed out all the materials on day 1?

One group suggested that intervention in a particular module should consider engagement in other modules too. A common situation is that a student has high engagement with one module which has particularly demanding requirements to the detriment of their engagement in less demanding modules.

If low marks are being used as a trigger, it may be too late to intervene by the time the assessment has been marked. More frequent use of formative assessments was suggested by one group.

Another assumption mentioned was that low VLE usage and physical attendance do actually correlate with low attainment. A drop in attendance or in VLE use may not necessarily be a cause for concern. Again, analysis of historic data may be the best way to decide this.

Types of intervention

Push notifications (to a student app or the VLE), emails and/or text messages sent to the student by the relevant staff member were suggested. These might consist of an invitation to meet with a personal tutor. Some messages might best go out from the programme director (sent automatically with replies directed to a support team).

Other options were to send a written communication or to approach a student at a lecture to attempt to ensure they attend a follow-up meeting (though if they’re disengaged, you probably won’t find them in the lecture, and of course in large cohorts, lecturers may not know individual students…)

One risk that was noted was that multiple alerts could go out from different modules around the same time, swamping the student.

Ideally, one group suggested, the messages would be recorded in a case management system so that allied services would be informed and could assist where appropriate. Interventions would be viewable on a dashboard which could be accessed by careers, student services etc (with confidentiality maintained where appropriate). People working in groups at tables

Timing and frequency

A sequence of messages was suggested by one group:

  1. A message triggered by extension request or late submission
  2. If no response received from the student after a week, a follow up message expressing concern is sent

Another group suggested that an intervention would need to be taken in the first 4 weeks or it might be too late. If low marks are being used as a trigger, this would have to happen immediately after marking – waiting for subject boards etc to validate the marks would miss the opportunity for intervention.

Points during the module e.g. 25%, 50% and 75% of the way through could be used as intervention points, as happens at Purdue University.


A lot of thought is put into the best wording for messages by some institutions. One suggestion at our workshop, with the aim of opening a dialogue with the student to gauge if further support is necessary, was:

I noticed that your coursework was submitted late. I hope everything is OK. Let me know how you are doing.

Another group thought that a text message could be sent to the student, which would maximise the chance of it getting through to them. It would ask them to check an important email which has been sent to them. The email would say:

  1. We are contacting you personally because we care about your learning
  2. We would like to have a chat
  3. Why we would like to have a chat – concerns about attendance and online engagement, and risk that this might translate to poor performance in assessments coming up
  4. Make student aware of support networks in place

The use of “soft” or “human” language, which relates to the student, was thought to be important.  Short, sharp, concise and supportive wording were recommended.  A subject line might be: “How to pass your module” or “John, let’s meet to discuss your attendance”.

I did have a conversation with one participant at lunchtime who thought that we need to be clear and firm, as well as supportive, with students about their prospects of success. If we beat about the bush or mollycoddle them too much, they’re going to get a nasty shock when they encounter firmer performance management in the workplace.


A couple of groups managed to get onto this part of the exercise. Numerous projects involving learning technologies have failed at this hurdle – it doesn’t matter how useful and innovative the project is if it’s not adopted effectively into the working and study practices of staff and students.  What I was hoping to see in this part was more mention of staff development activities, staff incentives, communication activities etc. but most people had run out of time.

One group did however suggest defining a process to specify which staff are responsible for which interventions when certain criteria are met. A centralist approach could be less effective they thought than a personal one – so ideally a personalised message derived from a template would be sent from a personal tutor. Retention officers would be key players in the planning and operationalisation of intervention processes.

It was also thought that the analytics should be accessible by more than one member of staff e.g. the module tutor, programme director, personal tutor and retention officer in order to determine if the cause for concern is a one-off or whether something more widespread is happening across the cohort or module.


One of my personal mantras is that there’s no point in carrying out learning analytics if you’re not going to carry out interventions – and that there’s no point in carrying out interventions unless you’re going to evaluate their effectiveness. To this end, we asked the groups to think about how they would go about assessing whether the interventions were having an impact. Suggestions from one group included monitoring whether:

  • Students are responding to interventions by meeting with their personal tutor etc as requested
  • Students submit future assessments on time
  • Attendance and/or online engagement improve to expected levels

Another group proposed:

  • Asking students whether the intervention has made a difference to them
  • Analysing the results of those who have undergone an intervention

Engagement by the participants during the morning session would no doubt be rated as “high”, with some great ideas emerging, and some valuable progress made in developing the thinking of each institution in how to plan interventions with individual students. In my next post, I’ll discuss how the afternoon session went, where we examined using learning analytics to enhance the curriculum.