Designing learning analytics dashboards

Originally posted on Effective Learning Analytics.

Universities and colleges are accumulating significant amounts of data about student engagement, potentially enabling early warning about students at risk of failure – and the enhancement of many aspects of course provision. But how should that data be presented to the various stakeholders in institutions? Who are these stakeholders and what would be the most useful visualisations for each of them?

Torchbox workshop meeting

These are some of the questions being addressed by Jisc as we roll out our learning analytics architecture for educational institutions in the UK. Yesterday we met with Torchbox at their Bristol offices to work on user experience design for Jisc’s Data Explorer. This tool is not aimed at doing everything or competing with the more sophisticated data crunching tools emerging for learning analytics. It’s being designed to enable institutions to explore the data accumulating in their learning record stores, to perform simple correlations, and to gain experience before investing in a fully-fledged system.Data explorer tool

With the help of Torchbox and colleagues from two universities (Gloucestershire and Newman) we looked at user goals, behaviours and experiences for a range of user types, as well as some of the associated organisational goals and barriers.

Out of this emerged two distinct hierarchies of roles:

  1. Modules: module leader / tutor -> subject / course / programme leader -> management
  2. Pastoral: personal tutor -> senior tutor -> head of tutors

A third grouping contains a number of roles, not necessarily arranged in any hierarchy:

Central support: e.g. librarians, IT, learning technologists

Students have their own visualisations through a separate app, Study Goal, which itself went through an extensive consultation and design process. However one use case is for personal tutors to discuss the dashboard they have on an individual student with them during a meeting as a way of framing the discussion.

Modules hierarchy
For each module there will be a leader who needs data on the students taking that module, resource utilisation etc. Modules are grouped into subjects, courses or programmes, led by someone who is likely to require aggregated data from the component modules but may also want to drill down to module level data. The management layer above that will be interested in further higher level aggregations, again perhaps requiring to drill down to lower levels.

Pastoral hierarchy
Personal tutors will require information on the individual students they’re responsible for and may be interested in aggregated data on all of them. A senior tutor role exists in many institutions, someone who is responsible for multiple tutors and will want to see data at a higher level. There may also be a head of tutors, responsible for the whole process, and therefore interested in further aggregations of data. Again there may be requirements for each of these to be able to drill down to lower levels of data.

Central support
What we termed “central support” is less a hierarchy than a collection of different roles for staff in different parts of the institution, generally providing support across the board to students and/or staff. These may include librarians, disability support staff, counselling staff, IT support, learning technologists and academic designers.

One challenging design issue is that these roles will be constituted differently at each institution. They’ll also have a variety of names. Can we come up with generic enough roles that they can be mapped easily onto existing institutional structures and renamed as required in each institution? Or do we work on collections of lower level roles and allow institutions to form their own higher level groupings, perhaps building a series of default roles for them to select from?

I suspect this will become clearer quite rapidly as we work on it. Torchbox is now setting up interviews with stakeholders at institutions to test out and refine the ideas we had yesterday before we build them into the Data Explorer tool in order to provide the most useful visualisations to staff. Stay tuned for further updates on this blog.

Early bird registration for CAN 2017: Extended until 5 March

Originally posted on Change Agents' Network.

Registration for CAN 2017 is  open  and the early bird rates are  available until 5 March.  The 5th annuaPartnership1.jpgl CAN conference on 20 & 21 April is a fantastic opportunity for students and staff from across the UK to share good practice and honest reflections on the importance of working in partnership to improve the education experience.  This year’s conference is hosted by the University of Exeter and supported by Jisc.

There are 50 confirmed sessions and a plenary panel session. The conference will open with a keynote address presented jointly by staff and students at the University of Exeter, charting the role that Exeter students have played as “Change Agents” over the last 10 years, and what developments and challenges lie ahead for Exeter.  The Conference’s second keynote address, “Authentic Partnership – what, how and why?” will be provided by Colin Bryson (Newcastle University, Chair of the RAISE Network) in partnership with Fanni Albert, a combined honours student from Newcastle University.

The final programme will be posted shortly but if you want to know what this CAN conference is building on, have a look at Birmingham 2015 and Lincoln 2016.  One participant reflected “This event has been a great opportunity to share ideas and discuss common goals, challenges and ideas.”  We hope you’ll join us and discover that the ‘buzz’ happens throughout the 2 days and isn’t limited to the coffee breaks!

 

Development not Training: an approach to social media for leaders

Originally posted on lawrie : converged.

The Jisc Digital Leaders course is running again in May. Whenever we have run the course we have always had lots of questions about social media, especially Facebook, Twitter and Linkedin (sic).

These three particular platforms perhaps embody one of the key issues – and causes of angst – for many people both working in education and beyond: where do you draw the line between personal and professional? For some people the following is a rule of thumb:

  • Facebook: Personal
  • Linkedin: Professional
  • Twitter: complicated – with some people saying I only use it for work, and some saying I only use it for personal stuff!

At first glance it makes some sense, and I have heard variations on these distinctions in many workshops. There are two problems with it. Firstly, the affordances of the platforms are very different. Secondly, it creates an artificial divide in your life where there is a hard line between work (professional) and personal activities. Most people have friends of one degree or another in the workplace; it is after all where we spend a large part of our waking life.

One of the projects I was managing under the 2006 Users and Innovation Programme – aimed primarily at looking at how education was influenced by Web 2.0 – came up with the idea of social media behaviours being a continuum.

The Open Habitat team posited that at one end of the continuum, individuals treated the web as a tool box, going online to do something, completing it, and going offline, leaving almost no social media trace. At the other end, some individuals had their identities more embedded in their social media use, actively engaging, having conversations, leaving digital footprints. Framing these differing practices on a continuum allowed the team to think about how students engaged in online learning environments, especially – at that time – Second Life. The team, led by Dave White, coined the term Visitor – Resident for the continuum.

Canal Boat

At the time I was working with Dave Cormier around the idea of scenario planning for education. We recognized that whilst the Continuum was was useful for identifying how some students behaved in social media spaces for their learning, more nuance was needed. Whilst navigating a canal in the North of England, and joined at various times by Mark Childs, Richard Hall and Dave White, I reflected on my own practice and discussions I had had with Craig Wentworth (formly at Jisc) around attitudes to work-life balance. Craig described his own practice as a work-life blend: he and I, and others in our team and beyond (such as Dave Cormier) kept open social media and communication channels and were by nature flexible in our working hours. With this in mind, we reflected on the digital spaces we occupied – and how and why – and decided to map our various practices along the visitor – resident continuum.”

We also realized that more granular value and understanding could be achieved by adding an axis of personal / institutional use, turning the continuum into a tension pair. Later this was to become personal and professional (where professional could also be applied to a student’s formal learning activities).

This grid – and the leaders’ mapping of their own and organisational practices on it – are now established as part of the VandR toolkit. It is also a key part of the approach that we take on the Jisc Leaders Course to delegates’ use of social media (and other online tools). We are not training leaders to use social media, the tools we use are open ended, adaptable and focused on personal and professional development in the context of the individual. There are no template maps that can be elicited, no best practice that can be duplicated. The process is a journey, and the maps that leaders create are merely a snapshot of where their current practice stands. The Leaders Course uses the maps created as a development tool, allowing delegates to understand and reflect, with their peers, on what they do, and importantly what their aspirations are.

At the start of this post I talked about the angst of whether some social media and online tools are for work or personal. The process of creating a map of practice allows you to reflect on this, not dictating right or wrong, but allowing choices to be better understood. Look at 10 social media accounts of senior leaders in education and you will probably see 10 different approaches to how they project themselves, communicate and engage online. The Jisc Leaders Course approach to advice around social media, as well as providing help with getting started with the tools for those that need it, looks at the behaviours that leaders exhibit. Providing advice and guidance to enable delegates to create a more nuanced use of social media and online tools, to meet their needs and effectively communicate.

Encouraging informal learning

Originally posted on e-Learning Stuff.

So how do we encourage students to learn outside the formal structures and processes we put in place across our institutions?

Informal learning in my opinion is learning that happens outside the “control” of the institution, but is part of the learning towards a qualification that a learner will undertake. This learning may happen within the institution, but will also happen outside at home, at work or in a coffee shop. This definition of informal learning differs from non-formal learning in that the activity of learning is still tied to the institution and the qualification, but is not a proscribed or set activity as set down by a practitioner or an academic.

So can you design informal learning?

No!

There we go that was easy wasn’t it.

You see when you design informal learning, you formalise it and as a result it becomes formal learning.

So if you can’t design informal learning, then how do you design informal learning?

It’s not about designing informal learning, it’s about institutions facilitating and encouraging informal learning. If this happens then, with encouragement from practitioners (rather than setting activities) we should see more learners learning informally.

So how should institutions encourage informal learning?

Well the key really is to think about what actually facilitates and encourages informal learning.

It’s a combination of factors and can include design of learning spaces and the learning activities undertaken by the learners.

Creating the right contexts and environments for informal learning, will ensure that the concept of learn anywhere and anytime is encouraged and enhanced.

Don’t forget the coffee, well of course that could be tea, soft drinks, even cakes and chocolate. Having refreshments can aid the learning process, but also encourages people to be within an informal learning space.

So where is it written that learning has to be uncomfortable?

After I put some sofas into the libraries when I worked in a college, I was asked a few times why do I have sofas in the library when the library is a learning environment?

I would ask then, where is it written down that learning has to be uncomfortable? Where is the rulebook that states learners should sit at desks on hard chairs? Is it not possible for a learner to learn whilst sitting on a sofa? Why can’t a learning environment be enticing, comfortable and even a little bit social?

Sometimes you want to take learners out of their comfort zone, but I am not sure that means making them sit on hard benches! Providing spaces that learners like to be in, ones they will spend time in, combined with other factors could encourage informal learning. If all other factors were implemented, why would you spoil it all, by having an uncomfortable environment?

With dependency on the internet and connectivity for learning these days, it is critical when wanting to encourage learning to have ubiquitous, fast and dependable wifi. Any spaces will need to have the capacity for multiple connections, many learners will have two or more devices that use wifi.

Dropped connections, insufficient bandwidth can result in learners going elsewhere or doing something other than learning.

Another factor that often gets ignored is the impact building construction can have on 3G and 4G signals. If learners are using their own connections, then building construction should be considered in respect to that issue.

When creating spaces that will encourage informal learning, then it needs to provide different furniture for different activities.

Sofas for calm individual reflective thinking, tables and chairs for small group work. Quiet secluded places for focused work. Use appropriate furniture for small groups discussion.

As well as the physical aspects of the space, it is also useful to
think about the temperature, the lighting and ambient noise.
Use furniture, walls and plants to create quiet and less quiet spaces for example. Having the same kind of lighting across a space may be efficient, but using different kinds of lighting for different spaces can both encourage different kinds of activities.

As well as physical spaces, it is also useful when encouraging informal learning, to provide access to virtual collaborative spaces. This could be the vLE, but other options are available such as Slack, WhatsApp or even a Facebook group. It’s not just about providing access (through the firewall) but also about providing guidance and best practice so that learners have a better understanding of the benefits (and limitations) of these virtual collaborative tools. It would also make sense to check that the organisation has a sensible Social Media policy that reflects the use of social media tools for learning.

Think about any non-formal activity and ensure that student has access to appropriate resources (digital and non-digital). Is access to those resources mobile friendly? Will they work on the kinds of devices those learners are using when learning?

One thing to ensure is you have an appropriate Bring Your Own Device (BYOD) policy to facilitate informal learning.

So how are you creating spaces for and facilitating informal learning?

This blog post is inspired by a blog post on informal learning, that I wrote in 2010, and a cookery book activity from the ALT Winter Conference 2016.

Notes and presentations from the 9th Jisc Learning Analytics Network meeting at Exeter University

Originally posted on Effective Learning Analytics.

20170222_131044

Our network meeting this week in Exeter was again fully booked; there does seem to be a growing interest in learning analytics in the UK. This was also a particularly informative meeting, I felt, and we were able to absorb a huge amount of expertise from our presenters.

Exeter University is ahead of much of the sector in this area, and has been preparing itself for learning analytics with some rigorous research activities as well as working on its data sources and technical infrastructure. We heard from three key members of the Exeter team.

Prof Wendy Robinson presenting

Prof Wendy Robinson, Academic Dean for Students was first (Slides ppt 420KB). Wendy introduced Exeter’s learning analytics project, which is led by academics in partnership with students and aims to help students monitor and benchmark their own performance. Enhancing the information available to academic staff on their students, and ascertaining the factors affecting student success are also key aims.

Wendy discussed the University’s “My Grades” feature of their student app, which shows module results to date, and is proving popular with students. There are plans to add more sophisticated functionality.

Joanne Smith from Psychology was next (Slides 1 ppt 733KB | Slides 2 ppt 525KB). Joanne had led a systematic review of the effectiveness of learning analytics interventions. This was a comprehensive and impressive bit of work. She and a colleague assessed peer reviewed studies as “strong”, “moderate” or “weak” based on various aspects of the methodology. Out of 547 publications identified only 20 were retained for inclusion in the analysis.

Joanne concluded from the literature review that the three key factors in predicting student “success” are: social/demographic factors, academic history and engagement (usually in a VLE). She also points to the evidence being limited due to the small number of research studies to date, and issues around the quality of some of them.

Hywel Williams showing where Exeter's data sources for learning analytics are coming from (or will do in the future)

Hywel Williams showing where Exeter’s data sources for learning analytics are coming from (green), aren’t coming from (red) or will do in the future (amber).

Joanne’s colleague, Hywel Williams (Slides pdf 2.5MB) then gave another fascinating insight into research carried out with colleagues Carmel Kent and Chris Boulton at Exeter surrounding the links between engagement and success. The data showed that use of the VLE at Exeter (and, he believed, by extension other bricks and mortar universities) was not a good predictor of student success.

They had identified many potential data sources for learning analytics. They ruled out the use of social media, private email and WiFi records as overly-intrusive. Sources that could be used included lecture recording data, access to past papers, use of the iExeter student app, VLE access, module feedback forms and careers data. Future data sources included attendance data, IT logins etc, use of the online helpdesk and assignment / assessment submissions.

Hywel and colleagues found that predictors varied across disciplines and also between high and low performers.

Shri Footring made an amateur but acceptable recording of the Exeter team’s presentations using Periscope.

Paul Bailey and Rob Wyn Jones then gave an update on activities in the Jisc Learning Analytics Project. (Slides pdf 2.8MB) | Recording  The slides give a good summary of the current status of the project. Paul also demonstrated the Data Explorer tool, which has been developed by Jisc to enable institutions to provide quick and easy analyses of the data they hold in the learning records warehouse.

20170222_133004

Our afternoon sessions were on the theme of the connection between learning analytics and learning gain. We began with Dr Camille Kandiko Howson, Academic Head of Student Engagement at Kings College London (Slides pdf 1.25MB).

Camille is an expert in learning gain and has been working across the 13 HEFCE-funded pilot projects in the area. She discussed the many different aspects of learning that can potentially be measured, such as intellectual skills, communication skills, interpersonal skills, vocational and employment preparedness and personal life quality enhancement. There are various ways to attempt to measure these, and different measures for students, subjects and institutions.

20170222_142323Dr Ian Scott from Oxford Brookes University was next (Slides pdf 5.8MB). He discussed the data sources in use at his university, and the ABC Learning Gains project, carried out with Surrey University and the Open University.

The project had had a number of ethical concerns, in particular with students opting in or out of the data collection. A literature review found that the concept of learning gain is mainly used to measure the effect of particular educational interventions. Research carried out by the project on Open University students showed that socio-demographic factors were the strongest predictors of variance in learning gain, in particular ethnicity and prior educational level. Meanwhile for Business students engagement with the VLE correlated with higher learning gains, while for Arts students it was not related.

There is also a recording of Camille’s and Ian’s slot.

Students at ExeterOur final presentation was by Dr John Whitmer, Director for Analytics & Research at Blackboard, who came in by Skype from California. We heard from John, getting on for two years ago now, at our network event at Nottingham Trent. He’s been leading on some fascinating research since then, examining the data on large number of students and their use of their VLE, and exploring new ways of visualising the data (Slides to come). Recording

Our next session will be on 3rd May 2017, at the University of Strathclyde in Glasgow.

 

 

12 new learner stories now available

Originally posted on Jisc Digital Student.

The idea for the Digital Learner Stories came from feedback at a consultation event at the end of the Jisc Digital student: Skills sector study.  Participants remarked that it would be useful to have real learner voices talking briefly about their digital experiences in various post-16 sectors.  This would be a resource to help groups within institutions hear directly from learners and have better conversations about the role that technology plays in the present and in planning for the future; from teacher/lecturer training to library spaces to access to hardware and robust wifi.

And it has been a joy to work on the Digital Learner Stories and hear twelve stories from  learners across the sectors. They provide inspirational snapshots of their digital experiences in HE, FE and Skills, as an adult learner in an evening class, as an apprentice, as  part time and/or full time learners/employees, learning in a physical classroom or online  at a college, university or as part of a continuing education institution.  As individuals talk about how technology supports or makes a difference to their studies, what comes through is a love of learning – formally and informally – and a realisation that digital opportunities loosen some of the constraints  of traditional education.  Some of the participants would be unlikely to have been able to study in a previous time.

We are grateful to the staff who helped us find volunteers who were willing to share their digital stories.  We are indebted to the 12 learners who gave their time to talk with us and to record a short video of their thoughts, sharing apps and enthusiasm for the role that digital technology plays in their lives. We hope that the stories and videos will  stimulate discussion  and promote individual and institutional reflection on the access and opportunities that are highlighted in these 12 stories.

Consent for learning analytics: some practical guidance for institutions

Originally posted on Effective Learning Analytics.

What information do students need about the use of their data for learning analytics? When should students be asked for their consent for this? How is it best to obtain that consent? What happens if a student wishes to opt out?

Consent continues to be one of the main concerns for universities and colleges when thinking about deploying learning analytics. We covered some of the issues in a podcast last year but it’s become clear that what institutions really need is concrete guidance on how to deal with these questions.

After talking to Andrew Cormack, Jisc Technologies’ Chief Regulatory Officer, I’ve put together the following guidance. This should not be taken as legal advice; we would welcome commentary from others who have been considering these issues.

The Data Protection Act
The UK Data Protection Act 1998 (DPA), based on the EU’s Data Protection Directive, has set the context for data collection and use from a legal perspective for nearly 20 years. Institutions should already have in place policies for processing the personal data of students, ensuring compliance with the DPA.

In order to process personal data one or more conditions must be met. Obtaining the free, informed, consent of the individual is one of these. However there are two other conditions which may be relevant in the case of learning analytics. Processing can also be justified on the basis that:

  1. It is necessary in relation to a contract that the student has entered into, or
  2. The processing is assessed as being in the “legitimate interests” of the organisation

Taking a course can be regarded as the student entering a contract. To fulfil this contract, the university or college needs to process certain data, such as the student’s name, date of birth, address, and the modules they are taking. Particularly where modules are delivered digitally, activity records may be a necessary part of providing the contracted service, or kept as part of the organisation’s legitimate interest in ensuring that systems and resources are not misused. If, in addition, students are invited to submit records of the hours they spend studying, for example, this could be based on their free, informed consent.

When processing using legitimate interest as justification, the law provides additional protection by requiring that the interests of the institution must be balanced against any risk to the interests of the individual. Individuals may request an individual assessment of this balance (and exclusion of their data from processing) if their circumstances involve an increased risk. To satisfy this balancing test, processing should be designed to minimise the impact on individuals. Learning analytics may, for example, help to identify improvements that can be made to a course. That could be regarded as being in the legitimate interests of the organisation and to benefit both current and future cohorts without impacting the rights or freedoms of any individual, thus satisfying the balancing test.

European data protection regulators have explained that a single transaction may involve processing under several different conditions, and that trying to squeeze these into any single condition may actually weaken the protection of the individuals concerned. The data and processing involved in learning analytics may require using different conditions for the stages of data collection, data analysis, and individual intervention.

Sensitive personal data
Data which is of a more sensitive nature is defined separately in the law and is subject to additional protections. This includes attributes such as a person’s religion, ethnicity, health, trade union membership or political beliefs. Some of these may be irrelevant for learning analytics and can be ignored (or perhaps should be ignored from an ethical perspective). However, if it is identified that people from a particular ethnic group, for example, are at greater academic risk, then there is a strong argument that it could be justified to use that characteristic in the predictive models in order to target additional support more effectively.

In the case of sensitive data, the legitimate interests of the organisation cannot be used as justification for its processing. It is likely that the only justification for processing this data will be if the individual has given their explicit consent. The student should also be told exactly what the data will be used for: they have the right not to provide this data, or to have it excluded from any particular type of processing.

The forthcoming EU General Data Protection Regulation
The Data Protection Act 1998 and other national legislation in EU member states will be replaced imminently by the General Data Protection Regulation (GDPR). This will apply across the whole EU and will not be customised for individual countries as with the previous legislation. The UK Government has stated that it expects organisations to comply with the GDPR when it comes into force on 25th May 2018, irrespective of the UK’s plans to leave the EU.

The GDPR will continue to allow data processing to be carried out, as at present, on the basis of the legitimate interests of the organisation, or if it is necessary for the performance of a contract with the data subject.

Using consent as the basis for processing under the GDPR
However if consent is used as the legal basis the GDPR attaches new conditions. It requires clear, affirmative action – pre-ticked boxes, for example, would not be sufficient. A record must also be kept of how and when the consent was provided. In addition, students will have the right to withdraw their consent at any time. The GDPR also strongly disapproves of attempts to obtain “consent” as a condition of providing a service.

One challenge with using consent as the legal basis for learning analytics is that the request must make explicit to the student all the consequences of either providing or withholding their consent. This is perhaps not really feasible or fair when asking students to sign a document on the first day of their course.

A second issue with using consent as your basis for data collection is that – because of the requirement to fully explain the consequences – at the moment you request consent you freeze the activities to which that consent can apply. New types of analysis or intervention cannot be added if they were not envisaged at the time consent was obtained. Given the rate of development of learning analytics, this may prevent both organisations and students obtaining its full benefits.

A hybrid approach: using “legitimate interest” for analysis and consent for intervention
Andrew has argued in his paper “Downstream Consent: A Better Legal Framework for Big Data” that a hybrid approach is the best way forward. This considers collection, analysis and intervention as distinct stages under data protection law. We don’t need to request additional consent for most of the data collection if it is “data debris” which is being collected lawfully anyway e.g. the log files in a VLE. We can use legitimate interest as the justification for further analysis of the data, provided students are aware of this and it is done in ways that minimise the impact on individuals. This might include identifying groups or patterns of experience or behaviour that might benefit from a specific intervention or different treatment.

What we will need, though, is consent from students to intervene on the basis of these analytics, since here the intention is to maximise the (beneficial) effect on the individual. By postponing this request to the time when specific interventions are known, we will be in a much better position to explain to the student the consequences of granting or refusing their consent.

Over time, it seems likely that learning analytics processes currently regarded as add-ons will become an integral, normal and expected part of how education is conducted. I have argued this in a blog post entitled extreme learning analytics. If (or when) students contract with universities or colleges for a personally-tailored education, much of this collection, analysis and intervention will change from a legitimate interest of the organisation to a necessary part of its contract with the individual.

Note: There have been some suggestions that the Information Commissioner’s Office (ICO) may class universities and colleges as “public authorities” for the purposes of the GDPR, in which case they may be prohibited from using legitimate interests for some activities. If this were to occur, the alternative justification that those activities are “necessary for a task in the public interest” could be used, though this requires no balance of interests test so provides less protection for individuals and their data.

Privacy notice
Before any personal information is collected, organisations must inform individuals of all the purposes for which it will be used, who (if anyone) it may be disclosed to and the individual’s rights. These privacy notices (also known as fair processing notices) are required so that individuals know what processing will result from their decision, e.g. to become a student. The privacy notice is distinct from any request for consent – indeed it is required even when consent is not the basis for the processing. See the ICO Guide on privacy notices.

Institutions should already have a privacy notice, which will describe the data required for a student to study there. Text should be added to this which explains what additional purposes learning analytics may be used for – for example to improve the provision of education and to offer personalised recommendations – and declare these as secondary purposes of processing.

The notice may refer to more detailed policy documents such as an institutional learning analytics policy and student guide – see examples. These should explain the measures taken to protect students, staff and their data, and the circumstances in which consent will be sought.

Requesting consent
Unlike the collection and processing of data, taking interventions with students on the basis of the analytics will require their explicit consent. A common example would be for a personal tutor to contact a student if it appears that they are unlikely to pass a module – in order to see if anything can be done to help. Students could be enabled to opt-in to such interventions via a web-based form, or to refuse interventions at the time they are offered. They will also need to have the opportunity of opting out subsequently if they change their minds. The consequences of opting in or out must be explained to them.

Conclusion
Institutions must inform students and staff of the personal information being collected about them and the purposes, including any results of learning analytics, for which it may be used. Provided data collection and analysis are done in ways that minimise the risk of impact on individuals, however, it may not be necessary to obtain individual consent for these stages. Indeed relying on consent risks leaving holes in the dataset and students missing out on the benefits of the analytics for their learning, thus potentially disadvantaging them.

Under both the DPA and the GDPR data collection and processing can often be justified, and the interests of institutions and students better protected, by other legal grounds, such as legitimate interest. Provided institutions ensure that any remaining risk is justified by the benefits to the organisation and its members, this will enable a range of learning analytics to take place, e.g. identifying problems with modules.

Using sensitive data or taking personalised interventions with learners on the basis of the analytics will require their explicit consent. The student should be enabled to opt in to the type of intervention(s) they would prefer, and subsequently to opt out again, or refuse individual interventions, if they wish.

News from the Building digital capability project team

Originally posted on Jisc digital capability codesign challenge blog.

Although it has been quiet on the blog recently, we have been busy behind the scenes with some new developments. We will also be starting a series of blog posts in March 2017, to launch a suite of resources to support colleges and universities with the development of digital capability of their staff and students. So bookmark this site and look out for the series of blog posts from Helen Beetham over the forthcoming weeks. We are pleased to be presenting on this work at DigiFest together with colleges and universities who are taking forward their developments on digital capabilities.

Piloting the discovery tool

We are delighted to be working with 14 institutions on a closed pilot of a beta version of our Discovery tool aligned to the digital capability framework.

Discovery tool

Discovery tool

The tool has been designed to support individuals and managers in a range of roles by helping them to identify and reflect on their current digital capability and make plans to improve their capability through a set of recommended actions and resources.

The following institutions are working with us over the next 6 months to pilot the discovery tool and our wider set of digital capability resources:

  • Coleg Y Cymoedd
  • Derwentside College
  • Hartpury College
  • North Lindsey College
  • Hull College Group
  • School of Pharmacy, Cardiff University
  • University of Derby
  • University of East London
  • Glasgow Caledonian University
  • University of Hertfordshire
  • University of Hull
  • Institute of Education, University of Reading
  • The Open University
  • University of Southampton

The findings from the pilot will be informing the further development of the discovery tool which will move to a more sustainable platform for a roll out of an open pilot in Autumn 2017.

Developing organisational approaches to digital capability

6 Elements of digital capabilities model

6 Elements of digital capabilities model

In March we will be launching a suite of resources to support colleges and universities with the development of the digital capability of their staff and students. We are creating an online guide on ‘Developing organisational approaches to digital capability’ authored by Clare Killen and Helen Beetham, which will be launched late March. The online guide aims to support organisational leads with responsibility for developing staff and student digital capabilities in FE and HE by offering a structured approach showing how our digital capability framework can be used alongside a suite of tools and resources to help you to build a contextualised model for developing digital capability in your organisation.

The guide will link through to the following resources which have all been updated following feedback from an extensive consultation with practitioners and managers across further and higher education:

  • Updated digital capability framework
  • Organisational lens on the digital capability framework – which will provide guidance on how to approach digital capability across four key areas within an educational organisation of teaching, research, content and communications.
  • Strategic steps towards organisational digital capability – a 4 step model
  • An audit tool and checklist – a valuable starting point for conversations within the organisation
  • Seven digital capability ‘profiles’ outlining the digital capabilities required by different roles, including HE and FE teacher, learner, library and information professional, learning technologist, researcher and leader
  • Series of case studies highlighting how universities and colleges are developing staff digital capability

These resources will all be published in March and linked from the Building digital capability project page with supporting blog posts here.

If you have any queries please contact us at digitalcapability@jisc.ac.uk

We look forward to your feedback on these forthcoming resources.

Lisa Gray, Heather Price and Sarah Knight

Show me the evidence…

Originally posted on e-Learning Stuff.

I think this line is really interesting from a recent discussion on the ALT Members mailing list.

…in particular to share these with academics when they ask for the evidence to show technology can make a difference.

Often when demonstrating the potential of TEL and learning technologies to academics, the issue of evidence of impact often arises.

You will have a conversation which focuses on the technology and then the academic or teacher asks for evidence of the impact of that technology.

From my experience when an academic asks for the evidence, then the problem is not the lack of evidence, but actually something else.

Yes there are academics who will respond positively when shown the “evidence”, however experience has taught me that even when that happens then there is then another reason/problem/lack of evidence that means that the academic will still not start to use technology to “make a difference”.

When an academic asks “for the evidence to show technology can make a difference” the problem is not the lack of evidence, but one of resistance to change, fear, culture, rhetoric and motivation.

You really need to solve those issues, rather than find the “evidence”, as even if you find the evidence, you will then get further responses such as, wouldn’t work with my students, not appropriate for my subject, it wouldn’t work here, it’s not quite the same, not transferable…. etc…

Despite years of “evidence” published in a range of journals, can studies from Jisc and others, you will find that what ever evidence you “provide” it won’t be good enough, to justify that academic to start embedding that technology into their practice.

As stated before, when someone asks for the “evidence” more often then not this is a stalling tactic so that they don’t have the invest the time, energy and resources into using that technology.

Sometimes it can be “fear” as they really don’t have the capabilities to use technology and lack the basic ICT confidence to actually use various learning technologies, and as a result rather then fess up their lack of skills, they ask for the “evidence”, again to delay things.

Just turn it around, when you ask those academics who do use technology then, you find that the “evidence” generally plays little or no part in their decisions to make effective use of technology.

So what solutions are there to solve this issue? Well we need to think about the actual problems.

A lot of people do like things to remain as they are, they like their patterns of work, they like to do what they’ve always done. This is sometimes called resistance to change, but I think it’s less resistance to change, and more sticking to what I know. I know what works, it works for me, and anything else would require effort. This strikes me more about culture, a culture where improvement, efficiency and effectiveness are seen as not important and the status quo is rarely challenged.

Unless an organisation is focused strategically and operationally in improvement, widening participation, becoming more efficient, then it is hard to get people to think about changing their practice.

When it comes to embedding learning technologies we often talking about changing the culture of an organisation. This can be hard, but doesn’t necessarily have to be slow. I am reminded of a conversation with Lawrie Phipps though in which he said we have to remember that academics often like the current culture, it’s why they work in that place and in that job. So don’t be surprised when you are met with resistance!

Creating a culture which reflects experimentation, builds curiosity and rewards innovation, isn’t easy, but also isn’t impossible. There are various ways in which this can be done, but one lesson I have learnt in making this happen, is that the process needs to be holstic and the whole organisation needs to embrace that need to change the culture. What I have found that you need to identify the key stakeholders in the organisation, the ones who actually have the power to make change happen. I found in one college I worked in that the real “power” wasn’t with the Senior Leadership Team (who often had the same frustrations I had when it came to change) but the Heads of Faculty, the managers who led and managed the curriculum leaders. They had the power to make things happen, but they didn’t always realise they held that power.

Getting the rhetoric right, but also understood across the organisation is critical for success in embedding learning technologies. Often messages are “broadcast” across an organisation, but staff don’t really understand what is meant by them and many staff don’t think it applies to them. Getting a shared understanding what is required from a key strategic objective is challenging. I have done this exercise a few times and it works quite well, pick a phrase from your strategic objectives and ask a room of staff or managers what it means and to write it down individually. You find that everyone usually had a different understanding of what it means. A couple of examples to try include buzz phrases such as “the digital university” and “embrace technology”.

Finally looking at what motivates people to use technology to improve teaching, learning and assessment.

When I was teaching, I would often experiment with technology to see if it made a difference, if it did, I adopted it, if it didn’t I stopped using it. The impact on the learners was minimal, as I didn’t continue to use technology that didn’t make a difference or was even having a negative impact. What I also did was I applied the same process and logic to all my teaching. So when I created games to demonstrate various economic processes, if they made a difference I used them again, if they didn’t then I would ask the learners how they would change or improve them. When I gave out a reading list of books, I would ask the learners for their feedback and, those that didn’t make a difference or had no positive impact, then they would be removed from the list! I was personally motivated, but we know you can’t just make that happen.

When I was managing a team I ensured that any experimentation or innovation was part of their annual objectives and created SMART actions that would ensure they would be “motivated” to do this. Again you need to identify the key stakeholders in the organisation, the ones who actually have the power to make this happen.

So when someone asks you to show them the evidence what do you do?

Seven weeks to go…

Originally posted on Jisc Digital Student.

There are just over seven weeks to go before we close the door on data collection for this pilot version of the Digital Student Experience Tracker.

Seve

There have been a few questions about this deadline, so I’ll try to answer them here. First, though, it’s important to say that seven weeks is a long time! It is absolutely not too late to launch the Tracker and collect a useful body of data from your learners. We only recommend a window of about two weeks for the survey anyway – unless you have a strategy for reviving interest, for example around a specific initiative or event. So even if you are further behind than you hoped to be, there is still plenty of time to hit that deadline.

Why do we need a deadline at all? Although it’s hard sometimes to remember, this is actually a pilot process. We are learning about the Tracker, how best to run it, how best to support it, and what value it can be. So we need a deadline for closing the pilot, analysing the data, and asking you how it has gone. We already have lots of ideas for improvements, for example around greater flexibility in the questions asked, better integration with other data sources and so on. Unless we take time to really consider the evidence, we won’t be able to make the changes you want and (hopefully!) launch an improved Tracker service later in the year.

We are very grateful to everyone who has been part of the process. Even if you find that you run out of time in the pilot phase, we hope that the thinking and planning has been useful – for example has helped you to identify champions and to raise awareness. Perhaps you are in a better place to run the Tracker in the future, or perhaps you have decided that it’s not for you. We will still want to ask you about your experience – as far as we are concerned, it is all learning.

Why did we choose this particular time? The November to March window was the most popular with our first pilot phase institutions when we polled them (there is a FAQ about timing which explains this). It has proved not to be ideal with our Australian colleagues due to their having a different academic year. And if you planned to include final year HE students in your population – and missed earlier opportunities to launch the survey – then these last 7 weeks are not ideal because of the NSS. But people are finding ways around these problems. Running right up to the deadline, targetting students that are not involved in other surveys, using selective rather than random sampling – these are all options that you have.

How about strategies for encouraging more participants to complete the Tracker in the time available? Here are some fantastic posters from Vikki and the team at Epping Forest College. They’re using QR codes so learners can go straight to the survey, and they’ve also been using differential communications to reach different groups of learners – helping them to achieve over 200 responses in just a few days.

DigitalStudent Review Poster 2 TELL USThere are tips for engaging FE, HE and online learners in our Guide to Engaging Learners and some real world experiences in our Tracker case studies. For example:

  • Make sure you are reaching learners through different channels – posters, flyers, email, social media, texts, via the student desktop or VLE or app.
  • Tell learners about something specific you are already doing to improve their digital experience, or (if you are starting to collect data) something specific you have already found out. This encourages them to focus on positive outcomes and to think that their opinion will really make a difference.
  • Emphasise that the survey is quick and easy to complete.
  • Generate some real-world activity e.g. with a launch event, or focus on live completions, with helpers taking the survey out into student areas on mobile devices.
  • Ask learners to help you design communications about the Tracker, e.g. short video, infographic.
  • Ask sympathetic staff to allow learners to complete the survey live at the end of taught sessions.
  • Emphasise the benefits to students of an enhanced digital environment and employability skills.

Finally, I recently ran a Q and A session for our Australia/New Zealand pilot sites. I’ve uploaded the slides here here in case they are of interest (they may be a bit slow to load).

Happy Tracking!