More support for tracker reporting – and your feedback please!

Originally posted on Jisc Digital Student.

Feedback on Tabetha’s powerpoint template and excel file has been overwhelmingly positive – thank you everyone who has got back to us to say how useful these have been. You can download both files from our previous blog post. (The excel spreadsheet now has
a minor update to improve the formulae – the links remain the sameScreen Shot 2018-05-24 at 09.15.27.)

As we’re obviously onto a winner with resources to support reporting, Tabetha has also designed two posters you can customise for reporting your key messages around your university or college. You can download the posters as one customisable .pdf file here.

We also want to offer you another resource in draft, and discuss the possibility of a further toolkit – still at the concept stage. We’d like your feedback on both of these.

Digital student experience benchmarking tool

This DRAFT benchmarking tool enables students to audit their own digital experience. This was first developed by Helen back in 2015 with the support of the NUS, student change agents, and the Student Engagement Partnership (you can download the original here). It has been used by student unions and guilds to initiate conversations about their digital experience.

Nov blog photo Skills

Working with students on the benchmarking tool

However, the existing tool really needed an update to reflect what we have learned in piloting the tracker, and to tie in more closely with the questions we are asking. We’re working with student organisations to refine this new version and make sure it meets their needs. We’d also like to ask our tracker pilot institutions for feedback. Is this a useful resource? How might it be used in practice?

Sarah looking at digital student posters

Sarah looking at digital student posters

Some ideas we have are:

  • Student focus groups could identify the areas where they want to see progress: these can now easily be mapped to findings from the tracker survey.
  • Students could work from this resource to produce their own ‘blueprint’ for what they’d like to see happen, whether or not the tracker survey is run.
  • Students and staff working together on the benchmark might decide they need to run the student tracker, to canvas student views more widely.
  • Tracker leads can map findings from the student survey onto this tool to see what ‘next steps’ could be planned: these could be communicated with students in feeding back the tracker findings (e.g. in a ‘you said, we did’ style).

 

Digital toolkit

For some time we’ve been wondering how Jisc could support organisations to build a ‘digital toolkit’ for arriving students. Organisations are different and it’s not for Jisc to prescribe what students need to succeed in different places. However, we know this is an area where a bit of support can really make a difference.

One option is simply to provide a checklist of the information and materials that incoming students find useful. This would relate to the systems you have in use, devices that can be used on the network (and how to connect them), your local learning and teaching practices with technology, and how students can best prepare to study digitally. You could use this to ensure your pre-arrival resources are up to scratch.

Another option is to provide a template (perhaps web based?) into which this information could be embedded.

Either way, your tracker findings could allow you to offer the ‘voice of current students’. For example you could include:

  • common digital learning activities, e.g. online research, online discussion, using polling devices in lectures
  • how current students find digital learning to be of benefit, e.g. making them more independent in their study habits
  • what digital apps and resources might be useful, based on free text responses
  • assistive and adaptive technologies other students find useful, again based on free text responses
  • helpful quotes from current students – perhaps from free-text responses or focus groups.

Your feedback

We need your feedback on these resources and ideas. You are welcome to use the Jiscmail list for any comments. We will also be launching an evaluation survey (on 4th June) to find out about the experience of running the tracker. This will include questions about the resources we have produced, or might produce, to support you in analysing, reporting and responding to tracker findings.

Please do tell us your views – your input drives the decisions we make and the direction the tracker project goes next.

“Accessible” is not the same as Inclusion

Originally posted on lawrie : converged.

Do we consider all learners when embedding digital into teaching, learning and assessment? If they don’t have the newest device, good connectivity, unlimited data, can they still participate, engage and learn, what if they have a disability? When practitioners start to use innovative practices and technologies is everyone included?

It’s not often that I talk about accessibility, inclusion and technology; and its even less frequent over the last 10 years that I have written about it. But between 2001 and 2006 I wrote over 20 papers, chapters and other miscellaneous publications covering a broad range of issues around inclusion and technology. It was a formative time for me, I had been tasked with building a new service that would have a positive impact on disabled student’s educational experience. So TechDis was born (with huge thanks and support to Rich Townend and Mike Adams).

Every so often I find myself looking back and wondering how much of impact did we have. Initially I think we did, I think we worked hard at moving the agenda forward using a social model of disability, and we also worked hard to disabuse the sector of the notion that “accessibility standards” could create an inclusive environment – when you set a minimum standard, that is what people achieve.

A lot of what I brought to that role was based on my own experiences. At the time of dropping out of school I had no qualifications. My school years were unpleasant, I was constantly terrified of looking stupid in front of the other kids. I was an “undiagnosed” dyslexic and anyone with dyslexia will know those feelings well. I was in my late 20’s when I entered education again, I had lots of support, from family and friends, although even at graduation I still had not been identified as dyslexic (I didn’t even understand what it was).

Most recently I have been involved in the Next Generation Digital Learning Environments work at Jisc, some of the emerging trends have been especially interesting, where we are seeing lecturers have access to a range of tools outside of the institutional control, tools that allow them to build their own apps for example; and alongside these new tools and apps they have been adapting their practices.

This is what brings me back to my own experience, and my experience at TechDis. Most tools and commercial apps will have inbuilt features that support disabled students; legislation across Europe, and the Global North Countries embed the principles of ensuring accessibility, and by and large this has been of benefit. But recently I have come into contact with lecturers who have been doing great things with the plethora of apps and tools, shaking up their teaching and engaging students.

It all feels great.

But I started thinking about some of the answers to questions I have been asking.

“How many of your students engage with the tools?”

    “About 90%”

“Are the tools accessible?”

    “Yeah, the students love them, they access on their smartphones”

The questions go on?

One lecturer has an app that is a quiz tool. The app times the students, and the students that finish fastest, and with right answers, win prizes.

Is it Inclusive?

To read textbooks, with complex equations and diagrams, is hard for me, it takes a little longer than some people. I am not going to win a prize. And I just have mild dyslexia.

The apps, the tools, the websites, the e-books, are all getting more accessible. They are flexible and adaptable. Technology is helping me, I can easily change fonts, backgrounds, as I get older I find myself increasing the size of the text! Technology can help disabled students; but it doesn’t mean your teaching practice is inclusive.

I felt stupid at school, while the smart kids were lauded, handed prizes. I thought we’d left those days behind. But the technology, the access to technology, and the way the technology may be being used is again raising those issues.

“How many of your students engage with the tool?”

“About 90%”

If you don’t have the best device, the fastest data; if the way the tools are used to reward the smartest and most engaged students at the expense of the that 10% that are not engaged, then what are we doing as educators?

I am hoping that some of the practices that are stratifying students, creating a divide, are exceptions, transitional approaches. But in a race for recognition for innovation, where new and shiney is rewarded, I think we may be failing many students. Hearing about these practices, so many years after I graduated, has shaken me; a practice being used that would have rocked my confidence, would have hurt me academically, at university has really brought home to me the importance of inclusion, not just accessibility.

#CAN 18 Resources – Presentations and Blog Posts

Originally posted on Change Agents' Network.

We are gradually receiving the presentations from CAN 2018 and they are being added to the Resources page.

Along with the resources, there is a Storify and we’ve just received a link to a comprehensive blog post by Brad Forsyth and Jake Forecast, who presented last year at CAN 2017 as students from Epping Forest College.  This year’s blog posting gives a wonderful overview of the range of events and topics discussed this year during the 2 day conference at the University of Winchester.

VLEs – Value, Learning and Engagement?

Originally posted on Inspiring learning.

One area that the Student Experience team have been looking at is the role of the Virtual Learning Environment (VLE) in universities, colleges and the skills sector.  There are many organisations looking at their current VLE, with an eye to the future and assessing where they are with their use as a digital platform.

students working online

Students accessing their VLE

Value is a very useful word when assessing a VLE and needs to be viewed from a variety of perspectives.  This poses a number of questions to consider, such as:-

  • Do staff feel it is worth their time to upload or create content?
  • Why should staff invest time to learn new skills in creating quizzes or other interactive resources?
  • Do students value the VLE as a support to their studies? Does the VLE enhance learning and offer a ‘valued’ experience?
  • Does the organisation as a whole value having an online environment for their learning content?

Many organisations take pride in their buildings and surrounding areas.  Life on campus is aimed at providing a good student experience.   We would encourage organisations to take the same pride in their digital environments and apply the same principles and vision.

In our pilots so far, we have found in some institutions, quite a distinct difference in that ‘value’.  If you would like to know more about how Jisc can help you review your VLE, when the service goes live, you can contact us via our consultancy page https://www.jisc.ac.uk/consultancy.

We have found that while most students do value 24/7 access to learning material, especially for revision or missed sessions, they do have an expectation of what a system can provide.  Social media presents non-stop fresh material and engaging content.  Responsive platforms and notifications come as standard.  How does a VLE compete for attention?

Devices in use by students

Students use different devices to access the VLE

A lot of the ‘value’ will come from the user experience: navigation, ability to use on mobile devices, speed and so on. But we have also heard about how content is also very relevant.

Where the students have been unanimous in their appreciation for an online connection to their studies, some staff have shown less enthusiasm about what role the VLE currently plays.  All the main platforms on the market are capable systems but the fall back to uploading documents is the preferred option.  These being either already created or can be done in applications already familiar.  So, a level of confidence and capability are key factors in engaging staff in creating content to enhance learning further.

It is obvious that many of the staff we have spoken to understand what a VLE ‘could’ achieve, but the common answer for not engaging with it is ‘time’ or a lack thereof. This brings the ‘value’ element to the attention of  senior management.  How important is it to the senior management to provide the training and time for the staff to gain skills and produce a more engaging and enhanced experience?

One thing we feel is an important output from the review is not all about answers, but to also provide the right questions an organisation should ask when thinking about vision, strategy and looking at their VLE and how it meets their expectations now and in the future.

The post VLEs – Value, Learning and Engagement? appeared first on Inspiring learning.

Three emerging insights from the Digital discovery pilot

Originally posted on Jisc Innovation in Further Education and Skills.

Co-authored by Clare Killen

Map showing locations of UK pilotsOver one hundred universities, colleges and other providers are piloting the Jisc Digital discovery tool in the UK and overseas. The design of this tool encourages individuals to reflect on and develop their digital capabilities. It provides a summary of their self-assessment in response to nuanced question prompts as well as suggestions for further development with links to relevant, interactive resources. Whilst it is very much a personal tool, additional features allow institutional leads tasked with supporting digital capabilities development to gain insights from anonymised data and translate them into the institutional context.

Jisc team members have visited several pilot institutions to support the implementation process. In doing so, and through our in-depth conversations, we have learned about what works, at a practical level, when it comes to providing opportunities to develop the digital capabilities of staff and students in the various organisations. Further insights have emerged from conferences, events and meetings featuring presentations from our pilots, for example, the Student Experience Experts meeting and the Digital capabilities session at Digifest18.

As the roll-out gathers pace, we are starting to gain some fascinating insights into how institutions are using the opportunities offered by this tool to bring about positive change in their organisations. There are some clear themes emerging around what organisations that are benefiting from the Digital discovery process typically have in place:

 1. Clear strategic vision

We are seeing that the organisations with a clear message about the importance of digital technologies, communicated and understood by everyone, provides a meaningful context for the use of the discovery tool.

“It is important to have a clear strategy and people need to know that digital is part of the strategy and part of what they do. You need to engage people in it, allow them to see how it affects them and why it is important to them. It needs to be exciting, so for example, we have run several big events that inspire and excite people around the idea of using technology to support teaching and learning and the college business.”
Penny Langford, head of e-learning, Milton Keynes College

2. Culture

Having a safe space in which teams can explore their thinking about their own priorities for development creates an environment in which individuals can thrive.

“The individual reports which each member of my team had, generated discussions and comparisons, with staff considering their different roles and how that has had an impact upon their individual percentage. More than that though, it made them consider how they might acquire skills where they didn’t score as highly. I have eLearning Technologists and Librarians in my team and each had different scores, particularly the Information Literacy category. Which prompted all manner of discussion around the fake news agenda and critically evaluating information sources.”
Sarah Crossland, academic services manager, Doncaster College and University Centre

3. Connections

Establishing the connections between individual self-identified aims, the overall picture for all staff and the resources available to support professional development to meet organisational strategic aims.

We wanted to identify gaps in staff confidence in their digital skills and use this information to target staff training and support. We looked at other products but there was nothing really out there to meet those requirements. We were looking for a standardised tool and wanted something to self-motivate staff. The approach taken by the Digital discovery tool supports that.
Joseph Pilgrim, digital learning co-ordinator, ACT Training

Digital capability community of practice

The next digital capability community of practice event is being hosted in partnership with the University of Leicester on 22 May 2018. This provides an opportunity to learn about related initiatives and hear more from the wider community including many members who are taking part in the pilot of Digital discovery tool.
While registration for this event has now closed, the keynote sessions will be live streamed. Follow the hashtag #digitalcapability on the day and presentations and any outputs will be available from the event page.

There is still time to engage staff

If you are part of the pilot, you still have time to engage staff, perhaps through end of term staff development events. Remember that feedback is required by the end of May but the Digital discovery tool will continue to be available until 13 July 2018.

Three emerging insights from the Digital discovery pilot

Originally posted on Jisc Building Digital Capability Blog .

Co-authored by Clare Killen

Map showing locations of UK pilotsOver one hundred universities, colleges and other providers are piloting the Jisc Digital discovery tool in the UK and overseas. The design of this tool encourages individuals to reflect on and develop their digital capabilities. It provides a summary of their self-assessment in response to nuanced question prompts as well as suggestions for further development with links to relevant, interactive resources. Whilst it is very much a personal tool, additional features allow institutional leads tasked with supporting digital capabilities development to gain insights from anonymised data and translate them into the institutional context.

Jisc team members have visited several pilot institutions to support the implementation process. In doing so, and through our in-depth conversations, we have learned about what works, at a practical level, when it comes to providing opportunities to develop the digital capabilities of staff and students in the various organisations. Further insights have emerged from conferences, events and meetings featuring presentations from our pilots, for example, the Student Experience Experts meeting and the Digital capabilities session at Digifest18.

As the roll-out gathers pace, we are starting to gain some fascinating insights into how institutions are using the opportunities offered by this tool to bring about positive change in their organisations. There are some clear themes emerging around what organisations that are benefiting from the Digital discovery process typically have in place:

 1. Clear strategic vision

We are seeing that the organisations with a clear message about the importance of digital technologies, communicated and understood by everyone, provides a meaningful context for the use of the discovery tool.

“It is important to have a clear strategy and people need to know that digital is part of the strategy and part of what they do. You need to engage people in it, allow them to see how it affects them and why it is important to them. It needs to be exciting, so for example, we have run several big events that inspire and excite people around the idea of using technology to support teaching and learning and the college business.”
Penny Langford, head of e-learning, Milton Keynes College

2. Culture

Having a safe space in which teams can explore their thinking about their own priorities for development creates an environment in which individuals can thrive.

“The individual reports which each member of my team had, generated discussions and comparisons, with staff considering their different roles and how that has had an impact upon their individual percentage. More than that though, it made them consider how they might acquire skills where they didn’t score as highly. I have eLearning Technologists and Librarians in my team and each had different scores, particularly the Information Literacy category. Which prompted all manner of discussion around the fake news agenda and critically evaluating information sources.”
Sarah Crossland, academic services manager, Doncaster College and University Centre

3. Connections

Establishing the connections between individual self-identified aims, the overall picture for all staff and the resources available to support professional development to meet organisational strategic aims.

We wanted to identify gaps in staff confidence in their digital skills and use this information to target staff training and support. We looked at other products but there was nothing really out there to meet those requirements. We were looking for a standardised tool and wanted something to self-motivate staff. The approach taken by the Digital discovery tool supports that.
Joseph Pilgrim, digital learning co-ordinator, ACT Training

Digital capability community of practice

The next digital capability community of practice event is being hosted in partnership with the University of Leicester on 22 May 2018. This provides an opportunity to learn about related initiatives and hear more from the wider community including many members who are taking part in the pilot of Digital discovery tool.
While registration for this event has now closed, the keynote sessions will be live streamed. Follow the hashtag #digitalcapability on the day and presentations and any outputs will be available from the event page.

There is still time to engage staff

If you are part of the pilot, you still have time to engage staff, perhaps through end of term staff development events. Remember that feedback is required by the end of May but the Digital discovery tool will continue to be available until 13 July 2018.

How HR teams support staff digital capability

Originally posted on Jisc Building Digital Capability Blog .

At the end of 2017 we began a short review into how Human Resources (HR) departments support staff to develop their digital capability. We developed an online survey and interviewed some of the respondents to try to capture a snapshot of current practice.

Initial results

The results of these activities confirmed our initial expectation that many HR teams have been working across several areas of the digital capability framework, often in partnership with other teams within their institutions. However for both HE and FE respondents there were quite significant variations to the questions about HR team involvement in the 6 core digital capability areas. Whilst 90% of people said they were involved in supporting ICT proficiency of staff, only 50% said they were involved supporting staff with information, data and media literacy, digital communication, collaboration and participation, or digital learning and teaching. 84% said they were not involved in digital creation, problem solving and innovation and 58% said they were not involved in digital identity and wellbeing.

Later questions and in-depth interviews revealed that many HR teams are in universities or colleges which are just starting to take an institution-wide approach to staff and student digital capabilities. One of the challenges for HR teams is in identifying their roles and potential areas where they could input to institution-wide initiatives and the developments of strategies for developing digital capabilities. Whilst some HR teams were aware of the Jisc tools and resources to support this work, many had not seen them before or had not engaged with them. It became clear to us that there was a need for some practical materials to help HR teams map their various activities (often split into specialist sections) to the digital capabilities framework.

The original survey is still open so if you did not get a chance to respond earlier we would still welcome your input.

https://jisc-beta.onlinesurveys.ac.uk/hr-support-of-staff-digital-capabilites

New materials for HR teams

HR teams cover a wide range of activities that require them to consider and/or support staff digital capabilities across their institutions. These include recruitment and selection, onboarding, appraisal/performance review, learning and development, relationship management and health and wellbeing. Data management and analytics, increasingly sophisticated institutional systems and the impact of social media mean that Human Resource teams themselves need a range of digital capabilities to effectively carry out their work.

We have produced two sets of powerpoint slides that could be used within HR teams and we are interested to find out if they are useful. Thanks are due to Abi Mawhirt, Head of People and Organisational Development at Dundee and Angus College who worked with us to refine these slides and to make sure we did not have any serious omissions. Abi will be using the slides within her own institution and we have some other HR teams who have said they might try them out.

HR teams could use the slides (or select the ones that they feel are most relevant to their context) to consider their activities, identify and build on strengths, as well as identify any gaps or areas where they could enhance their support of staff digital capabilities. It may highlight areas where HR teams could take the lead, for example in the area of Digital identity and wellbeing.

HR-ppt-screen

This set maps HR activities and roles to the Jisc digital capabilities framework. It highlights where HR teams can input to institution wide approaches to staff digital capabilities and offers some suggestions for activities where they could get involved. Some of these areas involve other teams and would encourage HR input to support teams leading on a particular area.

HR-ppt-DigID

This set offers a view of HR activities through the Jisc digital capabilities framework. Each area of HR activities is mapped to the 6 key elements of the Digital capabilities framework and highlights where HR teams can impact on digital capabilities of staff (and to a lesser extent students).

We have also highlighted those activities that relate to digital capabilities of staff in HR teams.


Please pass these on to your own HR team and ask them to try them out. We have produced a brief pdf document which offers ideas for how they might be used.

Here are some of the suggestions:

  1. Use the slides to deliver a team presentation highlighting areas of most relevance to the team.
  2. Use the slides or a selection of slides in a presentation to focus on particular aspects – either a particular area of HR activities such as recruitment and selection or on a specific area of the digital capabilities framework such as Digital wellbeing.
  3. Use the slides as a pdf document to share within teams and follow up with workshops to consider them within your own context.
  4. Get different teams within HR to focus on specific slides (or pdf pages) and ask them to come up with an action plan following their discussions.
  5. Use the slides or some of the content to present to different teams within the organisation to highlight what you are doing in different areas of digital capability or what you would like to do.
  6. Use the materials to highlight areas for joint working or partnership approaches to other teams or departments within the institution.
  7. Link to other Jisc digital capabilities, guidance, tools or resources to highlight possible HR roles across the institution.

We would like to gather some feedback about these so that we can adapt or enhance them. Link to a brief survey.

Let us know what you think. Help us make them better.

Discovery tool: understanding the questions

Originally posted on Jisc Building Digital Capability Blog .

We have just been through an interim analysis of feedback from staff users of the Digital discovery tool. Thank you for directing so many staff to complete the feedback form – 225 general staff and 150 teaching staff have done so already, and it has been an invaluable resource.

Screen Shot 2018-05-14 at 15.20.38The feedback so far has been very positive, with some interesting perceptions that we will report in the next blog post. This post is about some of the changes we have made to the content of questions. It also seems like a good opportunity to explain a bit more of the thinking that goes into the three question types, and into the reasons for designing the discovery tool in the way we have. There is some general information at the top of the post, and more detail further down for those who are interested in the different question types.

Development rather than testing

At the start of the design process we had to make a significant decision. We could have written ‘testing’ questions, as in a typical assessment test, to find out what users really  understand about digital applications and approaches. But we decided to write ‘developmental’ questions instead. These are designed to develop understanding, for example by making clear what ‘better’ (deeper, better judged) performance looks like. Rather than hiding the ‘right’ answer, they make transparent what expert digital professionals do and ask users to reflect and report: ‘do I do that?’

We have gone down this road partly because we are not convinced that testing abstract understanding is the best indicator of actual practice, and partly because this approach is more acceptable to end users. Staff want to be treated as professionals, and to take responsibility for assessing and moving forward their own practice. Also, we are not designing in a platform that supports item-by-item matching of feedback to response. So it’s not possible for the feedback itself to be closely matched to users’ input – as it would be in an assessment system – and our questions themselves have to do a lot of the work.

This has important implications for the meaning of the scoring ‘bands’ that we use to assign feedback to users (more of this shortly).

Where do the question items come from?

Essentially, to design the questions we first developed a wide range of real-world activities that digital professionals do. We’ve tested those out with expert panels, and also against the relevant professional profile(s) – which have had professional body involvement.

Of course we could just have presented these activities in a random order, and this was an early design idea. But the digital capabilities framework already had good recognition in the sector, and we needed a navigational aid. So in the case of the generic assessments (for staff and students) we allocated activities to the different framework areas, e.g. ‘data literacy’. In the case of role-specialist assessments, we used specialist requirements from the relevant profile, such as ‘face-to-face teaching’ or ‘assessment and feedback’ in the case of the teaching assessments.

We then took one activity that was central to the area in question and framed it as a ‘confidence’ question (‘How confident do you feel about doing x?’). We developed another activity into a mini-scenario or example to create a ‘depth’ question, with four levels of response possible (‘Which of these best reflects your response?’). Six further activities became options in a ‘breadth’ question (‘Which of these can you do? Select any or all that apply to you’). This provides us with three questions, 8 activities, for each area of practice. There is more about the different question types below.

We have not statistically tested to discover whether responses to all three questions in one area  hang together to create a distinct and separate factor. There is the opportunity to do that with system data at this point, but our first aim was to create a navigable user experience – making sense and generating helpful feedback – rather than to validate a model.

Ideally the feedback we give to users would relate to their responses for each of the eight different activities. Without this option, we have used scoring bands to allocate roughly appropriate feedback to users, based on their responses to the three questions. It’s not exact, and some users have picked that up. However, most users rate the quality of feedback highly – it has the most positive comments of any feature – so we know we are getting it more or less right. We hope we have dealt with the lack of specificity by offering a range of ‘next steps’ that participants can choose from, according to their own interests and self-assessed development needs.

You’ll understand from this that scoring is an artefact of the system we are using and the design choices we have made within it, not an objective measure of any kind.

We were pleased when we analysed system data from the first two months of use to see that in all but three of the 45 generic staff questions, and in all the teaching staff questions, the scoring bands were evenly distributed. This means that the questions were doing a good job of discriminating among staff according to their (self-declared) expertise, and the full range of scoring bands and feedback was being used. Three questions had median scores outside of the normal range, and a couple of sections elicited comments that users did not feel their feedback reflected their actual capability (‘information literacy’ was one). Rather than changing the underlying scoring model for these questions, we decided it was more appropriate to work on the content to try to produce a more even distribution of responses around a central median point. So if users’ scores differ from the median, that should mean something – but we can’t say that it means anything about their objective performance.

Of course users who answer the questions after the changes were made on 5 May will not be scoring in the same way as users who answered the questions before. (It’s also possible that in making the changes suggested by user feedback, we have inadvertently shifted the scoring for some other questions – we will be checking this.) This will need to be communicated to any staff who are returning to use the discovery tool again. It will also need to be taken into account when looking at data returns, since data from before and after the changes can’t be treated as one data set. This is one reason we have cautioned against using scoring data to draw any firm conclusions, particularly during this pilot period when the content is still evolving.

We hope you will convey to all the staff who took the time to complete a feedback form that we have listened to their views – and that you and they will feel that the revised questions are an improvement. This is why this pilot process is so valuable.

How have the questions changed in response to feedback?

(Some changes to wording and options is based on findings from early user testing and not from the more general feedback we gained via the user feedback forms.)

We’ve slightly changed the lay-out of questions and added some more navigational text to clarify how to answer them.

We’ve removed or clarified some terms that were not well understood. Overall we know there is a need for a glossary – ideally with examples and links. That is something Lou will be working on for the future service. We’ve also changed a couple of examples we were using for illustration. There have been many discussions about the pros and cons of examples. Some people find generic terms difficult to understand without examples: but more people object when examples are used, because they favour some applications or approaches over others that are equally valid. Examples can confuse further: ‘if I don’t use that tool, I’m obviously not doing it (right)’. Overall we have gone light on examples, and we hope users’ understanding of terms will improve when we have a detailed glossary we can link to.

We have tried to focus more on activities users do at work, in an educational organisation (college or university). There were some negative comments about references to digital practices beyond this space. However, because of the need to cover a very wide range of roles – and because some roles don’t allow people to express digital capabilities they actually have – we can’t avoid offering some examples from beyond a narrowly-defined work role. For example, one of the activities under ‘digital identity’ is ‘manage social media for an organisation, group or team‘, and under ‘data literacy’ we have ‘judge the credibility of statistics used in public debate’. This is to allow users who don’t manage social media or evaluate statistics as part of their job to reflect on whether they have these capabilities anyway – perhaps gained in their personal life or another role. And indeed to consider whether these activities might be useful to them.

We’ve changed several references to social media, as a number of users objected to what they felt was an underlying assumption that social media would or should be used, and that this was a positive sign of capability. There are still several ways that users can show they are making wise judgements about the appropriateness of social media.

We’ve tried our best to use prompts that reflect capability (‘could do’, ‘would do’, ‘have ever done’) rather than current practice (‘do’, ‘do regularly’), which may be constrained by organisational issues or may reflect judgements not to use. However, we are also mindful that self-reported practice (‘I actually do this’) is usually more accurate than self-reported ability (‘I could do this if I wanted to’). Where we feel it is justified, we have continued to ask about actual use. So long as users understand that they are not being judged, it seems appropriate for the questions and feedback to indicate areas where they are not as capable as they might be if their organisation were more supportive of different practices, or their job role offered more digital opportunities.

There have been changes to the teaching questions, again to focus on pedagogical judgement rather than digital practice. There are now quite a number of caveats e.g. ‘if appropriate to my learners‘, which were suggested by more expert users. Of course we always listen to our experts (!) but as designers we’re aware that introducing caveats like this makes the questions longer and more complex, creating more cognitive load for users, and potential annoyance. We will monitor completion rates to see if this is a problem.

We have particularly reviewed the assessment questions and the online learning questions to be sure we are covering the very wide range of good practice in these areas.

There follows more detail on specific question types and the changes we have made to each of these.

‘Confidence’ questions

Why have we included questions that ask users ‘How confident do you feel about..?’ when we know that self-assessed confidence is generally unreliable? We do this at the start of each element to give users an orientation towards the questions that follow – ‘this is the area of practice we are looking at next’ – and a sense that they are in control. By trusting users to rate themselves, we are both reassuring them that they are not being ‘tested’, and asking them to be honest and searching in their responses. We have weighted the scoring for this question at a low level to reflect users tendency to answer inaccurately – though in fact when we came to compare confidence scores with scores on the other two question types in the same area of practice, there was a positive match.

In feedback, quite a number of users mentioned the tone of these questions positively. Screen Shot 2018-05-14 at 15.03.11However, some felt that they were too ‘subjective’, or ‘vague’. We have tried to deal with this in the update by focusing some questions more tightly on specific practices within the overall area we are looking at. So for example in the generic staff set, under ‘digital creativity’ we ask: ‘How confident are you creating digital content e.g. video, audio, animations, graphics, web pages?’ In the teaching set, under ‘learning resources’, we ask ‘How confident are you about using digital resources within the rules of copyright?‘ We have to find a practice that is generic enough to be available to staff in a wide variety of different roles, but specific enough for the response to feel rooted in a real-world activity.

We have had internal discussions about whether to move the confidence questions to the end of each set, or to remove them altogether. For now they stay where they are.

 

‘Depth’ questions

These questions are the most difficult to write and currently the most troublesome to end users. There are some ongoing issues with how they are presented on screen, and we are looking into whether any improvements are possible, but for now we have reworded the questions to make the steps to answer them as clear as we can.

These questions offer a short situation or example. Users select the one response that best matches what they would do or what expertise they have. The lay-out of the question reflects the progression logic: the first option reflects the lowest level of judgement or expertise, and the fourth option reflects the highest. There is no trickery here. We describe how progressively more expert practitioners think or act, and ask users to report where they sit on that scale. (At the moment, the visual cues do not make clear that it is a scale, or that higher levels of judgement encompass aScreen Shot 2018-05-14 at 14.45.16nd include the lower ones.)

 

Beyond the difficulties some users had in ‘reading’ the answer logic for these questions, it is clear that we have to get the progression logic right in each case. When people disagree with our judgement about what is ‘more expert’, they don’t like these questions. When they agree, they say they are ‘nuanced’, ‘thoughtful’, and ‘made me think‘. We know that our users expect us to reflect issues of judgement and discrimination (‘how well is
digital technology being used?’) at least as much as extent of use (‘how many different digital tools?’). So we know these questions have to be in there. They have to reflect important issues of digital thinking or mindset, and we have to get them right – in a very small number of words!

Our recent updates aim to clarify the focus on judgement and experience rather than extent of use. And we have added modifiers such as ‘when appropriate’ or ‘if appropriate for your learners’ (teaching staff) to emphasise that we don’t believe technology is always the answer – but good judgement about technology is. This creates more words on the screen, which will put off some users, but we want our champions to feel that our words represent thoughtful
practice and not a shallow checklist of skills.

‘Breadth’ questions

Screen Shot 2018-05-14 at 14.48.55These are in many ways the most unproblematic. They offer a range of digital activities that staff may do already, may want to do, or may not even have thought about. As before, we try to clarify that we don’t think digital practices are always the best, but we do want people to extend their repertoire so they have more experience of what does (and doesn’t) work. We try to use wording that values skills users have, even if they can’t use them currently due to their role or organisational context. We have tried to avoid very role-specific activities, but not to preclude the possibility that people might develop some professionally-relevant skills in their personal lives, or take on tasks from ‘other’ roles that they enjoy. We include fairly basic activities that many users will be able to select, and quite advanced activities that offer something to aspire to. The ‘nudge’ information is obvious: think about doing some of these things if you don’t or can’t already.

 

What next?

We are always interested in your views on the questions and other content. The user feedback forms will remain live until the end of the pilot project and we expect to make some further updates to content at that point. If you are an institutional lead, you will shortly have an opportunity to give us feedback via your own detailed evaluation survey.

14th UK Learning Analytics Network meeting, University of Gloucestershire, Cheltenham 12 June 2018

Originally posted on Effective Learning Analytics.

Jisc’s next learning analytics network meeting is in Cheltenham at the University of Gloucestershire on 12th June 2018. These popular events comprise a range of presentations and discussion sessions – and an opportunity to network with colleagues involved in learning analytics projects at other institutions.

the-park-elwes-main

The theme for this meeting is “Beyond Retention”, we will see some examples of new data sources around student feedback, explore ideas for supporting student success and improving teaching and get some suppliers views on interventions. Consider the following and come prepared to share your thoughts

How can we enhance students’ performance through analytics? What are the key indicators of successful students. What information should we gather? What parameters can we use and what is ethical? How can we capture the uncapturable, the unmeasurable learning that goes on in the café, the corridor and in halls? How can we reward good behaviour, can we gamify these systems? As humans we respond well to tasks, can we adapt our systems to incorporate these tasks?

There is a growing debate about interventions and how these can be applied within the Learning Analytics context. How far should interventions go and how measurable are they? Should interventions be automated or manual? Can recording interventions be effective and what format should they take. Students are barraged with notifications, alerts and messages from a host of apps and social media. Will interventions and alerts just contribute to that noise?

The draft agenda is below. Please register early to ensure a place at the event.

Agenda

10:00 – 16:00, Tuesday 12 June 2018
Hosted by the University of Gloucestershire

University of Gloucestershire, The Park, Cheltenham, GL50 2RH

Registration form   Transport & parking     Campus Maps

09:30 – 10:15 Arrival and coffee
10:15 – 10:45 Arrangements for the day & welcome to the University of Gloucestershire
10:45 – 11:15 Update on Jisc effective learning analytics project  Michael Webb, Rob Wyn Jones, Jisc
11:15 – 12:00 Unitu – engaging with realtime student feedback Anish Bagga, CEO and Founder of Unitu
12:00 – 12:15 Coffee
12:15 – 13:00 Using student questionnaires in analytics – HEFCE catalyst A project
Christine Couper, University of Greenwich
13:00 – 14:00 Lunch
14:00 – 15:00  Interventions how, when and why (Panel Session)   Invited experts, chaired by Steve Hoole, Jisc
15:00 – 15:15 Coffee
15:15 – 15:55 Beyond Retention Workshop Sarah Davies, Jisc
15:55 – 16:00 Farewell

 

Why does no one care about my digital strategy?

Originally posted on e-Learning Stuff.

lens

So have you ever been tasked with writing a digital strategy? Do you know where to start? Do you know what is going to ensure it will work and be successful.

So if you are tasked with writing a digital strategy, you could write it in isolation, but prepare for it to be a low priority for people higher up. Also expect people in other directorates or departments to ignore it as they focus on their own strategies.

Jisc have recently published a leadership briefing written by myself and Lawrie Phipps. A key aspect is aimed at those tasked with writing strategies, where we argue that in order to get stronger “buy-in” there is a need to apply digital lens to all strategies.

Jisc Senior leaders’ briefing paper

The paper proposes the concept of using a digital lens when approaching strategy, practice and process. The lens is made up of different aspects that need to be considered when applying digital to existing and intended structures.

digital lens

It is necessary to identify which element will be looked at in digital contexts – for example, a particular teaching practice. Different digital options should then be explored to gain a thorough understanding of the range of possibilities. The benefits and risks of each possibility should be carefully weighed before deciding to deploy. As with all change, it is important to reflect and evaluate the nature and impact of the changes caused by the incorporation of digital.

There is a history of people talking about applying a lens to stuff, to look at things differently. To give a different perspective on what has been written or talked about. These are sometimes called strategic lenses and can cover different area such as design, customer focus, resources, cultural amongst others.

lens

In this blog post I want to reflect on my own experiences in designing, developing and writing my own digital strategies. My initial frustrating experiences with a strategy that took a lot of my time which was then ignored, well certainly felt like it was ignored. It was almost a tick box exercise and the end result was the strategy was put into a lever arch file, put on a shelf until the following year when it would be reviewed, revised and published again.

As a TEL Manager in a college I was asked and I delivered a digital learning strategy, well back then it was called the Information and Learning Technology or ILT strategy. Historically it had come about because of funding from Becta to colleges was given on the basis of colleges writing an ILT strategy. This was often distinct from the IT strategy. The IT strategy was usually focused on the technical infrastructure to support the college business, whereas the ILT strategy was focused on the embedding of technology into teaching and learning. What often happened though was that both strategies weren’t linked together and weren’t always linked to the corporate strategy, of if they were those linkages weren’t always clear.

The end result was that sometimes these strategies were at odds with each other.You had an ILT strategy was advocating a student BYOD policy and the IT strategy was clear that non-organisation devices could not be connected to the wireless network.

Clicker

It wasn’t just the IT strategy, I am aware of heated discussions between managers, where the ILT strategy was advocating a student BYOD policy and the Estates strategy was clear that non-organisation devices could not be plugged into the power sockets.

On top of all this was the core corporate strategy that was focused on something completely different.

I remember my ILT strategy talking about the use of the VLE by students and that all courses would have a presence on the VLE. Sounds fine, but academic staff didn’t see that as a priority, because the corporate strategy was talking about, widening participation, improving teaching and learning, and better student outcomes. Staff saw the improvement of teaching and learning as a priority, they saw using the VLE as something extra, more work so a) didn’t use it b) would often say they didn’t have the time (which we know now means they didn’t consider it a priority). So a lot of my time was taken up “selling” the use of edtech. What I didn’t realise at the time was that what I often was doing was applying a digital lens to the existing strategy in order to “sell” the VLE or other edtech to academics. I would talk about how the VLE would enable them to “improve” teaching and learning, could be used to “widen participation”. I started to realise that having a strategy focused on tools was never going to be successful, one which focused on outcomes would be easily understood by managers and staff, and more easily achieved.

In a later role I had to write a combined IT, Libraries and Learning Technology strategy. We were being supported by an external consultant and I do remember one of the key things she said was that anything in our departmental strategies had to stem from the core corporate strategy.

A typical IT strategy will often say something like this:

Enable a secure, robust and stable network.

What this does is focus the minds of the IT and network teams to ensure that the network has high resilience, low downtime and is secure. As a result when academics and learning technologists want to try something new, they are “refused” because it could impact on the security and reliability of the network. Over the years I remember many times being told we couldn’t use this tool, access this service, because of the “importance” of enabling a secure, robust and stable network.

The problem was that the corporate strategy said

We will develop and deliver high quality teaching and learning, across a wide range of subjects and qualifications.

This meant developing new ways of teaching and improving learning. Academics wanted to try new and innovative practices, but the IT strategy was acting as a barrier.

If the IT strategy was linked to the corporate strategy and said:

Enable a secure, robust and stable network to allow high quality teaching and learning through the use of technology.

What this does is focus the minds of the IT and network teams to ensure that the primary focus and use of the network is on allowing innovative use of technology for teaching and learning. Yes they still need to ensure that the network is secure, resilient and stable, but their primary focus will be on ensuring that teaching and learning can make effective use of technology.

Any departmental or methodology strategy should always link back to the organisational strategy and how the objectives and actions will support the organisational strategic aims.

So how do you do that then?

Well that’s where the lens comes in.

So if you are tasked with writing a digital strategy, you could write it in isolation, but prepare for it to be a low priority for people.

If you apply a digital lens to the corporate strategy, you can demonstrate how digital technologies can enable that strategy. So rather than talk about how you are going to increase the use of digital technologies, the strategy talks about how the use of digital technologies will enable the strategic aims.

The leadership briefing we published provides a mechanism on how to do just that. The next stage will be to distill the strategy into an operational plan, again applying a digital lens will demonstrate and show how digital technologies can be an enabler and not a barrier.