Designing for the digital curriculum

Originally posted on Inspiring learning.

In May Chris Thomson and I had the pleasure of working with staff at Queens University Belfast. We delivered a series of worskshops to explore what the student of the future might look like, how we can support them effectively and what we need to consider when designing for the digital curriculum.

queens chris

The workshops built on much of the previous work Jisc has carried out in this area, but the activities themselves were tailored to fit Queen’s particular context. Having said that, many of the activities we did are equally relevant to other universities and colleges too.

Here’s a snapshot of our experiences.

What does the student of the future look like?

Future-gazing trends in education and technology is challenging to do, but it’s also a lot of fun! It’s fun because it allows your imagination and creativity free reign to explore the unknown.

Articulating what our students of the future may look like is also a worthwhile activity. It helps at the curriculum design stage when building in meaningful digital aspects that nurture students and prepare them for when they leave their course.

We used Jisc’s learner profile as a starting point to help inform our thinking, which lead us on to a great activity created by Helen Beetham. This involved posing a hypothetical situation of a student from the future contacting their former lecturer, due to a glitch in the space-time continuum. Yes, of course it couldn’t really happen (except perhaps in episodes of Star Trek), but it does get staff thinking about what the future holds for learners.

We did this on the day by making use of discussion forums on the university’s Virtual Learning Environment (VLE). This was largely because we wanted to use the systems that staff themselves would be usinng with their own students but it also allowed for fruitful discussions about how we build in collaboration activities for students too.

One of the key points I took away from this was the need to scaffold digital activities appropriately beforehand, especially ones that involve online collaboration and discussion. Students need to be clear on what is expected from them and understand how they will be supported if they need help. It is perhaps all too easy to assume that students are familiar with online networking and that they’re already proficient with it. This may be true in some cases, but not all. What if, for example, the student is from outside of the UK and English isn’t their first language? Or if they have previously experienced trolling online due to their race, religion or sexual orientation?

We also did a fun activity where we got staff to draw their student of the future. This provided a visual way for staff to articulate how they perceive their students and produced some very impressive drawings on the fly!

Artistic impressions of the "student of the future."

Artistic impressions of the “student of the future.”

Interestingly, many of the key themes that arose in the drawings were things like barriers to engagement, online language, digital values, balance in life and work/study, stability and sustainability for our students, as well as the threats and challenges.

DdT6C13XcAEAKTb.jpg large

This activity challenged many of the assumptions we sometimes have of our students and emphasised the need to have those conversations with our students. Jisc has done a lot of work in this area and we are thrilled to announce that the Student Digital Experience Tracker service will become available later this year which helps with this particular challenge.

Discovering your own digital capabilities

Meeting student expectation when it comes to designing for the digital curriculum also requires an understanding of your own digital capabilities.

Queens university, along with many other universities and colleges across the UK, are currently piloting Jisc’s discovery tool. This tool encourages individuals to reflect on and develop their own digital capabilities. It also provides a summary of their self-assessment in response to nuanced question prompts as well as suggestions for further development with links to relevant, interactive resources.

After staff had run through the tool we asked them to plot their indiividual digital capabilities using a ‘tension pairing’ activity. This involved indiividuals thinking more deeply about which capabilities they need to develop for specific courses and how easy or difficult it would be for them to master.

tension pairing

I really like this activity because it provides individuals with a sense of perspective. The whole area of digital capability can be a little overwhelming – there are so many tools out there and so many different approaches one could take.

Mapping your digital capabilities in this way helps you to prioritise and identify the quick wins as well as the longer term goals to strive towards.

How do we deliver a digital curriculum?

After having reflected on what our students may look like we moved on to the more practical aspects of how we meet and support the needs of our students.

Jisc has produced a number of resources to help teaching staff think about where the technology might fit in a particular course or module which we used on the day. These can all be seen through the lens of Jisc’s digital capability framework, which includes the six elements model.

Incidentally, Jisc is running a public Curriculum Confidence workshop on the 18th of June in Belfast and again in Manchester on 10th October. There’s more information on the Jisc Training web pages.

We rounded the workshop off by spending an hour or so looking at the ABC Curriculum Design resources created by Clive Young and Nataša Perović at UCL. These are openly available and we highly recommend them to anyone working on module design. They are a great way of getting teaching teams together, mapping out the balance of different learning types used during a module and thinking constructively about which elements to blend.

We found that these resources worked well when complemented by the Jisc Curriculum Confidence materials as they can give different perspectives on the same module.

We wish the staff at Queen’s all the very best.

The post Designing for the digital curriculum appeared first on Inspiring learning.

Notes and presentations from the 3rd Digital capabilities community of practice event – 22 May 2018

Originally posted on Jisc Building Digital Capability Blog .

University of Leicester, College Court fountain

Photo credit: Heather Price

The University of Leicester provided a great setting for our third community of practice meeting. With eighty five delegates participating in person and many more joining in online (using the hashtag #digitalcapability) this was one of our most vibrant and productive meetings to date.

This is a brief summary of the event. All links to slides, recordings and all other outputs from these sessions are available from the Jisc event page.

Dr Ross Parry, Associate Professor and Deputy Pro Vice Chancellor (digital) at the University of Leicester set the scene for the day with his opening keynote Digital capabilities as a strategic priority. He talked about the importance of creating a shared vision and shared a number of insights gained from his experience of developing and implementing the University’s digital strategy. He said “You can have all the tech in the world, but it’ll make little difference if you don’t also have a community with the confidence and fluency to use it in creative and exciting ways”.

The three parallel community led sessions focussed on practical engagement strategies to engage students, senior leaders and human resource teams. This was an opportunity for participants to share their experiences, discuss with colleagues and identify opportunities for collaboration.

These were followed by the first set of four pecha kucha sessions:

  1. Future facing learning – Paul Durston, Teesside University
  2. Digital Leadership for Students: Development of an online resource – Vikki McGarvey, Learning and information services manager, Staffordshire University Library
  3. Can student-staff partnerships support the development of digital teaching and learning practices? – Alex Patel and Bethany Cox, University of Leicester
  4. Digital Leaders – Integrating digital in York’s leadership programmes – Michelle Blake, head of relationship management, University of York

Kerensa Jennings

The second keynote, How iDEA is developing digital citizens was delivered by Kerensa Jennings, The Duke of York Inspiring Digital Enterprise Award (iDEA). Kerensa gave an overview this international programme aiming to help address the digital skills gap. She explained that all iDEA resources are free to use and are being increasingly taken up by UK FE colleges and other learning providers.

Sarah Knight and Heather Price then gave a brief update from the Jisc digital capability team
comprising of four sessions:

  • Digital Discovery tool surgery – Heather Price, Jisc
  • How can we support students with the development of their digital capabilities using the Jisc discovery tool for learners? Helen Beetham and Sarah Knight, Jisc
  • Mapping of Microsoft resources to the digital capability framework – Shri Footring (Jisc), Nevin Moledina (University of Leicester) and Clare Riley (Microsoft)
  • Building digital capability service site – Clare Killen and Alicja Shah, Jisc

The event closed with the second set of four pecha kucha sessions:

  1. Practising Digitally @ NTU – Elaine Swift, Digital practice manager, Nottingham Trent University
  2. MedEd meet real world world – building capability in HE and NHS workplaces – Cath Fenn, Senior academic technologist, University of Warwick
  3. To infinity and beyond: achieving the University’s ambitions through digital capability – Mike Quarrell, Worksforce development co-ordinator and Alison Small, Head of registry services and change, University of Derby
  4. Can you escape the digital challenge? – A Pecha Kucha in rhyme about our Digital Escape Room event – Mark Hall, digital learning developer, Bishops Grosseteste University

Overall, I was struck by the sense of energy throughout the day. This was evident in the keynotes, presentations and workshops as well as the depth of questions and conversations throughout the day. Delegates mentioned that they found the keynotes, presentations, and the opportunities to network and share ideas particularly valuable.

This is a community led event and we are really keen to work in partnership to run the next one, due to be held in November 2018. Please get in touch with us in the team if you might be interested in hosting this event.

 

 

Digital discovery tool: please give us your feedback!

Originally posted on Jisc Building Digital Capability Blog .

Over the last few weeks we’ve been immersed in individual feedback on the experience of using the Digital discovery tool. This has meant some significant revisions to the content and format of the questions for staff, as described in an earlier post. As we are now at the end of the pilot we’ll be able to compare feedback since the changes were made and see if users find them to be an improvement. (Remember you’ll still have access to the tool till the 13th July).

log-in screenSome of the same issues have been recorded by our student users, along with some new ones such as relevance to different subject areas. We’ll be reporting back on this feedback shortly, with our planned response. You’ll have an opportunity to hear more about individual staff and student responses in our webinar at 12.30 – 14.00 on Tuesday 19th June (links to follow).

Now we are keen to hear about the experience of our lead contacts and how the Discovery tool has been used at organisational level. We have just launched the evaluation form (you will receive a personal email with a link to this). All the questions are optional, to help you focus on those areas where you really have something to day. But of course the more you can tell us, the more we can improve.

In particular, we ask about any evidence you have that use of the Discovery tool has led to change, either for individual users or more generally in the organisation. It’s really helpful if you have carried out a focus group or consultation event, and there are resources to help you do this on the evaluation page of this site. There’s also a handy reminder here of the evaluation process overall. And Shri’s recent blog post covers some of the organisational issues you might be thinking about as you compose your feedback.unnamed-2

There is a whole section of feedback about your experience of using the organisational data dashboard, so it’s a good idea if you have downloaded your most recent data and thought about how it might be used. See our guide on how to download the data, and blog post on Making use of your data).

We’d appreciate all organisational responses by the 29th June, as we’ll be analysing these results shortly after. There’ll be an opportunity to hear and discuss our findings at our second webinar on Thursday 19th July, 12.30-14.00.

GDPR and Learning Analytics – Frequently Asked Questions

Originally posted on Effective Learning Analytics.

Universities and colleges are understandably concerned about the European General Data Protection Regulation and its impact on their processes. There is quite a lot of uncertainty about how the new legislation applies to learning analytics initiatives. We believe it is perfectly possible to carry out learning analytics in the interests of students while complying with the new legislation, though careful consideration needs to be given to various issues such as whether and when to ask students for consent.

The following is our attempt to answer some of the main questions that have been cropping up in this area. My thanks to Andrew Cormack who has provided his useful detailed feedback on my initial drafts, much of which were based on his earlier work, and to Paul Bailey for further helpful comments.

Please let me know if any of these issues remain unclear or if you think there are additional important issues relating to GDPR and learning analytics which need to be clarified.

Q1. Should we ask students for their consent to collect their data for learning analytics?

Much of the data used for learning analytics is being collected anyway and may be necessary for providing students’ education or for statistical purposes e.g. date of birth, gender, prior qualifications, modules studied, grades and use of IT facilities.

Often it’s not possible to opt out of collecting such data so asking for students’ consent is not meaningful and would not be acceptable under GDPR.

You must ensure though that the collection is justified under one of the lawful bases for processing provided by GDPR such as it being a legal obligation, in your institution’s legitimate interests or necessary to fulfil your contractual obligations with the student.

If you’re collecting data specifically for learning analytics such as asking students how much time they’re spending studying you must ensure that the students have consented to this.

Also, if you plan to collect Special Category Data (e.g. ethnic origin) you must first obtain the consent of your students.

Q2. Should we ask students for their consent to carry out learning analytics on their data?

The Information Commissioner’s Office is clear that organisations should avoid over-reliance on consent as a justification for data processing and that it’s often better to use a different lawful basis.

Reasons why you should not normally ask students for their consent to carry out learning analytics on their data include:

  1. Using a justification such as “legitimate interests” for the processing of student data provides students with better safeguards than using consent. In this case the institution takes on the burden of ensuring that all such processing is done in ways, and subject to policies, that minimise the risk to individual students.
  2. If consent is requested you freeze the activities to which it can apply: new types of analysis cannot be added if they were not envisaged at the time consent was obtained.
  3. Enabling students to opt out of data collection may create ‘holes’ in the data set which reduce the effectiveness of learning analytics and disadvantage students overall.

There are however two exceptions to this where students’ consent must be obtained:

  1. Where Special Category Data is used
  2. When you decide to take interventions with individual students on the basis of their analytics.

Q3. Should we ask students for their consent to take personal interventions with them as a result of learning analytics?

The analytics may for example suggest that a student is at academic risk. Initial contact with the student can be justified under the legitimate interests of the institution however when you intend to carry out an intervention on the basis of this data then, yes, you do need to request the student’s consent. Examples might include:

  • An email to the student (justified under legitimate interests) which offers an informed choice of whether to attend an extra class (the intervention to which they would need to give their consent)
  • Discussion in a routine tutorial meeting (legitimate interests), suggesting that an extra class could be helpful (consent).

Q4. When should consent be obtained from students to collect and use their data?

We recommend that consent is sought at the point where Special Category Data is collected or at the point where an intervention is offered. Requesting consent from students to collect and use their data for learning analytics on their first day of study as part of registration processes is unlikely to be legally valid and could result in incomplete, and hence less useful, data sets.

 

Q5. How should we ask students for their consent?

The requirements of GDPR for requesting consent include:

  1. Consent requests should be kept separate from other terms and conditions.
  2. Clear and specific information must be given to the students about what they are consenting to.
  3. Students should be informed of any third party data controllers who will rely on the consent.
  4. The consequences of either providing or withholding their consent must be made clear.
  5. Clear, affirmative action is required by the student; the use of pre-ticked boxes would not be acceptable.
  6. Students have the right to withdraw their consent at any time so mechanisms must be put in place to enable them to do so easily – with the consequent removal of their Special Category Data from all databases or withholding of any interventions.
  7. You should keep records of any granting, withholding or withdrawal of consent by students.

Q6. What happens if a student doesn’t consent to the collection or use of their data for learning analytics?

Students should normally only be asked for consent if Special Category Data is collected or used or if interventions are to be taken with them (e.g. being phoned by a tutor).

If the student refuses to provide Special Category Data then it clearly can’t be collected or processed in any way, while if they refuse the particular interventions offered then they should not be subject to them.

Q7. What should we tell students?

Complete transparency about the processes of learning analytics, the data used etc. is important to ensure legal compliance as well as acceptance by staff and students. We’ve produced a model student guide to learning analytics and a model institutional learning analytics policy which can be adapted for institutional use and discussed in relevant committees with student representation.

You should provide additional information to students when inviting them to provide Special Category Data or when seeking their consent to carry out interventions based on learning analytics. See the response to How should we ask students for their consent? for suggestions on what additional information to provide.

Q8. Is learning analytics automated decision making as defined under the GDPR?

The GDPR protects individuals from solely automated decision making (i.e. ones without any human involvement) that has legal or similarly significant effects on them. Most instances of learning analytics would not fall into this category. Significant decisions should be taken by humans, who could potentially use learning analytics to help inform the decisions.

Q9. Is learning analytics profiling as defined under the GDPR?

Profiling is defined as

any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements

Many applications of learning analytics, including the identification of at-risk students, would come under this definition. Profiling is considered a type of automated decision-making so, if it has legal or similarly significant consequences, it is regulated in the same way as automated decision making.

Ensuring that humans are involved in decisions with significant consequences for individual students will help to ensure compliance with GDPR.

Q10. Lecturers and tutors have always known about students who are at academic risk and been able to intervene: isn’t learning analytics may just be supplementing what we did anyway so why should we have to do things differently?

The GDPR is concerned with personal data. If a student’s data is used to inform interventions then this does require the student’s consent, even if the data is basically confirming what the lecturer or tutor already knows.

More support for tracker reporting – and your feedback please!

Originally posted on Jisc Digital Student.

Feedback on Tabetha’s powerpoint template and excel file has been overwhelmingly positive – thank you everyone who has got back to us to say how useful these have been. You can download both files from our previous blog post. (The excel spreadsheet now has
a minor update to improve the formulae – the links remain the sameScreen Shot 2018-05-24 at 09.15.27.)

As we’re obviously onto a winner with resources to support reporting, Tabetha has also designed two posters you can customise for reporting your key messages around your university or college. You can download the posters as one customisable .pdf file here.

We also want to offer you another resource in draft, and discuss the possibility of a further toolkit – still at the concept stage. We’d like your feedback on both of these.

Digital student experience benchmarking tool

This DRAFT benchmarking tool enables students to audit their own digital experience. This was first developed by Helen back in 2015 with the support of the NUS, student change agents, and the Student Engagement Partnership (you can download the original here). It has been used by student unions and guilds to initiate conversations about their digital experience.

Nov blog photo Skills

Working with students on the benchmarking tool

However, the existing tool really needed an update to reflect what we have learned in piloting the tracker, and to tie in more closely with the questions we are asking. We’re working with student organisations to refine this new version and make sure it meets their needs. We’d also like to ask our tracker pilot institutions for feedback. Is this a useful resource? How might it be used in practice?

Sarah looking at digital student posters

Sarah looking at digital student posters

Some ideas we have are:

  • Student focus groups could identify the areas where they want to see progress: these can now easily be mapped to findings from the tracker survey.
  • Students could work from this resource to produce their own ‘blueprint’ for what they’d like to see happen, whether or not the tracker survey is run.
  • Students and staff working together on the benchmark might decide they need to run the student tracker, to canvas student views more widely.
  • Tracker leads can map findings from the student survey onto this tool to see what ‘next steps’ could be planned: these could be communicated with students in feeding back the tracker findings (e.g. in a ‘you said, we did’ style).

 

Digital toolkit

For some time we’ve been wondering how Jisc could support organisations to build a ‘digital toolkit’ for arriving students. Organisations are different and it’s not for Jisc to prescribe what students need to succeed in different places. However, we know this is an area where a bit of support can really make a difference.

One option is simply to provide a checklist of the information and materials that incoming students find useful. This would relate to the systems you have in use, devices that can be used on the network (and how to connect them), your local learning and teaching practices with technology, and how students can best prepare to study digitally. You could use this to ensure your pre-arrival resources are up to scratch.

Another option is to provide a template (perhaps web based?) into which this information could be embedded.

Either way, your tracker findings could allow you to offer the ‘voice of current students’. For example you could include:

  • common digital learning activities, e.g. online research, online discussion, using polling devices in lectures
  • how current students find digital learning to be of benefit, e.g. making them more independent in their study habits
  • what digital apps and resources might be useful, based on free text responses
  • assistive and adaptive technologies other students find useful, again based on free text responses
  • helpful quotes from current students – perhaps from free-text responses or focus groups.

Your feedback

We need your feedback on these resources and ideas. You are welcome to use the Jiscmail list for any comments. We will also be launching an evaluation survey (on 4th June) to find out about the experience of running the tracker. This will include questions about the resources we have produced, or might produce, to support you in analysing, reporting and responding to tracker findings.

Please do tell us your views – your input drives the decisions we make and the direction the tracker project goes next.

“Accessible” is not the same as Inclusion

Originally posted on lawrie : converged.

Do we consider all learners when embedding digital into teaching, learning and assessment? If they don’t have the newest device, good connectivity, unlimited data, can they still participate, engage and learn, what if they have a disability? When practitioners start to use innovative practices and technologies is everyone included?

It’s not often that I talk about accessibility, inclusion and technology; and its even less frequent over the last 10 years that I have written about it. But between 2001 and 2006 I wrote over 20 papers, chapters and other miscellaneous publications covering a broad range of issues around inclusion and technology. It was a formative time for me, I had been tasked with building a new service that would have a positive impact on disabled student’s educational experience. So TechDis was born (with huge thanks and support to Rich Townend and Mike Adams).

Every so often I find myself looking back and wondering how much of impact did we have. Initially I think we did, I think we worked hard at moving the agenda forward using a social model of disability, and we also worked hard to disabuse the sector of the notion that “accessibility standards” could create an inclusive environment – when you set a minimum standard, that is what people achieve.

A lot of what I brought to that role was based on my own experiences. At the time of dropping out of school I had no qualifications. My school years were unpleasant, I was constantly terrified of looking stupid in front of the other kids. I was an “undiagnosed” dyslexic and anyone with dyslexia will know those feelings well. I was in my late 20’s when I entered education again, I had lots of support, from family and friends, although even at graduation I still had not been identified as dyslexic (I didn’t even understand what it was).

Most recently I have been involved in the Next Generation Digital Learning Environments work at Jisc, some of the emerging trends have been especially interesting, where we are seeing lecturers have access to a range of tools outside of the institutional control, tools that allow them to build their own apps for example; and alongside these new tools and apps they have been adapting their practices.

This is what brings me back to my own experience, and my experience at TechDis. Most tools and commercial apps will have inbuilt features that support disabled students; legislation across Europe, and the Global North Countries embed the principles of ensuring accessibility, and by and large this has been of benefit. But recently I have come into contact with lecturers who have been doing great things with the plethora of apps and tools, shaking up their teaching and engaging students.

It all feels great.

But I started thinking about some of the answers to questions I have been asking.

“How many of your students engage with the tools?”

    “About 90%”

“Are the tools accessible?”

    “Yeah, the students love them, they access on their smartphones”

The questions go on?

One lecturer has an app that is a quiz tool. The app times the students, and the students that finish fastest, and with right answers, win prizes.

Is it Inclusive?

To read textbooks, with complex equations and diagrams, is hard for me, it takes a little longer than some people. I am not going to win a prize. And I just have mild dyslexia.

The apps, the tools, the websites, the e-books, are all getting more accessible. They are flexible and adaptable. Technology is helping me, I can easily change fonts, backgrounds, as I get older I find myself increasing the size of the text! Technology can help disabled students; but it doesn’t mean your teaching practice is inclusive.

I felt stupid at school, while the smart kids were lauded, handed prizes. I thought we’d left those days behind. But the technology, the access to technology, and the way the technology may be being used is again raising those issues.

“How many of your students engage with the tool?”

“About 90%”

If you don’t have the best device, the fastest data; if the way the tools are used to reward the smartest and most engaged students at the expense of the that 10% that are not engaged, then what are we doing as educators?

I am hoping that some of the practices that are stratifying students, creating a divide, are exceptions, transitional approaches. But in a race for recognition for innovation, where new and shiney is rewarded, I think we may be failing many students. Hearing about these practices, so many years after I graduated, has shaken me; a practice being used that would have rocked my confidence, would have hurt me academically, at university has really brought home to me the importance of inclusion, not just accessibility.

#CAN 18 Resources – Presentations and Blog Posts

Originally posted on Change Agents' Network.

We are gradually receiving the presentations from CAN 2018 and they are being added to the Resources page.

Along with the resources, there is a Storify and we’ve just received a link to a comprehensive blog post by Brad Forsyth and Jake Forecast, who presented last year at CAN 2017 as students from Epping Forest College.  This year’s blog posting gives a wonderful overview of the range of events and topics discussed this year during the 2 day conference at the University of Winchester.

VLEs – Value, Learning and Engagement?

Originally posted on Inspiring learning.

One area that the Student Experience team have been looking at is the role of the Virtual Learning Environment (VLE) in universities, colleges and the skills sector.  There are many organisations looking at their current VLE, with an eye to the future and assessing where they are with their use as a digital platform.

students working online

Students accessing their VLE

Value is a very useful word when assessing a VLE and needs to be viewed from a variety of perspectives.  This poses a number of questions to consider, such as:-

  • Do staff feel it is worth their time to upload or create content?
  • Why should staff invest time to learn new skills in creating quizzes or other interactive resources?
  • Do students value the VLE as a support to their studies? Does the VLE enhance learning and offer a ‘valued’ experience?
  • Does the organisation as a whole value having an online environment for their learning content?

Many organisations take pride in their buildings and surrounding areas.  Life on campus is aimed at providing a good student experience.   We would encourage organisations to take the same pride in their digital environments and apply the same principles and vision.

In our pilots so far, we have found in some institutions, quite a distinct difference in that ‘value’.  If you would like to know more about how Jisc can help you review your VLE, when the service goes live, you can contact us via our consultancy page https://www.jisc.ac.uk/consultancy.

We have found that while most students do value 24/7 access to learning material, especially for revision or missed sessions, they do have an expectation of what a system can provide.  Social media presents non-stop fresh material and engaging content.  Responsive platforms and notifications come as standard.  How does a VLE compete for attention?

Devices in use by students

Students use different devices to access the VLE

A lot of the ‘value’ will come from the user experience: navigation, ability to use on mobile devices, speed and so on. But we have also heard about how content is also very relevant.

Where the students have been unanimous in their appreciation for an online connection to their studies, some staff have shown less enthusiasm about what role the VLE currently plays.  All the main platforms on the market are capable systems but the fall back to uploading documents is the preferred option.  These being either already created or can be done in applications already familiar.  So, a level of confidence and capability are key factors in engaging staff in creating content to enhance learning further.

It is obvious that many of the staff we have spoken to understand what a VLE ‘could’ achieve, but the common answer for not engaging with it is ‘time’ or a lack thereof. This brings the ‘value’ element to the attention of  senior management.  How important is it to the senior management to provide the training and time for the staff to gain skills and produce a more engaging and enhanced experience?

One thing we feel is an important output from the review is not all about answers, but to also provide the right questions an organisation should ask when thinking about vision, strategy and looking at their VLE and how it meets their expectations now and in the future.

The post VLEs – Value, Learning and Engagement? appeared first on Inspiring learning.

Three emerging insights from the Digital discovery pilot

Originally posted on Jisc Innovation in Further Education and Skills.

Co-authored by Clare Killen

Map showing locations of UK pilotsOver one hundred universities, colleges and other providers are piloting the Jisc Digital discovery tool in the UK and overseas. The design of this tool encourages individuals to reflect on and develop their digital capabilities. It provides a summary of their self-assessment in response to nuanced question prompts as well as suggestions for further development with links to relevant, interactive resources. Whilst it is very much a personal tool, additional features allow institutional leads tasked with supporting digital capabilities development to gain insights from anonymised data and translate them into the institutional context.

Jisc team members have visited several pilot institutions to support the implementation process. In doing so, and through our in-depth conversations, we have learned about what works, at a practical level, when it comes to providing opportunities to develop the digital capabilities of staff and students in the various organisations. Further insights have emerged from conferences, events and meetings featuring presentations from our pilots, for example, the Student Experience Experts meeting and the Digital capabilities session at Digifest18.

As the roll-out gathers pace, we are starting to gain some fascinating insights into how institutions are using the opportunities offered by this tool to bring about positive change in their organisations. There are some clear themes emerging around what organisations that are benefiting from the Digital discovery process typically have in place:

 1. Clear strategic vision

We are seeing that the organisations with a clear message about the importance of digital technologies, communicated and understood by everyone, provides a meaningful context for the use of the discovery tool.

“It is important to have a clear strategy and people need to know that digital is part of the strategy and part of what they do. You need to engage people in it, allow them to see how it affects them and why it is important to them. It needs to be exciting, so for example, we have run several big events that inspire and excite people around the idea of using technology to support teaching and learning and the college business.”
Penny Langford, head of e-learning, Milton Keynes College

2. Culture

Having a safe space in which teams can explore their thinking about their own priorities for development creates an environment in which individuals can thrive.

“The individual reports which each member of my team had, generated discussions and comparisons, with staff considering their different roles and how that has had an impact upon their individual percentage. More than that though, it made them consider how they might acquire skills where they didn’t score as highly. I have eLearning Technologists and Librarians in my team and each had different scores, particularly the Information Literacy category. Which prompted all manner of discussion around the fake news agenda and critically evaluating information sources.”
Sarah Crossland, academic services manager, Doncaster College and University Centre

3. Connections

Establishing the connections between individual self-identified aims, the overall picture for all staff and the resources available to support professional development to meet organisational strategic aims.

We wanted to identify gaps in staff confidence in their digital skills and use this information to target staff training and support. We looked at other products but there was nothing really out there to meet those requirements. We were looking for a standardised tool and wanted something to self-motivate staff. The approach taken by the Digital discovery tool supports that.
Joseph Pilgrim, digital learning co-ordinator, ACT Training

Digital capability community of practice

The next digital capability community of practice event is being hosted in partnership with the University of Leicester on 22 May 2018. This provides an opportunity to learn about related initiatives and hear more from the wider community including many members who are taking part in the pilot of Digital discovery tool.
While registration for this event has now closed, the keynote sessions will be live streamed. Follow the hashtag #digitalcapability on the day and presentations and any outputs will be available from the event page.

There is still time to engage staff

If you are part of the pilot, you still have time to engage staff, perhaps through end of term staff development events. Remember that feedback is required by the end of May but the Digital discovery tool will continue to be available until 13 July 2018.