Learning Analytics Adoption and Implementation Trends

Originally posted on Effective Learning Analytics.

Lindsay PinedaThis is a guest blog by Lindsay Pindeda, who is currently a Senior Implementation Consultant for Unicon and has a rich background in learning/predictive analytics. In her previous position, she focused on helping to develop, implement, and execute a proprietary predictive modeling technology that has proven to be successful in predicting student course persistence on a week-to-week basis. Lindsay has immersed herself in learning/predictive analytics research, practical applications, and implementation. Since coming to Unicon, she has been working with institutions to provide Learning Analytics solutions, both technical and nontechnical, and has had a focus on visiting institutions onsite to provide Readiness Assessments. She helps institutions work through issues of change management, resourcing, challenges, and concerns relating to the adoption of Learning Analytics.

Identifying Organizational and Technical Patterns

Key Takeaways

  • What are the top learning analytics challenges among institutions in the UK?
  • What organizational and technical considerations need to be addressed to pave the way for a successful learning analytics initiative?
  • When it comes to learning analytics, are institutions ready?

illustrationOver the past year, I had the pleasure of traveling throughout the UK with a colleague, Patrick Lynch of the University of Hull, jointly conducting Readiness Assessments as part of Jisc’s learning analytics project (see “Useful Reading” section below for more information). The institutions varied in student population and demographic representation, and were located across the UK. Institutions included both HE (higher education) and FE (further education), with private and public institutions represented.

Yet with all of the diversity between institutions, they shared many similar organizational and technical readiness trends. My hope is by sharing these trends in the aggregate, institutions interested in learning analytics (whether through Jisc or through another means) will find that they are not alone in their concerns or challenges, organizationally and technologically.

The Readiness Assessment process, outlined below, is designed to be collaborative and conducted onsite with a variety of key stakeholders across the organization. Typically, the onsite visit is three days long and consists of larger-scale meetings involving several departments, smaller-scale meetings including focus groups, and one-on-one meetings with individuals. At one institution, for example, we gathered the key stakeholders to facilitate a discussion using activities designed to help the participants generate productive and collaborative conversations.

Magnifying glass over Readiness - illustrationSome questions we asked participants were:

  • How do you think learning analytics will impact your daily job activities?
  • What policies, procedures, and practices do you believe will need to adapt or be created to accommodate the adoption of a learning analytics solution within your institution?
  • What ethical considerations are there to using the data to provide guidance to students?

The different processes involved in assessing organisational readiness

Man on ladder with binoculars - illustrationWe typically spent one day doing larger scale trainings/ activities and two days meeting with organizational staff. Meeting topics included organizational structure, policies, procedures, and ethical concerns. We met with technical staff to discuss data management practices, integration, and maintenance challenges/ concerns. With senior level leadership, we provided them an opportunity to express their concerns, feedback, and overall goals for the institution. We also conducted focus groups with students (of varying degree levels and academic focus), academic staff (including teachers), and professional staff (including academic advisers, strategy groups, and learning/ teaching staff).
After the onsite visit, a fully comprehensive report was delivered back to the institution that contained all observations, direct feedback from participants (in an anonymous manner, collected as quotes without assignments of individual names), both qualitative and quantitative measures, and recommendations of next steps.

Institutions provided positive feedback on the Readiness Assessment, reporting it was extremely beneficial in bringing together individuals from different departments throughout the institution. This allowed the departments (e.g., learning and teaching, academics, senior leadership, students, IT, etc.) to speak freely and openly with each other about their concerns; the ways in which they might use the information gained; how one department’s activities can and do affect another; and more. Each institution received a comprehensive report outlining the information gained within the sessions to help illustrate the institution’s current state and the steps needed to get to a place where learning analytics could continue to be explored.

In our discussions, we used the Unicon Readiness Assessment Matrix (closely coordinated with the EDUCAUSE Maturity Index) to assess the readiness of institutions. The matrix rates institutional readiness based on six criteria: Data Management/ Security, Culture, Investment/ Resources, Policies, Technical Infrastructure, and IR (institutional research) Involvement. These criteria provided an outline for the qualitative elements discussed within the comprehensive report and throughout the onsite visits.

Our readiness process also provided a forum that enabled institutions to collaborate on ways to overcome the concerns and challenges that they identified. A summary of the most common challenges and concerns, “real-life” examples, and potential solution suggestions (directly from institutions) will be covered in a series of future articles, beginning in March 2017.

Trends: Organizational

“The institutions discovered discrepancies in how staff and leadership perceived management of institutional policies and practices.”

The following are observed trends in regards to organizational aspects such as culture, process, and communication.

  • Level of change management comfort/ willingness
    • Notable challenges throughout all visits regarding level of comfort and willingness to accept change management including job roles, additional responsibilities, and changes to current practices.
    • Significant variances in the understanding of what would be required in terms of level of effort among leadership members
    • Bulk of resistance experienced from academic/ teaching staff who have prescriptive allocations of their time related to teaching and advising
  • Organizational support for analytics
    • Staff (inclusive of academic and university/ college individuals) were particularly concerned with the impact on their current job requirements, roles, and workloads
    • The overall communication from all staff level roles was that a “top down” directive from leadership members would be necessary to properly implement learning analytics efforts
  • Organizational infrastructure
    • Most institutions did not have the organizational infrastructure currently present to support the implementation and adoption of learning analytics technology
    • Several did not have formalized organizational structures and many did not know what other departmental staff did on a daily basis or how their jobs affected each other
    • Most were very concerned about issues of redundancy, additional workload, and time management
  • Policy/ Practice management
    • Overall, the institutions did have their own policies and practices in place that were currently being managed; however, there were great discrepancies in most institutions regarding how the staff and leadership perceived the management of policies and practices
    • Several institutions were concerned about issues of micromanagement, current inconsistencies of policies in place, the execution/ accountability of those policies and practices, and whether learning analytics would add to their already overloaded requirements
  • Ease of integration with existing organizational structure
    • There were many concerns expressed about how learning analytics would actually work in practice. All institutions liked the theory and the idea behind it, but were uncertain about what it would mean to actually implement it
    • Most of the organizational structures in place did not support learning analytics implementation. For example, many institutions were divided about their use of personal tutors (some did not have this role at all) and instructors/ lecturers. Determining who would be responsible for implementation was a topic of much debate

Organizational Trends Summary

Overall, from an organizational perspective, there was a high level of support for learning analytics. While there were concerns, as expressed above, the learning and teaching, academic staff, and student groups all felt strongly that learning analytics would be of benefit to the institutions; however, only if it were implemented correctly and with the full participation of all required groups. High-level buy-in did vary from institution to institution, but there were more supportive members of leadership than not. Very few institutions had organizational challenges and obstacles that could not be overcome with communication, training, and involvement from several departments throughout the institutions.

Arrows on a graph - illustrationTrends: Technical

The following are observed trends in regards to technical aspects such as data, infrastructure, and integration.

  • Demonstrates sufficient learning analytics knowledge
    • Most technical departments throughout the institutions had some knowledge of learning and predictive analytics; however, much of their knowledge and experience was in academic analytics
    • The bulk of the time spent on education about learning analytics was experienced within the technical departments themselves
  • Institutional infrastructure
    • Many of the institutions already had a data warehouse, but housed only limited integrated data from only one or two systems. For example, the VLE (Virtual Learning Environment) and the SIS
    • All of the institutions still had many “manual processes” (e.g., collecting attendance manually on Excel spreadsheets) that were not being captured or housed in a collective place
    • All of the institutions expressed interest and a desire to have collective information in one place that was easily accessible (i.e., a “one source of truth” concept)
  • Data management
    • All institutions were using data but for many different purposes and none of them were in sync. Most were not aware of which data other departments were using and for what purpose
    • They were all compliant with Data Protection laws and policies; however, each department appeared to have their own interpretation of those laws and policies
    • They expressed a desire to have a unified way of managing data across the entire institution
  • Ease of integration with existing infrastructure
    • Due to having so many different systems and data sets, the ease of integration appeared to be a challenge for most institutions. Institutions were using an average of 8-10 different systems and many more different ways of collecting and interpreting data
    • Many also had previously purchased commercial systems that may result in some challenges with integration of learning analytics technology
    • The integration of systems depends highly on the expertise of both the institutions’ technical teams and their ability to comply with the xAPI and UDD (Universal Data Definitions) requirements. Due to the many inconsistencies with data collection, much work will be needed in this area for the institutions
  • Maintenance
    • This was a major concern for all technical teams throughout the institutions. None of the institutions had a high variety of skill sets among their staff to be able to manage the learning analytics implementation in-house, nor would they have the same level of expertise to maintain the technology. Data Scientists, Data Engineers, Business Analysts, Quality Assurance, and other technical roles were not found at any of the institutions
    • Institutions would be in need of constant support during and after the implementation of the technology
  • Resource support
    • While there was buy-in for the implementation of learning analytics, all institutions had smaller technical teams with concerns related to resource allocations, workload additions, and time management
    • Most also expressed the desire for explicit prioritization of learning analytics from leadership to help them guide and direct current resources and work efforts

Technical Trends Summary

Technical trends centered around the availability, access, and use of data, as well as the staffing needs to deploy and maintain a learning analytics environment. However, very few institutions had technical challenges and obstacles that could not be overcome with communication, training, and involvement across departments throughout the institution.

Conclusion

In the aggregate, our measurement of “readiness” across the institutions we served yielded an average of a “somewhat ready” to “ready” score. This denotes the majority of the institutions visited were reasonably ready to implement some element of learning analytics technology solutions, provided they address the more glaring challenges first. As these institutions move forward, it will be key for them to keep open communication with all of the Readiness Assessment participants throughout the project; without this, there will be a significant decrease in buy-in, enthusiasm, and willingness to accept/ adopt change.
The following selection of quotes come from a few of the institutions we visited and demonstrate the value of the Readiness Assessment approach:

“Just getting the groups together to talk to each other has been a valuable experience. Most have never even met and the discoveries made about how different departments affect each other has been brilliant.”

“We are extremely grateful for the guidance and assistance in aligning the university and encouraging genuine conversation on how we take learning analytics forward.”

“As a senior member of staff, it was valuable to look at this process from both a top-down and bottom-up approach to determine the best way to provide support and leadership around the [learning analytics] initiative.”

Based on client feedback, Unicon now offers a service to help institutions get started with learning analytics that includes similar readiness activities. LA Quick Start provides institutions with an environment to help get them off the ground, which can evolve into a full learning analytics solution at each institution’s own pace. The service includes data integration with the LMS into a Learning Record Warehouse (on-premise or in the client’s cloud environment), and the ability to view student activity through dashboard visualizations. In addition, Unicon includes consulting hours with this service to help with an analytics Readiness Assessment and/or roadmap planning.

LA Quick Start can be used to get conversations started across various groups on campus, which is one of the key factors in getting onto the path towards success with learning analytics. View the LA Quick Start datasheet for more information.

Download the PDF version of this article.

Useful Reading

Jisc, Effective Learning Analytics, 5th UK Learning Analytics Network Event (2016)
Jisc, Learning Analytics Discovery Service (2015)

Case studies: journeys towards digital expertise

Originally posted on Jisc digital capability codesign challenge blog.

New and updated resources in this post:

Case studies:

Summary report: Journeys towards digital capabilities

Digital capability is an agenda for organisations across the sectors of education. But how best to take it forward in your own setting? Time and again we hear that examples from practice are what people need to turn inspiration into action. There is no substitute for learning from people who have tried and succeeded, and although we can’t bring you those people directly, we have done the next best thing. Through interviews with key players and a look at the background evidence for each case we have produced a series of written reports on organisations that are making a difference.

You can explore the full list of case studies from the links above and from the Digital capability project page. Each one starts with a general overview so you can judge how relevant the lessons might be in your own setting. Some of the ideas, though, will be relevant to everybody. We have drawn these together in a summary report, Journeys towards digital capabilities, which lists the lessons learned at the different case study sites. These cover: frameworks and definitions; other strategic approaches; development strategies (personal and curriculum change); motivation and reward for staff; and ideas for working with students.

In the next post we hear from some key change agents whose stories feature in the case studies, and who also presented their experiences at Digifest.

Follow #digitalcapability for updates over the next days and weeks, and beyond.

Digifest: Learning analytics interventions should always be mediated by a human being

Originally posted on Effective Learning Analytics.

I was at Jisc Digifest 2017 last week involved in several sessions around data and analytics. However for me the most interesting session was the debate on Humans vs Machines debating whether  “Learning analytics interventions should always be mediated by a human being”.

In case you are new to learning analytics, it is the use of data to support learners, enhance other educational processes or to improve the curriculum, usually involving some sort of data analysis that your normal human would struggle to do. The intervention is the action a human or machine makes in the form of an alert message (via a text, email, phone call, meeting etc.) in response to the data analytics to attempt to support the student.

My two debaters were for the Machines, Richard Palmer from Tribal and for the Humans, Sheila MacNeill from Glasgow Caledonian University.

Jisc events started the debate early with Learning analytics: ditch the humans, leave it to the machine – a Digifest debate which led with some arguments from Richard on the reliability of Humans and the potential of machine based learning and interventions in learning. However this was quickly countered with a blog post from Sheila with Time for Analytics of the Oppressed? – my starter for 10 for #digifest debate which argued both the practical and ethical benefits of human interventions.

Some took the news story as literal and missed it was a debate quote from twitter “I thought this piece was sarcasm. Unfortunately, its not. I couldn’t disagree more. Really ?” but this helped get the debate going.

The session was a lively debate  https://www.slideshare.net/JISC/learning-analytics-interventions-should-always-be-mediated-by-a-human-being and was captured after the event in this short video.

A number of points were raised by the audience, one in particular was around the need to give students control over interventions, that they should be able to say how interventions are made and whether they are contacted by a human or machine. This argument reflects the suggestions in an earlier post on consent for learning analytics. I would much rather purchase goods online or interact with a web form than experience the interactions with a sales person or some online support services. Sheila argues it is easier to ignore a machine. Another suggestion is that it may come down to cost of delivery, the machine based intervention is cheaper than the human based intervention.

So what was the outcome.  Not what I expected the voting was so close and I had not time to count hands to find it was a 49/51 split.  It all came down to the word “always”, Richard in summing up agreed with all the moral, ethical and human arguments but swayed enough people by saying there would be times when a machine based intervention is better.

However the debate has continued and Sheila posted a further bog post reflecting on the discussions “I wish I’d said that . . . reflections from #digifest17″ which suggests that the only intervention that will ever work is when a Human decides to make a change. Either a student changes their behaviour or a staff member provides support, changes the learning design or environment to assist the learners.

I am sure the debate will continue as we implement learning analytics. Automated decision making has legal implications and I’m not suggesting that learning analytics should actually make the decisions for us. However for good or bad we increasingly trust and rely on machines to make better decisions in our live.

Amplifying Events through Social Media

Originally posted on Inspiring learning.

How can social media be put to good effect to amplify events?

This week Jisc held its annual Digifest in Birmingham (#Digifest17) and the Subject Specialist team were working hard behind the scenes to ensure the event trended on social media and reached out to those ‘attending’ the event virtually, as well as complementing the activities for those physically present over the two days.

#digifest17

I’m thrilled to report that this year’s #Digifest17 trended third in the UK (just behind the Cheltenham races and Donald Trump’s tax returns!) and experienced in the region of 5,000 tweets under the hashtag!

If you’re looking to amplify your own events with a social media backchannel here are a few pointers to consider that I picked up from #Digifest17.

Preparation, preparation, preparation!

Yes, social media is by its very nature serendipitous, but that doesn’t mean to say that leaving everything to chance will guarantee that it will give your event the coverage it deserves – there’s a lot you can do in advance of the event to maximise engagement on the day itself.

It’s always advisable to create a hashtag early on in the planning process and use it from day one with all content to do with the event. This not only helps to give attendees a central source of up to date news about the event it also empowers your potential attendees to do some of that all-important marketing for you!

Make sure you do your research and find out whether speakers have social media accounts they’re happy to include and have any links to presentations and so on too. Prepare a few template tweets to fire off on the day with this kind of information and you will save you a lot of work later.

Creating social media sharing buttons on any sites you use to provide information about the event also gets the message out there. There are plenty of sites out there that allow you to do it, such as ClicktoTweet, and encourage your speakers and attendees to announce to their followers that they will be attending your event.

It’s not just about Twitter …

The Open University have developed a social media toolkit that includes a really useful overview of many social media tools, such as Periscope, Flickr and Storify, that all link well with Twitter and other social media platforms to amplify your events that’s worth checking out.

On the day of the event assign dedicated people to manage your social media channels so attendees’ posts are responded to and even incorporated into the feel of the event. There are logistical considerations to consider, like ensuring mobile devices being used are fully charged, that all sessions at the event have some coverage and the people doing the social media amplifying are well placed in workshop rooms to hear everything, take photos and capture videos too. If you have access to widescreen TVs dotted around the event venue there are plenty of sites that allow you to display the social media buzz in a visually impactful way too.

Tom Mitchell, Digital Communications Manager at Jisc, also recommends creating a Facebook event page to generate a buzz before, during and after an event. Take a look at the #Digifest17 event page here on Facebook for some ideas on the kinds of things to include.

In a previous post Chris Thomson provides a closer look at Periscope and explains how it can be easily put to good effect to live stream presentations and workshops at events – again, the point about setting up beforehand in a good viewing position away from unnecessary background noise is doubly important here.

The key is to include a range of media to ensure the content is fresh, stimulating and adds value to the event.  Photos, videos, links to presentations and further content all add value and are more likely to amplify your event.

Some may even scoff at the seemingly trivial nature of sharing photos of cakes and the like at events, but sometimes quirky can generate interest and help boost your event in ways you’d never even have anticipated. I remember being at a Connect More event last summer where a little toy monkey kept mysteriously cropping up in many of the photos and it certainly added a sense of fun and got plenty of retweets (it became a little bit like a game of ‘Where’s Wally!’).

Remember to keep your attendees engaged before, during and after the event too. Storify is a great way of bringing together your portfolio of social media around a particular event into one place by creating a digital record (or story) that can be shared with attendees afterwards. It also includes your attendees in that social media buzz around the event by acknowledging their posts – take a look at the Storify from Jisc’s Connect More event in Liverpool last summer for ideas.

Find out more!

I’ve created a draft checklist for using social media that you are welcome to use as a starting point at your own events. Feel free to repurpose this as required and let us know your own experiences of using social media at events in the comments section below.

All the resources from #Digifest17 will soon be available here and for more tips on using social media in teaching and learning take a look at Jisc’s 50 most influential Further Education professionals and Higher Education professionals using social media.

The post Amplifying Events through Social Media appeared first on Inspiring learning.

Four years on from FELTAG

Originally posted on Jisc Innovation in Further Education and Skills.

This guest post is by Ros Smith, author of the updated Evolution of FELTAG guide.

Over the last few months, I had the privilege to ‘talk FELTAG’ with some leading senior managers and practitioners in further education (FE). Why? It was time to take a fresh look at the impact of the FELTAG recommendations on the sector before updating our Evolution of FELTAG report.

The report provides an important insight into the digital landscape of the UK FE and skills sector. And this latest glimpse reveals a sector that is agile and robust, and more than capable of responding to the very tough challenges it faces with its own blend of visionary thinking and hard-nosed astuteness. The good news is that the spirit of FELTAG is very much alive and well, and in a college near you!

Here are some of the key findings from our 2017 FELTAG update.

Keeping abreast of change

Many organisations are addressing their FELTAG goals by drawing up a digital strategy. This in itself recognises the importance of technology in the sector. However, things are never as simple as they sound. The first issue you have to confront is the connection between your organisation’s digital aspirations and its other strategic aims. Interconnectedness is key. There is no value, after all, in procuring a new VLE or learning management system without a programme of staff development to ensure the technology is used to its full potential. Nor is there any point in having a strategy for innovative curriculum delivery without first having the right tools in place.
Getting the equation right depends on a holistic vision, careful planning, and a lot of determination – the next ingredient you need when implementing your strategy:

“It’s like an eco-system or a fine Swiss watch. Each cog is important individually but even more important is how it links into the others…. To achieve these goals, we have been on a three-year journey.” Dr Ken Thomson, principal and chief executive, Forth Valley College.

BYOD
Wi-Fi across the campus is not a cheap option, but forward-thinking organisations have put this at the top of their shopping list to support an increase in curriculum delivery via online and blended learning. Effectively located Wi-Fi hotspots and a robust BYOD policy can make a real difference to learners, and there can be benefits all round. Learners using their own devices have a learning environment they can personalise to suit their needs; replacing an ageing stock of desktop PCs becomes less of a problem to the organisation, but perhaps the biggest winner of all is learning:

“One of the biggest milestones for us has been the installation of Wi-Fi on all campuses. With Wi-Fi access everywhere, the entire college can become a learning space.” Graham Razey, principal, East Kent College

Preparing for an even brighter digital future
The next generation of digital platforms will be nothing if not user-friendly. Providers want to see their teaching and support staff able to create blended learning resources with as much ease and fluency as they use social media. More than one provider had changed key learning platforms for that reason:

“Choose a learning management system that learners and staff feel at ease with. Technology has to work first time and feel comfortable. Only then will your staff move from partial buy-in to full take-up.” Pete Gallop, head of ICT and LRC, The Isle of Wight College

Cornwall College Group took an altogether different tack, keeping Moodle as its cross-campus VLE but ensuring that users’ needs were met by altering the look and feel of the VLE. Their case study describes how Moodle can be adapted to enhance its functionality and refresh its appeal.

The common thread in all accounts, however, was the need to inspire digital confidence and competence of staff. Our interviewees were aware that they needed to support all – the unconfident as well as the front runners – and had taken action to ensure this could happen:

“I said to staff two years ago when we started the drive for creative learning and teaching that I was taking the responsibility for what happened. This meant they were free to experiment without blame.” Ken Thomson, chief executive, Forth Valley College.

Where are you on the digital scale?
The most common answer to this question was somewhere between partially and fully embedded. Of course, there is a difference between being competent in handling IT in everyday life and using it with confidence in the classroom or as a blended learning option. Nonetheless, there were promising signs for the FELTAG agenda:

The majority of our senior managers enthused about the willingness of staff to try new approaches
“We have some highly innovative teachers in the Group. The creativity and passion of our staff, and their willingness to engage with new methods, is making a real difference to learners here.” Michelle Swithenbank, deputy chief executive, the Hull College Group

Senior managers and governors provide firm backing for innovation in learning and teaching
“The spirit of innovation and experimentation amongst our staff, leadership and governors has been without a doubt our most valuable asset on this journey.” Simon Barrable, deputy principal, Portsmouth College

Innovating with less familiar technologies such as augmented and virtual reality is taking root in day-to-day teaching and training
“Learning this way is so much more interesting, but learners also expect it. We would be wasting our time if we didn’t make use of the technology that’s out there.” Michael Grundy, engineering programme leader, Goole College

There are indications of improvements in terms of grades, digital capabilities and employability skills when technology is at the centre rather than on the periphery of curriculum delivery
“I have always said this would be a five-year project but already we are seeing tangible benefits. The digital capabilities of students have improved and so have their independent learning skills. You can also see this in the improved grades.” Simon Barrable, Portsmouth College

Digital is less often viewed as a separate entity
The titles of digital strategies now more frequently feature creativity and innovation rather than technology. A small thing in itself, but could it indicate technology is no longer seen as the new kid on the block? The real benefit of digital technology has always been its capacity to transform the way things are done for the better, and that is what the strategies of many organisations across the sector are now aiming for.
“The days of always doing what you have always done are no longer acceptable, either to learners or employers.” Neil Bates, chief executive, PROCAT

Join speakers from PROCAT, the Hull Group, Forth Valley College and Portsmouth College at our sessions at Digifest on 14-15 March and follow #feltag to follow discussions.

Digital innovation: choosing what’s likely to fly

Originally posted on Inspiring learning.

Drone with camera flying over mountains

Image: Pexels http://bit.ly/2n0PgFc

Thoughts are turning to the next Jisc Digital Leaders Programme which happens in May. Something that I’ve been reflecting on since last year’s event was the role of leaders in supporting digital innovation.

It’s important to foster a culture of play and experimentation in an organisation which allows pockets of innovation to flourish. The problem we find a lot of institutions face is how to take the discoveries of those small-scale projects and to embed them across the organisation.This is one of the topics of the Digital Leaders programme.

What’s been occupying me, though is how do you determine which technologies are likely to embed successfully across an organisation.*

The pace of change

Technology advances at a blistering pace. No sooner do you think you’ve got your head around something like WhatsApp then someone mentions Signal and it can feel like you’re back to square one! It might feel safer to revert to more established channels like Facebook although this could mean you miss out on the opportunity to engage with new audiences and in new ways .

One thing our team of subject specialists gets asked about a lot at the moment is virtual and augmented reality. There’s a feeling that these emerging technologies have potential for education but a way of using them in the mainstream seems elusive.

Universities and colleges don’t have the time or resources to invest in everything and it would test the capabilities and resilience of staff even if we could.

Institutions should focus on the things that are likely to actually make a difference either by transforming learning, professional practice or the effectiveness of the organisation. Success isn’t guaranteed, some level of failure is inevitable, but we should always be trying to shorten the odds.

Some rules of thumb

This is obviously a very complex process but I’ve found it useful to boil things down into some basic rules of thumb.

The first: always focus on the use, not on the features.

Take Twitter as an example. As a basic premise (140 character micro updates sent and received by a network of followers) it doesn’t exactly sound earth-shattering but it’s only through the different uses that have emerged over the years that have made it a successful platform.

Unpromising looking technology can have a real transformative effect when put to the right use.

Once you focus on use  and the impact on the user you can start to apply some other simple tests.

Balancing costs and benefits

When considering the potential impact a technology might have, it helps to start by asking two simple questions.

  • To what extent does the use of this technology change our relationship with information and knowledge?
  • And to what extent does it affect our relationship with other people?

Unless something significantly impacts on one or either of these it’s probably not worth spending much time worrying about, or maybe you just haven’t identified the best way of using it yet. It took a long time for mobile networks and then their customers to cotton onto the potential for SMS – the tech was there, we just didn’t know how to use it (or monetise it!).

Relationship with information and knowledge

Technology can affect our relationship with information and knowledge in many different ways. It can make it cheaper and quicker to access. It can present it to me in contexts where the information is timely or location-specific or in a way that a physical disability has stopped me from doing in the past. It can uncover information that was previously withheld or difficult to find, making what was closed, open. It can aggregate it from a number of sources. It can help us to synthesise it come to new understandings.

Relationships with people

Technology can allow us to connect with people that were previously out of reach for physical, social or economic reasons. We can communicate with them more often and in greater numbers. It can help us to collaborate more easily or on more complex tasks and co-create knowledge. It can change or subvert the relationships of power.

The question for any new technology is will it do these things in a way that is currently impossible or do it better than the systems and tools we already have. It also needs to do it in a way that is valuable to the organisation.

If the answer is no to any of these questions then we should be thinking critically about the value of taking it forward.

If the answer is yes it means we can start balancing it against the potential costs which might include:

  • Financial cost
  • Developing user skills and capabilities
  • Accessibility
  • Privacy and security
  • Safeguarding and ethics
  • Longevity

Virtual Reality as an example of innovation

Woman wearing virtual reality headset

Image: Unsplash http://bit.ly/2n13G8j

Take Virtual Reality as an example as this comes up a lot when we talk to colleges and universities about emerging technologies.

Much of the argument I hear made for the adoption of VR (usually from people selling it) is that it is a uniquely immersive experience and that it’s a great tool for improving learner engagement. I’m not convinced that immersion and engagement are synonymous but the important thing is that this argument focuses on features, not use so it doesn’t help us.

But let’s say we were investigating using VR and 360 degree video to support fieldwork activity.  Video shot in the field could be accessed through a VR headset either in the run up to the fieldtrip or as a tool for reviewing a visit back at base.

At the moment VR doesn’t give us very much when it comes to interacting with others so I’ll focus on how it changes our relationship with information. The information here is the visual appearance of a location and getting access to that without the expense of repeated visits to the same site may be of value. The ability to look round in 360 degrees could potentially give a better idea of scale and distance that might previously have required the use of a number of different information sources. It could also be that the risks to safety of taking students into particular environments are intolerable for an institution charged with their care.

Assuming we can use this to test the value of the technology we can now balance that up against all the other aspects of using it like the cost of the equipment, the redesign of fieldwork activity, health and safety issues and so on.

Whether in this context VR offers anything more than marginal gains over more established working methods is something I’d have to leave to more experienced fieldwork leaders.

Incidentally, if you want to know what Jisc is doing to develop innovative uses of VR and AR in education you should check out the work of Matt Ramirez and Suhad Aljundi.

In summary

This approach does have shortcomings. It’s only meant as a rule of thumb but how would it apply to new technologies that you are discovering?

  • Always think about uses and users, not features.
  • How does this technology affect our relationship to information and knowledge?
  • How does it affect relationship to other people…
  • …and are these benefits of value to you?
  • How do they stack up against the real world costs?

If it still looks like the technology is “in credit” after this it’s worth moving on to the next phase to develop and then embed its use.

Jisc’s process for innovation involves a thorough discovery phase that ensure these sorts of questions are considered before moving on. It’s by no means the end of the process and failure is still a possibility.

But hopefully, you’ve put yourself in a much better position to choose the sorts of technology projects that are more likely to take off.

* I’m deliberately taking a top-down view of this topic. The challenges of developing the uses of technology in a much less-controlled fashion will have to wait for another blog post

The post Digital innovation: choosing what’s likely to fly appeared first on Inspiring learning.

Digital capabilities: a whole-organisation approach

Originally posted on Jisc digital capability codesign challenge blog.

New and updated resources in this post:

In 2009/10, through its Learning Literacies for a Digital Age (LLiDA) project, Jisc first flew a flag for what has become ‘digital capabilities’ today. That project focused on the skills and mindset needed to become a digitally capable learner. It concluded that there is no one way of being digitally literate. Instead there are a number of interlocking practices that individuals express differently depending on their settings, on the aspirations they have, and on the opportunities and constraints they experience.

LLiDA prepared the ground for the more ambitious Developing Digital Literacies (DDL) programme. Twelve universities and colleges were supported to put in place different strategic approaches, and together they explored what difference they could make to staff and student digital practices over a two-year period. Once again, DDL showed that there was not one ‘right’ way of doing development around a digital capabilities agenda. What one university or college needs to succeed and what its opportunities are for development are not the same as for the university or college down the road. Jisc also worked alongside a wide range of professional bodies to consider how digital technologies were changing practice across educational roles.

Since those projects reported we have seen the launch of the UCISA Digital capabilities survey (more on this in a following post), been through several iterations of the FELTAG agenda, and responded to the QAA Higher Education Review theme. Digital literacy, or capability (or fluency, or confidence, or skills) have become a mainstream concern for strategies and organisational leaders to address. Digital technologies have continued to reshape organisations – for example by enabling the collection and analysis of learning data, by opening up new kinds of student market, and by unbundling roles and functions. No-one who works in education has been untouched by these effects.

Print

Six elements: organisational lens. Click for hi res version.

This is why I feel it’s important not to focus on people’s digital capabilities without considering the organisation they study or work in. Tools such as the Digital Capabilities (‘six element’) framework and the profiles might help individuals to identify skills they have that are valuable  in their work, or to consider what areas they would like to develop further. But the responsibility for development has to be shared. We know from those earlier projects, for example, that the digital environment and infrastructure play an important role. People are more likely to develop advanced skills if they have access to a personal device and web services, robust classroom technologies and connectivity, specialist software, and rich digital media to hand.

Individuals also need support to keep acquiring and updating their skills. Online, just-in-time materials are fine for those who have some confidence, but fall down when people are anxious about their basic skills, or are trying to master complex and challenging practices. Time and freedom to experiment, especially with peer support, are necessary for deep, embedded learning. More important still are credible forms of recognition and reward.

What is Jisc doing to help organisations?

Developing digital capability: an organisational framework is a new resource, covering the six elements from an organisational perspective. It looks at what enables individuals to thrive and develop in their digital practice. It is designed to be used by digital leaders and change agents in whatever roles they are working.

From a highly practical point of view, the Digital capability case studies synthesis report describes what 15 organisations are doing to support digital capability and what they find works for them. It is a taster for the case studies themselves, which we will be launching tomorrow alongside our session on the same theme at DigiFest.

We are also launching a brand new Guide: Developing organisational approaches to digital capability. Clare Killen will be blogging about the Guide at its launch, but you can catch a preview right now in this Briefing paper and Poster.

Follow #digitalcapability for updates over the next days and weeks, and beyond.

Digital capabilities framework: an update

Originally posted on Jisc digital capability codesign challenge blog.

This is the first in a series of posts to bring you up to date with developments on the Digital Capabilities Framework and associated resources from Jisc. Each post will start with a list of  resources so you can go straight to those links if you prefer.

New/updated resources in this blog post:

From now through DigiFest and beyond there will be plenty of resources coming your way, with further developments promised through the spring and summer 🙂 So if you are into digital capability (literacy, fluency, confidence or skills) you might want to get your party bag ready.

Goodie bag 2

Click for hi res version

This post focuses on updates to the Digital Capability Framework for individuals. If you don’t know the framework already you can get a quick refresher from this blog post: framing the digital capabilities of staff (November 2015).

The aim of the Framework is to provide a high-level, general account of the digital capabilities that we (in post-16 education) aim to develop, in our staff and in our learners. For the first time in this update we have evidence of how the high level framework is being used in practice. Drawn from survey evidence and consultations, and from case study interviews, these include:

  • To support discussion and build consensus about the capabilities needed in an organisation (described as ‘a common language for development’)
  • To inspire a local version
  • To plan, ‘benchmark‘ or review staff/educational development
  • To plan or review a curriculum, or to develop new learning materials, with digital capability as an outcome (there more on curriculum uses in a later post)
  • To structure and signpost development opportunities – videos, content playlists, workshops, communities of practice
  • To design digital badges or ‘missions’ for staff and/or students to evidence their digital capability
  • To map digital expertise across different staff roles within a team, department, or the organisation as a whole, identifying gaps and recognising where digital expertise adds value

Updates since 2015

New diagram

Click for hi res version

The 2015 Digital Capabilities review produced a Framework that is now well recognised and used (the UCISA 2016 Digital Capability survey will provide more evidence of this). However, we know that the original descriptions for the six elements were too complex. They have been cut down and simplified, and organised into 15 sub-elements for ease of reference. You can download the updated Framework and descriptions here.

If you are very alert – or know the original very well – you will notice some minor changes. There is no problem with continuing to use the 2015 version! But you might be interested in why these changes were made, especially if you have been involved in any of the professional body consultations or have given feedback on resources. (Yes we have listened!) This is what we were asked to do, and why.

Clarify the difference between proficiency and productivity
The first element of the Framework is now more clearly split in two parts, the first (‘ICT proficiency’) meaning functional access to digital technologies, and the second (‘ICT productivity’) meaning the choice and use of those technologies to meet personal needs and the demands of different tasks. Proficiency is essentially a set of technical skills. Productivity is the ‘mindset’ and experience to apply those skills in practice. This involves confidence and curiosity, openness, judgement and discrimination, and the ability to deal with technical set-backs.

We have not included a long list of current tools, apps and technologies in this element because they are constantly changing. But we do have a new resource (coming soon!) that maps the six elements to current tools in use. You keep asking for it, and thanks to Jisc’s Subject Specialists you can now have it. And you will be able to adapt it and add your own favourites too.

Extend the idea of ‘scholarship‘ to evidence-based problem solving
The framework is meant to apply across professional roles in HE, FE and the skills sector. And one of the big toe-stubbing moments for people in non-HE-academic roles was always ‘scholarship’. We knew there was something important here about using digital evidence and tools. We wanted to keep  that digital capability is about thinking differently and not just doing differently. But it had to be expressed in a more inclusive way. This sub-element now appears as ‘problem solving‘. The Profiles for Researchers and for Library and Information Professionals show that this element can be interpreted in ways that are highly scholarly and research-based. But other staff (and learners!) also use digital evidence to make decisions, solve problems, and arrive at innovative solutions. Increasingly, practices such as survey design and finding patterns in data are needed across roles. We hope that in making this term more inclusive we have managed to keep that sense of intellectual engagement.

Notice that we do ‘development‘, not just self-development!
The Framework was intended to be generic, with the special skills of digital teachers being explored in the relevant teacher profiles. While everyone can learn, reflect and develop in their role – the thinking went – not everyone is a teacher.

Feedback on question sets for the Discovery Tool told us that teaching staff did not find enough in the generic framework that addressed their expertise in developing students and the curriculum. At the same time, staff in other roles pointed out that they also develop others, whether that’s supporting students with advice and guidance, contributing their expertise to the curriculum, or mentoring other staff in their team. Learners too can act as mentors and collaborators in the curriculum. So we’ve fixed it. The relevant element is now Learning and development and there is a new sub-element called ‘teaching‘. Just as the original framework implied that everyone should be a ‘learner’ – and have those habits of digital exploration and self-development – so now it implies that everyone in an educational organisation should be able to develop other people. And in the context of our framework that means everyone must appreciate how digital tools can help in this, even if they are not using those tools every day, or as a core part of their role.

We hope that these changes enhance the Framework and make it more usable in practice. We look forward to hearing your views! Please come and discuss digital capability with us if you are attending DigiFest on 14/15 March.

In the next update we discuss the new and updated digital capability profiles for different roles.

Follow #digitalcapability

Digital capability profiles for different roles

Originally posted on Jisc digital capability codesign challenge blog.

This is the second post in our rolling update on the Building digital capabilities challenge and associated resources from Jisc.

New and updated resources in this post:

 

In the first post I explained that the generic Framework has its limits when it comes to considering the specialist skills needed by staff in different roles. We have removed much of the detail from the high level framework, and it is now explored in a series of role-specific profiles or mappings. Each profile is an example of how the six elements can be interpreted and implemented in role-specific ways.

What does it mean to say that they are ‘examples’? Although they are more detailed, these profiles or mappings are still at a very general level. They don’t include any of the digital specialisms that we see emerging in different roles. They aren’t organisationally specific. They  don’t include any indication of level. Why not? Well, we are not trying to create competence frameworks, standards, or role descriptions. Those already exist and are quite rightly owned by the relevant professional bodies. We are – working with the relevant bodies where we can – providing examples of how digital expertise is emerging in different roles.

There is no suggestion that individuals should be able to do everything that is in the relevant profile. The profiles show how new areas of practice are emerging, and how individuals might use their digital skills in different areas of their designated roles.

byod-bannerThe profiles might be used:

  • by individuals to review their own development, and/or to ensure their digital capabilities are fully recognised and credited e.g. in appraisal and review;
  • by teams and team leaders to assess collective strengths and priorities, and identify areas in which new skills need to be developed or recruited;
  • as the basis for a local version, with language and examples relevant to local requirements;
  • to develop or curate resources relevant to people in specific roles.

Working on the profiles

I’m very grateful to have input from the members of several professional bodies and expert working groups on these resources. That doesn’t mean that the professional bodies overall have approved the content – unless it says so on the tin!

  • Teacher profile for further education and skills, with thanks to Jisc’s Digital Launchpad Working Group and Jisc subject specialists for accessibility and inclusion, for their comments and improvements
  • Teacher profile for higher education, with thanks to the Higher Education Academy for their comments and for supporting the mapping to the UK Professional Standards Framework (UKPSF)
  • Library and information professional profile, with thanks to Jane Secker and the Information Literacy Working group of Chartered Institute of Library and Information Professionals (CILIP) for their comments and improvements
  • Learning technology mapping, developed in collaboration with the Association for Learning Technologies (ALT), with special thanks to Maren Deepwell for coordinating this
  • Leader profile, with thanks to participants on the Jisc Digital leaders course for their feedback
  • Learner profile, with thanks to members of the Association for Learning Development in HE and especially to Debbie Holley, Bournemouth University; also to student members of the Change agents’ network (CAN) network)
  • Researcher profile , with thanks to Vitae for supporting and enabling the mapping to their Researcher Development Profile (RDF)

We look forward to hearing your views about these resources – about how they could be useful or are already being used. Please come and discuss how you are developing digital capability in your organisation with us if you are attending DigiFest on 14/15 March.

 

In the next update we look at some new organisational resources for planning and developing digital capability.

Follow #digitalcapability

“The digital leaders programme was immensely valuable…”

“The digital leaders programme was immensely valuable…”

Deborah Millar, the IT director at Salford City College explains how we are helping to transform her world through digital.

Deborah said of the Jisc Digital leaders programme, “it was immensely valuable… it was really reassuring to know that we were all encountering the same sort of things, whether we were from FE or HE, and to see how the work could be embedded in my own college…

The Jisc Digital leaders programme will be running again in May 2017 in Manchester.

Over two residential workshops, we will equip you with the tools, knowledge and skills to:

  • Become a more effective digital leader through your own personal and professional development
  • Explore how organisations can engage more effectively with the digital technology at their disposal – at both strategic and operational levels
  • Discover and reflect on how digital technology is changing the way your organisation operates – creating new leadership challenges and strategic opportunities
  • Learn to lead, manage and influence digitally-driven change across organisations, departments, services and teams.

Aimed at current and aspiring leaders and managers working in higher and further education, our programme is suitable for both individuals or organisational teams.

Early bird rates are available for those booking before 13 March 2017 so act now to secure your place.

Book now 

Further information about the Jisc digital leaders programme can be found on our web site.

We hope you can join us and look forward to welcoming you in May.