Trends Unpacked: Organizational Challenges and Learning Analytics (Part 1)

Originally posted on Effective Learning Analytics.

Lindsay PinedaThis is another guest post from Lindsay Pineda, Senior Implementation Consultant, Unicon, Inc. with Patrick Lynch, Technology Enhanced Learning Advisor, University of Hull.

Earlier this month, I posted an article about the learning analytics readiness trends observed over the last year as I traveled to several UK HE (higher education) institutions with my colleague, Patrick Lynch from the University of Hull. Patrick and I are co-authoring a series of articles titled “Trends Unpacked.” In this series, we will expand upon some of the trends discussed in the article, “Learning Analytics Adoption and Implementation Trends: Identifying Organizational and Technical Patterns“.

Patrick’s knowledge is two-fold, as he not only works within HE, but he is also a leader at his university regarding learning analytics. Our combined expertise brings the added value of having two different perspectives when visiting institutions.

This article characterizes common concerns expressed by individuals we spoke with at the institutions. We will also provide potential solutions for these concerns, based on institutional feedback.

We are focusing on the first two aspects of organizational challenges and trends:

  • Level of change management comfort/ willingness
  • Organizational support for analytics

In future posts, we’ll cover additional organizational challenges and trends.

Level of Change Management Comfort/ Willingness

As highlighted in the article “Learning Analytics Adoption and Implementation Trends: Identifying Organizational and Technical Patterns”, some of the challenges and trends we observed were related to change management:

  • Staff expressed concern in areas including level of comfort and willingness to accept change. This included job roles, additional responsibilities, and changes to current practices
  • Institutional leaders sometimes did not have a clear understanding of the level of effort required for a larger scale initiative such as learning analytics
  • Academic and teaching staff expressed resistance around prescriptive allocations of their time related to teaching and advising

The following examples illustrate the types of change management challenges most often expressed at the institutions:

  • “Change fatigue” – This particular feeling was expressed by many at the institutions we visited. At one particular institution, we were told, “People are used to things changing all the time. Resistance is futile.” While this is somewhat comical, it is unfortunately very true. Some viewed this attitude as a positive thing because individuals within the institution are not “scared” by change. However, the fact remains that too many changes happening at the same time, or in short succession, can lead to wariness about what’s to come. Often, those charged with acting on the changes are not the ones actually mandating the changes. As one institution told us, “They (the changes) come from the third floor where the executives live”.
  • The “something else for me to do” syndrome – At other institutions, we experienced resistance from a group of academics who voiced significant frustration. When we asked the question, “How do you think learning analytics will affect your job duties?” we were met with heavy sighs. One participant stated, “We only have a certain number of hours allotted for teaching, advising, etc. A learning analytics implementation would add to workload and that would need to be planned for.”

Institutions shared with us their ideas regarding potential solutions and recommendations that they feel would be beneficial:

  • Clarifying job roles and duties – Engaging in this type of exercise is advantageous for any institution. We noticed that many institutions had old, outdated job descriptions and duties. Clarifying these job descriptions and duties can be a large undertaking, but we saw a lot of excitement, passion, and willingness from staff members to help leadership with these types of assessments. In the words of one participant, “It helps with job role, task clarification, and professional growth for those who are motivated to be involved.”Another approach to managing change fatigue is to reduce the overall number of changes people need to deal with by packaging change into larger programs whereby learning analytics isn’t a change, but rather a part of other initiatives. In general, we recommended institutions look for ways to incorporate learning analytics into existing initiatives for the best chance of success.
  • Involve others in the changes – Every institution we met with expressed a common challenge regarding implementing change. Those who would actually do the new tasks had no involvement in the planning phases. A participant at one institution stated, “Perhaps selecting a member from each department that will be affected by the change to sit on a steering committee or change management panel would be a way to involve us.” Delegation is the key takeaway here.
  • Communicate the changes before they happen – At many institutions, we were advised that changes were often relayed via email and often very shortly before they were to be implemented—some even cited only a week’s notice. There were even examples of particular groups who were not told about changes, but were still held accountable for them. One individual told us, “I never know when a change is going to happen or if I’m even doing my job right. Most of the time, I’m in a constant state of anxiety about what I might be doing wrong.” Employees who feel this on a daily basis are not able to focus on the real reason they are there: to help students succeed. Involving staff from the start of the initiative, and having their input upfront, helps establish the message and ensures that it is well understood among the “target audience” (e.g., advisors, academics, students, etc.). This allows the message to be tailored to the audience by someone from the particular group, which help with understanding and buy-in. The same can be said for the actual training to accommodate the changes. Most institutions did not have a formalized training plan for changes that were rolling out to staff. “Well thought out training is not something we excel at,” said one institutional leader. Recognition and awareness is a great first step and helps fuel the importance to communicate and train staff on a more regular basis.
  • Institutional approaches to project management – Having established project management processes helps the institution identify and manage what work is required to reach the end goal. Many of the challenges we identified could be addressed through well thought out and well-implemented institutional project management approaches. It isn’t just learning analytics initiatives that institutions struggle with; establishing project management processes is also a challenge. Creating a project management structure that can be applied to other initiatives helps set up the institution for success.

Organizational Support for Analytics

Organizational support is another challenge we observed while visiting institutions.

  • Staff (inclusive of academic and university/ college individuals) shared that they were particularly concerned with the impact on their current job requirements, roles, and workloads
  • Communication we received from all staff level roles was that a “top down” directive from leadership is necessary to properly implement learning analytics efforts

The following examples illustrate the types of leadership support challenges most
often expressed:

  • “I’m not doing anything until I’m told by leadership to do so” – This is a direct quote from one member of an IT group at an institution. He advised that he and his colleagues would not engage in anything new unless leadership told them it was important. His statement was made regardless of his personal opinion on whether or not learning analytics would be of benefit to the institution. Looking to leadership for the “OK” is very important at institutions, and not having a clear direction can make it difficult to navigate the leadership’s priorities.
  • “We already have so many things we are asked to do in a day, how can we possibly manage one more?” – Within all departments, there was concern about how to manage something else; another tool, another “thing.” One participant told us, “I have to spend 15 minutes searching for something and then [I] forget what I was looking for.” This was more common than one would like to think.

Institutions shared with us some ideas regarding potential solutions and recommendations that they feel would be beneficial:

  • “Single source of truth” – This can be both a technical and organizational solution. Institutions expressed that having one place to go for most, if not all, information would be the most effective way to mitigate the “one more thing to manage” concern. This could be a centralized data warehouse, a Learning Records Store, or a centralized place to house policies and procedures. For institutions, having one place to go to get the information needed is universally said to save time, energy, and effort. An individual at one institution said, “If we had one system it would be quicker, more efficient, and easier to find information and then you’d have more time to actually help students, rather than finding the information itself.”
  • Leadership buy-in is key – Those within the institution look to leadership to help guide overall strategic direction and vision. At some institutions we visited, the leadership was completely on board with the idea of implementing some sort of learning analytics solution. Others still needed convincing. One leader told us, “I see some of the overall benefits of learning analytics, but we have so many departments throughout the university that I’m not sure it would even work here.” The impact of unclear leadership buy-in upfront and in totality (among all leadership) transmits a sense that something is not a priority and does not need attention. Working with leadership to demonstrate the benefits of learning analytics, and how it can impact the “bottom line,” is something that is needed at the beginning and all throughout the initiative. As one senior leader told us, “I can understand the reluctance of my peers, as I know this will be a lot of work, take some time, and will result in some changes, but what I can’t understand is why anyone wouldn’t want to take that chance to help better our students and our institution as a whole. We have a duty to provide the best service possible and if this helps us do it, then we owe it to everyone to at least try it.”

We learned quite a bit from our journey across the UK, and we are excited to continue to share our findings. Please be on the look out for another article in April regarding what senior leadership needs to know about learning analytics, including the importance of information sharing, expectation setting, and collaborative thinking (a further extension on the discussion above).

Here’s to continuous growth and improvement!

College Analytics Lab – A Digital Modelling Environment for the FE and Skills Sector

Originally posted on Jisc Innovation in Further Education and Skills.

Further Education Colleges are adept at dealing with change and challenge. They have to be. Identifying, evaluating and capitalising on opportunities which arise in the rapidly changing environment we work in is a core skill of all successful college leaders.

Recent conversations with college principals have shown that meeting local employers’ needs; improving student experience and outcomes; helping to bridge the skills gap; and serving the local population, particularly school leavers, are all high on their list of priorities.

Alongside the economic, political and regulatory challenges faced by colleges, there have been other significant technological and data developments which can help us to address them. For example, high quality, detailed data sources (both open and securely held) are now more accessible than ever before. The tools to process, link and visualise these data sets are becoming more sophisticated and easier to use all the time.

Jobs by Industry and Region Dashboard

Interactive Dashboard Example: Jobs by Industry and Region

We know that using accurate local area data to inform strategic planning and decision making is crucial to success. Presenting the information in a visual and interactive way helps leaders to communicate their vision and ideas to their funders, staff teams, students and the wider community. However, what we have found is that there is a wide variation in the effectiveness of the way in which colleges make use of the data available to them. We also found significant duplication of similar core processes across colleges.

How can the FE and Skills community benefit from the availability of new data sources and state-of-the-art data visualisation technologies?

Jisc’s well-established Analytics Lab environment provides a secure technical, legal and project management framework to enable the creation of new, experimental data dashboards. As part of the Jisc HESA Business Intelligence project, participants use a mix of open and secure data from both new and established sources, to create visualisations and dashboards which address key business questions.

In January, we extended this project to include the FE and Skill sector, working with two groups of colleges – one from South East Wales and the other from the Greater Manchester City Region. This pilot project used our unique specialist Analytics Lab development environment, making a few adjustments to meet specific FE needs, and was underpinned by our knowledge of the Jisc Learning Analytics data definitions and architecture. More details about the two pilot projects will appear in future blog posts.

College Analytics Lab – An experimentation and data dashboard development environment

The College Analytics Lab can be thought of as having three components:

  • Technical infrastructure
    A secure, remote desktop environment which contains state-of-the art data visualisation tools including Tableau and Alteryx.
  • Data and legal
    The secure environment is used to house a wealth of data collected from a variety of sources, cleaned and in a format that is ready to use. Carefully agreed and signed legal agreements enable colleges to use sensitive data securely, as needed.
  • People
    The most important element. Teams are made up of sector experts (who understand the important business questions facing them and what data will help them to find the answers); consultants who are able to transform and link data to create powerful interactive visualisations and dashboards; and Jisc staff who provide agile project management.

Our proven Analytics Lab agile process

Teams of sector experts work together on a focused 3-month project. At the start of the project they work together to identify and prioritise their most pressing business questions. They then consider which data sources would help them address these. Typically, these are a mix of internal, external, public, private, existing and new data sources.

The next step is to work with specialist data analysts and a Jisc project manager to explore identified data sources which, when combined, can help to answer their business questions.
It is a “lab” process, so is experimental by definition. It may or may not lead to the creation of dashboards which are generically useful to the sector, but they will certainly be useful for the participants.

College Labs Model

Data sets from participating FE Colleges are analysed alongside other data from a variety of other external sources to produce visualisations and dashboards which can be used by a wide range of college staff enabling data-informed planning, benchmarking and strategic decision making.

Potential for the future

A few years ago, interrogating data was an expensive, specialist skill limited to a few within a typical FE college. Static reports would have been used by senior managers at specific times during the planning and evaluation cycle.

We hope the Analytics Lab environment will help open up the field of data science and bring it within the reach of all who have an interest in the college. In addition to the senior management teams, this could include subject / curriculum leads, support staff, Local Enterprise Partnerships, students, parents, governors and the wider community.

We have the facility to bring new open data sources (such as local transport routes) into the environment and build up a comprehensive catalogue to improve the use of data in the sector as a whole. Understanding underlying data trends and using the information for planning has the potential to improve the data literacy of all involved.

The Jisc Business Intelligence projects have enabled a wide range of HE professionals to engage critically with data in powerful ways to address key business questions facing them. I believe the same can be achieved in the FE and Skills sector, where this model will enable many more people to use data to inform strategic decision making.

Full programme for CAN 2017 now available

Originally posted on Change Agents' Network.

The full programme for CAN 2017 is now available.  Links to the Day 1 and Day 2 abstracts for each session can be found here.

Proudly hosted by The University of Exeter, the 5th annual Change Agents’ Network Conference presents a fantastic opportunity to share effective practice and honest reflections on student and staff partnership working and how this is driving curriculum innovation.

The full programme offers an engaging line up of students and staff including:

  • Two keynote addresses from the Academic Skills and Engagement Team and Student Guild at The University of Exeter (Day 1) and Colin Bryson and Fanni Albert from Newcastle University who will talk about the ‘how, what and why’ of Authentic Partnership (Day 2)
  • Plenary student led panel
  • 51 workshops, presentation and poster sessions from a diverse range of UK Higher and Further Education providers exploring the conference themes of:
    • Developing digital capability of staff and students
    • Employability and graduate attributes
    • Students as Researchers
    • Curriculum co-design
    • Impact of student partnerships
    • International perspectives and practice
  • Opportunities for networking – regional clusters and informal meet-ups

A notable feature of the conference is that all workshops involve students as workshop leaders or co-presenters.

Hurry – places going fast!Registration extended until noon on 7 April 2017.

Secure your place at

Teaching without speaking

Originally posted on Inspiring learning.

Have you ever tried to teach without speaking?

Twenty years ago I decided to train as a teacher of English as a foreign language (EFL) and I enrolled on the CELTA course at Harrow College which was a Cambridge Centre of Excellence. It was more like boot camp – one night a week for a year, working with real students, and all our classes employed the experiential teaching methods to teach us, which we then used with our learners.

We were trained to teach at various levels, beginner, intermediate and advanced, but the hardest to teach were the beginners. Why? Because they didn’t speak English and I didn’t speak their languages. I had to employ a whole range of skills and resources to be able to teach the target language without being able to explain it. These included mime, acting, drawing, flashcards, props and target language drilling.

tell me

Free images: publicdomainpictures 

We have all seen this quote from Benjamin Franklin, and we all nod wisely and say of course! But I think that if we reflected on it we might all find ourselves speaking too much in our teaching and learning sessions and expecting learners to know things just because we have told them.

On my CELTA course I learned that too much speaking can be a hindrance rather than a help to learners – it can be like white noise cluttering up the audio environment without serving any useful purpose. I learned that we can we can teach with minimal speaking in order to achieve our purpose – speaking needs to be sparingly used and carefully targeted. I also learned that we need to use a lot of different methods and tools to facilitate learning and that the most important thing is that learners are doing, building, researching, collaborating, practising and sharing. It was up to me to prepare beforehand so that when we all went into the classroom the activities would be in place for the learners to work together, discuss, play games and tackle tasks.


Free images:

So I always used a kind of blended learning in my practice and for all my teaching, training and presentation sessions I went in with the intention of facilitating with minimal ‘telling people things’ from me. When I became an eLearning advisor with Jisc I was thrilled to discover a world of new technologies, tools and techniques to add to my repertoire and share with colleagues and fellow teachers and trainers.

My favourite definition of blended learning is just the simple approach of using technology and social media alongside traditional techniques to support and enhance teaching and learning. This could happen inside or outside a classroom and it might also include gathering evidence for assessment in digital formats. Technology makes learning more accessible and students can use their own devices to make choices about formats, apps and settings that work for them. Using blended learning helps staff and learners to develop their digital capability and increase their opportunities for gaining employability skills, knowledge and experience.

Digital tools and resources can expand and modernise our methods of engaging with learners and enable them to participate and communicate in new and exciting ways. We can set the scene, provide the tools, give them the guidance and let them go! And all with just a few words from us…


Jisc guides on blended learning and more can be found here

This post is based on my keynote at the Colegau Cymru FE Teaching and Learning Conference – ‘Step Up to the Future’ and the resources and slide decks are shared here

You can find the Storify here

And the open access resources for digital literacy in Wales and more are here


The post Teaching without speaking appeared first on Inspiring learning.

Key themes from the Digital Learner Stories now available

Originally posted on Jisc Digital Student.

RayHelen Beetham has now completed a comprehensive summary of the key themes that have emerged from the Digital Learner stories. The summary takes key themes from the 12 different learners’ stories and allows the learners voices to explore each theme in turn.

Highlights of the summary include the following.

  • Key benefits of digital learning for these learners were inclusion, independence, and flexibility (or ‘making time‘).
  • Tablets are a game-changer for many learners: convenient, lightweight and connected to all their digital services. These learners loved their tablets.
  • These learners also loved some very traditional features of formal education, including libraries, the virtual learning environment, on-site IT support and fixed computing facilities.
  • Confident teaching staff were critical to the positive experiences these learners had, especially when they were introducing professional practices and networks.
  • New digital learning habits that learners explore in these stories include digital reading and writing, note-making, curation, learning from video and other media, sharing, coding and making.
  • These learners responded to digital learning with feelings of curiosity, enthusiasm, excitement, freedom and a sense of fun. However, they also wanted their digital learning to be safe and for other learners to be respectful in digital spaces.
  • For some of these learners, digital technology represented a ‘second chance’ or even ‘the only chance’ that they had to engage with education.
  • Some of these learners are mixing public and private spaces (such as learning groups and professional networks). Others engage in formal and informal learning in tandem. These are confident digital behaviours that not all learners will feel happy to try.
  • Most but not all of these learners saw their digital skills as assets for work.

In her conclusion Helen writes:

There are many continuities in the stories learners tell. As in the pre-digital age, learners still need access to rich resources, opportunities to practice, and supportive interactions with their tutors and peers. They make notes, organise ideas, prepare assignments, collaborate, express themselves, manage their time and motivation, revise and review, listen to feedback and showcase what they can do. But there are also some striking discontinuities. Learners are making more use of graphical, video and audio resources, both to learn and to express what they can do. They curate their personal learning resources in ways that were unimaginable in the days of paper. They share, comment, mix, remix and repurpose freely. They use digital networks to connect across boundaries, whether the barriers between learning and work, or between learners in different countries, or between formal learning and all the other opportunities and interests they have.

Increasingly, learners expect their digital skills to be a resource for getting on in life, and getting an education. They have innovative learning habits of their own, and they have creative ideas about how educators could better support them. Through stories like these we are learning to listen.

You can download the summary (PDF)  here.  The stories and the videos that accompany them are useful resources in their own right. Use them to open up discussions about staff-student partnerships, institutional provision of IT, digital skills support and much more. 


Ten Years on the Twitter

Originally posted on e-Learning Stuff.

Audio MP3

I have now been on the Twitter for ten years…

Five years ago, I wrote a post about five years on Twitter.

Twitter actually started eleven years ago and probably like most people joined when it became “big” after SXSW in 2007.

I didn’t use it much in the first year, partly as there were very few people on it, more so because I was using Jaiku a similar service available back then.

In the ten years of using the service I have posted nearly 43,000 tweets and have nearly 5000 followers. I currently follow just under 700 people and in the past I tried and keep it to under 500 otherwise I feel that the stream becomes too quick and loaded. I probably need to cull a few followers…

Back in 2012 I said about the Twitter,

In the main looking at Twitter I usually use it to post links about my blogs, links to news and sites I have found interesting, photographs (usually via Instagram) and importantly conversations.

I still post links to my blog, interesting news and other links. I went through a phase of not posting photographs via Instagram, until I re-discovered IFTTT and created a “recipe” that posts my Instagram images as proper pictures to my Twitter stream.

My tweeting patterns haven’t changed too much, when I am travelling or at an event the amount of postings I make really increases. At events I will tweet about the presentations, discuss and also post links related to the sessions I am in. However I do appear to be tweeting more when at work.

My favourite conversation was singing Spandau Ballet’s Gold with BBC iPlayer.

I have a reputation for tweeting about coffee and in reality it only accounts for 3% of my tweets! Though a day after joining Twitter I did post a tweet about coffee!

I did once say Twitter would die…

Ten reasons why Twitter will eventually wither and die…

Well I certainly was wrong on that one.

Though twitter is mainly now about mainstream and traditional media accounts who in the main use Twitter for broadcasting, I still think there is a community there that use it for conversations and sharing.

We are seeing many more competitors out there, stuff like Snapchat, WhatsApp and even services such as Slack.

I am surprised that not only is Twitter still going after eleven years, but I am still engaging with it. Will I still be on it after another five or ten years… I have no idea!

Digital capabilities at Digifest: reports from Brighton, North Lindsey and Nottingham

Originally posted on Jisc digital capability codesign challenge blog.

We were at Digifest in Birmingham last week to launch the new and updated Digital capability resources, including a brand new series of organisational case studies and a synthesis report on their experiences.

Key players from three of our case study institutions were there with us on Tuesday to speak about their experiences. Fiona Handley from the University of Brighton explained how a digital literacies framework for academic staff has supported their professional development. Elaine Swift from Nottingham Trent University outlined how their framework has evolved over time, and how a ‘continuum of support’ ensures nobody falls through the net. Ross Anderson from North Lindsey College showed us how digital badges and digital ‘missions’ are motivating teams of staff to work together on their digital skills. Here they say a bit more about their different approaches.

Ross Anderson

e-learning ambassador, North Lindsey College

Our approach had to be something that supported and nurtured our staff in their digital skills development. Our College strategy and e-learning action plan reflected this. We wanted to make sure that ALL staff were able to develop their skills and ultimately be rewarded for their work. We saw their skills development as a continuum and wanted to have a framework or process that reflected that. Jisc’s Digital Capabilities Framework was a great starting point for us at it helped break down the skills areas into their own subsets. From there we were able to develop this further and put the skills criteria across a stage-by-stage framework in order to give staff a progressive approach. This meant that staff can pick a starting point that suits their current skills or confidence and can not only be rewarded for the skills they have but can see a clear route to further progression.

The final element was to gamify this approach and promote it to curriculum areas. So DPD Go! was created. Staff are rewarded for completing digital ‘missions’ related to using technology in their teaching, learning and assessment. A team approach has proved the most successful method of engaging staff with e-learning skills development, so it made sense to use this for DPD Go! too. The team approach creates a collaborative (and competitive) atmosphere to help drive participation.
Video introduction to DPD Go!

fiona 1 - smallish
Fiona Handley
Senior Lecturer in Learning and Teaching, University of Brighton
with Fiona MacNeill, Learning Technology Adviser

The evaluation of our original Digital Literacies Framework highlighted the topics that really seemed to engage academic staff. The webpages under the Learning and Teaching category (for example Finding and Creating Resources) had many more hits that the other three categories, especially Administration, which covered topics such as Managing Time. During face-to-face sessions introducing the Framework we also found that when groups discussed literacies under each category, it was the ones under Learning and Teaching that sparked the most enthusiasm and discussion. The bespoke sessions that were requested focused on similar topics such as using social media in teaching, flipped classroom, and using mobile devices.

At the University of Brighton improving the learning environment is a key way of getting staff to consider their digital literacy. The updated Framework which was launched in 2016 reflects this, with more literacies under the Learning and Teaching category and fewer overall. The literacies that remain under Administration are more focussed on demonstrating clear expectations about digital skills such as using calendars and formatting documents. The Framework now attempts to inspire and support good practice while also setting out a baseline of knowledge and skills that support institutional policy and initiatives.

Fiona and Fiona’s presentation on sway
Brighton’s Digital Literacies blog

Elaine Swift
Digital Practice Manager, Nottingham Trent University

Developing digital talent at NTU is an evolving process. It started with the LFHE led Changing the Learning Landscape project initiating the strategic discussions at NTU about how to embed digital literacy as a core competence for both staff and students. Initially through looking at the continuum of support that we had in place for a variety of digital literacies, a framework of digital skills and competencies was developed and adopted. This framework is now being embedded through the institution in a variety of ways including through curriculum refresh activity, online support and case studies and linking through to other key initiatives with a digital element, such as our Respect at NTU initiative. Reflecting on the work at NTU, I think there are a few key points to consider when looking to introduce a framework such as the Jisc Digital Capability Framework:

  • A framework can be a useful starter for conversations and offers a common vocabulary.
  • Think about the support that wraps around it: this often involves numerous areas of the organisation.
  • Be open to different opportunities to embed a framework. It can work in many different ways.
  • Don’t be afraid to try approaches that have not worked in the past. Timing and readiness for change occur at different stages depending on the organisation.

Thanks to all three speakers and to all our case study sites for their valuable insights and inspiration. All the slides from the session are available on slideshare.

Follow #digitalcapability and check back here soon for more resources you can use to develop digital capability in your organisation.

Learning Analytics Adoption and Implementation Trends

Originally posted on Effective Learning Analytics.

Lindsay PinedaThis is a guest blog by Lindsay Pindeda, who is currently a Senior Implementation Consultant for Unicon and has a rich background in learning/predictive analytics. In her previous position, she focused on helping to develop, implement, and execute a proprietary predictive modeling technology that has proven to be successful in predicting student course persistence on a week-to-week basis. Lindsay has immersed herself in learning/predictive analytics research, practical applications, and implementation. Since coming to Unicon, she has been working with institutions to provide Learning Analytics solutions, both technical and nontechnical, and has had a focus on visiting institutions onsite to provide Readiness Assessments. She helps institutions work through issues of change management, resourcing, challenges, and concerns relating to the adoption of Learning Analytics.

Identifying Organizational and Technical Patterns

Key Takeaways

  • What are the top learning analytics challenges among institutions in the UK?
  • What organizational and technical considerations need to be addressed to pave the way for a successful learning analytics initiative?
  • When it comes to learning analytics, are institutions ready?

illustrationOver the past year, I had the pleasure of traveling throughout the UK with a colleague, Patrick Lynch of the University of Hull, jointly conducting Readiness Assessments as part of Jisc’s learning analytics project (see “Useful Reading” section below for more information). The institutions varied in student population and demographic representation, and were located across the UK. Institutions included both HE (higher education) and FE (further education), with private and public institutions represented.

Yet with all of the diversity between institutions, they shared many similar organizational and technical readiness trends. My hope is by sharing these trends in the aggregate, institutions interested in learning analytics (whether through Jisc or through another means) will find that they are not alone in their concerns or challenges, organizationally and technologically.

The Readiness Assessment process, outlined below, is designed to be collaborative and conducted onsite with a variety of key stakeholders across the organization. Typically, the onsite visit is three days long and consists of larger-scale meetings involving several departments, smaller-scale meetings including focus groups, and one-on-one meetings with individuals. At one institution, for example, we gathered the key stakeholders to facilitate a discussion using activities designed to help the participants generate productive and collaborative conversations.

Magnifying glass over Readiness - illustrationSome questions we asked participants were:

  • How do you think learning analytics will impact your daily job activities?
  • What policies, procedures, and practices do you believe will need to adapt or be created to accommodate the adoption of a learning analytics solution within your institution?
  • What ethical considerations are there to using the data to provide guidance to students?

The different processes involved in assessing organisational readiness

Man on ladder with binoculars - illustrationWe typically spent one day doing larger scale trainings/ activities and two days meeting with organizational staff. Meeting topics included organizational structure, policies, procedures, and ethical concerns. We met with technical staff to discuss data management practices, integration, and maintenance challenges/ concerns. With senior level leadership, we provided them an opportunity to express their concerns, feedback, and overall goals for the institution. We also conducted focus groups with students (of varying degree levels and academic focus), academic staff (including teachers), and professional staff (including academic advisers, strategy groups, and learning/ teaching staff).
After the onsite visit, a fully comprehensive report was delivered back to the institution that contained all observations, direct feedback from participants (in an anonymous manner, collected as quotes without assignments of individual names), both qualitative and quantitative measures, and recommendations of next steps.

Institutions provided positive feedback on the Readiness Assessment, reporting it was extremely beneficial in bringing together individuals from different departments throughout the institution. This allowed the departments (e.g., learning and teaching, academics, senior leadership, students, IT, etc.) to speak freely and openly with each other about their concerns; the ways in which they might use the information gained; how one department’s activities can and do affect another; and more. Each institution received a comprehensive report outlining the information gained within the sessions to help illustrate the institution’s current state and the steps needed to get to a place where learning analytics could continue to be explored.

In our discussions, we used the Unicon Readiness Assessment Matrix (closely coordinated with the EDUCAUSE Maturity Index) to assess the readiness of institutions. The matrix rates institutional readiness based on six criteria: Data Management/ Security, Culture, Investment/ Resources, Policies, Technical Infrastructure, and IR (institutional research) Involvement. These criteria provided an outline for the qualitative elements discussed within the comprehensive report and throughout the onsite visits.

Our readiness process also provided a forum that enabled institutions to collaborate on ways to overcome the concerns and challenges that they identified. A summary of the most common challenges and concerns, “real-life” examples, and potential solution suggestions (directly from institutions) will be covered in a series of future articles, beginning in March 2017.

Trends: Organizational

“The institutions discovered discrepancies in how staff and leadership perceived management of institutional policies and practices.”

The following are observed trends in regards to organizational aspects such as culture, process, and communication.

  • Level of change management comfort/ willingness
    • Notable challenges throughout all visits regarding level of comfort and willingness to accept change management including job roles, additional responsibilities, and changes to current practices.
    • Significant variances in the understanding of what would be required in terms of level of effort among leadership members
    • Bulk of resistance experienced from academic/ teaching staff who have prescriptive allocations of their time related to teaching and advising
  • Organizational support for analytics
    • Staff (inclusive of academic and university/ college individuals) were particularly concerned with the impact on their current job requirements, roles, and workloads
    • The overall communication from all staff level roles was that a “top down” directive from leadership members would be necessary to properly implement learning analytics efforts
  • Organizational infrastructure
    • Most institutions did not have the organizational infrastructure currently present to support the implementation and adoption of learning analytics technology
    • Several did not have formalized organizational structures and many did not know what other departmental staff did on a daily basis or how their jobs affected each other
    • Most were very concerned about issues of redundancy, additional workload, and time management
  • Policy/ Practice management
    • Overall, the institutions did have their own policies and practices in place that were currently being managed; however, there were great discrepancies in most institutions regarding how the staff and leadership perceived the management of policies and practices
    • Several institutions were concerned about issues of micromanagement, current inconsistencies of policies in place, the execution/ accountability of those policies and practices, and whether learning analytics would add to their already overloaded requirements
  • Ease of integration with existing organizational structure
    • There were many concerns expressed about how learning analytics would actually work in practice. All institutions liked the theory and the idea behind it, but were uncertain about what it would mean to actually implement it
    • Most of the organizational structures in place did not support learning analytics implementation. For example, many institutions were divided about their use of personal tutors (some did not have this role at all) and instructors/ lecturers. Determining who would be responsible for implementation was a topic of much debate

Organizational Trends Summary

Overall, from an organizational perspective, there was a high level of support for learning analytics. While there were concerns, as expressed above, the learning and teaching, academic staff, and student groups all felt strongly that learning analytics would be of benefit to the institutions; however, only if it were implemented correctly and with the full participation of all required groups. High-level buy-in did vary from institution to institution, but there were more supportive members of leadership than not. Very few institutions had organizational challenges and obstacles that could not be overcome with communication, training, and involvement from several departments throughout the institutions.

Arrows on a graph - illustrationTrends: Technical

The following are observed trends in regards to technical aspects such as data, infrastructure, and integration.

  • Demonstrates sufficient learning analytics knowledge
    • Most technical departments throughout the institutions had some knowledge of learning and predictive analytics; however, much of their knowledge and experience was in academic analytics
    • The bulk of the time spent on education about learning analytics was experienced within the technical departments themselves
  • Institutional infrastructure
    • Many of the institutions already had a data warehouse, but housed only limited integrated data from only one or two systems. For example, the VLE (Virtual Learning Environment) and the SIS
    • All of the institutions still had many “manual processes” (e.g., collecting attendance manually on Excel spreadsheets) that were not being captured or housed in a collective place
    • All of the institutions expressed interest and a desire to have collective information in one place that was easily accessible (i.e., a “one source of truth” concept)
  • Data management
    • All institutions were using data but for many different purposes and none of them were in sync. Most were not aware of which data other departments were using and for what purpose
    • They were all compliant with Data Protection laws and policies; however, each department appeared to have their own interpretation of those laws and policies
    • They expressed a desire to have a unified way of managing data across the entire institution
  • Ease of integration with existing infrastructure
    • Due to having so many different systems and data sets, the ease of integration appeared to be a challenge for most institutions. Institutions were using an average of 8-10 different systems and many more different ways of collecting and interpreting data
    • Many also had previously purchased commercial systems that may result in some challenges with integration of learning analytics technology
    • The integration of systems depends highly on the expertise of both the institutions’ technical teams and their ability to comply with the xAPI and UDD (Universal Data Definitions) requirements. Due to the many inconsistencies with data collection, much work will be needed in this area for the institutions
  • Maintenance
    • This was a major concern for all technical teams throughout the institutions. None of the institutions had a high variety of skill sets among their staff to be able to manage the learning analytics implementation in-house, nor would they have the same level of expertise to maintain the technology. Data Scientists, Data Engineers, Business Analysts, Quality Assurance, and other technical roles were not found at any of the institutions
    • Institutions would be in need of constant support during and after the implementation of the technology
  • Resource support
    • While there was buy-in for the implementation of learning analytics, all institutions had smaller technical teams with concerns related to resource allocations, workload additions, and time management
    • Most also expressed the desire for explicit prioritization of learning analytics from leadership to help them guide and direct current resources and work efforts

Technical Trends Summary

Technical trends centered around the availability, access, and use of data, as well as the staffing needs to deploy and maintain a learning analytics environment. However, very few institutions had technical challenges and obstacles that could not be overcome with communication, training, and involvement across departments throughout the institution.


In the aggregate, our measurement of “readiness” across the institutions we served yielded an average of a “somewhat ready” to “ready” score. This denotes the majority of the institutions visited were reasonably ready to implement some element of learning analytics technology solutions, provided they address the more glaring challenges first. As these institutions move forward, it will be key for them to keep open communication with all of the Readiness Assessment participants throughout the project; without this, there will be a significant decrease in buy-in, enthusiasm, and willingness to accept/ adopt change.
The following selection of quotes come from a few of the institutions we visited and demonstrate the value of the Readiness Assessment approach:

“Just getting the groups together to talk to each other has been a valuable experience. Most have never even met and the discoveries made about how different departments affect each other has been brilliant.”

“We are extremely grateful for the guidance and assistance in aligning the university and encouraging genuine conversation on how we take learning analytics forward.”

“As a senior member of staff, it was valuable to look at this process from both a top-down and bottom-up approach to determine the best way to provide support and leadership around the [learning analytics] initiative.”

Based on client feedback, Unicon now offers a service to help institutions get started with learning analytics that includes similar readiness activities. LA Quick Start provides institutions with an environment to help get them off the ground, which can evolve into a full learning analytics solution at each institution’s own pace. The service includes data integration with the LMS into a Learning Record Warehouse (on-premise or in the client’s cloud environment), and the ability to view student activity through dashboard visualizations. In addition, Unicon includes consulting hours with this service to help with an analytics Readiness Assessment and/or roadmap planning.

LA Quick Start can be used to get conversations started across various groups on campus, which is one of the key factors in getting onto the path towards success with learning analytics. View the LA Quick Start datasheet for more information.

Download the PDF version of this article.

Useful Reading

Jisc, Effective Learning Analytics, 5th UK Learning Analytics Network Event (2016)
Jisc, Learning Analytics Discovery Service (2015)

Case studies: journeys towards digital expertise

Originally posted on Jisc digital capability codesign challenge blog.

New and updated resources in this post:

Case studies:

Summary report: Journeys towards digital capabilities

Digital capability is an agenda for organisations across the sectors of education. But how best to take it forward in your own setting? Time and again we hear that examples from practice are what people need to turn inspiration into action. There is no substitute for learning from people who have tried and succeeded, and although we can’t bring you those people directly, we have done the next best thing. Through interviews with key players and a look at the background evidence for each case we have produced a series of written reports on organisations that are making a difference.

You can explore the full list of case studies from the links above and from the Digital capability project page. Each one starts with a general overview so you can judge how relevant the lessons might be in your own setting. Some of the ideas, though, will be relevant to everybody. We have drawn these together in a summary report, Journeys towards digital capabilities, which lists the lessons learned at the different case study sites. These cover: frameworks and definitions; other strategic approaches; development strategies (personal and curriculum change); motivation and reward for staff; and ideas for working with students.

In the next post we hear from some key change agents whose stories feature in the case studies, and who also presented their experiences at Digifest.

Follow #digitalcapability for updates over the next days and weeks, and beyond.

Digifest: Learning analytics interventions should always be mediated by a human being

Originally posted on Effective Learning Analytics.

I was at Jisc Digifest 2017 last week involved in several sessions around data and analytics. However for me the most interesting session was the debate on Humans vs Machines debating whether  “Learning analytics interventions should always be mediated by a human being”.

In case you are new to learning analytics, it is the use of data to support learners, enhance other educational processes or to improve the curriculum, usually involving some sort of data analysis that your normal human would struggle to do. The intervention is the action a human or machine makes in the form of an alert message (via a text, email, phone call, meeting etc.) in response to the data analytics to attempt to support the student.

My two debaters were for the Machines, Richard Palmer from Tribal and for the Humans, Sheila MacNeill from Glasgow Caledonian University.

Jisc events started the debate early with Learning analytics: ditch the humans, leave it to the machine – a Digifest debate which led with some arguments from Richard on the reliability of Humans and the potential of machine based learning and interventions in learning. However this was quickly countered with a blog post from Sheila with Time for Analytics of the Oppressed? – my starter for 10 for #digifest debate which argued both the practical and ethical benefits of human interventions.

Some took the news story as literal and missed it was a debate quote from twitter “I thought this piece was sarcasm. Unfortunately, its not. I couldn’t disagree more. Really ?” but this helped get the debate going.

The session was a lively debate and was captured after the event in this short video.

A number of points were raised by the audience, one in particular was around the need to give students control over interventions, that they should be able to say how interventions are made and whether they are contacted by a human or machine. This argument reflects the suggestions in an earlier post on consent for learning analytics. I would much rather purchase goods online or interact with a web form than experience the interactions with a sales person or some online support services. Sheila argues it is easier to ignore a machine. Another suggestion is that it may come down to cost of delivery, the machine based intervention is cheaper than the human based intervention.

So what was the outcome.  Not what I expected the voting was so close and I had not time to count hands to find it was a 49/51 split.  It all came down to the word “always”, Richard in summing up agreed with all the moral, ethical and human arguments but swayed enough people by saying there would be times when a machine based intervention is better.

However the debate has continued and Sheila posted a further bog post reflecting on the discussions “I wish I’d said that . . . reflections from #digifest17″ which suggests that the only intervention that will ever work is when a Human decides to make a change. Either a student changes their behaviour or a staff member provides support, changes the learning design or environment to assist the learners.

I am sure the debate will continue as we implement learning analytics. Automated decision making has legal implications and I’m not suggesting that learning analytics should actually make the decisions for us. However for good or bad we increasingly trust and rely on machines to make better decisions in our live.