Strategy is multimodal

Originally posted on lawrie : converged.

One of the things I hear working with leaders in organisations is “I need to write the <insert term> strategy” — it doesn’t actually matter which strategy it is, or what flavour of a strategy it is. When you start working at a certain level, you have to start writing strategies, or parts of strategies; and the more senior you get the more of the strategy your are responsible for, until you reach the tipping point when you tell people the vision and ask them to write the strategy.

On Saturday Laura Gogia wrote a post “Facilitating the paradigm shift”, in which she spoke about getting a faculty to think strategically. It’s a great post, and relates how her own journey to a more strategic thinking place happened.

The thing with Laura’s journey is that her thinking was from a place of practice, that led to strategy. But I wonder how many people see the strategic thinking as being the opposite or at least having some dissonance with practice.

How often do we hear “think strategically” or “be more strategic”?

Leaders need to be able to think about strategy in a “meta” way, as Laura describes it and they need to think about strategy as practice. For leaders, especially the ones that we are working with in the higher and further education sectors, strategy is both thinking and doing.

Some of this is at the centre of the Jisc Digital Leaders course. The time we spend over the four days of the course is about leaders having enough information to be able to think through and develop strategies around digital. It is also, and crucially, setting leaders up to be in a place where they can situate their own digital practices as a model for their staff and peers.

“Being strategic” means practicing strategy as well as thinking strategically. Effective strategy is multimodal, and effective leaders ground their thoughts in a reflective approach to their practice, and the practices within their organisation.


Our new Community of Practice sets sail

Originally posted on Jisc digital capability codesign challenge blog.

CoP launch 3 CoP launch 1

More than 90 digital capability specialists and enthusiasts came together for the launch of our Community of Practice at the University of Aston last week.

Delegates heard from Sarah Knight about the resources Jisc has developed for the community, including the new Developing organisational approaches to digital capability online guide launched last week. But this was very much a working meeting, with the focus on sharing ideas and developing things together. In a rapidly emerging field, strategies need to be kept in development, and resources produced in an agile way. So a lot of the day was given over to workshop sessions where we mapped the landscape and planned next steps

I gave an overview of where we are in the UK, and how we got here (including some international perspective). Then we heard a pair of inspiring organisational case studies from John Hill of the University of Derby, and Richard Fullylove of Coleg y Cymoedd. John described an integrated approach to staff and student development, incorporating the Jisc framework and profiles, and self-assessment tools. Richard showed how the use of infographics is helping to get the message across in his college, along with 30 full hours of CPD provision for staff.

You can access the Periscope recordings from these sessions here and also review a Storify of reactions on Twitter.

Next we split into groups to paper-prototype four ‘toolkits’ for embedding digital capabilities. CoP launch 2You can find our ideas and resources on these four Padlets:

After lunch, Gillian Fielding gave a summary overview of the findings of the 2017 UCISA Digital Capabilities Survey. You can access the Periscope recording here.

Then it was back to groupwork, with groups working in parallel on the same three issues this time – barriers and drivers to digital capability development, and available resources. A write-up from the activity is available here and will be further analysed in the coming days to decide on priorities.

CoP launch 4At a final panel, representatives were put on the spot with challenging questions such as ‘what makes for a digitally capable organisation?’ and ‘what is the one thing an organisation needs to get right?’. Luckily the day had lent us all inspiration and ideas.

You can find a periscope recording of the plenary panel session here.

The findings of our Mentimeter polling are also an important resource for thinking about where we go next.

  1. What one thing should the community do? (Open question)
  2. Which activities would you prioritise for the community? (Closed question based on results of 1.)

To join the Building Digital Capability community of practice visit Follow #digitalcapability on Twitter and for more information on the Building digital capability project email

FE and Skills Coalition Meeting – The Apprenticeship Journey in a Digital Age

Originally posted on Jisc Innovation in Further Education and Skills.

On the 10th May over 50 representatives from further education and skills across the UK attended the Jisc hosted FE and Skills coalition meeting in London. The theme of the meeting was the apprenticeship journey in a digital age. The aims of the meeting were to explore how we can embed technology throughout apprenticeship design, delivery and assessment and to hear from participants on their experiences of using technology in support of apprenticeships.

With apprenticeships going through a period of great change with government targets to deliver 3 million new starts by 2020, and a move from provider-led frameworks to the delivery of new employer-led standards. Many providers are now seeing technology as core to their delivery model if they are to be cost-effective.

All the slides from the meeting will be available here and you can access the Storify summary of the day

Highlights from meeting include:

The meeting began with an overview of how Jisc is supporting colleges and skills providers with their use of technology and highlighted the new resources available to support organisations develop their digital environment. New resources include:
Online guide on developing organisational approaches to developing digital capability
Briefing paper on developing organisational approaches to developing digital capability
The evolution of FELTAG: a glimpse at effective practice in UK further education and skills
• Enhancing the digital experience for skills learners online guide
• Area review toolkits available here

Joe Wilson sharing his views on apprenticeship landscape

Joe Wilson sharing his views on apprenticeship landscape

We heard from Joe Wilson, @joecar, on his views around the current apprenticeship landscape and the challenges facing colleges and providers with the anticipated changes.

Apprenticeship toolkit from Jisc

Apprenticeship toolkit from Jisc

Lisa Gray updated the group on the work Jisc has been conducting in relation to how technology can support the design, delivery and assessment of apprenticeships. Jisc is developing guidance to providers moving to these new models of delivery, and Lisa provided an overview of work to date. The guidance available is based on a roadmap of effective practice from the preparatory stages through delivery of training to end-point assessment and highlights the key role of technology throughout. Read Lisa’s blog post about the project and you can access the apprenticeship toolkit here. We would welcome your views and feedback on this resource and please contact Lisa Gray ( for more information.

Nick Poyner, Real Apprenticeship Company presented on how technology supports their delivery

Nick Poyner, Real Apprenticeship Company presented on how technology supports their delivery

We heard from Nick Poyner, from the Real Apprenticeship Company, on how they are making effective use of technology to support the delivery of their apprenticeships. Nick’s slides will be made available from the events page and you can read more about their work here

The Education and Training Foundation have a programme of support for providers and colleges delivering apprenticeships and Alison Morris and Dawn Buzzard presented to the group on what support and training opportunities are available.

Dawn Buzzard presenting on ETF support programme for apprenticeships

Dawn Buzzard presenting on ETF support programme for apprenticeships

In addition, Ufi run the Blended Learning Essentials course and are developing a new course aimed specifically at supporting staff delivering apprenticeships and the role of technology. For further information visit

To end the day Rob Bristow and Sarah Dunne ran an activity on developing user stories to capture user requirements, to feed into the developments on the Jisc digital apprenticeship work. You can view the user stories here.

Thank you to all those who contributed to the event and for the rich discussion. The next meeting of the group is on 11th October. You can join the mailing list for this group by visiting FELTAGIMPLEMENTATIONGROUP and continue the discussions on Twitter using #FELTAG

Online language – How are communities using it?

Originally posted on Inspiring learning.

So we have a new species of language – neither written nor spoken but with elements of both.

New protocols, changing spellings, new words and phrases, emojis and icons, memes and gifs – new and evolving ways to express ourselves. Our evolving online language is creative, challenging, exciting and sometimes baffling. And like any language it is used in different ways by different communities, groups and individuals. Often it is used as a badge of identity. Like clothes, cars, hobbies and accessories, language can make up part of who we are and how we present ourselves.

language communities

Free images:

Different communities use language in different ways. I got really interested in the work of Penny Eckert a few years ago when I discovered her linguistic studies of the Jocks, Burnouts and Geeks in high schools in America. You can get more of a flavour of her work in this later paper on adolescent language. My lightbulb moment happened when Penny described the three groups of young people distinguishing themselves from each other by clothes, accessories, behaviour and language

Slang often originates with young people, multicultural communities, armed forces, political movements and so on, and becomes assimilated into mainstream language, only to be replaced by the next generation of alternative language! Think of the multitude of words from a range of past cultures which sound perfectly ordinary to us now – youth words like cool, wicked, like; rhyming slang like butcher’s, porky, barnet, raspberry; military terms like doolally, lousy, snapshot, scrounger; criminal underworld terms like chum, rumble or knock off.

In fact we can often identify an online community by the way the language looks. Think of the Twitter community for example. We see a piece of text under 140 characters long, it has a #hashtag perhaps – the name of the poster is preceded by @name. They may mention other community members with @names, and a tweet will often contain a shortened link and an image or a video or gif. If you are in the process of learning how to use Twitter for professional practice, this is an excellent guide.

snap 1 snap 2

Mother and son conversation – Snapchat

You can also identify a snapchat message by the way it looks – it will have an image and a text box over the image, often containing words and emoticons. You can employ filters over the images and draw on it too. One of the interesting things about this is that the images are part of a real time conversation, not special records of special times. The messages disappear within a few seconds of being opened or they can be added to a shared group story which lasts for 24 hours. It’s a tool for now – no comments and no likes… There’s a good article here if you want to know more.

There are lots of communication apps and social media sites that are identifiable by the way the language looks and feels, but I’ll describe just one more.

FB 1

Facebook users will recognise the layout of this status and comment. The first comment includes a profile photo and the poster’s name is blue highlighted so we can click on them and visit their profile page. Under the comment there is a time stamp and blue highlighted ‘like’ and ‘comment’ for friends to interact. (There are more interaction choices on Facebook now.) We can see that three people have liked it, and one has commented. The commenter’s profile picture is also shown and we can click on their name which is also blue highlighted. There is space to write your comment too.

It’s possible to identify communities such as Reddit, Imgur, Tumblr and many others by the use of particular words and phrases, and some communities will even mock others about their use of language! Imgur has a handy glossary so that new arrivals can find their way around the site, translate the acronyms and understand the in jokes. Users have also come up with their own glossaries like this one which are designed to make newcomers feel more welcome.

If you are looking to become part of an online community, lurking is a good way to get used to its language protocols and behaviours. I have been lurking around some communities for so long now, it’s embarrassing… I don’t know if I will ever make a comment on Imgur, even though the community comes across as loyal, comradely and welcoming to newcomers. But one wrong move – posting a selfie, promoting a product, using Tumblr language, reposting too soon, plagiarism – and the wrath of the community is upon you.

We should legitimise and support various communities’ development of language as we move through time, but we also have to be aware of the dangers of extremism and the existence of malign communities that also use a sense of identity to draw vulnerable people in. Use of language can be a warning sign in some cases that a person is identifying with a community that might cause us concern.

Language use is part of the identity of communities – you can be included or excluded depending on your understanding and use of language and behaviour, and people have to be able to learn the language etiquette that will enable them to be a part of the community they want to belong to. Online language evolution enriches all our languages and it helps our communities to thrive.


(If you would like to see some of my favourite examples of creative online language, here’s a link to my Tumblr collection of artefacts, articles and amusing memes )

(You can read sections of the PhD here )

(Have a look at Jisc’s work on digital capability here )


  1. Online language – Journey to a PhD
  2. Online language – What does it look like?
  3. Online language – A new species of language

Coming next:

  1. Online language – Why do we need to teach it?
  2. Online language – Bilingualism
  3. Online language – Somewhere along the line



The post Online language – How are communities using it? appeared first on Inspiring learning.

Trends Unpacked (Part 3): Technical Challenges and Learning Analytics

Originally posted on Effective Learning Analytics.

This is a guest post from Lindsay Pineda, Senior Implementation Consultant and Amanda Mason, Senior Business Analyst, Unicon, Inc.

How are you approaching a learning analytics implementation at your institution? Is the impression that it is mostly a technical implementation? Or is it mostly an organizational/cultural one? As we discuss in this installment of the “Trends Unpacked” series, it is actually both and it is important that your institution believes this as well.

Amanda Mason, a Senior Business Analyst with Unicon, has extensive experience in recognizing technological challenges related to systems integration, investigating technical requirements, and strategic analysis. All of these skills have been a vital piece of connecting the technical side of a learning analytics implementation to the organizational one. You can read more about Amanda’s experience in her bio at the end of this article.

In the past few “Trends Unpacked” articles, the focus has been mainly on the organizational challenges, observations, and recommendations. In this third installment, Amanda and I are going to focus on the first three aspects of technical challenges and trends:

  • Demonstration of sufficient learning analytics knowledge
  • Institutional infrastructure
  • Data management

In future posts, we will cover the remaining technical challenges and trends.

Demonstration of Sufficient Learning Analytics Knowledge

As highlighted in the article “Learning Analytics Adoption and Implementation Trends: Identifying Organizational and Technical Patterns,” some of the challenges and trends we observed were related to demonstrating sufficient learning analytics knowledge:

  • Many individuals within technical departments expressed concern about their collective learning analytics knowledge. Most felt they had some knowledge; however, much of it had been focused on institutional analytics.
  • The bulk of the time spent on education about learning analytics was experienced within the technical departments themselves.

The following examples illustrate the types of learning analytics knowledge challenges most often expressed at the institutions:

  • What do we do with the data? – This was a common theme and expressed by many at the institutions we visited; at one particular institution we were told, “How is this different than what we already do? We already have loads of data, we just don’t do anything with it.” This is a consistent statement among institutions. Most do have loads of data they have been collecting for decades, but that is not the difficulty. The difficulty lies in where the data is located. We found that data is located in several different systems, departments, and often stored within the minds of tenured individuals who are sought out to advise in specific situations. Some viewed this robust wealth of data as a positive thing because there was so much information being captured about students. However, the fact remains that a lot of data does not equal a lot of knowledge about how to use it. As one institution pointed out, “There is the perception that having loads of data as a good thing, but we’re not sure if it’s at all useful or valuable.”
  • We need a systematic way to collect data – At other institutions, we experienced concern from a technical group of individuals who voiced frustration regarding the sheer volume of data collected. They expressed that it is difficult to determine how to collect the right data in the right format for the right practices, from all of the varying systems they currently have in place. This same group advised us, “We need to be very clear about the data we collect currently and how this is different than what we are going to collect in the future. There needs to be a structure in place to systematically collect data moving forward.”

Institutions shared some ideas regarding potential solutions and recommendations that they feel would be beneficial:

  • Define a clear purpose for data collection – Institutions recognized the need to use data to help inform effective delivery of a positive student experience. One institution told us, “We need to do something with the data. We can’t just let it sit there.” Questions need to be asked in regards to several individuals and departments to determine where priorities lie for data collection. Having a clear idea of what data should be collected and for what purpose is imperative. This ensures the processes and policies will reflect the goals of the institution as it pertains to learning analytics. As another institution communicated, “We have a good sense, at an institutional level, of what data is needed, but we need to determine where the priorities lie at the learner level and what power there is in that data.”
  • Define the data points for collection – Most institutions have policies and procedures according to government requirements, but few have actual definitions of data points according to their specific usage on campus. For example, at one institution, “student engagement” meant how many times a student logged into their Learning Management System (LMS). At another institution, this meant how many times a student physically showed up to the classroom. It is paramount that data points such as student engagement, retention, completion rates, and employability are well defined at the start of the initiative. Asking questions, brainstorming with others from different disciplines, and taking the time to define the specific data points that will be collected can benefit the collaborative development of clearly defined policies and processes moving forward.

Institutional Infrastructure

Many of the institutions we visited already had a data warehouse, but it housed limited integrated data from only one or two systems, for example, the Virtual Learning Environment (VLE), LMS, and Student Information System (SIS). All of the institutions still had many “manual processes” (e.g. collecting attendance manually on Excel spreadsheets) that were not being captured or housed in a collective place. Additionally, the institutions expressed an interest and a desire to have collective information housed in one place that was easily accessible (e.g. a “one source of truth” concept).

The following examples illustrate the types of institutional infrastructure challenges most often expressed at the institutions:

  • Data gaps and limited data availability – We were informed, at several institutions, that the readiness assessment process was the first time some staff had investigated the data the institution currently collects. In preparation for the onsite visits, many began to identify gaps in data or take notice of the limited amount of data available for use with learning analytics technology. One institution told us, “Not having certain data hasn’t been a problem in the past because it hasn’t been required for use.” This seemed to be a shared theme among the institutions visited. Another institution illustrated a specific example of missing or limited data related to library information, “We’ve noticed there are quite a few bits of data the library doesn’t capture. And the information it does collect often can’t be shared due to privacy policies.” These types of situations were common and certainly a point of frustration for many institutions.
  • Data ownership concerns – We found that several departments within an institution collect data, but they do not own the information itself. The institution, as a collective body, often officially owns the data, but does so without a clear gatekeeper for access to that data. This can pose the problem of who should own the data collected and how one gains access to the information for learning analytics purposes. One institution articulated to us, “There is no clear data ownership within the institution. We are unclear on who decides what data is used and collected and what isn’t. There is nothing systematic about the way we approach data.”

Institutions shared with us some ideas regarding potential solutions and recommendations that they feel would be beneficial:

  • “Single source of truth” – This can be both a technical and organizational solution. Institutions expressed that having one place to go for most, if not all, data collected would be the most effective way to mitigate the questions surrounding data collection processes and data ownership. One institution advised, “We only have ‘single points in time’ for data right now, like the VLE activity. What we need is a more dynamic view with more systems included and housed in one place to access it.” The solution could be a centralized data warehouse or Learning Records Warehouse (LRW) that is linked to several systems for data collection. For institutions, having one place to go to get the information needed saves time, energy, and effort. An individual at one institution said, “The University has never had a holistic viewpoint on what data is needed and collected over time. This would be hugely beneficial for everyone if we found a way to do this.”

Data Management

Individuals at the institutions collectively expressed their frustration regarding inconsistent data management practices and policies; we found that all institutions were using data, but for many different purposes. However, not all of the processes were in sync with each other. Most individuals within the institutions were not aware of which data other departments were using and for what purpose.

The institutions we visited were generally compliant with the UK’s Data Protection laws and policies; however, each department appeared to have their own interpretation of those laws and policies. They also expressed a desire to have a unified way of managing data that was implemented across the entire institution.

The following examples illustrate the types of data management challenges most often expressed at the institutions with whom we spoke:

  • “Lack of confidence” when it comes to data collection policies – Several institutions communicated a lack of confidence in the data that was currently being collected. This includes whether or not the data was of good quality for a learning analytics initiative purpose. There was also communicated concern regarding the lack of unified policies to help guide the collection of data itself. We heard several discussions around issues such as not having a universal policy for VLE/ LMS usage, attendance policies being unclear, and data not being collected uniformly across departments. Individuals within a large group at one institution had no awareness of what role each held within the university and how their job duties affected each other. They were unclear about how the data they needed impacted and overlapped with each role as well. One individual at that institution stated, “We have a vast array of ways we collect data at the moment and so many issues around that. We are seeing missing data, needs for data which is not currently gathered, attendance data that varies across programs and we have no idea if the data that actually is available is any good.”
  • Data protection and privacy policy confusion – An individual at one institution told us, “We have so many different policies and procedures for every department and program across the university right now. I have no idea how we are going to control that moving forward.” We found the same to be true across other institutions we visited. Most had policies and procedures regarding data protection and privacy, but they felt things were inconsistently enforced and poorly executed. Another individual voiced, “We need to be really clear about what data the institution is using and why we are using it. How in depth does it need to be? And what is the risk of doing it versus not doing it? We just have no idea right now.”

Institutions shared some ideas with us regarding potential solutions and recommendations that they feel would be beneficial:

  • “It is important to determine how data is captured and that it is standardized” – This is a direct quote from one member of an Information Technology (IT) group at an institution. This person advised that they discuss their frustrations regularly about this topic. They informed us that the bottom line for them was to have “controls about confidentiality levels, data that is captured and how it is used need to be determined upfront. This needs to be done the same way for everyone as well; not just IT.” Standardizing the practice of what data will be collected, who will gather it, and for what purpose it will be used in the initial stages of a learning analytics initiative will help outline the process for future iterations. This will allow for a smooth transfer of responsibility and help determine if any resources should leave the institution.
  • Establish a searchable “policy bank” – The individuals within one institution’s technical group shared that “having a searchable ‘policy bank’ would greatly cut down on the confusion about where a policy is and how to follow it.” This particular institution expressed that training new staff about the policies and procedures to be followed was difficult. This was due to the vast number of policies that existed and the fact that no one executed those policies in the same way. Another institution we visited had spent a significant amount of time and effort to develop a very clear outline for a policy regarding student consent. This included what “consent” meant to that institution, who can give consent to whom, and when consent is required. After all of this effort, there was still nowhere to refer to the policy for someone who was being newly on-boarded. As it was pointed out to us, “This is not an efficient, or even practical way, to house policies. If we can’t find it and it’s not readily available to everyone, how can we make sure we are adhering to it?” Establishing a centralized, searchable policy bank or storage hub significantly benefits institutions.

Throughout our visits, we found a common, prevalent misconception that an initiative such as learning analytics is either a technical or an organizational solution. It is both and needs to be addressed concurrently. Technical challenges are very real and prevalent; however, they only focus on those particular challenges and can lead to the fallacy that a learning analytics solution is a “silver bullet” or “magic cure.” Our message to institutions based on our experience is there is no “fix-all” solution that will resolve the entirety of systematic challenges an institution retains. Learning analytics technology is a tool designed to help provide a richer student experience. It is not meant to be the solution. It is meant to be a part of the solution.

Please be on the lookout for another article coming later this month regarding the quantifiable findings from the readiness assessments conducted and highlighted in the article, “Learning Analytics Adoption and Implementation Trends: Identifying Organizational and Technical Patterns.”

Here’s to continuous growth and improvement!

Useful Reading:

Trends Unpacked (Part 2): Four Things Senior Leaders Need to Know About Their Role in a Learning Analytics Initiative

Originally posted on Effective Learning Analytics.

Lindsay PinedaThis is a guest post from Lindsay Pineda, senior implementation consultant for Unicon.

One of the most common questions I get during onsite readiness assessment visits is, “What can I do, as a leader, to support learning analytics across my institution?” While working with varying institutions, I observed several key ways in which senior leaders such as deans, department heads, and vice chancellors can support the progress of a learning analytics initiative.

  1. Information Sharing: The first aspect to consider when engaging early on with an initiative is spreading the word; sharing the true and clear information about the initiative within the institution. As leaders, others look to you to help guide the overall strategic direction and vision. This doesn’t mean one leader alone within the institution can or should lead a larger scale initiative like learning analytics. The institutional leader cannot be the only one sharing information with other departments or creating a collective strategy; it has to be trickled down throughout the rest of the leadership team. For example, senior management needs to feed it to Deans, Deans to Associate Deans, Associate Deans to Department Chairs, etc. Creating this sort of information dissemination creates a strong foundation for building the initiative. It is not always easy to control the interpretation of messaging as it spreads across campus, but ensuring your delivery of the true, correct message helps dampen the potential gossip surrounding it. Information sharing also empowers those who work under leadership to continue to share the message across departments. Increased information sharing about the true message is always for the better.
  2. Expectation Setting: This next step is one that I found many leaders perform to help properly prepare staff for a larger initiative such as learning analytics. Even if you feel you have expressed support of the initiative in other ways, actually setting expectations to begin the work makes all the difference. For example, setting an expectation during budget meetings that resources will be allocated to this new initiative for a particular duration is an excellent way to express support. Setting this expectation also upholds the importance of the initiative’s priority. Vocal expression of support, followed by the action of creating expectations around what work will be required, is compulsory for the staff to believe that the initiative is a priority and will happen.
  3. Consistent Messaging: Once information is shared, transparency about the message is created and expectations are set. It is then helpful for leaders to provide consistent messaging to their staff. Message consistency serves to reiterate the true and correct message that has been established. In my observations, I found that some leaders were happy to support an initiative when speaking about it originally, but the message of support seemed to wane as time went on. This could be due to changes in leadership priorities. Consistent messaging regarding support can be difficult when there are many different priorities grasping your time, efforts, and approvals. However, an initial statement of commitment, backed up with consistent messaging throughout the project, demonstrates a connected, involved, and dedicated innovator. Keeping the focus pointed in the right direction lets everyone involved know that this is still important and a priority. These are likely things you already do on a daily basis. Holding true to your word goes a long way to those who are watching and taking cues from your guidance. Share your conviction.
  4. Collaborative Thinking: Every learning analytics initiative grows and changes over time, as it should. Continuing to involve staff, colleagues, and even students along the way is fundamental. Welcoming new ideas and new takes on the current processes and implementation plans are all dynamic ways to lead the changes within your institution. If you want to change something, you have to do it, show it, and lead it. This needs to be done at every opportunity. Your students, your staff, and your colleagues are all watching and will, ultimately, see and do the same. Allow others to feed their passion, curiosity, and enthusiasm into the next steps; collaborative thinking is contagious.

One of the best things we can do in higher education (HE) is to learn from each other. Most senior leaders are attempting to do what is best for those they lead, for their institution, and collectively across HE. This article is just a reminder to continue on with those efforts.

Here’s to continuous growth and improvement!

Useful Reading:

Lindsay Pineda is currently a Senior Implementation Consultant for Unicon and has a rich background in learning/predictive analytics. In her previous position, she focused on helping to develop, implement, and execute a proprietary predictive modeling technology that has proven to be successful in predicting student course persistence on a week-to-week basis. Lindsay has immersed herself in learning/predictive analytics research, practical applications, and implementation. Since coming to Unicon, she has been working with institutions to provide Learning Analytics solutions, both technical and nontechnical, and has had a focus on visiting institutions onsite to provide Readiness Assessments. She helps institutions work through issues of change management, resourcing, challenges, and concerns relating to the adoption of Learning Analytics.

Launch of new digital capability community and new guide to developing organisational approaches to digital capability

Originally posted on Jisc digital capability codesign challenge blog.

Technological change and evolving digital capabilities – how can we keep up?
Digital know-how (Beetham, 2015) is an acknowledged asset to our future economy (House of Lords select committee on digital skills, 2015) that is changing the way we work, communicate and conduct our personal business. The sustained rate of growth in digital innovation and creativity offers what seems like a limitless array of new opportunities and discoveries but also raises a few challenges (Ecorys UK, 2016) – how can we keep up with changing demands and rapidly changing skills sets?

Starting the conversation: building the digital capability community
Our understanding of what it means to be digitally capable continues to evolve – it is not a static thing and will mean different things to different people at various stages of their academic and working lives. One of the key messages from the case studies Journeys towards digital capabilities is how important it is for organisations and teams to explore frameworks and models and use these as conversation starters to establish shared understanding and a common vocabulary.

It is therefore very timely that this week sees the inaugural meeting of the digital capability community of practice for those responsible for developing digital capability in their organisations. The aim is to engage community members in co-creating resources, sharing knowledge and experiences.

Although the event is fully-booked, those unable to attend can join the discussion on Twitter at #digitalcapability @jisc as well as join the digital capability mailing list.

New guide: Developing organisational approaches to digital capability
If you have read recent posts on this blog by Helen Beetham and Sarah Knight you will be aware that Jisc has recently released a collection of new resources designed to support you to develop digital capabilities in your organisation. These include a briefing paper, an updated framework of digital capabilities, a series of digital capability role profiles (some linked to professional association requirements), an audit tool and checklist, a curriculum design checklist together with 14 new case studies from both Higher and Further education providers.

Jisc’s new guide to Developing organisational approaches to digital capability launched last week provides a structured route through these new and updated resources which we hope will be of value to those starting out as well as supporting those already working in this field with an approach based on the experience of others.

Why the focus on organisational approaches?
The guide specifically focuses on organisational approaches because the identity, culture and infrastructure of an institution have a significant impact on the extent to which digital practices are facilitated and, in turn, determines how enabled individuals feel in developing their own skills, knowledge and practices.

“When we were looking at the Digital Lancaster strategy [we considered] whether people would have the skills to undertake what we were expecting, and above and beyond that, did they have a culture that encouraged them to actually do it?”
Rachel Fligelstone, head of service, strategy and communications, Lancaster University.

Looking beyond individual capabilities: The digitally capable organisation
In addition to the recently updated digital capabilities framework for individuals we have introduced a new model that articulates what the digitally capable organisation might look like. It is similar in design and structure to the individual framework but looks beyond the capabilities of individuals and acknowledges the importance of taking a strategic approach, the impact of organisational culture and infrastructure and the role of digital capabilities and their impact on, and relevance to, all areas of university and college business.

Strategic steps towards digital capability
Based on the experiences and lessons learned from a diverse range of case study contributors, the guide also includes a customisable 4-step model which suggests practical activities designed to help you build a contextualised model appropriate to your needs.


Keeping the conversation going
The resources and outputs from the digital capability community event and any other relevant research and information will be posted to this blog site. Follow #digitalcapability @jisc on Twitter and join the digital capability mailing list to continue discussions and share your ideas.

Notes and presentations from the 10th Jisc Learning Analytics Network event at Strathclyde University

Originally posted on Effective Learning Analytics.

Glasgow, a great city even when it’s raining, was bathed in sunshine this week for our tenth network meeting, held at the University of Strathclyde. It was good to see how the learning analytics community has grown since we started having these events – there were 80 attendees at this one from institutions all over the UK mainland. Not surprisingly this time too we had more Scottish representation, and from quite a few colleges as well as universities.

Helyn Gould introduced the day on behalf of Strathclyde, an institution which has achieved a lot in learning analytics in a short time.  We heard about their interesting pilot projects from Helyn’s colleagues later in the day.


A video recording of all the morning’s sessions is available.

Group shot from the event

Michael Webb updated us on the latest developments with Jisc’s Effective Learning Analytics project [PPT 569 KB]. He described the data sources we’re going to be collecting next with institutions, with a focus on attendance, library and interventions. New xAPI recipes will be built for these. The data can be transferred to the Learning Records Warehouse, and then used in tools such as Jisc’s Study Goal and Data Explorer, and the learning analytics applications of the range of vendors Jisc has now partnered with.

Michael also outlined plans for benchmarking learning data across institutions (with their consent) and is now looking for people wishing to try this. There’s been a fair bit of interest in this possibility.

Patrick Lynch [Twitter] from the University of Hull was up next [PPT 3MB]. I’d heard good things about his recent presentation at Digifest on the connections between learning analytics and learning design – so I invited him to do something similar here. It turned out to be one of the most informed and reflective presentations I’ve seen for a while.
Patrick Lynch presenting

Patrick described how he’d started to play with 25 million records from the VLE and how he’d experimented with ways to visualise these. But he found conversations with staff and students more useful in working out what best to show them: a typical comment from a student was “I just want to know how I’m doing”.

He then showed some different models of learning design and learning analytics (see his ppt file if you want to explore any of these further), including Doug Clow’s off-quoted Learning Analytics Cycle and my recently developed model, which he hammered for being too American and not student-centred enough. I hope now to refine it jointly with Patrick!

Patrick proposed that we look at using Chickering and Gamson’s 7 Principles for good practice in undergraduate education, and asked whether we can find data to measure these:

  1. encourages contacts between students and faculty
  2. develops reciprocity and cooperation among students
  3. uses active learning techniques
  4. gives prompt feedback
  5. emphasizes time on task
  6. communicates high expectations
  7. respects diverse talents and ways of learning

Our final morning session involved some of us who were at the recent LAK17 in Vancouver, reflecting on the conference.

A few of us who were at LAK17

Sam Ahern, Michael Webb, Niall Sclater, Adam Cooper, Ainsley Hainey (Lee Baylis not in picture)

Two themes that had leapt out at Michael were multi-modal analytics and self-regulated learning. The first of these was the subject of two of the three keynotes. In laboratory conditions it’s increasingly possible to measure physiological aspects of learner behaviour such as heart rate, eye movements and facial expression, in an attempt to understand learning processes. Self-regulated learning is a move away from us intervening towards the student taking more responsibility for their own learning.

Ainsley noted how there were a lot of new presenters at the conference from different areas around the world and from different backgrounds including data scientists, senior managers and technologists. She was also intrigued by a presentation on the one laptop per child project in Uruguay where there was now an attempt to analyse learning data on a national basis.

Adam pointed out that, while SoLAR, which organises the conference is primarily a research organisation, it has put a lot of effort into organising a parallel practitioners’ track – its proceedings are available.  Lee noted too that there was a lot more for practitioners at this conference than at the previous one in Edinburgh. Lee also described the Hackathon we organised, where programmers got together with others to carry out some rapid developments to describe data, transfer it between applications, and develop new applications to visualise it.

LAK18 will be held in Sydney on 5-9 March, 2018.

At lunchtime, people had the chance to see demonstrations of the latest product innovations from Blackboard, DTP Solutionpath, Excelsoft, Jisc and Tribal. Throughout the day, I had immense problems in getting people to stop talking so we could get through the agenda – a sign, I guess, that we had some particularly enthusiastic presenters, and a lot of networking going on. After dragging people back from their lunchtime conversations, we heard from Brian Green and Ainsley Hainey on their institutional learning analytics project at Strathclyde.

recording of the initial afternoon sessions is available.

Brian GreenThe University already had a fairly sophisticated use of institutional data, but found engagement with SoLAR’s LAK16 conference in Edinburgh and in Jisc’s readiness assessment to be helpful external influences. There was also good senior management buy-in which Brian considered critical in securing the support and funding for the learning analytics pilot projects, which was the key recommendation from the readiness assessment carried out by Blackboard.

Ainsley HaineyAinsley described the institutional steering group for learning analytics and the various pilot projects at Strathclyde. One of these combined VLE usage and attendance data, and helped the academic involved to keep her students on track, particularly some of the foreign students who needed additional support. Another project involved using library data, providing useful information to the lecturer and involving weekly intervention emails being sent out to students. Learners were given data on their learning in another pilot: while there were no differences in attainment on the module from previous cohorts, the students felt it helped to keep them on track and made them feel better supported. The final project, involved trialling Jisc’s Study Goal app.

Future work aims to provide highly personalised interventions with students, the timing of which seems to be key. It’s well worth viewing the recording of this session for any institution starting out on learning analytics.

Tony ScealesNext we had Tony Sceales, an entrepreneur, currently advising Jisc on business aspects of the learning analytics project [PPT 7.83MB – I removed the massive video clips but you can see them in the recording]. One of the things Tony’s been looking at is how institutions can make the business case for a learning analytics project. He outlined various measurable benefits of learning analytics such as improving retention, attainment and operational efficiency. He’s developed a tool for institutions to quantify these benefits, and Jisc is happy to work with individual institutions to help make them make the case to the relevant committees and budget holders.

Tony also showed a way of evaluating the different approaches to developing a learning analytics solution:


Finally we had a series of presentations from some of the vendors present on where learning analytics is heading next, followed by a panel session. There were some great ideas from Richard Burrows of Blackboard [PPT 8.85MB], Richard Gascoigne of DTP SolutionPath [PPT 2.77MB] Adam Cooper of Tribal [PPT 1.61MB] and Gabriel Englehard of Excelsoft. These are all in the recording.

Panel session members

The panel session itself involved a lively discussion with with the audience and the vendors and has a separate recording.  Tune in to that for the panel’s response to questions such as “Is there a role for humans in education in the future?”

Our next meeting is likely to be in September. As with this and our most recent network events it’ll probably be over-subscribed so sign up early if you’re interested. Stay tuned to this blog for details or better still subscribe to our Jiscmail list.

Glasgow City Chambers


Digital professionals in education: an update

Originally posted on Jisc digital capability codesign challenge blog.

As part of the framework project in 2015 I was lucky enough to have conversations with key people in many of the professional organisations in UK HE and FE. It was great to find so much interest in the six elements model and so many different ideas for embedding it into practice.

digital professionalsIn addition to updated digital profiles for teachers in different sectors, we have worked with the HEA to produce a new digital lens on the UK PSF. This will help teaching staff build evidence of their digital capabilities while meeting the requirements of HEA-accredited courses or individual applications for HEA fellowship. It can also support the embedding of digital activities into accredited courses (we will soon be releasing other curriculum resources too). This builds on an earlier digital lens, coordinated by SEDA, which included links to content (Jisc/third party) and to a number of useful case studies from professional bodies (not updated for this version).

Working with teachersAnother piece of work I carried out with the SEDA executive last year mapped the SEDA values to the digital domain, asking whether the digital revolution in education throws up new ethical questions, and whether SEDA’s existing values base is fit for purpose. The answer seems to be ‘yes, but…’ educators must remain aware of the new contexts in which they are being asked to practice. The same document offers 13 scenarios for discussion – for example by applicants for SEDA senior and principle fellowship.

Digital professionals 3An employability lens on the 6 elements was produced in consultation with the Association of Graduate Career Advisory Services (the professional body for careers and employability staff) and the Centre for Recording Achievement. James Clay presented this at the AGCAS conference in September. Careers and employability advisers are often at the forefront of helping students to develop social media skills and a positive digital profile. They are also key to managing several key performance metrics, particularly for universities, and so have developed data literacies that are important to the organisation overall. Look out for an updated version of this lens very shortly as part of a release of resources for curriculum teams.

In the last few months I’ve done new work with professional bodies, particularly on the digital capability profiles. In the next few posts we’ll hear from Maren Deepwell of the Assocation for Learning Technologies about developing the new Learning technology professional mapping, from Jane Secker of CILIP  about the revised Library and information professional profile, and from Gillian Fielding of UCISA. There is also new work in the pipeline with health educators, researchers (via the parallel Jisc challenge on Research Skills), with professional staff in different roles, and with teaching staff in FE, skills and apprentice learning.

Follow #digitalcapability for more news, and look for outcomes of the Digital Capabilities Community Launch Event next week.