Presentations – from Yawn to Yay 🙌

Originally posted on Inspiring learning.

Have you ever wondered if your audience is listening or taking it in when you are delivering a presentation?

Why does your audience seem to glaze over, even though you tried to put every useful bit of information on your slides?

Have you attended a really interesting presentation and thought ‘I wish I could do that’?

You can make your presentations engaging and interactive by using a few free tools and simple techniques. At the same time you can help audience members who find it difficult to concentrate, or who don’t have the confidence to speak out in front of others.

Take a standard Prezi, Keynote, Google Slides or PowerPoint slide deck and drop in a couple of the following activities. for a memorable and energising presentation that colleagues and audiences will be talking about for the rest of the day, and longer.


Image Barney Moss – Flickr (Creative Commons 2.0)

Getting Started

All of the tools mentioned below are free or have a free basic version (freemium). Visit YouTube and search for tutorials if you would like to have a go and set one up.

Make sure your participants have a mobile device – a smart phone or tablet (they can share if necessary) – and that they have good internet access for maximum engagement.

All these tools can be used in a browser – no apps necessary.

Participants don’t need accounts for these tools – just enter a password or PIN and go.

Not all tools are accessible for everyone, so find out about your audience beforehand.

Practice with friends or colleagues first.


A poll can get a presentation off to a great start. How does your audience feel about your topic? What do they already know? At the end of the presentation you can gauge any changes in thinking, or get some feedback about your content.

Try Mentimeter for a colourful, visually engaging poll which displays results in real time as various charts or word clouds. In the free version you can ask two questions every time.

Another freemium tool, Poll Everywhere is a popular alternative, and also has a range of question types including multiple choice, ranking and word clouds.


Everyone loves a quiz, and you can energise the room with a well-timed snappy quiz which addresses some of your key themes. You can find out what people know, or what they have learned from your session.

Kahoot! is a firm favourite and brings a bit of fun and lively competition to the room. Participants can win points by answering quickly and correctly. It’s visually appealing and simple to set up and use.

Another favourite is Quizizz which is also easy to set up, with a sharable bank of ready-made quizzes and amusing memes in the feedback.

For a more sedate experience try Socrative which can be set to the pace of the presenter or the audience. Like the others, it offers a range of question types and instant feedback.

All these quiz tools generate an Excel spreadsheet with the results for you.


Audiences appreciate being able to participate in a presentation, whether by asking questions, posting comments and ideas or sharing links.

TodaysMeet is a simple and easy way to do this and is always popular. Participants can post up to 140 characters every time, so they have to be succinct. All they need is a link to take part, but you can set up your TodaysMeet with a password if you need more security.


Example of participants sharing ideas in TodaysMeet 2017

Lino and Padlet are both excellent tools if you want to get your audience collaborating and collating. The virtual sticky notes allow people to post text, links, images and videos. These can be organised into categories or left in freeform.

Set up a Tricider if you would like participants to address the pros and cons of a particular topic. They can add suggestions and comments, agree or disagree, vote on others’ ideas and make decisions.

Presentation tools

You might want your audience to follow along with your presentation in real time on their mobile device.

Zeetings allows you to create a slide deck or import an existing PowerPoint presentation. You can drop interactions in – a poll or a survey, links to websites, quizzes or activities, or videos. Participants can post messages to the group in a chat pane, and they have a personal notebook for the session. You can create five Zeetings in the free version. Nearpod, another freemium tool, has similar functions.

Slide design

Presentation slides are not engaging if they contain everything you want to say. Try using images instead of words. Scott Hibberson’s recent blog post gives some great advice about where to find free high quality images. Try not using a slide deck at all, and just include activities in your session. If you need a slide deck – make only one point per slide. Better to have lots of short interesting background slides than a few crammed, distracting or unreadable ones. You want the audience to pay attention to you!

One point per slide

Free image adapted from

At the same time – let the audience have your content in another format. You could put the script into notes and share it beforehand or afterwards. You could write a blog or make an eBook, by adapting the slide deck or making a PDF of your content. You could create a short video of the presentation using a built in tools like PowerPoint narration, free screencasting tools or create an Adobe Spark, Office Mix or Sway, for those who missed it or who want a recap. This could include a script and subtitles, for good access to your content during and after the event.

Go for it!

Using just one or two of these tools and techniques in your presentation will make a big difference to your audience, making sure everyone is engaged and no one is excluded. After all, you are doing it for them.

The post Presentations – from Yawn to Yay 🙌 appeared first on Inspiring learning.

Seven sites for sourcing free-to-use images

Originally posted on Inspiring learning.

Finding quality images that are also free-to-use for that all important presentation can be a time consuming process, so here are my top six sites for saving time on finding that perfect image.

(Image by Ryan McGuire, freely available on Gratisography)

(Image by Ryan McGuire, freely available on Gratisography)

  1. Gratisography: I was first put on to this site via a Twitter chat and it’s great for shots that are unusual and quirky! All of the images on the site are taken by Ryan McGuire under a Creative Commons 0 licence and are all high resolution photographs (so you don’t get images that pixelate easily like you do from some other sites). The main drawback is the number of images on there – a basic search returns some good images, but if you’re looking for something fairly niche you may struggle.
  2. Unsplash: Also a site where all the photos can be used for free (for commercial or non-commercial purposes). You don’t even need to ask permission or attribute the photographer, although it’s good practice to do so. If Chrome happens to be your browser of choice there’s also a handy extension you can add to get quicker access to the site. To get all the latest news and images from the site you can also follow them on Twitter and other social media channels.
  3. Pixabay: My colleague, Esther Barrett, absolutely loves this site and it’s easy to see why. With over 1,090,000 free stock photos there’s plenty to explore and the search options are relatively sophisticated (the site uses Boolean logic and there are options to specifically search either photos, vector graphics, illustrations or videos; by orientation – vertical or horizontal; pixcel size; and whether you want colour/black and white). Again, all images are available under a Creative Commons 0 licence.
  4. Flickr Creative Commons: No list of image searching sites would be complete without Flickr. Everyone loves Flickr – don’t they? With over 1,600,000 public domain images it’s easy to see why Flickr is often the first stop for researchers and bloggers looking for that all important image. Apps are also available on iOS and Android which is perfect for people on the go.
  5. Haiku Deck: Granted, this is not technically an image search site per se, but more of a site for creating your own presentations. Haiku deck essentially follows the mantra of small amounts of text with large amounts of images with the presenter filling the gaps with the narration. So, why have I included it in this list? Well, Haiku Deck does a great job in linking seamlessly with Flickr to ensure all images embedded in the presentations are under a Creative Commons licence. For more tips on this theme see Esther’s earlier post on interactive presentations.
  6. The Xpert Attribution site also allows you to get an auto-generated attribution embedded in the image and allows people to upload their own content for CC use – a practice that’s good to encourage.
  7. Creative Commons Search: Finally, there is of course the search on the Creative site itself which draws in free-to-use images from a range of other sites (some of which I’ve already listed above!).

Not an exhaustive list, by any means, so if you have come across any other good ones that you’d recommend feel free to add them to the Comments section below 😊

The post Seven sites for sourcing free-to-use images appeared first on Inspiring learning.

Survivorship Bias

Originally posted on e-Learning Stuff.

In my recent blog post I reflected on the wealth of news articles about highly successful people who failed their A Levels, or how everyone can be a millionaire I was reminded of this great XKCD cartoon.

They say you can't argue with results, but what kind of defeatist attitude is that? If you stick with it, you can argue with ANYTHING.

Every inspirational speech by someone successful should have to start with a disclaimer about survivorship bias.

Survivorship bias or survival bias is the logical error of concentrating on the people or things that made it past some selection process and overlooking those that did not, typically because of their lack of visibility. This can lead to false conclusions in several different ways.

These stories are designed to bring hope to some people, but I also feel they send a stark message to others that don’t need to worry about working hard for exams, because regardless of the result, you will become a millionaire!

News outlets at this time, never tell the stories of those who failed their A levels and never have financial success, which is the majority of those students who failed to make the grade. Many of these will though have successful and happy lives. They also never tell the stories of those who did succeed and went onto happiness and financial success.

So what’s your failure story?

Trends Unpacked: Organizational Challenges and Learning Analytics (Part 4)

Originally posted on Effective Learning Analytics.

Lindsay Pineda

Lindsay Pineda

Patrick Lynch

Patrick Lynch

This is a guest post by Lindsay Pineda and Patrick Lynch. Their bios are at the end of the article.

What organizational challenges have you identified at your institution in regards to implementing learning analytics? Do you wonder if they are they similar to those of other institutions? Do you worry that you are alone in facing those challenges? We are here to assure you that, not only are you not alone, but that those challenges can be overcome.

In March 2017, the first installment of the “Trends Unpacked” series was posted. The article focused on the first two organizational trends Patrick and I observed while onsite at several institutions in the UK. In today’s article, we will expand upon the next three organizational challenges and trends discussed in the original article, “Learning Analytics Adoption and Implementation Trends: Identifying Organizational and Technical Patterns.”  We will also offer potential solutions and recommendations to solve these challenges, based on institutional feedback.

This article will focus on the below aspects of organizational challenges and trends:

  • Organizational infrastructure
  • Policy, process, and practice management
  • Ease of integration within the existing organizational structure

Organizational Infrastructure

Some of the challenges and trends we observed while onsite at various institutions were related to organizational infrastructure:

  • Most institutions we visited did not have the organizational infrastructure to support the implementation and adoption of learning analytics technology.
  • Several institutions did not have formalized organizational structures. Many did not know what other departmental staff did on a daily basis or how their job duties affected each other.
  • Institutions were very concerned about issues of redundancy, additional workload, and time management as it related to staff requirements.

The following examples illustrate the types of organizational infrastructure challenges most often expressed at the institutions:

  • “Is that what your department does? I had no idea.” – Surprisingly enough, this was a common statement made when larger groups were gathered to discuss organizational infrastructure and institutional operating procedures. At several institutions, this was the first time they had met as a group to learn how each department/ role functioned and how this would likely affect a larger-scale implementation such as learning analytics. One institution told us, “We obviously need better sharing mechanisms to ensure we are working holistically toward the students’ success.”
  • “Development of a collective vision is vital.” – The notion that departments and individuals were often “working on their own thing without an understanding of how it impacts another department or individual” was a collective concern. At one institution we were advised, “Senior leadership needs to be part of this collective vision. Without their support, it will certainly fail.” We agree with this statement.

Institutions shared with us their ideas regarding potential solutions and recommendations that they feel would be beneficial:

  • Create a shared vision – “Creating a shared vision also helps develop a strong institutional rationale for doing a project like learning analytics. It will also likely help with other areas such as resourcing, budget planning, and task allocation.” This quote was taken from an institution that understood the importance of developing a collective/ shared vision of where the institution wanted to go and creating a plan to get there. Working with various departments, and representatives from multiple departments, will be necessary when developing the collective/ shared vision. As one institution told us, “It cannot be successfully done in a vacuum.” Sharing the vision and understanding the roles each department, or individual, plays in achieving success reduces duplication of effort. In addition, it enhances the feeling of learning analytics as a collaborative institutional endeavor, thus increasing the chances of a successful adoption.
  • Sort out the fundamentals – Sorting out, upfront, what an institution really wants to gain from a larger-scale initiative such as learning analytics is very important to help continue the journey toward a successful implementation. As one institution put it, “We need to know, what do we have and what don’t we have? I would think we’d need to know this upfront before we even start.” This statement can apply to which resources are available, what financial allocations are necessary to set aside, and what additional tasks might be required from staff. Having a clear picture of the fundamentals helps to further define the details as the initiative progresses.
  • Create an implementation plan – This is likely something that will need to be done with the support of senior leadership, along with the staff responsible for executing the tasks required. One institution advised us, “Senior leadership has a bigger picture view of the institution so their input will be necessary when developing any type of project or implementation plan.” This is absolutely true. Within the implementation plan, it is necessary for senior leadership to set the tone for the overall initiative. Aspects such as the value and importance of using data to drive decision making; creating a shared significance for the goals of the initiative; and establishing the importance of continuous, transparent communication within the institution are a necessity.

Policy, Process, and Practice Management

Policy and practice management is another challenge we observed while visiting institutions.

  • Overall, the institutions we visited did have policies, processes, and practices in place that were currently being managed. However, there were discrepancies evident in most institutions regarding how the staff and leadership perceived the oversight along with the implementation of those policies, processes, and practices.
  • Several institutions were concerned about issues including micromanagement; current inconsistencies; the execution and accountability of those policies, processes, and practices; and whether learning analytics would add to their already overloaded requirements.
  • Several institutions identified a gap between policies set by the institution, different processes adopted across the institution to fulfill those policies, and individual practice, on occasion, being completely at odds with both.

The following examples illustrate the types of policy, process, and practice management challenges most often expressed:

  • Standardization is necessary, but not always welcome – One institution we visited stated that they “Would like to have some standardized practices and guidelines across all departments while still keeping some amount of individuality. We don’t have that now.” Standardization can be difficult for larger institutions as they have many individuals on staff. Most institutions did not want to have standardized policies without considering the importance and value of individuality when implementing those policies and practices. An individual at one institution advised, “We don’t want to become a machine that doesn’t teach our staff how to think critically. Part of the joy of working within an institution of higher learning is the importance of thinking. That cannot be removed.”
  • “No one is monitoring the institution’s policies, processes, and practices” – We found that this concern was shared across several institutions. As one institution put it, “We have lots and lots of policies and we do our best to enforce them, but when they aren’t done properly, no one is held accountable or taught how to [enforce them].” If no one is monitoring or upholding the policies, processes, and practices at an institution, how do you know if they are being implemented properly? The answer is: You do not. One institution brought to light a comical, but quite important question, “Sometimes I feel as though I’m aboard the Titanic. Is anyone manning this ship? Or are we all slowly and quietly going down with it?” This is an essential insight to start considering, and to ask, does my staff feel this way as well?

Institutions shared with us some ideas regarding potential solutions and recommendations that they feel would be beneficial:

  • Create a communication strategy – Part of creating a shared vision, collective path forward, and common set of goals is to create a strategy based on information and communication. This type of plan needs to involve staff and their input in such areas as training plans, communication dissemination strategies, and task responsibility. Once the strategy is developed, it then needs to be upheld and accountability needs to be established. What will happen if someone does not implement a policy, process, or practice in the agreed upon way? How will they be further trained to avoid the same situation again? As one institution told us, “If we aren’t sure if our staff is providing accurate advisement to students, how can we know where they need more training to improve?” Being aware of what your staff knows, does not know, and is or is not doing well is essential to helping define a communication strategy. It is also fundamental in providing proper training for support and holding individuals accountable.
  • Empower staff – In balance to the approach expressed above, another recommendation was to empower staff. This concept was summed up by an individual who said, “Information given to staff isn’t just about improving their students, but also [about] helping to further development themselves”. Having staff that feel they are empowered to make decisions and have input in what the institutional strategies are is a formidable motivator. Involving them in important activities such as the development of policies, processes, practices, along with strategies regarding implementation and accountability are ways that staff reported feeling empowered. Checking in with the staff regularly to collect their thoughts and concerns also helps drive home how essential their input is to the overall mission.
  • Develop a centralized policies, processes and practices repository – We have discussed creating a centralized repository for data, such as a data warehouse or learning records warehouse. This works well for the technical side of things, but what about for the organizational side? One institution advised us, “Having a centralized place for all of the institution’s policies, processes, and practices would be very helpful. One that is updated regularly and communicated often would be extremely beneficial for staff to understand what they are required to do and where they can go for a central point of reference.”

Ease of Integration within the Existing the Organizational Structure

The ease of integration of technology like learning analytics within the existing organization structure is a challenge on the forefront of many institutions we visited.

  • We heard many concerns expressed about how learning analytics would actually work in practice. All the institutions visited liked the theory and the idea behind it, but were uncertain about what it would mean to actually put it into practice.
  • Most of the current organizational structures at the institutions did not appear to support a learning analytics initiative in present form.

The following examples illustrate the types of integration challenges most often expressed:

  • “Who is responsible for what?” – Most institutions have many roles within them and those roles may or may not easily accommodate the implementation of learning analytics. For example, many institutions were divided about their use of personal tutors and instructors/ lecturers. Others did not have the personal tutoring role at all. Determining who would be responsible for implementation and interventions was a topic of much debate. Many questions arose, including, which department “owns” interventions and the responsibility for delivering them to students? Should there be a new tutor/ advisor policy to incorporate the changes in the requirements of the role?  How will things be enforced? These are all very valid questions to ask and consider.
  • “We already know who is at risk. Why do we need a formal process or solution to tell us what we already know?” – This is a sentiment we received from more than one institution. Many of the institutions visited expressed concern about how another process or technology solution was going to be implemented when tutors/ advisors already know who is at risk and who is struggling. One institution advised, “This seems redundant. We have the experience to know who is having trouble. Why do we need a piece of software to tell us? Shouldn’t that be part of our jobs already?”

Institutions shared with us some ideas regarding potential solutions and recommendations that they feel would be beneficial:

  • Setting clear expectations – Clearly outlining and setting expectations early on for anyone who will be responsible for the interventions related to learning analytics becomes very important in many ways. Receiving full participation and buy in from staff is an essential element for a successful implementation. Staff knowing what their responsibilities are and what accountability looks like also help increase empowerment and ownership. An institution expressed to us, “It seems we will need to revisit our tutoring expectations to ensure our staff is provided with the right kinds of guidance and training to help them better assist their students. After all, the goal is to provide a better student experience. How can we do that if our staff isn’t provided the same opportunity?”
  • “Proactive, not reactive” – All institutions expressed the desire to be “proactive, not reactive” when it came to students. One institution told us that a learning analytics initiative would “Share what’s already in our heads with others and will allow us to provide more targeted support to students along with creating a better use of our time.” Learning analytics is not just “another piece of software,” nor is it intended to replace human thought and action; it is meant to support what tutors and advisors already know and do. Students look to their tutors and advisors to help guide them on their journeys, to provide them with well-informed support, and to have necessary information to help with decision-making. Another institution advised us that “Learning analytics seems like a tool we as tutors are given to help students build their houses. We aren’t doing it for them, but with more information and the right tools, we can help them build a stable house to grow in.” This is an excellent way of illustrating the purpose and goal of learning analytics information.
  • Standardize the level of support – Patrick and I met with student groups while onsite to gain their perspectives on how learning analytics might look or benefit them. Quite often, students reported that the quality of support they received varied between staff. One student told us, “When I call in to have a chat with a personal tutor, I expect that they would all respond the same way to a question, but they don’t. I will often get two or three different answers to the same question. And when I call back, it appears that no one knew I had already spoken with someone else. It seems to me like they should all know the same information about me and the institution in order to help guide me.” Creating a standard as it relates to areas such as advisement, support, and guidance would greatly benefit multiple institutions in the quality of the overall support they are providing to their students.

While organizational challenges within institutions are prevalent, they are still manageable and should not stand in the way of maximizing the benefits of learning analytics. There is an underlying desire to create consistency in advisement and support, along with a need to standardize policies, procedures, and practices. Institutions seek a proactive culture in order to ensure an equivalence of experience for all involved. Constant and consistent communication regarding learning analytics initiatives is the most important aspect among institutions. To help with this effort we recommend institutions involve stakeholders from all relevant departments. The collaboration resulting from this involvement is paramount to the overall initiative.

Please be on the lookout for our next article in September, which will offer real, practical activities that can be used at your institution to achieve the suggestions and recommendations outlined above.

Here’s to continuous growth and improvement!

Useful Reading:

You too can be a millionaire!

Originally posted on e-Learning Stuff.

millionaires shortbread

Yesterday was A Level results day, for over 800,000 students they got a letter explaining the outcome for most of two years studying. For some it will be an amazing result and they will progress onto the next stage of their lives. For some there will be disappointment, and uncertainty.

For another 800,000 young people, September will see the start of their A Level journey whether that be at Sixth Form or at an FE College. I wish them luck and hope they work hard to achieve the success they desire.

One thing that they do need to realise is that despite the BBC News publishing stories like this one, The A-level failure who became a multi-millionaire, you do need to study and work at your A Levels.

The day Giles Fuchs learned he had failed his A-levels, his family gathered around the dining table for dinner as normal.

His father didn’t say a word during the meal, waiting until the plates had been cleared to turn to his son and say: “Giles, I hope you’re good with your hands.”

Hoping to prove his dad wrong despite the dismal results, the next day Mr Fuchs knocked on the door of the biggest estate agent chain in Northamptonshire to ask for a job.

Today a multi-millionaire 52-year-old, and co-founder and boss of UK serviced office business Office Space In Town (OSIT), Mr Fuchs says that the three years he spent working for that estate agency in the East Midlands gave him an invaluable grounding.

I do find that often news outlets, like the BBC News, publish these stories, which I am sure are all published with good intentions about giving “hope” to those learners whose A Level results weren’t as good as they hoped.

I think they also have a negative aspect to them too, which is the impact it has on learners who have yet to start their A Levels (or even their GCSEs). The message appears to be don’t worry about studying, even if you fail to get the results, you will still be a millionaire!

Lots of successful people, such as Richard Branson, Jeremy Clarkson, all messed up their exams, but still found success and became millionaires!

Looking back you can see stories across the news media on how it’s okay to fail, but you can still be a millionaire! Here is a list of just 15 people who succeeded despite exam failure.

News outlets at this time, never tell the stories of those who failed their A levels and never have financial success, which is the majority of those students who failed to make the grade. Many of these will have successful and happy lives.

They also never tell the stories of those who did succeed and went onto happiness and financial success.

Many people for whom GCSEs and A Levels were not the way to academic success may find success later with Access courses and going to University that way, or study through the Open University. Apprenticeships offer another route to success.

We can all be millionaires, but the reality is that most of us won’t be millionaires. Only 1% of the UK population are millionaires and a third of those live in London!

So do you want to be a millionaire?

Let’s not give up hope, but let’s celebrate success, celebrate hard work and effort. Let’s give a realistic hope to those who weren’t successful, show them alternative routes to academic success, or vocational routes into employment.

Let’s go digital – an extra slice

Originally posted on Inspiring learning.

One of the main topics of conversation with our members over the last few months has been how college library and learning resources services can best support the increasingly digital needs of students and staff. You could say that’s hardly surprising, given the importance of digital skills for student transition into employability or higher education. What’s more, restructures and mergers can prompt managers to question what the role of library and learning resources services should be in the college’s digital future. Clearly they need to keep evolving, but how?

With these questions in mind, I was delighted to be invited by CoLRiC (Council for Learning Resources in Colleges) to one of their summer events. Scott Hibberson reported on the first event in London in this post, and a few weeks later I was on the train from Swansea to Huddersfield for the second event at Kirklees College.

Once again the programme kicked off with a Jisc session on digital capability before going on to feature some of this year’s CoLRiC best practice award winners, topped and tailed by two sessions from Salford City College. Rather than cover everything, this post will focus on my main takeaways and also look in detail at how we used the LearningWheel approach to collect ideas on how digital tools and resources are being used by college libraries to support teaching and learning. For a full digest of the day with even more tips and photos, pop over to my Storify.

How are learning resources services evolving?

When I first saw the event title ‘Let’s go digital!’ I must admit my initial reaction was to question it: surely libraries have been digital for a long time? After all, back in 2000 when most UK FE colleges first got connected to the Janet network, librarians were among the first to embrace the potential of the web for collaboration and to build new services. However, I think it’s fair to say that this pioneering zeal didn’t always grab headlines, perhaps because so much of it was local and incremental. Since then libraries have continued to experiment on many fronts with digital technologies and have built world-leading services on the back of those early projects. Now we seem to be at a point where digital capability, in one form or another, is part of everything a library does. With digital provision surely high on the agenda of every successful college, there’s never been a better time for libraries and learning resources services to demonstrate the value they bring to the digital student experience.

Best practice

Scott’s earlier blog post touches on several of the CoLRiC best practice award winners from both the London and Kirklees events, and you can find a full list award winners and commended entries here. The list gives a sense of the sheer breadth of college library activity across the UK, ranging from teaching digital skills on the front line to the behind-the-scenes integration of library services with other core systems (e.g. through single sign-on to resources or embedding in the VLE). Clearly not all excellence has a primary focus on digital, but even with “non-digital” activity, such as reader development, it’s a safe bet that digital capability is still needed to manage the process and achieve maximum impact.

One of the colleges our team have been working with recently is RNN Group in Rotherham, so it was great to see their liaison librarian Rachel Stone pick up first prize for their “exceptional and comprehensive information skills scheme of work, clearly demonstrating impact with good evidence of a flexible and innovative approach.” I was particularly keen to note the focus on skills development for HE learners as this can be a group with some distinct needs in colleges. HE-FE collaboration was further demonstrated in a cross-sector presentation from Anne-Lise Harding of South Essex College and Kate Grigsby of Sheffield University.

Well done to all prizewinners and commended entries! I will certainly be encouraging other colleges to tap into this wealth of experience in the CoLRiC community.

Developing digital capability

I ran a session updating the group on Jisc’s building digital capability project and it was good to see that many people were already aware of some of its work and had even started to put its resources to use. I focussed particularly on the role profile for library and information professionals which I’ve been finding really useful as a starting point, not just for thinking about staff skills but also as a way of rethinking a vision for future library services and their role in the organisation. The role profile can open up our eyes to the ways library and learning resources teams support digital learning, whether on the front line, through provision of specialist tools and content, or in support of college-wide innovation.

We started with the six elements of digital capability as shown in this diagram:

Digital capability framework

Digital capability framework @Jisc and Helen Beetham CC BY-NC-ND

Then, working in groups, we used Mentimeter polls to discuss three key questions based on those six elements. These are the responses:

Mentimeter poll 1

Mentimeter poll 1

Mentimeter poll 2

Mentimeter poll 2

Interestingly, these results varied somewhat from the responses to similar questions we asked at the London event! But even so, it is clear that in all six areas, librarians are active and keen to know more.

I also asked for ideas on future requirements and had some interesting responses:

Mentimeter poll 3

Mentimeter poll 3

These will help to shape our support from Jisc in the coming year – many thanks to all participants for their lively contributions!


We had two sessions from Salford City College, by Deborah Kellsey Millar and Andy Eachus, exemplifying the possibilities that can happen when library/learning resources experts and e-learning enthusiasts join forces. Deborah passed on some lessons from her experience of developing a strategic approach for digital learning, based on the LearningWheel model of digital pedagogy.

I’d come across the LearningWheel concept a few years ago but this was my first chance to engage with it closely in a group. On one level it’s a pedagogical model to support teaching and learning to make the most effective use of digital possibilities, working across four ‘modes of engagement’ as shown in this image:

LearningWheel model

LearningWheel model. Image credit:

But on another level, the LearningWheel is a graphic way of presenting digital tools and resources to meet specific teaching needs. These tools and resources are crowdsourced and curated by practitioners and you can see examples on the LearningWheel site under the ‘Collections’ tab.

This image gives a glimpse of a LearningWheel for teacher training, focussed on tools for collaboration:

LearningWheel example

LearningWheel example. Image credit:

At the centre of the “wheel” is the learner and radiating out from that are the ‘spokes’, each made up of a digital tool/resource you could use along with an idea for how to use it. So, for example, you could have a wheel for English, hairdressing or drama. Other examples include maths, hairdressing and sport. LearningWheels can also be centred on a specific technology (eg Twitter) with the spokes indicating the many ways you could use it. Yet another type of LearningWheel is one created for a conference allowing members of a community of practice to share the resources and tools they advocate.

A key feature of the LearningWheel approach is the principle that a digital tool/resource/activity supports one or more of four key ‘modes of engagement’ i.e. Collaboration, Learning Content, Communication and Assessment. This makes a lot of sense for libraries as everything they do fits – directly or indirectly – into at least one of those engagement categories. And it means a library resource – maybe a service like Hairdressing Training or library screencasts on how to find copyright-cleared images – can be added to the mix.

How do you create a LearningWheel?

In a nutshell:

  • Someone sets up a LearningWheel to meet an identified need, using a standard Google spreadsheet template available from the LearningWheel site (under the ‘About’ tab). At the CoLRiC event Deborah set up a LearningWheel spreadsheet so the group could begin to build up a list of digital tools and resources they use to support teaching and learning. You can view it here.
  • The wheel has a designated ‘Captain’: this means there is someone taking responsibility for its curation. However, it’s important to stress a LearningWheel is very much a collaboration and is open to all to participate.
  • The LearningWheel is put out for contributions as widely as possible. As you might expect, social media plays a big part: have a look at my Storify of the day to see how it was rapidly shared on Twitter.
  • We’ve already got some familiar tools listed (like Camtasia and QR codes) but also some newer ones like ExplainEverything and Piktochart. You’re bound to find something new to try and we’d love to build on this collection further.
  • Eventually, when we have got all the spokes we need, the spreadsheet can be converted into a colourful graphic. I can see it having a potential place e.g. in staff induction, staff development, service planning and awareness raising with stakeholders to show what libraries do.

What next for our LearningWheel?

We got off to a great start but the CoLRiC LearningWheel could do with a few more spokes! If you work in a college, or work with FE libraries/learning resources, you can help. Here’s how:

  • visit the Google spreadsheet here
  • Choose the most relevant mode of engagement: Learning Content, Collaboration, Communication or Assessment (it might fit more than one, just pick the one you feel comes nearest!)
  • Add your resource and give an example of how you’ve used it
  • Remember the focus is on tools and resources for colleges (any level is fine, including sixth form and HE-in-FE also).

I like to think the spreadsheet can remain open for some time until we have a good set of “spokes”. Scott and I as joint ‘Captains’ will hope to work with Deborah and the team at Salford City College to complete the curation process and let everyone know when it’s finished. We might even feature some of the tools on this blog over the coming months if we get enough interest.

Thank you!

A big thank you to CoLRiC for welcoming Jisc to their events and for the work to encourage excellence in college library services. 2018 will be CoLRiC’s 25th birthday year and from what we saw this summer, there will be plenty more achievement – including digital – to celebrate.

Thankyou also to our hosts at Kirklees College! We had a guided tour with the Head of Learning Centres, David Scott, of the Huddersfield site, one of seven award-winning Learning Resource Centres in the college. I got a clear sense that it is a college where learning resources sit very much at the heart of things, both in terms of both digital and f2f learning.

Looking ahead

This summer, as we pause for breath and get ready for the coming year, I hope everyone reading this will take a moment to consider what makes your learning resources service or library stand out. How is it inspiring learning in a digital world? What new skills do staff need and what else needs to change? Whether you’re bursting with ideas or struggling for answers, I hope you’ll want to share them with us in the coming year. Maybe you could even be one of CoLRiC’s award winners next summer?

“Let’s go digital!” was our starting point, and on reflection I think it’s a pretty apt headline: college learning resources staff are showing that they are eager and well-placed to collaborate on creating the best possible digital environment for our students.

The post Let’s go digital – an extra slice appeared first on Inspiring learning.

Résumé vs Network: what do they say about you?

Originally posted on lawrie : converged.

It might be the time of year, or it might be that there are some interesting job appearing in the sector, but whatever the reason, my inbox is full of Linkedin requests, and no small amount of requests for the writing of linkedIn recommendations. In addition, I noticed that some institutions and organisations have reverted to asking for CVs rather than filling out job applications. There was also an interesting article by Joshua Kim on inside higher ed, asking if LinkedIn will ever replace the CV.

I remember when I signed up for Linkedin, it was described to me as a “Facebook for work” (sic). I don’t particularly like it, and even though I know people who use it very effectively, I have never deeply engaged with it. But the recent activity got me thinking about what someone’s network tells us as opposed to what traditional CVs and job applications tell us about a person?

At the most basic level a CV or job application gives the following information:

  • Who you are Name
  • Education and Qualifications
  • Skills
  • Experience and Work History
  • Sometimes extra information such as voluntary work, hobbies and interests

But the important thing about a CV or a résumé is that it is autobiographical, it is you portraying yourself, and it can be as simple as a bunch of facts and dates or it can be a narrative, and the narrative can have as much gloss or spin as the author decides.

I started thinking about this more over the weekend as I was also doing some social media analysis, mostly around Twitter, and mostly around my voluntary work with birds. I started to visualise my own (active) network on Twitter (using I did it in two ways – first visualising who I am having conversations with; and then the hashtags I was using. August in the education sector is probably not the ideal time to be having that conversation!

The hashtags showed what I cared about – unsurprisingly in my case, and it being August, it pretty much split into four areas:

  • Education, Digital, and Technology
  • Birds / Wildlife / Conservation
  • The Athletics World Championships (I had tickets and a trackside seat)
  • Canals and Narrowboats (I took a two-week break on a boat)

But who was I having those conversations with?

Twitter network image

When you look at someone’s Network, and the interactions within it, is it possible to find out questions such as:

  • Who they are? (beyond the spin of a CV)
  • Who values them?
  • What they have done?
  • What they are doing now?
  • What they care about?
  • Is it possible to get a feel for whether someone has the skill set to do a job by their network activity?

And if it is possible to do that, is it of more or less value than their CV or résumé? And how can we assess or evaluate that network? What should we be looking for?

There is no denying that having certain degrees from certain universities looks good on a CV, but does looking at a person’s network reinforce that privilege? Does it reproduce inequality? Or is it a way of moving beyond those structural advantages?

There is also the influence of negative networks. The recent case of Cole White in the US is an example of where being associated with a group of white supremacists resulted in him losing his job (and just so we are clear here, I am glad).

Tweets about Nazis

If people hiring him had looked at his network, would they have identified this “character trait” and if so would they even interviewed him? The answer is probably no, but whilst this kind of right-wing network is obviously repugnant, are there networks that will split interviewers. My own network clearly shows, as well as my education and technology interests, that I am very much driven by animal welfare and lean toward the left of politics. None of the networks I participate in are “extremist” or “illegal”, but to some my network will be at odds with their own.

There seem to be more companies and organisations moving toward CVs and résumés rather than applications, certainly at middle and senior management, this may be because of the changing nature of the job market. Is it more challenging to recruit, is this shift away to attract potential recruits and as an initial sifting process? Or might LinkedIn be looking at automating some of that sifting?

In 2016 Microsoft bought LinkedIn for $26 billion dollars, giving them access to the data,  employment records and education records of around 500 million users. It also gives them access to the network dynamics within LinkedIn, who they are connected to, who they interact with. Could Microsoft develop a tool that uses that data to provide an enhanced CV, one that looks at network interactions. They will certainly be looking for ways to get a return on the $26 billion dollars.

With increased visibility through digital, our networks are saying things about us, about what we believe and what we value. The data of those networks is available to be interrogated and used. What does it say?

Also on:


Why you should sign up for the 2017-18 Tracker!

Originally posted on Jisc Digital Student.

Last month we published a report from this year’s Student Digital Experience tracker project, representing the voice of 22,000 learners from across the UK. Findings from the project have attracted national and international interest, and we’re starting to see some exciting lessons about what makes a difference to the learning experience. But the main purpose of the tracker has always been to help institutions understand and make a difference to your own students.

sign-up word cloud

Word cloud showing responses to the question ‘what is your main reason for engaging with the Tracker?’

We asked participants in the 2016-17 Tracker project what they wanted from the process, and what they had gained. We found that the main reasons for engaging with the Tracker were (in no special order):

  1. Inform ourselves about the student digital experience
  2. Gather evidence to support specific actions (e.g. in respect of the curriculum, student digital support, and/or digital infrastructure)
  3. Demonstrate that we are engaging with students and responding to their feedback
  4. Evaluate ourselves – against other institutions (benchmark), or over time (monitor), or in relation to our aims for a specific initiative/strategy/project etc (evaluate)
  5. Gain intrinsic value from the project process (e.g. student engagement, other stakeholder engagement, benefits of working with Jisc/other institutions)

When we asked whether the tracker had helped them to meet these goals, 45 institutional leads responded. They told us that the tracker process had been either useful (n=21) or very useful (n=22). All but one was keen to use the tracker again in the future. Of the 45 who reported back, 37 described concrete actions they had taken in response to their tracker findings, from wifi upgrades to new digital learning strategies.


How useful was the tracker process? Click to zoom in.

We have also collected a range of case studies and practical examples to show the Tracker making a difference in a range of different contexts. You can read some of the feedback from participants here, and see more quotes below.

Quotes from 2017.001

Quotes from 2017 participants – click to read

So rather than waiting to read another report about our national findings, why don’t you get involved and start collecting your own data? As well as a great way of engaging with students and showing you care about their views, the tracker process connects you with change agents at other institutions. Find out how others are working in partnership with their students to implement the tracker, analyse the data, and act on the findings.

Nearly 95% of users find our guidance valuable, and the whole process is based on a body of research and evidence. Factor analysis has confirmed that the survey is robust, and we have refined the questions still further for 2017-18 to ensure you only ask learners about issues that are clear, relevant, valid, and actionable.

New features for this year will include:

  • overall measures of digital satisfaction
  • better correlation with the NSS for HEIs
  • questions for institutional leads about organisational drivers
  • opportunity to send out individualised links to learners
  • data tracking by anonymised IDs to support merging with other sources of learner data

How can you get involved?
The 2017/18 tracker project will launch in September with the Tracker surveys going live to students any time from October 2017 to April 2018, to suit the needs of institutional users. The project will continue to be supported by Jisc with a full range of updated guidance. BOS is transferring to Jisc in October 2017 so there will be an excellent technical infrastructure and the reassurance of data security throughout the process.

If you’re interested in participating in the Student Digital Experience Tracker for 2017/18 please sign up before the end of September 2017 using the following online form:

You will complete a second form in September when we will ask for more details about your organisation and your plans for the tracker. This is to help you plan the process, engage your stakeholders, ensure a representative sample of students respond to the survey, and set you up for success.

From October you will be able to customise and launch your own survey using our detailed online guidance and resources. You will also be added to the tracker pilots Jiscmail list that allows you to share ideas and experiences with other participating institution, and that gives you first access to new resources and FAQs.

Learning Analytics Service Agreement – Final

Originally posted on Effective Learning Analytics.

The  Jisc Learning Analytics Service Agreement  is available to download and we would like to thank everyone who provided feedback and comments. Here is a version showing the changes that have been made to the agreement.  Learning Analytics Service Agreement_Consultation-Final_Compared

Existing implementation institutions have already been contacted regarding the process for signing the new agreement. New institutions should contact if you are interested in the service.

The consultation process is now complete.

The table below shows a summary of feedback and the changes or responses to the suggestion.

Agreement, Section or Clause Query Jisc response / significant changes to the documentation
Data Processing Agreement (DPA) – Clauses 5.1.7, 7.7 and 11.2 If we don’t sign the new Service Agreement by 31 July 2017 is my data still securely stored in the learning records warehouse?

Will I have to instruct Jisc to stop processing my Institution’s data?

Will there be a gap between the Data Processing Agreement my institution has previously signed and the new Service Agreement?

Pathfinder institutions will be offered the chance to extend their current DPA to 31 Aug 2017 to allow time for the new Service Agreement to be reviewed and signed off.

This extension to the term of the DPA will ensure that the data provided under the agreement will remain with Jisc and the obligations under it will continue until the Service Agreement is entered into.

Service Agreement How will you ensure compliance with the new EU General Data Protection Regulation (GDPR)? A new Service Agreement has been produced to replace the Data Processing Agreement that pathfinder sites signed with Jisc during the R&D Learning Analytics Project. The new Service Agreement has been updated with clauses to ensure compliance with GDPR.

We have also ensured that our contracts with sub-contractors (eg HT2 Ltd, which provides the Learning Records Warehouse) are GDPR compliant.

Service Agreement Are there issues around obtaining consent to use student sensitive data, such as ethnicity, for learning analytics and for interventions? At this point we’re not processing sensitive personal data in the Jisc learning analytics service.

It’s worth remembering that an Institution, as the Data Controller, retains control over what data it passes into the Learning Records Warehouse (LRW).

We’re currently looking at the best approach to manage consent whether via new functionality in the LRW, or via some other mechanism.

We’re also continue to gather and share the different approaches institutions are using to gain student consent for learning analytics.

A recent post on the Jisc Effective Learning Analyitcs blog describes the work on consent that is currently underway at seven pathfinder sites, see

Service Agreement Some of the wording that describes arrangements during the initial term and pilot implementation period could be clearer. Jisc will produce a FAQ to clarify these points.
Service Agreement My institution’s current data processing agreement ends on 31 July 2017. What’s the process and timescale for signing the new Service Agreement? As a pathfinder site, Jisc will email a new Services Agreement to you in the week of 17 July 2017.

An Order Form has been added to the front of the Service Agreement so that pathfinders can easily see the Terms & Conditions and any costs associated with the service.

Pathfinders will need to confirm the components they wish to order for the year ahead. They have the option to continue with the Jisc components they began implementing during the R&D Learning Analytics Project, or they can add new ones if they wish.

Pathfinder sites currently implementing Tribal Student Insight can indicate that they wish to continue to receive this service during AY2017-18. There will be no charge for this service.

Please sign and return a scanned copy of the agreement to

Service Agreement – Order Form Why has the contracting authority been changed to Jisc Services Limited? As learning analyitcs is now being offered as a service the contracting authority needs to change to Jisc Services Limited, which is one of the Jisc group companies (see )
Service Agreement – Order Form What’s included in the Jisc service? The key components of the Jisc service are:
– Learning Records Warehouse Core
– Study Goal Core
– Data Explorer Core
– Student Success Plan
– Tribal Student Insight (existing pathfinder sites only)Additional information about the service will be available on the Jisc website.
Service Agreement – Order Form What will my Instituion pay for the Jisc service? Existing pathfinder sites (including those using Tribal Student Insight) will not pay for the service during AY2017-18.

All new customers will be able to pilot a selection of Jisc products for 6 months at no charge.

Charges after the free pilot period and 100% discount period for pathfinder sites are currently being finalised. In the coming months we’ll discuss service charges with each Institution.

Service Agreement – Definitions The agreement defines sensitive personal data but the information about the types of student personal data covered by the agreement needs clarification. Clause 7.2 Data Protection Particulars in the Services Agreement describes at a high level the types of personal data that are processed under the agreement.

Sensitive personal data is defined under GDPR (see ).

The GDPR equivalent of sensitive personal data is ‘special categories of personal data’ which includes data about a person’s racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, genetic data, biometric data, health data, their sex life or sexual orientation.

Sensitive Personal data will not be processed in the Learning Analytics service.

Service Agreement – Definitions The agreement refers to information on the Jisc website. Our legal team find this rather vague. Will it be collected together into one place? Ultimately our aim is to have one area of the Jisc website that contains all of the information you’ll need about the Learning Analytics service.

Learning Analytics is a new service and we don’t have all the internal processes in place within Jisc to achieve this at the moment, but we are working towards it now.

Service Agreement – Clause 2 Could auto-renewal after the initial 12 months be changed to a rolling 30 day agreement with 3 months notice? Given that we received only one request about changing the renewal terms, we decided to retain the original wording.
Service Agreement – Order Form and Clause 2 What is my financial commitment during the Initial Term and what termination options do I have? The Initial Term of the Service Agreement covers one year (1 August 2017 to 31 July 2018).

Within the Initial Term there are two points at which a customer can terminate their service, the first is at any point during the 6-month Pilot Implementation Period. The second is during the remainder of the Initial Term when a customer can terminate their contract with 3 months notice.

In practice, this means:
– Pathfinder sites have no financial commitment to Jisc during the Initial Term.
– New customers commit to the charges payable in the 6 months following the Pilot Implementation Period.

Service Agreement – Clause 3.1 What SLAs will be in place for the Jisc Learning Analytics Service? We are now in a position to define a set of SLAs for Learning Analytics as it transitions from an R&D project into a Jisc service.

We’ll share our approach with you over the coming months.

The Service Levels will be set out on the Jisc Site. These Service Levels are subject to change and we will notify you should this happen. The Service Agreement does give your Institution considerable flexibility to terminate during the Initial Term if you are not satisfied with the service you are receiving. We want to retain customers and ensure they’re satisfied with the service we’re providing, so in this interim period we’ll be working with you to ensure any concerns you have a promptly addressed.

Service Agreement – Clause 4.8 Only giving institutions 1 month to provide written notice of termination if JISC amend charges is not enough. The period that Institutions can give notice to terminate if Jisc changes prices has been increased to 2 months.
Service Agreement -Clause 9 What will happen to historical data in the learning records warehouse? Will it routinely be cleared after a period of time, and will this period of time be on a per institution basis? When students leave at the end of an academic year their historical data can be de-identified if the insitution wishes to retain it for data modelling purposes.

If instructed to do so by an Institution, Jisc can also delete all historical data.

Service Agreement – Clauses 8.2.6, 9.1.2, 9.1.5 and 9.1.6. Both parties to the Service Agreement may struggle to fulfil their obligations to notify each other within a 48hr period as set out in Clauses 8.2.6, 9.1.2, 9.1.5 and 9.1.6. A commitment to ‘promptly’ respond would be sufficient for the purposes of this agreement. We’ve received a number of queries about the 48hr response time, particularly in relation to Clause 9.1.5.

We’ve noted that some institutions would struggle to accept a term like ‘promptly’ without it being linked to a stated timescale.

Given that Learning Analytics is a new service for customers and Jisc, we may need to review these response times in the light of our experiences once the service has been up and running for a time.

For now, we believe that 48hr is a reasonable starting point for these notice periods.

Also see the query about Clause 9.1.5 below.

Service Agreement -Clause 9.1.3. What safeguards are in place to protect personal data?

What security standards will be inplace to protect my Institution’s data? When will I be able to see details about them?

The new Service Agreement identifies the obligations on Jisc and the sub-contractors it uses to ensure personal data is safely stored and processed.

To ensure we meet these obligations we are extending the scope of Jisc’s ‘ISO 27001 – Information Security Management’ certificate to include the Learning Analytics Service. This work will be completed in early 2018. We’ll share details of our approach and progress over the coming months.

Details on the scope of Jisc’s current ISO 27001 certificate are available on the Jisc website, see

Clause 9.1.5 How does the timescale for Jisc notification of a data breach to the Institution fit with the deadline an Institution has for reporting a breach to the regulator?

Can the 48hr notice period in Clause 9.1.5 be changed to 24hr, as we must report any breaches to ICO within 72hr?

Under Recital 85 of the General Data Protection Regulation (GDPR) a Controller has up to 72hr to report a personal data breach to the regulator (see )

If the Institution is not aware of the breach itself, then the 72hr reporting deadline for the Institution begins after it has received the breach notification from Jisc.

Jisc did consider whether the notification period could be reduced to 24hr. Based on the current availability of support resources we decided that 48hr was the shortest timescale we could reasonably expect to meet, particularly over a weekend.

Clause 9.1.8 Under GDPR can an Institution rely on consent being sufficient to cover the transfer of data outside the EEA? This query arose in relation to the transfer of personal data outside the EEA by a Jisc sub-contractor. The sub-contractor based in the USA would need to use terminal services sessions to access data stored in the EEA. Access to data via a terminal services session involves a data transfer.

Our approach has been to replace an institution’s reliance on providing consent for a data transfer, with a commitment to ensure that data is transferred under a compliant mechanism, such as Privacy Shield or through the adoption of EU Model Clauses.

Service Agreement – Clause 9.1.9 a Current clause does not take into account if there have been any breaches/near-misses/etc. Clause 9.1.9a has been amended to allow an institution to undertake an audit following an actual or ‘near miss’ personal data breach
Service Agreement – Clause 9.1.10 What are the arrangement for the return/deletion of data at termination? The process is described in Clause 9.1.10.

The time allowed for the return and deletion of data and any back-ups upon termination is now 60 days.

Service Agreement -Clause 17 Can a clause be added to allow us to exit the agreement with immediate effect if we do not approve of a sub-contractor Jisc has appointed? An Institution may not be using the services of the newly appointed sub-contractor. Allowing an automatic right to terminate in this situation could introduce the risk that termination may be triggered without Jisc first having the chance to discuss the practical implications of the appointment with the Institution.

As a result, we decided not to incorporate this change.

Service Agreement – Clause 11 I’m an existing pathfinder site. We’re implementing Tribal Student Insight as part of the cohort that jisc is funding. My institution would like Tribal to process the 5 years of historic data we have. Three years of historic data was the maximum amount we’d previously discussed under these arrangements. Is there anything I’m contractually obliged to do before I pass this additional data to Tribal? You are not contractually obliged to do anyting further in order for your data to be passed to Tribal.

Tribal is an existing sub-contractor of Jisc and Clause 11 of the Service Agreement allows Jisc to disclose an Institution’s data to its sub-contractors soley for the performance of Learning Analytics services.

You’ll probably need to discuss the extra years of historic data you’d like to use with Tribal and Jisc, so that you can confirm the practicalities of when and how the data will be made available.

It is also worth noting that, as the Data Controller, each institution can decide what data to release for use by the Jisc Learning Analytics service or any Add-On services.

Service Agreement -Clause 11.3 Why has Amazon Web Services been singled out for a unique sub-clause in Clause 11.3? We included separate wording for the Jisc sub-contract with Amazon Web Services (AWS) because:
• AWS provides the hosting service for many components of the Jisc Learning Analytics architecture, so is an existing core Jisc sub-contractor.
• AWS does not negotiate on, or vary, its standard terms and conditions of service. With other Jisc sub-contractors, such as HT2 Ltd (which maintains the Learning Records Warehouse), we have been able to negotiate to ensure that the data protection terms in our sub-contracts with them are compliant with the new EU General Data Protection Regulation (GDPR). In terms of AWS, although we cannot control the data protection clauses they use in their contact with us, we do expect that as world leaders in hosting services they will have GDPR-compliant processes and agreements in place to meet their obligations under GDPR when it is implemented in May 2018.
Service Agreement – Clauses 15.2 and 15.3 What in ‘Clause 15: Liability’ addresses the larger liabilities under GDPR, such as fines?

Does the liability position in the agreement allow an Institution to recover from Jisc the cost of any regulatory fines that it may receive for a data protection breach?

Is the liability position in the agreement suitable for the period prior to implementation of GDPR (May 2018)?

GDPR places specific legal obligations on Data Processors (in this case Jisc). This marks a significant change to the Data Protection Act, where the Data Controller (in this case the Institution) was responsible for ensuring legal compliance.

Under GDPR the Data Processor (Jisc) will have a statutory obligation to implement appropriate security measures to protect the personal data made available to it by the institutions (the Data Controller). As such, under GDPR, Jisc (rather than the Data Controller) can be directly fined for a breach of these statutory obligations.

The liability position in the new Services Agreement reflects that situation and is in line with what other suppliers currently offer in the marketplace.

Service Agreement – Clause 15 Do Jisc and the Institution offer to indemnify each other? No.
Also see Liability position response above
Sub-contract Agreement Will there be a pro-forma agreement that Institutions can use for Add-On Services they procure under the DPS? If an Institution opts to buy Add-On Services via a mini-competition in the Jisc Dynamic Purchasing System (DPS) for Learning Analytics, then the contract for those services will be directly between the Institution and supplier.

In a static procurement framework suppliers generally sign up to use a standard pro-forma contract with their customers. The DPS operates slightly differently. Under the DPS an Institution can use its own contract wording or the supplier’s.

In a bid to ensure that all suppliers adopt a consistent approach to data protection/GDPR-compliance, Jisc has provided a standard set of data protection clauses and mandated that suppliers use these in their contracts with Institutions.

Jisc is also in the process of reviewing some final changes to our sub-contract agreement. Once finalised, we can make this available as a starting point for contracts Institutions may enter into with suppliers under the DPS.

11th UK Learning Analytics Network meeting, Aston University, Birmingham, 5th Sept 2017

Originally posted on Effective Learning Analytics.

The 11th UK Learning Analytics Network meeting is being hosted by Aston University in central Birmingham on Tuesday 5th September 2017.

Booking form

People working in groups at tables

The room lends itself to working in small groups so we thought we’d have fewer presentations and a bit more hands-on activity this time.  In the morning Paul and I will be facilitating a cut-down version of our interventions workshop. After lunch there’ll be a session on Jisc’s Data Explorer tool, where you’ll be able to try out the system and help improve its usability and feature set. We also hope to get feedback from the latest user meeting of institutions implementing Tribal Student Insight. Finally we’re looking forward to hearing how Aston is getting on with their own project.

As usual there’ll be plenty of opportunity for networking with people from other institutions who are implementing learning analytics.

You’re advised to book early as there are limited places available and the event is likely to be oversubscribed.


Draft Agenda

10:00 – 16:00, Tues 5th September 2017
Hosted by Aston University

Room 512, Main Building
Aston University
Aston Triangle
Birmingham B4 7ET

Directions and parking      Campus Map [PDF]

09:30 – 10:15 Arrival and coffee
10:15 – 10:25 Arrangements for the day & welcome to Aston University
10:25 – 11:00 Update on Jisc’s Effective Learning Analytics project Paul Bailey, Lee Baylis, Rob Wyn Jones
11:00 – 12:00 Planning interventions – interactive session Niall Sclater, Paul Bailey
12:00 – 13:00 Lunch and networking
13:00 – 13:50 Implementing Tribal Student Insight – feedback from user meeting [TBC]
13:50 – 15:00 Enhancing Data Explorer – interactive session Torchbox [TBC]
15:00 – 15:15 Tea / coffee
15:15 – 15:55 Aston’s learning analytics project
15:55 – 16:00 Farewell