Are robots taking our jobs?

Originally posted on Inspiring learning.

Yes, and they have been for generations

American union leader and civil rights activist Walter Reuther (1907 – 1970) led the United Automobile Workers union of five million auto workers, retired workers and their families. He was recognised by Time Magazine as one of the 100 most influential people of the twentieth century.

Walter was being shown around the Ford Motor plant in Cleveland in the 1950s. A company official proudly pointed to some new automatic machines and said, “How are you going to collect union dues from these guys?”

Walter replied, “And how are you going to get them to buy Fords?”


For my keynote at the second annual NTfW teaching and learning conference, I wanted to focus on robots and Artificial intelligence (AI). I asked our Jisc Sanbot to do the introductions, which she did by video (above). She was able to give her greetings to the audience in Cardiff, emphasise the importance of digital skills for learners and describe the coming session. She demonstrated that she could do at least part of my job as a presenter.

I asked the audience which of their past jobs or tasks were now done by robots, computers or machines. You can see their answers in this word cloud.


I shared a short series of videos, highlighting some of the latest developments in robot employment. Have a look at this bricklaying machine working with humans, cocktail mixing robot arms in Las Vegas, Amazon’s army of little orange bots which bring the shelves to the human packers, and the latest demonstration of driverless cars by Waymo.

The aim was to paint a picture of the modern, evolving workplace, and ask some questions about our delivery for work based learners. Does our practice support twenty first century apprentices and trainees? Are we embedding the digital skills into programmes that enable learners to perform and compete globally?

During this Radio Three discussion in 2015 a panel of experts talked about the kind of jobs that could be automated over the next few years. They argued that in addition to lower skilled jobs, many of the tasks currently carried out by doctors, health professionals, lawyers, teachers, accountants, journalists and others could be done by AI. These would be the more routine and time consuming parts of the role such as marking for teachers or collating statistical news stories for journalists. We have all seen our doctors’ surgeries employ triage, online prescription ordering, phone consultations and other services that save time and resources. In 2015, eBay solved 60,000 disputes online using no traditional lawyers.


Image from

The panel agreed that humans will continue to do tasks where they need to be creative, and have good perception and social intelligence in fields like mental health and social work. Stephen Hawking echoed this in 2016 saying,

“The automation of factories has already decimated jobs in traditional manufacturing, and the rise of artificial intelligence is likely to extend this job destruction deep into the middle classes, with only the most caring, creative or supervisory roles remaining.”

Despite this, two years later robots started to appear in situations that we associate with emotional and moral ability. Here’s a Pepper robot Buddhist funeral priest being sold in Japan at a quarter of the cost of a human priest (per funeral).

I asked the audience if we should be optimistic or pessimistic about our future with robots and AI.  You can see the overall scores from the room in the chart below.


The first two statements were optimistic views of automation and the third and fourth were more pessimistic. The statements were taken from this 2017 Guardian article asking the same question – Meet your new cobot: is a machine coming for your job? You can see that the audience was quite well balanced between optimistic and pessimistic, and this demonstrates one of the challenges. There are so many good things and so many dangers that it’s difficult to choose a side.

Whatever our view though, we need to lay the foundations for our learners and prepare them for the twenty first century workplace with up to date skills and experiences. As learning providers, we must work together and share the latest thinking, the latest ideas and the latest tools and techniques.


You can find the resources from the conference here in English and here in Welsh.

The post Are robots taking our jobs? appeared first on Inspiring learning.

Dragon hunting at #Digifest18

Originally posted on Inspiring learning.

Last week was Digifest, Jisc’s annual conference in Birmingham. It’s always a good opportunity to take the temperature of what’s happening in the sector and meet up with members.

This year I was running a workshop with a few people from Higher Education institutions on how digital storytelling (DS) can enhance the student experience. I called it Going Dragon Hunting, the reason for which was made clear in the workshop but will otherwise have to remain a mystery unless you want to talk to me about it. 😉

The aim was to introduce participants to what digital storytelling could achieve by showcasing some excellent practice and then giving people an opportunity to explore storytelling for themselves.

Showcasing good practice

Dr Liz Austen from Sheffield Hallam University gave a valuable insight into the results of a project coordinated by Yorkshire Universities to use DS to work with students from traditionally “hard to reach” groups such as people from black and other ethnic minority families, mature students and so on. You can see the outputs on the Yorkshire Universities website.

This has been an excellent project and they have shared guidance (pdf) for others looking to undertake a similar approach.

Richard Beggs joined us virtually having been laid low by some evil lurgy which meant he couldn’t travel from Belfast where he’s a curriculum design consultant at Ulster University. You can’t keep Richard down though and he recorded his presentation specially from his sick bed (or near it) and you can watch it below. Richard focused more on the teaching and learning aspects of storytelling and he’s helping Ulster Uni take it in very interesting directions.

From my perspective, it was great to see what is happening in the sector. Usually, I deliver workshops to people at their institutions but there’s little opportunity for detailed follow up so I seldom see where people take their skills and knowledge.

It’s always gratifying seeing people applying DS to situations and in ways that I’d never anticipated.

Exploring opportunities

Once we’d spent some time discussing the above examples, participants engaged with the raw act of telling stories and trying to visualise them. Liz and I then posed the question what opportunities could they see for DS in their institution and what things would they have to bear in mind to do it successfully.

You can see the output of the activity on this Padlet. There were quite a lot of ideas!

Jisc’s new Digital Storytelling workshop

This was all quite timely as I’m going to be running our first public online version of the digital storytelling workshop that I usually run on a bespoke basis for institutions so if you’re interested in finding out how to create your own digital stories and manage storytelling projects this is a really good opportunity.

Final reflection

On the train to Birmingham, I stumbled across an article on the BBC website where a former student was talking about trauma they had experienced as a student. This was a story about an all too common occurrence and it was a stark reminder that the “student experience” for some is not all about learning, development and growth but about something much harder to face. I wrote a personal blog post on it if you want to know more.


The post Dragon hunting at #Digifest18 appeared first on Inspiring learning.

Piloting the Digital discovery tool with students

Originally posted on Jisc Building Digital Capability Blog .

While our current pilot projects have been getting the Discovery tool into the hands of staff, we’ve been working behind the scenes on the student version. We’re pleased to say that this is user testing well, with students particularly keen on the detailed final report. We’ll be promoting this more positively to learners as the prize at the end of their journey. Meanwhile we’re making some final improvements to the content, thanks to all the feedback from users and experts.

All this means that we’re looking for existing pilot institutions that are keen to extend the experience to students. You can express an interest by completing this sign-up form, and you can read more about what’s involved below.

About the student Digital discovery tool

Screen Shot 2018-01-15 at 12.19.58

The student version is designed exactly like the staff version, as described in this blog post. So users answer questions of three types, received a detailed feedback report with suggested next steps, and links to resources.

The content is designed to be:

  • Practice based: users start with practical issues, and the language is designed to be accessible and familiar
  • Self-reported: we trust users to report on their own digital practices. We attach very little weight to self-reported confidence, but we do expect learners to report accurately on what they do in specific situations (depth), and on which digital activities they undertake routinely (breadth).
  • Nudges and tips: the questions are designed to get users thinking about new practices and ideas before they even get to their feedback reportScreen Shot 2018-03-12 at 22.34.12.
  • Generic: different subject areas present very different opportunities to develop digital skills – and make very different demands. We aim to recognise practices that have been gained on course (after all these make an important contribution to students’ digital capability!) but where possible we reference co-curricular activities that all students could access.

Student users will find only one assessment on their dashboard, unlike many staff who will find a role-specialised assessment alongside the generic assessment ‘for all’. Most of the elements in the student assessment are the same as in the staff generic assessment, mapped to the digital capabilities framework. But the content is adapted to be more relevant to students, and the resources they access are designed to be student-facing, even where they deal with many of the same issues.

The ‘learning’ element of the framework is split across two areas to reflect its importance to students. These are ‘preparing to learn‘ with digital tools (mainly issues around managing access, information, time and tasks), and ‘digital learning activities‘. There is also an additional element, ‘digital skills for work‘, that sits at the same level as ‘digital identity’ and ‘digital wellbeing’ in the framework, reflecting the importance of the future workplace in learners’ overall motivation to develop their digital skills.

The feedback encourages learners to think about which elements they want to develop further, based on their own course specialism and personal interests. Where they score low on issues such as digital identity that we know are critical, we prompt them to seek advice. So use of the discovery tool may lead to higher levels of uptake of other resources and opportunities – and we hope this is seen as a successful outcome!

Screen Shot 2018-03-12 at 22.36.05There is some minor variation between the versions for HE and FE students, but we have done our best to keep these to a minimum. Our research and consultations don’t suggest that sector is an important factor in discriminating the digital skills students have or need. However, we do recognise that students vary a great deal in their familiarity with the digital systems used in colleges and universities. So we’ve designed this assessment to be suitable for students that are some way into their learning career, right up to those preparing for work.

It is not intended for arriving or pre-arrival students. We are considering a special assessment  for students at this important transition, but there are some problems with developing this:

  1. These students vary much more in their experience of digital learning, so it is much harder to design content that is not too challenging (and off-putting) for some, while being too basic for others.
  2. We are concerned that organisations might see it as a substitute for preparing students effectively to study in digital settings – this is not a responsibility that can be delivered by a self-reflective tool.
  3. We have learned from students that the most important content of an induction or pre-induction ‘toolkit’ is institution-specific – depending on the specific systems and policies in place.

So at the moment our focus for arriving students is to work with Tracker users to design a digital induction ‘toolbag’. The ‘bag’ is simply a framework that colleges can use to determine for themselves – from their Tracker findings and other data – how they want arriving students to think about digital learning, and what ‘kit’ of skills, devices etc they will need. More of this over on the Tracker blog soon.

What the Digital discovery tool for students is not

As above, the Discovery tool is not an induction toolkit, or any kind of toolkit. It doesn’t deal with local systems and policies, which are critical to students becoming capable learners in your institution. It does prompt learners to think about a whole range of skills, including their general fluency and productivity with digital tools, which will support them to adopt new systems and practices while they are learning.

Screen Shot 2018-01-15 at 09.28.22The Discovery tool offers access to high quality, freely-available resources, in a way that encourages learners to try them. In future you may be able to point students to your own local resources as well. But it isn’t a course of study and there’s no guarantee that learners will follow up the suggestions or resources offered.

The scoring system is designed to ensure students get relevant feedback, and to motivate them to persevere to the report at the end. It has no objective meaning and should not be used for assessment, either formally or informally. We have deliberately designed the questions to be informative, so it’s always clear what ‘more advanced’ practice looks like. Users who want to gain an artificially high score can do so easily, but we don’t find this happening – so long as they see the development report as the prize, rather than the score itself.

About the pilot process

Just like the staff pilot, we’re looking for quality feedback at this stage. If you’d like to be part of the journey, we’d be delighted to have your support. You’ll need to complete this sign-up form before the end of Wednesday 21st March – it’s a simple expression of interest – after which we’ll notify participants and send out your log-in codes. Our Guidance has been updated to ensure it is also relevant to the student pilot, and you’ll have dedicated email support. Access will be open to students until the end of May 2018.

Because this is a pilot, we are still improving the content and still learning how best to introduce it to students to have the most positive outcomes. This means changes are likely. It also means we’ll ask you and your students to give feedback on your experiences, as with the staff pilot.

Join us now: complete the sign-up form.


Next generation [digital] learning environments: Community Voices

Originally posted on lawrie : converged.

As Technology Enhanced Learning continues to develop, it is clear that some form of digital learning environment will remain core to institutional practices; the levels of integration, features and porosity will continue to change, driven, and potentially driving the behavioural shifts we see in staff and students.

Digifest18 was where Jisc launched “Next generation [digital] learning environments: present and future”. The report looked back at the hundreds of discussions generated from Co-design 16-17.  To launch the report a panel of delegates who had been involved in the co-design process discussed the implications and their personal perspectives.

The authors of the report were myself, Rob Allen and Dave Allen, with substantial advice and contributions from our co-authors Damian Chapman, Nicola Whitton, Simon Thomson, Peter Bryant, Aftab Hussain, Simon Wood, Steve Rowett, Anne-Marie Scott and Anders Krohn.

To create an accessible document the report broke down in to seven key themes

  • Current good practice
  • Large enterprise approaches
  • From institution to individual
  • Self-starter and individual approaches
  • Analytics and learning environments
  • Emergent models
  • Disruptive approaches in online UX futures

The views in the report are also very diverse, from the technology focused – to the political, some of the quotes are below, but have a look at the whole report for context:

Policy, competition and the marketization of learning open up the sector to the influence of platforms and practices that argue they have all the answers. The important question for all of us who lead educational change, who teach classes both face-to-face and online, who have skin in the game of the future of higher and further education is; what kind of experience do we want for our students?
Peter Bryant, London School of Economics

The use of machine learning in the education sector is making personal learning environments smarter as they advance the delivery of personalised, adaptive and contextualised services to students. This marks the start of an exciting period where schools, colleges and universities in partnership with technology companies will take their first steps towards developing a personal digital teacher for every lifelong learner.
Aftab Hussain, Bolton College

The development of safe social spaces is at the heart of building meaningful and inclusive learning environments. Online spaces that support learner community – building and facilitation of deep and trusting relationships are necessary for a feeling of safety in the presence of peers. In these learning spaces students can engage with others in new and playful ways, take risks and learn from failure, build resilience and confidence, be creative, and learn to work will others to solve problems in truly innovative ways.
Nicola Whitton, Manchester Metropolitan University

Learning analytics approaches have significant potential, but institutions need to move past the dominant focus on retention use cases and data dashboards to realise this. Analytics need to take into consideration the context in which learning is occurring, and better support all students.
Anne-Marie Scott, The University of Edinburgh

The remarkable range of technological developments and appetite to interact both with a growing diverse student voice and diverse approaches to learning are evident in every aspect of cyberspace and our world of analogue. The next generation of digital learning environments must reach deeper and further into supporting deprived areas of the social spectrum. New cultures of collaboration and participation need to engage, connect knowledge flow…. and enable increased diversity of lifestyles and the richness they bring to learning and knowledge acquisition.”
Damian Chapman, University of West London

The report is available from the Jisc Repository “Next generation [digital] learning environments: present and future”. And is Published under the CC BY 4.0 licence 


Organisational issues – what are we learning?

Originally posted on Jisc Digital Student.

Today we’re discussing the third element of our 360 degree perspective.  Alongside student and teaching staff feedback we have data from the organisational perspective. If you’re engaged in the Tracker pilot this year you will have answered questions about ten organisational factors when you confirmed your place on the project. We’ve been keeping this data quiet for a while, as we think it’s most useful viewed together with your findings from learners. But in case you have a bit of thinking space between launching you surveys and analysing your data, here are some highlights for you to chew over.

Testing the questions

Our main goal this year was to test out the questions themselves. We were delighted that you chose to answer most or all of them, as we only asked for a minimum of four responses. This Screen Shot 2017-09-19 at 22.17.37suggests that you found them interesting and potentially valuable, but not too difficult to answer. Having said that, most questions relied on self-assessment, and we had a second concern that responses would gather towards the more positive end of every scale. We were happy to find the full range of responses was used, even when the options were quite negative (e.g. ‘no clear strategy’).

Once we have some learner data to put alongside them we will test out how well each question performs in relation to key metrics. Meanwhile if you have comments on the questions or ideas for improvement please contact us, share your views on the tracker jiscmail list, or use the evaluation process to give us your feedback.

Who are our Tracker institutions?

We had 180 responses in total: the breakdown of institution types (below) shows 84 Universities, 66 FE colleges, 11 ACL providers, and a mix of others including a fairly high number of dual providers. Taking dual providers into account, we were able to identify 93 Universities/HEIs (or dual providers) running the HE tracker, 68 FE Colleges (or dual) running the FE tracker, and 18 Skills (skills + adult and community) providers running the relevant trackers for their sector(s). These three groups form the basis of the charts that follow. Other groups were not large enough to analyse without risk of identifying individual organisations.

Eighteen non-UK universities are included in the HE group, enough to influence these results considerably. We are running a second analysis using only UK data, so please bear this in mind when looking at the charts.

Institution types

Organisations confirming participation in the tracker 2017-18

Organisations varied hugely in scale, with student numbers ranging from 400 to 175k in HE, 520 to over 55k in FE, and 172 to 16500 in the Adult, Community and Skills sector. This last group includes a particularly diverse set of providers. One thing we can say with certainty is that variation within each sector group is more significant than variation across them.

Strategic environment

TEL strategy

TEL strategy

BYOD policy

BYOD policy

These two charts show the status of the current TEL strategy (or equivalent) and Bring Your Own Device policy (or equivalent) in the organisations that answered these questions. Only a handful did not, so responses are likely to be representative. On the face of it FE colleges seem marginally more confident about their BYOD policies – the mode response was ‘policy is emerging’ compared with ‘no clear policy’ in HE and ACL/skills. Given that tracker institutions have demonstrated that they care about the student digital experience, these low norms for strategy development are quite surprising.

Tech adoption

Pace of new technology adoption

Respondents from the FE sector were slightly more likely to say that their organisation was an ‘early adopter‘ of new technologies (mode and median) than to say they adopted at the ‘pace of their peers‘ (the mode and median for the other sectors): again this has not been tested for significance.

Support for digital students

We used your responses to calculate the number of TEL support staff available per 1000 students (counting full and part time students as equally needing of support). The figures varied dramatically: it seemed that some respondents were counting their entire complement of teaching and student-facing staff as TEL support, which is an interesting position. There is a whole other blog post to be written about how you arrived at your numbers and definitions. Excluding these outliers, and the very large numbers who did not answer this question (do people actually know….?), the median comes in somewhere under 1 per 1000 across the sectors, but still with large variations that deserve further exploration.

We asked you about four different strategies for engaging students in their digital experience, and four different modes of student digital support. The different sectors had very different profiles of support, as shown in these charts.

Student digital support

Sources of digital support for students

Student engagement

Engaging students in their digital experience

Universities were more likely to point students at online support than other sectors: they were also more likely to engage with course reps on digital issues. Looking at the volumes of these charts, universities appear to be delivering better on both student digital support and student engagement. Seventeen percent of HE respondents said they used all four means of engaging students, and 36% said they made all four means of digital support available (the corresponding figures for FE and ACL/skills were much lower). But these findings need to be tested and explored further, and of course we need to test the institutional perspective against what students say!

Staff CPD

% of teaching staff undertaking TEL-related CPD in previous 2 years

Reversing the trend, when it comes to the percentage of teaching staff who have undertaken TEL-related CPD in the last two years, the HE sector looks less committed to staff development than the other two. Only in ACL does the median come in above 50%, though this is on a small sample of providers.

The digital environment for learning

Finally we asked you to assess the effectiveness of your VLE/online learning environment, the standard of course-related software, and the digital readiness of learning spaces – three issues that directly affect the digital learning experience. All have corresponding questions in the staff and student tracker, allowing direct comparisons to be made.

VLE use

% of courses making effective use of a virtual/online learning environment

Up to date software

% of courses using industry-standard, up-to-date software

Learning spaces

% of learning spaces adapted for digital learning

On all three issues, HE and FE responses are remarkably similar. Around 40% of organisational leads felt that (at least half of) their courses offered up-to-date software provision and effective use of the learning environment. At 60%, confidence in the status of learning spaces was higher. However, none of these results show confidence that the majority of learners are having a high quality digital experience. Responses from ACL were almost certainly skewed by the small, self-selecting sample.

As with other questions, it will be interesting to know how you assessed these issues. Who did you engage with? What existing data was valuable? How accurate do you think your own assessment will prove to be when measured against the perspectives of teaching staff and students? Please share any methods that were particularly effective or that led to interesting conversations and engagement that you might not otherwise have had.

With all the issues raised by this very interesting exercise, please comment on this blog post or on our closed email list!


Notes and presentations from the 13th Learning Analytics Network meeting at the University of Edinburgh

Originally posted on Effective Learning Analytics.

Our latest UK Learning Analytics Network meeting was kindly hosted by the University of Edinburgh. Sessions varied from details of the range of innovations Edinburgh is involved with, to using assessment data, to student wellbeing and mental health, to current developments with Civitas Learning’s products and Jisc’s learning analytics service.

Arthur's Seat, Edinburgh

The hashtag was #jisclan if you want to check the tweets. Video recordings are available for:

The day was introduced by Sian Bayne, Professor of Digital Education & Assistant Principal for Digital Education at Edinburgh. One of the things she discussed was the University’s agreed principles for learning analytics, which are worth repeating:

  1. Professor Sian BayneAs an institution we understand that data never provides the whole picture about students’ capacities or likelihood of success, and it will therefore not be used to inform significant action at an individual level without human intervention;
  2. Our vision is that learning analytics can benefit all students in reaching their full academic potential. While we recognise that some of the insights from learning analytics may be directed more at some students than others, we do not propose a deficit model targeted only at supporting students at risk of failure;
  3. We will be transparent about how we collect and use data, with whom we share it, where consent applies, and where responsibilities for the ethical use of data lie;
  4. We recognise that data and algorithms can contain and perpetuate bias, and will actively work to recognise and minimise any potential negative impacts;
  5. Good governance will be core to our approach, to ensure learning analytics projects and implementations are conducted according to defined ethical principles and align with organisational strategy, policy and values;
  6. The introduction of learning analytics systems will be supported by focused staff and student development activities to build our institutional capacity;
  7. Data generated from learning analytics will not be used to monitor staff performance, unless specifically authorised following additional consultation.

The separate purposes for learning analytics defined by Edinburgh include: quality, equity, personalised feedback, coping with scale, student experience, skills and efficiency.

Next we had an update on Jisc’s data and visualisation services from Michael Webb, Rob Wyn Jones and Lee Baylis (ppt 5.6MB).

Working with the Higher Education Statistics Agency (HESA), Jisc has developed a business intelligence shared service for education, which includes analytics labs and community dashboards. These tools allow institutions to review data comparatively from quality sources – and compare themselves with peer organisations.

The community dashboards released by Jisc


Our final session before lunch was a workshop on learning analytics and student mental health. Sam Ahern from University College London kicked this off with a presentation (pdf 1.1MB). Sam had examined data relating to Moodle accesses, library and laptop loans, together with student records – and showed details of some of her findings. With a particular interest in student wellbeing, her proposed research questions for institutions with an interest in analytics and mental health are:

  1. What current wellbeing or personal tutoring policies are in place?
  2. What data, if any, is supposed to support these
  3. What are the data flows

We then opened things out to the audience and asked them to discuss the question:

What data sources would be most helpful for supporting student wellbeing?

The results, below, show that attendance and VLE usage were thought to be the main potential sources. Nothing to do, I’m sure, with the fact I gave these as examples just before asking the question… However there’s a rich collection of other suggestions here which I shall be poring over in greater detail as we develop our thinking in this area with colleagues.

Data sources for student wellbeing





We then asked the group what challenges or issues they thought would be faced when gathering and using this data.  Responses included:

  1. Resource: do we have the resource to support students using this data? To what extent do we want to use data and algorithms as a sticking plaster for lack of academic resource?
  2. How comfortable are students will students be with the data collected, the way it’s shared etc?
  3. Will some students game the data? Some of them will not be thinking or acting rationally.
  4. What’s the legality of sharing the data?
    1. The involvement of a health professional would help
    2. There’s a balance between duty of care and culpability
    3. We could be generating sensitive data
  5. Whose responsibility is it to follow up? Everyone’s? We need coherent processes and should augment those already in place rather than starting from scratch.
  6. Damage could be caused by poor quality data in dashboards
  7. Fear of acting. Around 1/3rd of institutions do not have a wellbeing strategy in place.
  8. Getting permission to access social media data would be helpful but is logistically difficult

Finally we asked the question:

Should learning analytics be used to support student wellbeing?

We received 30 responses to this question. 25 said yes, 5 said no. Admittedly this is a self-selected group of people who are likely to be positive about this. My question is: how can we not at least explore the potential of using this data for helping the very many students with mental health issues – and potentially even to save lives? And as Sheila MacNeill points out, though, it’s not just staff who should be consulted about this:
Sheila MacNeill's tweet






View from Appleton Tower

After lunch it was the turn of Anne-Marie Scott & Yi-Shan Tsai, with Anne-Marie telling us about the wealth of learning analytics projects at Edinburgh, and Yi-Shan talking in-depth about the Supporting Higher Education to Integrate Learning Analytics (SHEILA) project (ppt 10.3MB).

Anne-Marie discussed the analytics carried out on some of the University’s MOOCs. Almost all participants were studying them “to learn new things”. Many were doing so for career purposes too though this varied depending on their age. She showed various other interesting visualisations of the MOOC data. They’d found that the data required a lot of cleaning before analysis.

Using Civitas Learning’s Illume product, Edinburgh had also carried out predictive analytics on students’ likelihood of success.

Yi-Shan then discussed the SHEILA project which is building a policy framework for learning analytics, looking at adoption challenges, institutional strategy and policy, and the interests and concerns of stakeholders.

The next session was from Michael Webb, Jisc’s director of technology and analytics, and Kerr Gardner, who is working with Jisc and Turnitin to look at the use of assessment data for learning analytics. In this interactive session, the group fed back some of their suggested requirements for the data which would be of most use to come back to them from Turnitin. Stay tuned to this blog for updates on Kerr’s progress in this area.

Finally we had an interesting presentation from Mara Richard and Chris Greenough of Civitas Learning (pdf 9MB). They explained how their predictive modelling system works and suggested that there are big differences between the UK universities they’d worked with. They looked particularly at modules attempted versus modules earned to help understand student burn-out and resiliency at different institutions.

Mara suggested developing “mindset principles” such as belonging, normalising, goal setting and empathy to help students persist, developing nudges to encourage certain behaviours. She also gave out a call to institutions who wish to work with Civitas’s data platform and predictive modelling in an integrated way with Jisc’s learning data hub.

Our next meeting is likely to be in May somewhere in England North of Watford Gap. Details will be announced on this blog and via the analytics@jiscmail list.


VLE Modelling

Originally posted on e-Learning Stuff.

Lego bricks

Despite many people talking about the death of the VLE over the years, the institutional VLE is still an important component of most colleges and universities offer in the online space, whether this be supporting existing programmes of study, those offering a blended approach, or even fully online programmes.

For the purposes of this blog post I see the VLE as a concept, more of a combination of tools, that includes the institutional LMS/VLE alongside other tools such as Padlet, WordPress, Twitter, Adobe Connect, etc..

The challenge for many academics and staff is the assumption often made by managers and learning technologists that they are able to create curriculum models that incorporate the VLE in a way which flows and is integrated for the learner. This is exacerbated if the VLE is more than just a Learning Management System (LMS) and incorporates other web tools and services.

Why would they?

Unless they have a core understanding of the potential of the different functions and tools within the VLE, how are they able to ensure they are fully integrated into the curriculum flow.

The result is more often the VLE is bolted onto or duct taped onto an existing curriculum model. This process creates extra work for academics, who they find the whole process of adding (not embedding) the VLE a chore, an extra, so no wonder we occasionally see resistance.

Now I am not saying that academics are not capable of building curriculum models where the use of the VLE is embedded and integrated. What I am trying to say is that when it comes to embedding the VLE, it’s more than just training and development in the use of the functions and tools. This will certainly enable academics to start along the process of developing curriculum models. However by creating some exemplar and example curriculum models where the VLE is embedded will enable academics to reflect and think about how to embed the VLE at a faster pace. Once academics are creating their own models or adapting those provided these can then be shared back.

Lego people

I’ve always thought when it comes to change, how can you make it easier. If something you’re doing isn’t working, then do something differently.

Finally always reflect on why you are doing this, as I posted recently though we talk about embedding digital technologies into practice, the reality is what we want to do is to undertake practices differently, and one way of doing this is through the use of digital. This isn’t about trying to increase the use of the VLE, it’s about using the VLE to solve a range of other issues such as how to ensure learners can have access to a range of materials, resources, activities and conversations at a pace, time and place that suits them on a device of their choosing.

New developments #2: mini surveys, teacher landings and more

Originally posted on Jisc Digital Student.


A mini-survey could save staff time

As well as launching our staff tracker this week, we’re putting another idea to the test. Alongside the standard staff survey we’ll be offering a mini survey with around half the questions. The 12 providers who are involved in the staff pilot will be assigned one of two options – litre or pint-sized – to trial with a number of teaching staff. By piloting both at the same time we hope to find out:

  • How much actionable data is lost by using only half the questions
  • What is gained in terms of ease of use, engagement, and ease of analysis
  • Actual time/cost savings from running a smaller survey with no customisation
  • Whether there are sector or other differences that make different versions preferable in different settings

Depending on the results, we’ll consider trialling a mini survey for students too. And of course we’ll be asking pilots to have their say.

Why mini surveys?

Surveys occupy precious resources. We are constantly trying to balance the value of their feedback with respect for staff and student time. We also know how much time our institutional leads devote to running the tracker and analysing the data. Ideally, mini surveys can achieve many of the same outcomes in less time. Even if they don’t work for everyone, they may be useful as a complement to the standard survey, perhaps to run in alternate years or as a low-risk starting point for new institutions.

Users of mini trackers will be able to benchmark their findings with other organisations in their sector, just as users of the standard tracker can do. It won’t be possible to include standard and mini users in the same benchmarking group, even where they are in the same sector and using the same questions. However, it will be possible to compare data across the two surveys outside of BOS, as we will do our own cross-analysis and publish the combined results. So users can if they wish benchmark their findings with the wider group of respondents.

What’s in the mini surveys?

Screen shot staff tracker micro

Mini surveys offer a number of key questions from the standard survey

We’ve chosen equivalent questions from the staff and student trackers to include in the mini trackers. They are questions that map well across the two different user groups and in the case of the student tracker they are questions with proven value.

You can now try out the teaching staff trackers online: standard staff tracker | mini staff tracker

These are open access test versions: you can take them as often as you like. Asterisks show where questions are substantially the same as in the student tracker, which you can also try out here: standard student tracker (HE version). The mini student tracker will be available shortly.

We welcome feedback on the staff questions using this googledoc pro-forma, as we will be making improvements after the first pilot.

Further support for teaching staff

There is now a landing page for teaching staff to explain more about the tracker and how it’s being used. This is partly for teachers who encounter the staff survey in the 12 pilot institutions, and partly so that teaching staff at any institution can understand more about the project. As outlined in the previous post, teachers need to know that their viewpoint is valued. And teachers are often the gatekeepers to student engagement. You’ll notice that we’re moving away from calling this project the ‘tracker’ because of negative associations with monitoring. Instead we are using its proper title going forward: ‘Student digital experience service’.

Screen Shot 2018-02-21 at 22.10.40

Teaching staff landing page

We welcome feedback on this page. It’s not meant as a substitute for local messages and engagement activities, and it doesn’t appear in menus. Organisational leads can decide whether or not to promote it, and whether or not to use some of the material in local communications.

More on the staff tracker: custom questions

Screen shot staff tracker custom

Customisable page in the teaching staff tracker

One of the most-liked features of the student tracker is the option to include custom questions. This means organisations can maximise the value of engagement and reduce survey load. Custom questions can be added to the standard versions of both the student and teaching staff trackers, and we include guidance to ensure the survey experience is seamless for users and the data collected is of a high quality.

Our parallel experience of developing the Digital discovery tool and talking to users about what they want from it has really helped us to appreciate the value of custom questions in the tracker. The discovery tool provides individuals with reflection and feedback. But we know that at an organisational level there’s a need for accurate data about issues such as:

  • what proportion of staff are proficient users of organisational systems e.g. Echo360, sharepoint etc
  • what forms of training and development staff want
  • what features of the VLE are used by staff (perhaps tracking differences across subject areas)
  • how staff behaviour and/or attitudes change over time or over the course of a strategic initiative
  • how digital practice differs across the organisation

BOS is the perfect tool to answer these questions as it designed to provide accurate data from anonymous survey responses and summarise it it at volume. Project leads will be able to add in exactly those questions they need to answer and can compare the answers over different broad groups of staff, or track them over time. Some of the questions asked in the standard staff tracker can also be used as key indicators.

Screen shot staff tracker 1

Screen shot staff tracker 2

Standard questions on teaching practice could be used as key indicators, alongside custom questions on local issues

Lightweight guidance on using the staff tracker will be produced in early March and circulated on the jiscmail list as well as on this this blog.

New developments #1: launching our teaching staff pilot

Originally posted on Jisc Digital Student.

We started the new year by talking about a new, ‘360 degree’ approach to understanding and  enhancing the student digital experience. Here is a snapshot of our thinking to date.

New service model

Draft diagram created by Jess

Over the next few blog posts we’re going to introduce some new members of the tracker family that together make up a virtuous circle of engaging stakeholders, gathering data, and supporting change. Today we’re launching our pilot survey for teaching staff.

The pilot process

  • Sign-up form opens today – this is a simple form asking for basic information only and a rationale for why you are interested in the staff project
  • Sign-up form closes 5 March. 6 FE and 6 HE institutions are chosen to participate. You will be contacted then with guidance on how to run the pilot project
  • Staff trackers close on 31 May. You will be asked to complete a short evaluation form during June to help us improve the staff tracker for service.

Why a teaching staff tracker?

Why are we suggesting that you should survey teaching staff as well as students?

We all know that student surveys have a mixed press at the moment. One reason is that students see data being siphoned off to create a national benchmark, rather than being owned and analysed locally, with their input at every stage. We hope we’re addressing that concern with our guidance on engagement and our data co-ownership model.

Another problem is that teaching staff can feel at the mercy of student opinion, with their own perspective being ignored. Teaching staff have their own experience of the digital environment for teaching and learning. They are key to enabling students’ digital experience, but students may not know what constraints and opportunities they have. For these reasons we see teaching staff as providing essential piece of the overall picture.

This diagram from Tabetha explains how the two surveys are connected. We ask students to rate the quality of digital provision (the digital environment provided by the organisation) and the quality of the digital learning and teaching experience. We ask staff to rate the same digital provision from their perspective. We also ask them about how they are supported, especially through professional development, to provide a high quality digital teaching and learning experience to students.

360 degree graphic

Linking the staff and student tracker surveys

We follow this up with guidance on analysing the data from the staff tracker, and making sense of it either separately or alongside your student data. (Don’t forget that there is now updated guidance to analysing your staff tracker data, if you are at that stage in the process.)

In future we imagine that organisations might want to run staff and student trackers alternately, both together, or just focus on one or the other instrument as suits their needs. There are two key metrics in each survey for tracking and benchmarking.

What questions are teaching staff asked?

We have blogged before about how we arrived at our questions for teaching staff and you can experience them for yourself in BOS here.

The questions are mapped both to the questions for students and to the questions for organisations. This allows you, as an institution, to explore some of the same issues from three different perspectives. If you have any feedback on the staff questions please use this open googledoc to comment.

What is involved in the pilot?

The process will be very similar to the student tracker, which is why to begin with we are only opening up the pilot to organisations that have experience in running that survey in BOS. You will have access to a master tracker from which you copy, customise and launch your own version. There will be some lightweight additional guidance. For this pilot we are asking all participants to distribute the link to their staff via the BOS email system (so that we all have access to the response rate data). If you haven’t used this system before then don’t worry, we will give you support. You can then if you wish then use the same system for any future runs of the student tracker.

The timescale for this pilot is short to allow us to analyse the findings alongside the student tracker. We are not expecting you to engage large numbers of staff – this is a first run to find out how the staff tracker can best be used and to help us make any necessary improvements. We hope involvement in the pilot will give you an opportunity of engaging with teaching staff about the tracker – perhaps in a way that makes it easier through them to engage with students?

What next?

At the start of march we’ll launch a landing page for teaching staff interested in the tracker. This is particularly important for staff pilot institutions, but it’s also a place that all pilots can send teaching staff for information (and hopefully reassurance) about the project.

In coming posts we’ll be telling you about our new ‘micro’ versions of the staff and student surveys, with some quick guides for those wanting to collect a smaller amount of data more easily. And we’ll be launching ‘360 degrees’ guidance to help you use both surveys, along with your organisational data (remember those 10 questions on the confirmation form?) to really understand your institution in the round and create the best opportunities for change.

Don’t forget that to sign up for the staff tracker pilot you must complete the sign-up form by 5 March at the latest. This is a very simple form – you don’t have to repeat any of the organisational questions we put you through when you signed up to run the student tracker!