CAN 2018: Championing student-staff partnership in an age of change: Call for Contributions

Originally posted on Change Agents' Network.

Partnership1.jpgWe are delighted to announce the call for contributions for CAN 2018 hosted by the University of Winchester and supported by Jisc. This year the CAN conference will be focusing on student-staff partnerships at universities and colleges in the age of change. With significant developments happening across the UK educational landscape, the conference themes ask us to be both reflective and dynamic in our practice, in order to facilitate effective partnerships and improve the student educational experience.

We particularly welcome contributions for the themes below. A brief description of these themes and the types of submissions can be found on this blog page.

  1. Theme One: Keeping Student Engagement and Partnership Relevant in an Age of Change
  2. Theme Two: Researching, Evaluating and Evidencing effective Engagement and Partnership
  3. Theme Three: Developing Digital Capabilities in an Ever Changing Landscape
  4. Theme Four: Ensuring the Student Voice is heard and the Feedback Loop is closed
  5. Theme Five: Student-Staff Partnerships to support Innovation and Inclusivity in the Curriculum
  6. Theme Six: (posters only) Entrepreneurship and Innovation Showcase

Please submit your proposal to CAN@winchester.ac.uk by the 12th January 2018. Forms can be found here.

Please note:

  • We will give priority to contributions which are student-led or collaboratively delivered by staff and students
  • We will confirm which proposals have been accepted by Monday 9th February 2018

Registration for the conference will open in January. Please visit https://can.jiscinvolve.org/wp/2018-can-conference-winchester/ to keep updated on conference developments.

If you have any questions please contact: CAN@winchester.ac.uk

Get Involved

To follow developments about CAN2018 and the Change Agents’ Network follow @CANagogy and the Twitter hashtag #CAN2018 for this event. You can join the Change Agents’ Network mailing list:http://www.jiscmail.ac.uk/CAN

Notes and presentations from the 12th Learning Analytics Network meeting at the University of Greenwich

Originally posted on Effective Learning Analytics.

Old Naval CollegeThe University of Greenwich provided a great setting for our most recent UK Learning Analytics Network meeting on 23rd Nov 2017. It is recorded in 3 sections:

Part 1: David Maguire (the latter part of his address), Phil Richards, Rob Wyn Jones, Mark Harrington (40 mins 32 secs)

Part 2: Suzanne Owen, Michael Web, Panel Session (2hrs 5 mins 30 secs)

Part 3: Christine Cooper & Karl Molden (29 mins 10 secs)

Unfortunately Andrew Cormack’s presentation was not recorded. However much of his thinking on GDPR is captured in his blog posts on the subject.

Prof David Maguire

Prof David Maguire

We started off with a welcome address by Prof David Maguire, Vice Chancellor of the University, who has a strong interest in learning analytics and is Chair of Jisc.

We then had a series of updates from Jisc colleagues on the Effective Learning Analytics Project [ppt 704KB]. Dr Phil Richards, Jisc’s Chief Innovation Officer, kicked this session off with the latest news and his vision for the future. He announced a new joint exploration with Turnitin of how to use assessment data held in that system for learning analytics.

Phil also discussed how SolutionPath Stream will now be fully integrated into Jisc’s learning analytics architecture.

Phil Richards from Jisc

Dr Phil Richards

Phil’s vision is of learning analytics transitioning through various stages to become ever more sophisticated, with personalised and adaptive learning becoming increasingly feasible.

Rob Wyn Jones then updated us on Jisc’s plans for its Learning Analytics Service. The Beta Service will be active until 31st July 2018.  The fully-supported service will have a dedicated helpdesk, user groups and product development roadmaps. There are currently 18 Pathfinder Institutions implementing the service and there are 15 further free of charge places available for 2018. From August the service will be charged for. See Rob’s presentation [ppt 704KB] for details of the prices, which have now been agreed.

The Data Explorer and Study Goal products will be continuously enhanced, with Data Explorer including basic case mangement / CRM functionality soon. JLAP, the Jisc Learning Analytics Predictor, which carried out predictive analytics on areas such as which students are at risk, is about to be rolled out with 5 institutions.

Rob Wyn Jones from Jisc

Rob Wyn Jones

Other technical developments include a new Unified Data Definition v1.3.2 and new plugins for Moodle and Blackboard to extract their data to the Learning Data Hub (the new name for the Learning Records Warehouse). Discussions are underway too with Canvas to extract data from that VLE to the Learning Data Hub.

There’s a growing portfolio of vendors involved in the architecture, both supplying data (e.g. Turnitin, Blackboard) and providing analytics services (e.g. SolutionPath, Tribal).

Finally, Mark Harrington told us about the new Jisc Learning Analytics Purchasing Service, an online portal for products, services and infrastructure that are compatible with Jisc’s learning analytics architecture and service.  This is designed to make purchasing easier for institutions, and to add value. Further details are available.

Andrew Cormack, Jisc’s Chief Regulatory Adviser was next up. His hot topic was the forthcoming General Data Protection Regulation (GDPR) and how to handle student consent for learning analytics.

Andrew Cormack from Jisc

Andrew Cormack

 

In Andrew’s presentation (ppt 1.03MB) he argues that using consent as the basis for collecting and analysing student data is not always necessary or sensible. The legitimate interest of the organisation may be a better justification: in this case the legitimate interest would be in improving learning.

Doorway at Old Naval CollegeLegitimate interest however cannot be used if the data is ‘sensitive data’, now know in the GDPR as ‘special category data’, such as ethnicity or health data. To use this data, either consent must be sought from the student or there must be a legal obligation to do so.

Intervention with a student (e.g. a tutor phoning them up because they appear to be at risk) would also require the student to provide their free, informed consent in advance.

After lunch Suzanne Owen from Turnitin presented on ‘Unlocking Turnitin Data with Jisc Learning Analytics’ (ppt 2.52MB). She discussed Kerr Gardiner’s recent survey of UK universities, where he found that all of those he questioned saw a requirement to access all student-related data to support the implementation of learning analytics and to inform metrics such as those in the Teaching Enhancement Framework (TEF).

Suzanne Owen from Turnitin

Suzanne Owen

Suzanne discussed how Turnitin works after students submit assignments to it, analysing the assignments’ orginality. She looked at how plagiarism, essay mills and ghost writing are growing problems, and how Turnitin is adding increased functionality to solve this.

Michael Webb from Jisc

Michael Webb

Next up was Jisc’s Director of Analytics and Technology, Michael Webb, explaining how predictive modelling works. He used Jisc’s Learning Analytics Predictor (JLAP) as an example of the various processed involved.

We then had a panel session “Getting your data right for learning analytics” with Adam Cooper (Tribal), Neil Price (University of South Wales), Lee Baylis & Josh Ring (Jisc) and Richard Gascoigne (SolutionPath).

Adam started this with some interesting reflections (pdf 883KB). He suggests that it’s important for someone to take a critical analytical view on the data who has good attention to detail and domain knowledge. You also need someone who knows what the data means in each system and should take an iterative approach to assembling and cleaning your data. Good written data definitions, he says, are essential.

Panel session with Adam Cooper presenting

Adam Cooper

Lee, Josh and Richard all added their perspectives and the audience chipped in with various points and questions. It was great too to have Neil’s perspective from the University of South Wales, as someone who’s been encountering the issues at first hand and has been consistently pro-active in moving things forward. There was a lot of experience and good tips provided by the panel, and the recording is worth watching.

Karl Molden from Greenwich presenting

Our next meeting will be at Edinburgh University on 22nd Feb 2018, the agenda and booking form will be posted to this blog in advance. Sign up to the analytics@jiscmail list if you want to receive details of that meeting and other developments in learning analytics by email.

 

 

 

The Learning Evolution – Books, Augmented Reality, Narrative and Beyond…

Originally posted on Inspiring learning.

In June of 2017, I was invited to speak at the UAReloaded conference in St.Loen-Rot Germany. The conference theme is user assistance (UA) and invited experts from a range of fields to explore and disruptively change the future of UA. The conference had a refreshing approach combining talks, forums, exhibits and UA Camp, an open-ended activity during which participants could engage and collaborate with other experts.

Learning resources that convey threshold concepts

Delivering a workshop about visual learning resources that convey threshold concepts

On the first day of the event, I was fortunate enough to deliver a workshop looking at the application of visual learning resources that convey threshold concepts using bite-sized video and Augmented Reality (AR). Some of the examples in education included using an Airway Managment training dummy in combination with AR to enhance the learning experience for Paramedic Students.

During the second day, I gave a talk entitled The Learning Evolution that looked at how education can leverage the power of technology (in this case AR and Mixed Reality (MR), to the benefit of the student learning experience.

AR Class room management by Computer-Human Interaction in Learning and Instruction (CHILI)

AR Classroom management by Computer-Human Interaction in Learning and Instruction (CHILI)

A great example of how AR can be used in education is the work developed by Computer-Human Interaction in Learning and Instruction (CHILI). They have used AR to manage classroom groups and activities. You can see this in action by watching the video.

I also covered the current availability of applications that allow non-technical practitioners to engage and use AR in education. Many of the conference attendees were unaware of the potential of AR, and its application to education. You can see just some of the applications by watching the video below.

The Learning Evolution Talk

The Learning Evolution

During the event I spoke with many participants who were world leaders in their specific fields, working at large multinational companies such as Google and SAP, to individuals who had become experts in their particular area.

Companies big or small can learn from each other.

Companies big or small can learn from each other.

What struck me the most was how much companies big or small can learn from each other. An example of this was the excellent talk given by Alison Norrington from Story Central. Alison spoke about how stories are timeless and global vehicles that communicate, explore, persuade and inspire. The advantages offered by Alison during her talk could be applied to companies large or small.

At the end of the event, I spoke with several people who had been inspired by some of the topics covered during the event, including some who would now be looking to try AR in their practice.

If you would like to know more about how AR or other technologies can help your organisation, my position as subject specialist can provide impactful practical support and assistance to Jisc members, in addition to providing consultancy services in the form of thought leadership, training, guidance and facilitation as a ‘critical friend’ to external organisations.

References:

Alison Norrington / Story Central
Computer-Human Interaction in Learning and Instruction (CHILI)
UA Reloaded
Zapbox

 

The post The Learning Evolution – Books, Augmented Reality, Narrative and Beyond… appeared first on Inspiring learning.

Quick guide to customising and launching the tracker

Originally posted on Jisc Digital Student.

As you’ll know by now, there’s plenty of ‘slow’ guidance to customising and launching the tracker from our Guidance page. But if you’re confident about your aims for the tracker, you’ve got your key stakeholders involved, and you’ve used BOS before, you might not need that much detail. This six-step quick guide is for you (with thanks once again to Tabetha and Mike for the content).

1. Log into your BOS account and take a look at the master tracker(s)

  • The BOS login page is https://admin.onlinesurveys.ac.uk/accounts/login/
  • Sign in using your log-in details: if you’ve mislaid them, just click on “lost invitation” to get new credentials.
  • Once you’re in the BOS dashboard, de-select “just my surveys” to see the master tracker(s) that you requested (see the handy green text box at the top of the dashboard if you’re stuck).

2. Copy from the master tracker(s)

  • Copy your master tracker(s) using the purple “copy” icon on the dashboard
  • Name your new tracker(s) with your institution name AND whether it is the FE, HE, Skills, Online or ACL version
  • Remove the word “locked”

3. Customise your survey questions
Most questions cannot be edited, because we need them to be identical so we can group everyone’s data and allow you to compare your institution against “all data” in the group (after surveys close at the end of April 2018).
However, you can make the following edits:

  • Page 6 (name your VLE in the section header above Q15)
  • Add questions to Page 9 and remove the red text instruction boxes
  • Edit the “thank you” text on Page 10

4. Check the survey before you launch it!

  • Once you press LAUNCH you can’t edit the survey, so make sure you check it carefully first
  • You can create and share a PDF of the survey by going to “Preview Survey” and clicking on the cog in the top right of the screen – this allows you to test it with users and run it past key stakeholders before launching.

5. Launch your tracker

  • From the dashboard click on DISTRIBUTE
  • Check that you are happy with “public URL” link address (if not you can change it under “distribution settings”)
  • You can change the close date if you want to. Surveys close finally on 30th April 2018 when we merge the data in “benchmark” groups, but you can choose a date before then. We recommend a survey window of 2-3 weeks.

6. Get a reliable and valid sample size from your learners

  • To get data that reflects true opinions from your learner population as a whole, you should aim to collect responses from at least 20% of all students, or target a sub-sample that contains all student variation and get responses from everyone in this sub-sample (e.g. by walking into some lectures with a box of chocolates and  asking everyone to complete the survey on their smart phones in return for a sweet reward!). Or you could do both.
  • You could share the public URL link with your learners e.g. in lectures, on twitter, via the student union, via their tutors (see blog post on engaging students)
  • Several institutions have decided to run their surveys in a single month and to make this “digital student” month. They are engaging learners with posters and events, and sharing initial findings whilst the tracker is still live. Is this something you want to consider?
  • If you don’t get enough responses in the time allocated, consider extending by a week and trying a different engagement strategy.And finally… a reminder that there is lots more help available on our Tracker website pages. The blog and loads of guidance material all cascade from the tracker landing page.

Digging deeper with the discovery tool

Originally posted on Jisc Building Digital Capability Blog .

We now have almost 100 organisations signed up to pilot the new Digital capability discovery tool. The new platform and content will be available to trial with staff from January to May 2018, and a student version will also be available in the trial period.

If you’re not part of the pilot, you can still follow the progress of the project from this blog and on the new dedicated Discovery tool pages (launching early December).

I’ve been digging through the undergrowth of the sign-up data from all 100 pilot sites, trying to get a clearer picture of who is leading the project and what their motives are. This is a brief report from my explorations.

  1. Job roles

Lead contacts for the discovery tool have a wide and interesting range of job titles.

Sign up data job roles

Job roles – click for more detailed chart

The largest category work in Digital education/e-learning/TEL (39) followed by Education/learning without a specific digital component (18). A separate cluster can be defined as specialists in Digital, IT or information literacy (8). These job titles included ‘Head of Digital Capability’, ‘Manager, information and digital literacy’ and ‘Digital Skills training officer’. Library/Learning Resources specialists accounted for another 8 sign-ups, Student experience/student services and IT/tech for 7 each, and HR/organisational/staff development for 5. There were also 5 subject specialist staff, of whom 3 were in English – a slightly surprising result.

These totals suggest that the bias of intended use is strongly towards teaching staff and learners.

  1. Sector

Nearly 60% of sign-ups came from HE providers and 33% from FE. The ‘other’ responses (9) came from work-based, professional and adult learning, combined HE and FE institutions, and routes into HE.

Screen Shot 2017-11-14 at 10.58.48

  1. Reasons for using the Discovery Tool

Users selected a mean of 3.5 different responses, with almost all selecting the motive to ‘Help staff to improve their own digital capabilities’.  Eighty-three percent were using the discovery tool to support a strategic change agenda, and there were also high scores for identifying and monitoring staff digital capabilities overall, and for raising awareness.

Sign up data reasons

It’s interesting that 37% of respondents hope to use the discovery tool to ‘monitor individual staff’, a feature that is not offered currently. We included this option to assess whether our original aims – to produce a self-reflective tool – match with those of institutional leads. It is these tensions that our evaluation will have to explore in more detail (more under ‘next steps’ below).

  1. Current approaches to supporting staff

There were 104 responses to the question: ‘What approaches do you currently have in place to support staff with development of their digital capabilities?’

Sign up data support

  • 43 (41% of) respondents mentioned voluntary staff development, especially of academic or teaching staff.
  • 14 (13% of) respondents mentioned some form of mandatory support for staff i.e. at appraisal, PDR, or induction.
  • 27 (26% of) respondents said provision for staff was currently insufficient, declining or ‘ad hoc’: Institutional support for staff to develop digital capabilities was removed about 6 years ago’; ‘[Staff training] has been very stop, start in recent years due to changes in management’; ‘There is no overall institutional approach to digital capabilities.
  • 20 (19% of) respondents mentioned IT or similar training, and a further 13 (12%) mentioned TEL support
  • 20 (19% of) respondents mentioned online materials, of which 15 were using subscription services such as Lynda.com and 5 had developed their own
  • A sizeable number (14) had a specialist digital capability project under way, though often in the early stages.

Nine respondents mentioned the Jisc Framework or (in two cases) another digital capability framework. Frameworks were being used: to support curriculum review; in teaching staff CPD; to design training (with open badges linked to the Jisc framework); and to identify gaps in provision. Two institutions mentioned the earlier pilot of the discovery tool as a source of support, and three were using alternative self-assessment tools.

  1. Current collection of data about staff digital capabilities

Asked ‘Do you already collect any data about staff digital capabilities?’ 58 respondents out of 101 (57.4%) said ‘no’ or ‘not at present’ or equivalent.

Among those who responded ‘yes’ (or a qualified ‘yes’) a variety of processes were used. These included:

  • Anonymous surveys
  • General feedback from staff training
  • Feedback from appraisal or CPD processes
  • Data from staff uptake of training or online learning
  • Periodic TEL/T&L reviews (inst or dept level)
  • Use of the discovery tool (n=10 – though in some cases this was prospective only)
  • Use of the digital student tracker (n=1)
  • Teaching observations

 

  1. Numbers involved and approaches to engaging staff

Organisations had very diverse ambitions for the pilot, from user testing with 6 staff to a major roll-out in the thousands. Strategies for engaging staff were also very different in the different cases. There were 104 responses to this question, and a lot of overlap with responses to the previous question about staff support in general.

Some 27 respondents decided that communication was key, and described the channels and platforms they would use. Strategies for gaining attention included having senior managers initiate messages, engaging students to design arresting visual messages, and involving a professional promotions team.  Timing was sometimes carefully considered e.g. to coincide with another initiative, or to avoid peaks of workload. In addition, 9 respondents considered the content of communications and the majority of these planned to focus on the immediate benefits to end-users: the opportunity to reflect, develop confidence, find out more, get advice and feedback, identify existing strengths. Other incentives were digital badges (2), support for an HEA fellowship application, and chocolate!

In all, 46 out of 104 or over 40% proposed completion of the discovery tool in live, shared settings, either at existing events such as meetings or at specific events designed for that purpose. Although we have emphasised the personal nature of the discovery process, it may be that shared, live events of this kind will prove more effective at developing trust, and providing on-the-spot support at a moment when users are receptive to it. One said:

‘Institutions that took part in the previous pilot of this tool identified the value of providing face to face support possibly as a lunch time ‘Digital Capabilities’ drop in to help staff complete the tool, explain those questions that staff didn’t understand and to discuss digital capabilities.’

  1. Planned support for the Discovery Tool

Users selected a mode of 3 different kinds of support (mean 2.8) – so there is a clear understanding that users will have contextualised support of various kinds. All but 3 were planning to offer users the ‘opportunity to share and discuss their results’. Almost as many proposed to offer ‘links to local resources/development opportunities’, in addition to resources provided as an outcome of the discovery process. Around half expected to offer support through a staff network or community of practice, and slightly fewer expected to offer accreditation or badging.

Less than 3% chose to tell us about ‘other’ forms of support, suggesting that the support activities we identified in the last pilot still cover most of the sensible options. The main form of support noted in the ‘other’ responses (not already captured in the closed options) was the offer of specific training opportunities related to the discovery tool content.

What next?

Information from this report is being used to inform the guidance for pilots of the discovery tool, and to identify issues for further exploration. A following post will have more detail about both the guidance and how we will be evaluating the pilot.

An important issue for institutional leads to consider is how much they can expect – and what they would do with – fine-grained data about staff capabilities. Data such as staff take-up of training opportunities, the quality of VLE materials, or student assessments of their digital experience, may be more reliable than self-assessments by staff using the discovery tool. This information could be obtained using the digital experience tracker, also available from Jisc.

We know from the earlier evaluation that the discovery tool can help staff to become more aware of what confident digital practice looks like, and more self-directed in their professional learning. And organisations will be able to collect aggregate data about the number of staff who complete the discovery process, and other information to be determined during the pilot phase.

We will be exploring what data is really useful for planning interventions around staff digital capability, how can this be generated from the discovery tool, and how  it can be reported without compromising staff willingness to engage.

For more information about the discovery tool please contact digitalcapability@jisc.ac.uk.

Who wants change?

Originally posted on lawrie : converged.

Image by Alan O’Rourke

This week I have been preparing for the second residential of the Jisc Digital Leaders course. Whilst the course is premised on role of digital, digital is actually a lens through which we look at institutional strategy and practice. We started off the course with a brief framing of digital and leadership, referring to Ron Barnett’s description of the current education landscape; a time of uncertainty, unpredictability, challenge and change – a situation to which he applies the term “supercomplexity”. The digital leaders programme seeks to both frame the challenges created by digital, and also show digital as providing the capacity and capabilities to respond to supercomplexity.

Responding to challenge and change, one of the workshops we use is based around Schein’s model, sometimes referred to as the onion model, working within an educational context.

Schein’s Onion Model

The model refers to three layers, at the surface there are the artefacts and symbols, beneath that lie the expressed values, and at the core are tacit assumptions and underlying values:

Artefacts and symbols mark the surface an institution; visible elements such as logos, buildings and other parts of the estate, digital platforms (such as the VLE), rooms and lecture theatres, the library, the institutional vision, straplines etc. They are visible not only to staff, but also students, prospective students and the wider public.

The expressed values within the model may include the strategy, reward schemes, the underpinning philosophies, and pedagogical approach of the institution. It also refers to how these values are expressed.

Tacit assumptions and underlying values are deeply embedded in the culture; experienced as self-evident and unconscious behaviour. These are the unspoken rules, these are the things it doesn’t occur to you to explain until something goes wrong, these are the things you wish someone had told you once you first started working at your institution, these are things you might only realise after you’ve left the job. They are not always easy to recognise, they can derail change initiatives and block progress.

All of these layers, and many other aspects, are part of the culture of an institution. In the workshop we get people thinking about the tacit assumptions and underlying values, sometimes we prompt them to discuss their own, sometimes we prompt them to put themselves in a different role in the institution. Almost all of the participants we have worked with engage in this activity with a lot of energy!

We do have prompts (such as below) if we need them, but we rarely do.

Institutional Culture is... Diagram

Thinking about the digital implications and practices can lead to some startling revelations about ourselves, and I have frequently found myself reflecting on my own practices and engagements.

One of the most common digital misbehaviours I indulge in is the abuse of email!

“Can you pop that in an email to me?”

About my person at anytime I have multiple devices where I can make a note and a reminder? What I am really saying is I can’t be bothered to make a note; or I might be thinking I’m too busy to make a note. The reality of that is that I am valuing my time more than that of my colleague. The consequences? Well eventually they will probably stop telling me things or involving me, because it just adds to their workload?

Breaking these habits is hard, but it’s worth it, and if we want digital change, then the key is to model the behaviours you want to see.

What are the tacit assumptions and underlying assumptions in your institution? What are they stopping from happening? There’s a few below to get you started….

Tacit Assumptions

Engaging students in the tracker

Originally posted on Jisc Digital Student.

This post is guest authored by Tabetha and Mike from the tracker team. We thought you might like to see the faces behind tracker.support@jisc.ac.uk. Not only are they keeping the project on the road, they also take time out to think about the bigger issues such as student engagement.

Tabetha

Tabetha

MikeG

Mike

 

 

 

 

 

We see the trackers as a way for you to engage – not just monitor – your learners. Students are increasingly concerned about surveys that are imposed on them without explanation, so it is important for you to explain clearly why your tracker can help improve their teaching and learning experiences.

To help with this, you might want to:

  • Engage learners in planning & communicating the tracker project
  • Identify student leaders and ask them to look at the tracker, and consider positive promotion within their networks
  • Engage learners in other conversations about their digital experience
  • Work in partnership with students to respond to the findings

Here are six practical examples for ways you could consider to improve your student communication and engagement.

#1 Have a conversation with your students whilst your survey is live

Our experience has shown that one of the easiest but most powerful things you can do to increase student engagement and response rates is to feed back anonymous results whilst your tracker survey is still live. So, for example, last year we saw some institutions who put the result of single questions onto screens around campus.

Adelaide sign 1

University of Adelaide Library 2017

You can do this by logging into your BOS dashboard and looking at the data as it comes in via the Analyse area. More information available at bit.ly/trackerguide in the ‘Analyse’ section. Your message might say something like: “80% of students rate the quality of this university’s digital provision as good or better than good … what do you think? Join the discussion here <insert your tracker link>”

#2 Promote the tracker widely

Distribute promotional materials via twitter, Facebook, WhatsApp, or posters/screens around your institution. You can download and use the graphics at the bottom of this post to include in your promotions.

#3 Read about what other institutions did last year

The Guide: engaging learners includes some quick examples and you could also browse the snapshot case studies from last year’s successful tracker projects. For example, Epping Forest College (p6-7) involved students in designing their own posters to promote participation in the Tracker, and also used QR codes for quick access to the survey.

Epping Forest

#4 Plan how you will share your tracker findings, and continue the conversation with your students

Why not plan some events for next April/May that allow you to liaise with your learners and staff to discuss your tracker findings?

#5 Share your examples of student engagement at CAN2018

If you already have some good examples of student engagement practices then why not submit a proposal for the Change Agents’ Network conference, to be held at the University of Winchester on 19 and 20 April? This is a network of staff and students working in partnership to support curriculum enhancement and innovation. For more information see the CAN18 submission form.

#6 Share your ideas and queries about student engagement on the mailing list

STUDENT-DIGITALTRACKER@JISCMAIL.AC.UK list is becoming an active community of practice sharing. If you are registered to use the tracker you should already be a member of this list – but you can always ask us to add other members of your institution. If you would just like to keep up with the project, you can also request to join. There are over 200 subscribers who work across HE, FE, work-based learning and adult community learning. Your ideas can help others, and help you to connect with like-minded people out there, in roles like yours.

Jisc Tracker four areas graphic

Participate in our study into how HR departments support staff to develop their digital capability?

Originally posted on Jisc Building Digital Capability Blog .

Would you like to contribute to a new Jisc study and shape future developments in this area?

Jisc want to find out what HR departments are doing to support staff in dealing with the challenges and making the most of the opportunities offered by technologies? We are also keen to find out how confident HR teams are of their own digital capabilities to support staff in their institutions?

We have just started our short study and hope to find out from HR staff how their activities link to institutional strategies and activities around digital capabilities.

By participating in the study HR staff will have the chance to inform future developments and to highlight the good practice that is already happening.

We have produced a short (5 minute) online survey to help us create a snapshot of current practice in the UK and to find out about HR staff levels of confidence around digital capability.

Please let your HR teams know about the study and ask them to complete the survey.

https://jisc-beta.onlinesurveys.ac.uk/hr-support-of-staff-digital-capabilites

Assessing Your Readiness for Implementation of Learning Analytics: Making a Start

Originally posted on Effective Learning Analytics.

Lindsay Pineda

Lindsay Pineda

Patrick Lynch

Patrick Lynch

This is a guest post by Lindsay Pineda and Patrick Lynch. Their bios are at the end of the article.

As an institution, you may find yourself asking, “How do I know if we are ready for learning analytics? Is there a way to ‘feel out’ where we are before having someone come onsite for a more official assessment? What kinds of things can I do on campus to prepare for an on-site Readiness Assessment?”

One of the main advantages of engaging in an on-site Readiness Assessment is that the event is tailored for your institution’s particular requirements. Our clients have confirmed for us that the “gold standard” in this work is to bring in people from outside of the organization to help facilitate this process. Institutions have told us:

  • “This is honestly the first time we’ve had all of these people together in one room to talk about a common goal.”
  • “Bringing someone in from the ‘outside’ onsite allows for an impartial view of the institution and it’s not muddied by an emotional connection to one thing or another. That is extremely valuable.”
  • “I have no idea how I would have been able to convince this many people to be in the same room together for this long without using the pull of Consultants coming in to assess us.”
  • “With someone coming in, no one seems particularly ‘suspicious’ about the intent behind the workshops. People are curious about who is here and what they intend to help us do so they actually show up.”

Making a Start

Through our experiences, we did find that there are activities an institution can undertake as “prep work” for the facilitated on-site Readiness Assessment. Getting people together for a sustained period is difficult. Getting disparate groups together is challenging and organizing senior leadership into any narrow time frame can be near impossible. We experience these obstacles ourselves while planning on-site visits with institutions.

In this article we present two activities that will help an institution better prepare for a shorter, more focused, external engagement with their on-site facilitators. This also provides the facilitators with richer information to help deliver the most value while onsite at your institution.

Areas to Investigate

One of the key elements of readiness is to consider the fit of a learning analytics project within the culture of the organization. This includes policies, processes, and practices that may be affected. Taking a broader view can help identify where existing projects and processes can support the initiative. For example, we have encountered change fatigue in a number of institutions. Wrapping activities together into one broader program is a great way to reduce the perceived number of changes, as well as help realize the advantages of interrelated projects. For example, an existing project at focused on ethics and privacy policies could also include those policies needed to support a learning analytics initiative.

Stakeholder Involvement

Given the broader approach to implementing learning analytics, a wide stakeholder group needs to be involved. Assessing readiness acts as a solid start to the project. It is a way to introduce colleagues to the goals of the project and benefits of implementing the initiative effectively. The assessment also assists with gaining buy in from the start. Key staff groups we have identified are:

  • Learning and Teaching support/development
  • eLearning/TEL
  • IT Services
  • Academic Tutoring/Advisement Services
  • Student Support Services
  • Research and Planning
  • Registrar’s Office
  • Key Thought Influencers
  • Senior Leadership (“Decision Makers”)
  • Faculty/Teachers
  • Library/Information Services

And last, but by no means to be considered least:

  • Key Student Union/Government Representatives

For each area, as appropriate, we would suggest having both academic staff as well as administrative individuals involved.

Having this broad range of departments together offers a good overall representation of the institution and allows for sharing of impact and ideas in a broader manner. Each of these areas brings a different perspective. It includes the individuals who will see the effects on the institution’s bottom line, who will carry out the actual work, who will provide interventions, along with the end users.

Activity Recommendations

The two activities we recommend are:

  • Setting Institutional Goals and Objectives
  • Determining Challenges and Obstacles

The first activity can be used as a starting point for goal setting, which will be iterated upon in subsequent activities. These two activities are specifically aimed at narrowing down the institution’s short and long term goals while identifying any major challenges/obstacles. This allows for the on-site efforts to center around the already identified goals, challenges, and obstacles.

Activity #1: Institutional Goals and Objectives

  • Why Is This Important?

  • Intended Outcomes of the Session

  • Duration

  • Setting and Structure

  • Resources and Supplies

  • Prerequisites

  • Who Should Be Involved?

  • Outputs

  • Summary of Hints and Tips

Activity #2: Challenges and Obstacles

  • Why Is This Important?

  • Intended Outcomes of the Session

  • Duration

  • Setting and Structure

  • Resources and Supplies

  • Prerequisites

  • Who Should Be Involved?

  • Outputs

  • Summary of Hints and Tips

Why Use an Outside Consultant?

While conducting the above activities can be very valuable for an institution to glean some initial insights, it is not realistic for an institution to undertake an entire Readiness Assessment on their own. The outcomes from the “prep work” activities above will provide the outside consultants with baseline information to build upon with institutional staff and students while onsite. Our recommended “prep work” activities above are meant to assist an institution for the preparation of an on-site, facilitated Readiness Assessment; not to replace it. There are several additional activities experienced facilitators will guide an institution through. Institutions have advised us that having someone come from the “outside” creates a sense of urgency and importance around the learning analytics project and the Readiness Assessment activities. This can be valuable to leverage.

Some of the other valuable areas that we provide for institutions as experienced Readiness Assessment facilitators are:

  • Introductions to learning analytics, including real world experience and examples
  • Student requirements scoping
  • Policies, processes, and practices insights
  • Ethics and privacy considerations
  • Technical/data considerations
  • Intervention considerations, guidance, and management

These areas, and many others, are discussed in a guided, collaborative, and facilitated manner through workshops, sessions, and activities. The findings from each Readiness Assessment are shared in a confidential final report format that is meant to provide broad insights into the institution’s collective readiness. There are also recommendations made for next steps in the learning analytics initiative journey.

We heard from several institutions that an on-site facilitated Readiness Assessment is an essential first step to engaging in a learning analytics initiative. One institution told us, “An on-site facilitated Readiness Assessment is an efficient, effective, and valuable part of starting a learning analytics initiative.”

Useful Reading: