Engaging users with the Digital discovery tool

Originally posted on Jisc Building Digital Capability Blog .

There are only a few weeks to go before we wrap up this pilot phase of the the Digital discovery tool, but still time to get new users involved. Some pilot sites have finished engaging users and are now evaluating how things have gone, but others are still looking for staff and students to give the discovery tool a try.

There are five new promotional posters from the Jisc team that can help. These can be adapted with an institutional logo and the details of any live workshops or support materials.

Screen Shot 2018-04-19 at 23.17.53

Download your posters here:

There are other ideas for engaging users on our Guidance page: Engaging users.

Thinking ahead, lead contacts at all the pilot sites will be sent a survey about their experience on 5 May. The survey is quite comprehensive, as this is our best source of information about how the Digital discovery tool is being used in practice. There are 15 questions, covering user engagement, support and follow-up for the discovery tool, and whether there have been any individual or organisational benefits. We ask for this to be completed by 30 May.

Before completing the form, we suggest that leads run a focus group or consultation event with users. This will allow evidence to be gathered that can help to answer the evaluation questions. There are materials for running consultation events on our Guidance page: evaluating with users, but this doesn’t have to be complicated. It could be as simple as getting some users together and exploring a couple of the questions on the evaluation form.

Just now, we are using all the valuable feedback from users to make some refinements. You may notice these in the questions and feedback for staff. There will be more significant updates once the pilot has finished. It’s really helpful if you can point your users to these feedback forms, which are found on their dashboards. We can only make things better with their help – and yours!

 

#CAN18 resources

Originally posted on Change Agents' Network.

IMG_1282The Change Agents’ Network conference at the University of Winchester on 19 and 20 April was an opportunity for staff and students to share their experiences in organised sessions as well as many informal ways; the breaks and meals in the warm sunshine encouraged us to sit and talk.  The conference was excellently organised and supported by the Winchester team who we thank for the considerable effort involved in putting it all together.

In the coming week, we will be adding resources from the sessions and photos.  While you’re waiting, here is a little sample of of images from the event.

IMG_6400 IMG_1279 IMG_6392 ATCV2827 image001

 

Analytics lab: teaching quality benchmarks

Originally posted on Effective Learning Analytics.

Exciting new Jisc Learning Analytics opportunity

At our most recent network event in Edinburgh, Michael Webb and I introduced ‘Learning Analytics Labs’. An experiment to see whether we can use selected learning analytics data to improve mandatory data returns and performance metrics. This is your opportunity to get involved in an initial exploration to improve teaching quality metrics.

We want to explore whether teaching quality benchmarks can be automated and enhanced from learning analytics data. We’ll determine whether the data can serve up some rich and reliable insights to help institutions to better address regulatory requirements (such as TEF2 and subsequent iterations) as well as contribute insights to related institutional uses.

This will only happen if we undertake some experimentation and with your help we can!

Jisc will provide a safe and secure data processing environment, developers and a data sharing agreement to undertake the work. But we also need your help.

We’re seeking

  1. Champions (institutional lead learning analytics contacts) to join a team and advise us on direction
    Effort expectation 2 hours per week (remote via Skype / Skype for Business) and 3 days face to face, over 12 weeks
  2. Data
    To be decided by the team, but likely to be pseudonymous (where identifiers have been replaced) historic data as used by the Jisc Learning Analytics Service, including student (UDD) data and activity data

You’ll bring expertise and knowledge of the data, the sector, the challenges and the potential for benchmarks which might be created from the data. You will join a small team of peers and be supported in identifying the most promising benchmarking areas for exploration and the data detail required.

In return you’ll get the opportunity to steer future mandatory collections / returns, helping to make these less burdensome and more useful. We’ll also invite you to meet 10 other teams exploring different areas of exploration and you’ll see a range of data visualisations for a variety of purposes.

Our target date for launching this initiative is 16 May at a face-to-face event in Manchester (Jisc will cover your travel costs for both F2F meetings).

At this stage we’re mocking up proofs of concept only – no institutional data will be published for benchmarking. We’re hoping that once we show what is possible, it will spark more interest in the potential ultimately benefiting the whole sector.

If you’re interested in taking part, email lee.baylis@jisc.ac.uk  by 30 April outlining in 200 words or less how your experience would contribute to the task. We’ll shortlist and invite successful applicants to the 16 May event and you’ll join a planning F2F meeting with your team the week after.

You can read a little more in Niall’s blog post at

https://analytics.jiscinvolve.org/wp/2018/03/02/notes-and-presentations-from-the-13th-learning-analytics-network-meeting-at-the-university-of-edinburgh/

or the morning video of our presentation is amongst those at

https://www.pscp.tv/w/1lPKqnglkPdGb

and the slides I used are at

https://analytics.jiscinvolve.org/wp/files/2018/02/2018-02-21-UKLAN-Webb-Jones-Baylis.pptx

Thanks for reading!

Lee Baylis

Senior Analytics Innovator

Jisc

Organisational data available

Originally posted on Jisc Building Digital Capability Blog .

If you are part of our organisational pilot of the Digital discovery tool, you will now have access to your data dashboard with visual results from your staff users. Guidance for accessing and reading your data visualisations can be found here.

There is also a collaborate webinar on Tuesday 17 April at 13:00 which will walk you through the process and help you to make use of your data. You can access the webinar live here, or after the event you can access the recording here.

The rest of this post is about how you might make use of the data in your organisation. Please remember that the data provided as part of the pilot is still in development. We are in the process of finding out what data is useful. You should not rely on these data visualisations as a definitive source of information about staff training needs.

Making use of your data

You may want to use the number of staff completions – possibly broken down by department – to compare the number of staff who have fully engaged with the number of staff you hoped to reach at the start of the project. Who has and who has not engaged? Do you have feedback from your engagement sessions or a follow-up process (e.g. focus group) to explain any differences? How might you encourage engagement from other groups of staff?

You could also compare the number of staff who have completed the general (‘all staff’) assessment with the number completing the specialist teaching assessment(s). How would you explain any differences? Again consult with your users: were teaching staff more motivated and satisfied by the role-specific assessment?

The ‘in progress’ data allows you to see if there is a significant drop-off as staff are going through an assessment. This is a figure Jisc is looking at closely, as the user experience needs to be easy and supportive – that is our responsibility. But if you find differences in the drop-off rate across different staff groups, could this be because of differences in the support you make available to them?

Scoring band data should be interpreted with great caution. Jisc is using this data to ensure that the questions we ask produce a reasonably even spread of medians across the different areas of digital capability. But this is a broad aspiration: it is inevitable that some areas will prove more challenging to users than others. Also, some areas are essential for all staff (such as digital wellbeing), while others such as information, media or data literacy are more important in different roles.

This is why all our feedback to individual users asks them to reflect on their role and its demands before deciding how to prioritise their next steps. It is also why you should not compare scoring bands across completely different areas of digital capability and conclude that your staff have a ‘deficit’ in one area as compared with another. If you want to make comparisons, look at overall sector scoring bands and compare with the relevant banding in your organisation. But even this should be done with great care, particularly if you have a low number of users overall or in one departmental group, as this will skew the results.

Scores are all self-assigned, and their purpose is to ensure that users get appropriate feedback. If staff believe that their scores are being used for another purpose, they may not answer questions honestly, and the value of the Digital discovery tool will be severely limited.

Jisc encourages you to use the Digital discovery tool to support a dialogue with staff about the training and development they need. The spread of scoring bands across different departments may encourage you to target training in specific areas towards specific groups of staff. Because of the caveats above, you should not do this without consulting with the staff involved. Where staff score lower than others in their sector, this is definitely a cue for you to investigate whether they would appreciate more training and support, but it is not a performance measure and should never be used as such.

Following up and closing the feedback loop

The information you gather from the Digital discovery tool can be used to start conversations:

  • with HR and staff development about overall staff training and development needs;
  • with teaching staff about their confidence with digital teaching, learning and assessment, and their further development needs;
  • with IT and e-learning teams about support for specific systems and practices;
  • with budget-holders about investing in staff development resources and in online services.

You should report back to your staff users about how you are using this data, and what you are doing to support them more effectively in the future.

Using Discovery tool data to refine the questions and scoring

Originally posted on Jisc Building Digital Capability Blog .

Thanks to the aggregate data we are getting from our first pilot users, we have been able to compare the median scores for each of the questions asked, and look at some other stats across the different assessments.

We were pleased to see from the first data returns that ‘depth’ and ‘breadth’ questions produce the medians we would expect, with one or two exceptions. We’ve worked on these outlying questions to make it a bit easier (or in one case a bit harder) to score in the middle range. This should bring the medians more into line with each other, making it easier and more valid to look across aggregate scores and compare areas of high and low self-assessment.

Median Question Scores - All capabilities

Median scores, ‘all staff’ assessment, snapshot from early March 2018: click for detail

There will always be some natural variation in average scores, because we are asking about different areas of practice, some of which will be more quickly adopted or more generally accomplished than others.

We were particularly pleased to find on testing that there is a positive correlation between confidence and responses to other questions in the same area (i.e. expertise and range). We would expect this, but it is good to have it confirmed. However, although there was a meaningful range of responses, almost no users were rating themselves less than averagely confident, so we are looking to adjust the scoring bands to reflect this. We don’t attach a great deal of weight to this question type, precisely because it is known that users tend to over-state their confidence, but is included to encourage reflection and a sense of personal responsibility.

You will see the impact of this work when we reach the mid-April review point, along with some further changes to the content and platform indicated by our user feedback. More about this below.

Scoring is designed to deliver appropriate feedback

As you see, we’re doing what we can to ensure that the scores individuals assign themselves are meaningful, so they allow relevant feedback to be delivered. The question types available don’t allow us to match selected items with feedback items (e.g. items not chosen in the grid or ‘breadth’ questions with ‘next steps’ suggestions in the personal report). This means relying on aggregate scores for each digital capability area. The pilot process is allowing us to find out how well the scoring process delivers feedback that users feel is right for them, and how the different areas relate to one another (or don’t!). However, the questions and scoring are not designed to provide accurate data to third parties about aptitude or performance. So scoring data, even at an aggregate level, should be treated with a great deal of caution. We are issuing new guidance on interpreting data returns very shortly.

radial

The radial diagram gives a quick overview of individual scores

The aim of the Digital discovery tool is developmental, so it’s clear what progress looks like and ‘gaming’ the scores is simple. Our contextualising information is designed to remove this temptation, by showing that the discovery process is for personal development and not for external scrutiny. Our feedback from staff in particular suggests that if there is any suggestion of external performance monitoring, they won’t engage or – if required to engage – they won’t answer honestly. Which of course will mean there is no useful information for anyone!

 

The ongoing evaluation process

evalform

Showing where to find the evaluation form on the dashboard

As well as examining user data, of course, we have access to the individual evaluation forms that (some) respondents fill out on completion.This is giving us some really useful insights into what works and what doesn’t.  However, at the moment we think the sample of respondents is weighted towards people who already know quite a lot about digital capability as a concept and a project. The views of people with a lot invested are really important to us. But we also need the feedback from naive users who may have a very different experience. Please encourage as many as possible of your users to complete this step. The evaluation form is available from a link on the user dashboard (see right).

Screen Shot 2018-04-01 at 22.01.53In addition we have taken a variety of expert views, and we are just about to launch a follow-up survey for organisational leads. This will ask you about what you have found beneficial about the project, what has supported you to implement it in your organisation, what you would change, and how you would prefer Jisc to take the Discovery tool project forward. Please look out for the next blog post and launch!

Do I need a new license? Creative Commons, Cambridge Analytica and Ethics

Originally posted on lawrie : converged.

Earlier this year Robin de Rosa and Rajiv Jhangiani launched the Open Pedagogy Notebook, a resource for any educators to both use open resources and learn more about the underpinning idea of open. I first met Robin in August 2016 at DigPedPEI, we had lots of conversations, and in one particular break out session Robin DeRosa, Daniel Lynds, Scott Robison, and I sat around in some comfy chairs and started talking about open. Eventually we got to talking about analytics and the data that is generated by students (and staff) and the tools that can take that data and using a variety of algorithms add some context, thereby either giving the student a representation of what they have done, or predicting what they will be likely to do.

Most of the current LMS/VLE offerings have some sort of data collection, and I imagine that even if the institution doesn’t use it the vendor does have access to it, if only so that they can understand and refine things like usability. But does the student have access to that data, does the student even know what data is collected and how it is used.

First let me state that I believe analytics can be a great force for helping students succeed, to catch those at risk, and support them earlier in their learning. At the same time I definitely agree to ask, as Amy Collier: “How might colleges and universities shape, rather than simply adopt, the ways that companies treat data?”

Back in 2016 the four of us looked around and asked “What would open analytics look like?” We brainstormed many possibilities of what an environment where students had open access to their data would look like. Not just the data, but also the system itself, the algorithms it used to nudge student behaviour, and what pitfalls any of this would have.

In the last month the Cambridge Analytica story got the issue of data and analytics across all the major news sites. The data that they had stolen “harvested” had been (probably) used unethically. They were nudging the behaviour of users in Facebook. Underpinning this issue is our data, how/where it is collected, how it is accessed, and who can access it. In the case of Cambridge Analytica and Facebook a simple app and quiz pretty much gave them access to everything you posted and read, and all of your friends.

So what has this got to do with Creative Commons? My posts on this blog and my images on Flickr and elsewhere are all licensed under a creative commons license that allows:

  • Sharing — copy and redistribute the material in any medium or format
  • Adapt — remix, transform, and build upon the material for any purpose, even commercially.

This is the license needed to make my work “open”, as currently defined.

I believe in the open movement, and I think open textbooks and open educational resources are excellent initiatives, and I am currently working on a chapter for an open textbook.

Data and algorithms are an issue. They can be good, but I believe they should be open and transparent, or they should also be opt in, with all of the caveats and warnings explained to students (and staff) as part of their learning about data literacy.

If I create an open object, whether it is an educational resource, a blog post or a photo. I want to know that if it is going into a VLE / LMS or other educational tool, it will only appear in tools that have open and transparent analytics and algorithms, or opt in data for students and staff, that they can also access. But by current definitions, that would mean my work is not longer open. Creative Commons is also something I believe in, and I want my work to be seen, used and adapted, but I want it done in a way that ethically aligns with my values. I do not mind if it is used commercially, but I do want to hold people to an ethical standard. Do I want an ethics rider for my Creative Commons license?

Currently the license states my moral rights for the work are not affected. But that is not clear language. “The preserving of the integrity of the work allows the author to object to alteration, distortion, or mutilation of the work that is “prejudicial to the author’s honor or reputation”. Arguably the work could still be used for “evil” if the people adapting it make it clear that they are the ones changing the context and I had nothing to do with it.

I’m looking for an answer, and probably there isn’t one. It’s possible there isn’t even a problem? If there is one good thing that as come out of the Cambridge Analytica and Facebook story, it is that we are talking about these issues, and that people are realising that data and algorithms are not neutral, that they have political bias, either unconscious to deliberately placed there. I do believe that the Open Movement needs to look at analytics and algorithms and decide how open objects can be used in these closed systems, and what the implications are.

Resources in the Digital discovery tool

Originally posted on Jisc Building Digital Capability Blog .

The Digital discovery tool provides links to a wide range of resources for each of the digital capability framework areas.

The platform delivers these resources in two ways.

Browse resources on your dashboard

When people log-in to the tool they are presented with a tailored welcome page/dashboard offering appropriate assessments for them based on the selections they make during log-in.   The dashboard also includes sets of resources for each of the six broad digital capability areas. You can scroll through these sets and browse the resources that we have mapped to these areas. We offer a brief description of the resource in this view.

resources-dashboard

Once you see a resource that looks interesting you can click on it to find out more. For each resource we have identified key audiences and level as appropriate and provide a brief description to help you decide how relevant it is to you. When you click on the URL in the resource page you will be taken directly to that resource outside of the discovery tool.

For some resources we offer suggested activities or reflections and a space to record them to save for the future.

resource-page

Find resources in your assessment report

When you complete an assessment, you receive a personal report which offers results, feedback and suggests some next steps that you could take. You are also offered links to selected resources for each area. These are offered in the same kind of scrolling list with a summary about the resource. When you print your assessment results report the resources are offered as a simple list of links so that you can revisit these at a time convenient to yourself.

Resource selection

Resources included in the discovery tool come from a wide range of publishers. They are checked for accuracy, relevance and quality. They are all free to use although some may require users to register.

These publishers include:

  • national or international bodies (such as Jisc, Nesta, HEFCE, SCONUL, EU bodies)
  • professional bodies (such as CILIP, AoC, UUK)
  • educational institution resources produced for staff or students but which could be of interest to a wide range of users.
  • individual academics who have set up websites or blogs
  • educational consultants or specialists who have websites or blogs
  • networks of educators or specialist collaborators (e.g. supporting citizenship, research, innovation)
  • wikipedia and wikiversity
  • commercial companies (such as Microsoft, Adobe, Google)

Jisc has been working closely with some publishers including the Microsoft educator community, and the Duke of York Inspiring Digital Enterprise Award (IDEA) to map their resources to the digital capabilities framework and include them within the tool. Jisc is also working with the subscription based online learning platform Lynda.com to map their resources to the framework.

Jisc is aware that many educational institutions subscribe to resource collections and may want the discovery tool to link out to them. This is something we are thinking about and hope to implement in the future.

Each resource included in the discovery tool is reviewed for relevance to the framework area, content and quality. Many of the resources also reflect the next steps suggestions.

Following feedback from our pilot phases we have attempted to limit the number of resources that are offered to prevent overload. The collection is not meant to be comprehensive – it has been selected to map to the digital capability framework, the questions and the feedback.

While we only have limited space, we are always looking for great new resources so please let us know if you can recommend one. Even if we can’t include it straight away we will review it for future use.

Resource description

We provide information to help you decide how relevant the resource might be for you. Each resource has a description of the aims and content.

We highlight if a resource is aimed at a specific audience, sector or level. Several resources are aimed at a specific audience but could also be of value to people in other sectors of with other roles. For example a resource aimed at students may be of value to a staff member if their capability levels are just developing in that area.

All the resources are mapped to the digital capability framework and to the different areas covered in the assessments. For example, the same resource may appear in the section about media literacy, or in the teacher assessment on creating learning resources.

Some of the resources have a very specific focus such as ‘managing your emails’ while others are broader and cover a range of digital literacies.

We have included a wide range of formats – from whole courses or sections of courses to downloadable learning resources. We have links to videos, websites, networks, screencasts, toolkits, reports and guides. We have included links to the Jisc guides as these often offer links to further resources. Some of the resources are in the pdf format which will require you to download a pdf reader such as Adobe Acrobat.

Resource management

Jisc has longstanding experience of managing resource collections and will be updating and maintaining this collection. This means that if you go back to an assessment report you may sometimes find different resources listed. Dead links will result in resources being removed from the collection. If you find any links that do not work please report it to us.

Podcast on Leadership #jiscdiglead

Originally posted on e-Learning Stuff.

Over the last three years I have been developing and delivering the Jisc Digital Leaders Programme, part of a wider team including Lawrie Phipps and Donna Lanclos.

Those two were recently interviewed by Chris Rowell, though the focus of the podcast was supposed to be about a chapter of a book that Lawrie and Donna had written, in the end the podcast was mainly about the leaders programme.

DELcast #4 Interview with Lawrie Phipps & Donna Lanclos about Digital Leadership and Social Media

DELcast #4 Interview with Lawrie Phipps & Donna Lanclos about Digital Leadership and Social Media – @Lawrie @DonnaLanclos @JISC

I really enjoyed listening to these two talk about the Jisc Digital Leaders Programme. The conversation reminded me how much the programme has changed since the initial pilots back in 2015, and what improvements and changes we have made to the programme.

Well worth a listen.

Making use of your tracker data: a flower-arranger’s guide

Originally posted on Jisc Digital Student.

As the last of the snow melts (we hope!) and the daffodils straighten up for spring, you probably also have a fresh crop of tracker data blooming in your BOS dashboard.

Although this is a beautiful and heart-lifting sight, we know that data also brings a sense of responsibility. How will you interpret the findings, and how will you justify all the hard work that has gone into them? How should you prepare and present your data flowers so they are really appreciated?

Think of this blog post as your flower-arrangers’ guide. (Note that benchmarking data will not be available until after the UK trackers are closed on 30 April. So even if you have data to play with now, it’s worth going back to the garden in early May and downloading the benchmark data. This is the contrasting foliage that will show your flowers off to their best effect.)

Arrangement 1: as nature intended

BOS produces readable charts for all the closed questions. The default .pdf file is not the most beautiful presentation, but it is good enough for many uses. You could use the Question by Question Guide to identify which stakeholders are likely to be interested in which question(s), and make sure they see the relevant results. Showing the right data to the right people is the most important thing to do.

For a more advanced arrangement, download all the closed data as a .csv file and open in excel. This gives you a wider range of charting options. A special commendation if you choose a colour swatch that matches your organisation’s brand colours.

Arrangement 2: focus on a few signature blooms

We recommend looking at the results for the two new summary questions: overall scores of institutional digital provision (Q13) and quality of digital teaching and learning (Q18). If you want to run more advanced statistical tests, these two metrics can be compared with other indicators to look for trends and contrasting results.

In each of the four areas of tracker, why not choose one or two sub-questions that are of particular interest to your organisation, or that differentiate your student respondents in an interesting way. For example:

  • You and your digital: Q5d (‘How often do you use digital tools or apps to look for additional resources not recommended by your tutor/lecturer‘)
  • Digital at your institution: Q9a (‘My institution supports me to use my own digital devices’)
  • Digital on your course: Q14f (‘As part of your course, how often do you produce work in digital formats other than Word/Powerpoint?’)
  • Attitude to digital learning: Q22: (‘How much would you like digital technologies to be used on your course?’)

Arrangement 3: give your free text room to shine

In our experience the free text questions provide a lot of specific and actionable detail at institution level. It’s worth downloading and focusing your analysis on these. Coding or marking up your data does not require special software. You can use colour highlighting, or group responses into themes. It helps to have more than one person do this so you can compare your results.

Two useful questions are:

  • Q11 ‘To improve your experience of digital teaching and learning… what one thing should we DO?’
  • Q14a ‘Please give an example of a course digital activity that you have found really useful’

This is an arrangement that students will really appreciate, so invite them to view it at the first opportunity.

Arrangement 4: foreground local species

Some questions provide data that is very particular to your institution. The free text questions are one example: so are the questions about the digital environment (‘digital at your institution’). Questions about digital course activities and the VLE can be localised further by partitioning responses according to students’ broad subject area (if you used this question).

Some of the other questions are better analysed at a sector level. For example, the Jisc team and expert panels will be looking at broad trends in issues such as personal device use and attitudes to digital learning. We will explore the impact of gender, stage of course, and organisational type, as well as some other organisational factors. These findings are likely to be relevant across institutions, and the analysis will be more robust across a larger sample size. So, unless these questions are of specific interest to you – or you have good reason to believe that your institution is different to the norm – feel free to leave this work to us. National sector reports will be published in or before September 2018.

Arrangement 5: colour-by-numbers

Over the next month we will be developing short slide decks and formatted single-page templates that allow you to plug in your data and present the findings quickly and efficiently. We will email to let you know when these are available. We’d appreciate knowing what formats would work best for you – please share your ideas on the jiscmail list.

Some technical advice

This year we have removed all ‘don’t know‘ options from the agreement scale questions. If you want to compare your percentage ‘agree‘ figures with last year’s results, remember to adjust last year’s figure first by removing the ‘don’t know’s and re-calculating the percentage based on just the agree/disagree figures.

If you want to get into more advanced analysis, or just understand what your data means in more detail, we offer a wealth of practical advice in the Guide to analysing your data and the Question by Question Guide, both available at bit.ly/trackerguide.

Are robots taking our jobs?

Originally posted on Inspiring learning.

Yes, and they have been for generations

American union leader and civil rights activist Walter Reuther (1907 – 1970) led the United Automobile Workers union of five million auto workers, retired workers and their families. He was recognised by Time Magazine as one of the 100 most influential people of the twentieth century.

Walter was being shown around the Ford Motor plant in Cleveland in the 1950s. A company official proudly pointed to some new automatic machines and said, “How are you going to collect union dues from these guys?”

Walter replied, “And how are you going to get them to buy Fords?”

From QuoteInvestigator.com

For my keynote at the second annual NTfW teaching and learning conference, I wanted to focus on robots and Artificial intelligence (AI). I asked our Jisc Sanbot to do the introductions, which she did by video (above). She was able to give her greetings to the audience in Cardiff, emphasise the importance of digital skills for learners and describe the coming session. She demonstrated that she could do at least part of my job as a presenter.

I asked the audience which of their past jobs or tasks were now done by robots, computers or machines. You can see their answers in this word cloud.

Mentimeter

I shared a short series of videos, highlighting some of the latest developments in robot employment. Have a look at this bricklaying machine working with humans, cocktail mixing robot arms in Las Vegas, Amazon’s army of little orange bots which bring the shelves to the human packers, and the latest demonstration of driverless cars by Waymo.

The aim was to paint a picture of the modern, evolving workplace, and ask some questions about our delivery for work based learners. Does our practice support twenty first century apprentices and trainees? Are we embedding the digital skills into programmes that enable learners to perform and compete globally?

During this Radio Three discussion in 2015 a panel of experts talked about the kind of jobs that could be automated over the next few years. They argued that in addition to lower skilled jobs, many of the tasks currently carried out by doctors, health professionals, lawyers, teachers, accountants, journalists and others could be done by AI. These would be the more routine and time consuming parts of the role such as marking for teachers or collating statistical news stories for journalists. We have all seen our doctors’ surgeries employ triage, online prescription ordering, phone consultations and other services that save time and resources. In 2015, eBay solved 60,000 disputes online using no traditional lawyers.

Pepper

Image from Wikimedia.org

The panel agreed that humans will continue to do tasks where they need to be creative, and have good perception and social intelligence in fields like mental health and social work. Stephen Hawking echoed this in 2016 saying,

“The automation of factories has already decimated jobs in traditional manufacturing, and the rise of artificial intelligence is likely to extend this job destruction deep into the middle classes, with only the most caring, creative or supervisory roles remaining.”

Despite this, two years later robots started to appear in situations that we associate with emotional and moral ability. Here’s a Pepper robot Buddhist funeral priest being sold in Japan at a quarter of the cost of a human priest (per funeral).

I asked the audience if we should be optimistic or pessimistic about our future with robots and AI.  You can see the overall scores from the room in the chart below.

Mentimeter

The first two statements were optimistic views of automation and the third and fourth were more pessimistic. The statements were taken from this 2017 Guardian article asking the same question – Meet your new cobot: is a machine coming for your job? You can see that the audience was quite well balanced between optimistic and pessimistic, and this demonstrates one of the challenges. There are so many good things and so many dangers that it’s difficult to choose a side.

Whatever our view though, we need to lay the foundations for our learners and prepare them for the twenty first century workplace with up to date skills and experiences. As learning providers, we must work together and share the latest thinking, the latest ideas and the latest tools and techniques.

 

You can find the resources from the conference here in English and here in Welsh.

The post Are robots taking our jobs? appeared first on Inspiring learning.