The Intelligent Library #CILIPConf17

Originally posted on e-Learning Stuff.

So what is the intelligent library? What is the future of the library?

library

At the CILIP Conference in Manchester this year, on Thursday 6th July, I am delivering a high level briefing session on technology, specifically looking at the library within the intelligent campus space. The session will explore the potential technologies and the possibilities that can arise from the developments in artificial intelligence and the internet of things.

There has been plenty of hype over artificial intelligence and the internet of things. Is it time to put aside the cynicism that this kind of hype generates and look seriously at how we can take advantage of these emerging technologies to improve the student experience and build an intelligent library?

library

The internet of things makes it possible for us to gather real-time data about the environment and usage of our library spaces. It is easy to imagine using this data to ensure the library is managed effectively, but could we go further and monitor environmental conditions in the library, or even, using facial recognition software, student reactions as they use the library so that we can continually refine the learning experience?

Most smartphones now make use of artificial intelligence to make contextual recommendations based on an individual’s location and interests. Could libraries take advantage of this technology to push information and learning resources to students? If we could, it offers some interesting possibilities. On-campus notifications could nudge students to make best use of the available services such as the library. Off-campus notifications could encourage them to take advantage of the learning opportunities all around them. Could we use approaches like this to turn student’s smartphones into educational coaches, nudging students towards the choices that lead to higher grades and prompting them to expand their learning horizons.

As we start to use a range of tracking technologies, smart cards, beacons, sensors we are facing a deluge of data in the use of buildings, spaces and equipment across a college or university campus. We are faced with a breadth and depth of data which can be challenging to use effectively and have greatest impact. These tracking technologies are already widespread in environments such as airports and retail. Often using wifi tracking to track users via their wifi enabled devices and smartphones. In addition sensors are used to track space utilisation and occupancy. Interpreting the data is fraught with challenges and difficulties, as well as potential ethical and legal issues. However this wealth of data does offer the potential to deliver more satisfying experiences for students and staff as well as ensuring the library is used as effectively as possible.

books

Looking in more detail we can outline some potential use cases for the intelligent library, and we may want to think which of these are desirable, but also which are possible with the current state of technology.

We can imagine an intelligent library which not only knows what seats and PCs are free, but can learn from history and predict when the library will be busy and when it will be emptier. The library then provides this information to students via an app, pushing the library when there is more availability of places and computers.

Having a deeper understanding of the utilisation of the library, will allow for more effective and efficient use of space. Could this also mean we have a flexible library that expands and contracts as demand for space in the library changes over the day or over the year?

Could we use wireless technologies, such as RFID, not just for issue and return, but also track those resources as they are used within the library itself? Could we also use the same technologies to track resources across campus to identify areas where they are being used or stored (or even lost)? Could we then enhance those spaces to improve learning?

Could we use facial recognition to monitor regular users of the library and provide insight and data into learning analytics? Could we go one step further and use facial recognition technology to discover when students are “troubled” or “in need of help” and then make appropriate interventions to support them in their studies?

books

If the library is getting full, could we identify those students who have been in there a long time, and push a notification, incentivising them to take a break with a free coffee from the library coffee shop? Could we go one step further, and promote wellbeing, by doing the same, but with a free coffee on the other side of campus, so they have to go outside and get some air and exercise?

using a mobile phone and a laptop

Is there any benefit in providing a platform to help gather this data from a range of systems in a standard format that makes it easier to analyse and act upon? Would it be useful to have a national view over this data? Would that enable us to find new patterns that could help us discover the story behind the data, to make appropriate interventions and improve the use of our libraries? Could we build the tools and practices an institution would need to use to gather, organise and push this data to student’s smartphones as well as exploring novel user interfaces such as chatbots?

Of course all this tracking and data collection has huge implications from an ethical perspective. We already gather large amounts of data in the library, sometimes this is anonymised, but sometimes it relates to individuals. At a basic level, we have seen physical counters to determine the number of users in the library, as well as using library management systems to gather data about the usage of resources and the borrowing behaviour of library users. The intelligent library as outlined above takes all this initial tracking of users one step further.

library

As the technology in the intelligent library space grows, we need to consider various questions on why we want to use these technologies, how we use them and if we should. We already use a range of systems to collect data, do we want to put in new systems to gather even more data? Some data we need to collect regardless of our concerns, a library management system by definition will collect and store huge amounts of data about resources and users. What happens less often now, but may increase in the future is the processing of that data. This is the analysis of that data and displaying that data in a way that shows a picture. The next step is taking action about what that data shows. It could be an organisational action, but could equally be action related to an individual user. How do we ensure that we have consent to collect data (sometimes this is implicit by using the library), how do we ensure we have consent for us to process that data and finally do we have constant to take action on that data?

What is the future of the library? This session at the CILIP Conference will explore the potential technologies and the possibilities that can arise from the developments in artificial intelligence and the internet of things. Can we build an intelligent library? Do we want to?

Student digital experience tracker 2017: the voice of 22,000 UK learners

Originally posted on Jisc Digital Student.

Universities and colleges are investing large sums of money on their digital environment both in terms of infrastructure, learning materials and supporting their staff with the development of their digital capabilities.
But how do we know if the investment being made in these areas is impacting on our students’ digital experience?
What do students’ expect in relation to technology? How are they using digital to support their learning? What are colleges and universities getting right and where are the areas where further work is needed?

Let’s take a look at the key findings and the implications of these for institutions and you can access the full report here and you can sign up for the 2018 Tracker here.

Student digital experience tracker 2017: the voice of 22,000 learners

Student digital experience tracker 2017: the voice of 22,000 learners

Using the student digital experience tracker:
Through extensive consultation with staff and students in further education and skills and higher education, our Digital student project developed the student digital experience tracker. The tracker allows universities, colleges and skills providers to:
• Gather evidence from learners about their digital experience, and track changes over time
• Make better informed decisions about the digital environment
• Target resources for improving digital provision
• Plan other research, data gathering and student engagement around digital issues
• Demonstrate quality enhancement and student engagement to external bodies and to students themselves.

The Tracker, delivered in BOS an online survey service especially developed for the UK education sector, is based on a concise set of questions that have been intensively tested with learners for readability and ease of response. It builds on resources such as the Jisc/NUS Digital Student Experience benchmarking tool, and the Jisc guide to Enhancing the Digital Student Experience: a strategic approach. The questions cover issues that are important to learners and/or to staff who have a focus on the digital learning experience.

This year, 74 UK institutions ran the tracker with their students collecting 22, 593 student responses. Institutions had the options of running surveys designed specifically for their learner types, HE, FE, adult and community learners, skills learners and those learners studying remotely online. The high level findings from the 2017 report are outlined below together with the implications for institutions in how we can better support our students’ digital experience.

Digital environment and services

The digital environment is key to a successful student experience. Colleges and universities should endeavour to make their systems easy to access and to provide a safe, secure and seamless online environment for learning. Many learners assume it’s a given that they will have access to free Wi-Fi whilst having the freedom to bring their own devices on campus.

What are our students saying in relation to the digital environment?

• Only 69% of FE and 80% of HE learners say they have reliable access to Wi-Fi in comparison with 90% of ACL and Skills and 96% of Online learners
• FE students use an average of 2 institutional devices (most commonly desktop and printer) and an average of 2 personal devices (most commonly smartphones and laptops)
• HE students use an average of 1.4 institutional devices (most commonly desktops and printers) and 2.7 personal devices (most commonly laptops, smartphones, tablets and printers)
• 83% FE students and 66% HE students use institutional desktops

What does this mean for institutions?

There is further work to be done in ensuring FE learners have seamless access to Wi-Fi and to ensure a parity of experience across all sites and areas of study. Despite the increasing number of learners bringing their own devices, they still like to feel that everything they need to succeed will be made available by their institution. This includes infrastructure (fixed computers and printers), access to online and off line learning resources and a physical learning environment where staff and students can work both individually and collaboratively.
Jisc offers guidance on providing a robust, flexible, digital environment and how to develop policies for supporting students’ use of their own devices.

Using digital systems for learning

Our research on student experiences and expectations of technology (Beetham, H and White, D. 2013) has shown that students respond favourably to authentic, meaningful digital activities that are linked to or directly embedded in their learning and assessment, especially if those activities are relevant to their future employment ambitions. The integration of technology in so many aspects of our daily lives means that learners now enter university of college with increased experience of technology, and have the expectation that technology will feature in their learning journey in some way.

What are our students saying in relation to how technology is supporting their learning?

• Over 90% of learners in all sectors have access to online course materials
• 70% HE and FE learners agree that when digital technology is used in their course they are more independent in their learning, and can fit learning into their lives more easily
• 80% HE and 62% FE learners agreed that submitting assignments electronically is more convenient

This suggests that learners value the convenience and flexibility that the use of digital technologies provides.

However,
• 80% HE and 61% FE rely on the VLE for coursework, but only 40% agree they enjoy using the collaborative features or want their tutors to use it more
• Fewer than 50% of learners agreed that they feel connected to others when they use digital technology on their course
• A minority of learners (ca 10-15%) find digital systems increase their sense of isolation: they may have difficulty with distraction and information overload

What does this mean for institutions?

This suggests that learners tend to experience digital technologies that aim to solve practical issues rather than support alternative pedagogic practice. Technology is still being used to support the ‘transactional’ learning elements rather than ‘transformational’ learning opportunities.

Make it clear to students how and why technology is being used to support learning from induction and at the start of new modules to establish an institutional digital entitlement. Reinforce this by embedding digital activities and assessment opportunities as part of the curriculum design to set the expectation that students will use technology throughout their study. Accompany this with responsive support to establish a base level of digital capability and confidence and a platform to explore and develop subject and discipline specific uses.

Technology can be particularly useful in bridging the gap between study and work – apprentices and students on work placement can use technology to access resources, monitor their own progress and keep in touch with employers, tutors and assessors. Jisc offers guidance on how to develop a digital curriculum which offers students’ opportunities to develop their digital skills which will also prepare them for a digital workplace.

Support for digital skills

Not all students have clear ideas on how digital technologies can support their studies or how they may be important in their lives beyond education. Technology is so pervasive in everyday life that ensuring students are digitally capable by the end of a programme of study has to be considered as one of the key employability skills that institutions need to help students develop. Where do students go to access support with their digital skills? Can we assume that learners are confident in using technology for their learning?

What are our students saying in relation to support for their digital skills?

• 46% of learners in FE/ACL and skills look to their tutors first for support with digital skills in comparison with only 16% HE learners
• HE learners most commonly look to online resources first (37%)
• Collectively, informal support (totalling across friends, family, other students and online resources) is more common than formal support (tutors and other organisational support options) for all learners in all sectors, but especially for HE and Online learners (76% and 80%)
• Few learners in any sector look to specialist support staff first: but 65% do know where to get help with digital skills

What does this mean for institutions?

The 2017 UCISA Digital Capabilities Survey identifies the importance of staff digital capabilities as a positive influence on students highlighting the need for staff who are confident and proficient in using technology and designing appropriate digital activities.

Threading the use of digital technologies throughout the whole learning experience from pre-entry to induction, to specialised and contextualised use and emerging professional practice will help students become familiar with common workplace practices and embed technology more naturally within personal practice.

From our digital learner stories with in-depth interviews with students, Helen Beetham reports:

‘As in the pre-digital age, learners still need access to rich resources, opportunities to practice, and supportive interactions with their tutors and peers. They make notes, organise ideas, prepare assignments, collaborate, express themselves, manage their time and motivation, revise and review, listen to feedback and showcase what they can do. But there are also some striking discontinuities. Learners are making more use of graphical, video and audio resources, both to learn and to express what they can do. They curate their personal learning resources in ways that were unimaginable in the days of paper. They share, comment, mix, remix and repurpose freely. They use digital networks to connect across boundaries, whether the barriers between learning and work, or between learners in different countries, or between formal learning and all the other opportunities and interests they have.’

The traditional approach to skills development by training staff and students separately is a model that is at odds with the fast pace of change and can result in delays in implementing new technologies and new approaches. A more agile approach where staff and students are supported to work in partnership is proving to be more effective. This helps to overcome difficulties of identifying separate time, resources and offer a more responsive approach. See our guidance on supporting organisational approaches to developing digital capability.

Preparing students for a digital workplace

‘The UK will need 745,000 additional workers with digital skills to meet rising demand from employers between 2013 and 2017, and almost 90% of new jobs require digital skills to some degree, with 72% of employers stating that they are unwilling to interview candidates who do not have basic IT skills.’
Digital Skills Crisis, House of Commons Science and Technology Committee, Second Report of Session 2016-2017 (https://www.publications.parliament.uk/pa/cm201617/cmselect/cmsctech/270/270.pdf)

Our research into developing student employability found that expectations from employers and education providers in relation to digital entrepreneurialism is low. To ensure students are developing these skills, their learning experiences need to be embracing these practices.

What are our students saying about being prepared for a digital workplace?

• While over 80% of HE learners and 63 % of FE learners, feel that digital skills will be important in their chosen career, only 50% agree that their course prepares them well for the digital workplace
• 40-50% of learners didn’t know or weren’t sure what digital skills their course required before they started it
• Fewer than 50% agreed that they have been told what digital skills they need to improve

What does this mean for institutions?

Learners need to understand the digital environment they are entering and the kinds of learning practices expected of them as they prepare for employment. These expectations and requirements should be embedded into induction processes as well as the curriculum and the wider learning experience. Our technology for employability toolkit (pdf) provides effective practice tips on incorporating technology-for-employability. Several universities have adopted digital capability, digital citizenship, or similar as a graduate outcome. Others have required digital activities and outcomes to be discussed during course design and review. Further work is required to ensure students are effectively supported with the development of their digital skills.

There is a range of approaches to using technologies in the development of students’ digital skills for the workplace. For example, some institutions are helping learners to enter into partnerships with employers around the world to identify and solve real world problems. This approach can prove highly motivating for learners, while also enabling providers to develop efficient and cost effective authentic learning experiences for learners, an approach that can bring benefits for employers too.

Engaging learners in planning for digital

Learners who feel a sense of belonging with their college or university and who feel the institution cares about their learning experience are more likely to succeed, to maintain good relationships beyond their initial course of study and contribute more through alumni activities. Engaging with learners in a genuine and meaningful way regarding their digital learner journey is one such way to build loyalty and enhance their learning experience.
What are our students saying on being involved in developing their digital environment?

• 35% HE students and 44% FE learners agree that they are involved in decisions about the digital environment

What does this mean for institutions?

This mirrors the 2017 UCISA Digital capabilities survey report , which stated that 43% of Universities that responded to the survey are working with students as change agents (another 38% said they were working towards this).
Through engaging in meaningful and collaborative dialogue and partnership whilst working with students as “change agents”, colleges and universities can encourage a deeper understanding of how digital technology can support learners’ needs. Effective use of technology can enhance the learning experience, for example, by providing additional channels of support or opening up enriched opportunities for learning and communicating to those who may otherwise find it difficult to participate.

When students and staff work together to combine their skills and expertise, results can exceed expectations. Here are some examples of how colleges and universities are engaging learners in the development of their own digital environment.


Further information:

You can read our report, Jisc Digital Experience Tracker 2017: the voice of 22,000 UK learners to gather further insights into what students are saying in relation to their digital student experience and join us for the Connect More events where further discussion around these findings will take place.

Sign up for 2018 Tracker here: http://bit.ly/trackersignup18

Join the tracker mailing list http://www.jiscmail.ac.uk/jisc-digitalstudent-tracker

Follow #digitalstudent and @jisc on Twitter

Using learning analytics to enhance the curriculum

Originally posted on Effective Learning Analytics.

At our Planning Interventions workshop at Aston University last week we spent the morning looking at interventions with individual at-risk students. This post reports on the afternoon session, where we examined the use of learning analytics to enhance the curriculum.

Workshop participants

Participants were given several simulated analytics outputs relating to the first year of a degree programme. Data concerned both the programme and individual modules, and included:

  • VLE activity for one module showing access of pages, quizzes and discussion forums
  • A graph of VLE activity correlating with grades for the programme of study (included below)
  • Attendance against achievement
  • Assessment dates and activity (included below)
  • The prediction of students at risk of failure over the year

Graph shows a reasonable correlation between VLE use and grades

This graph shows a correlation between VLE use and performance in assessments.

Graph shows how online activity and attendance for the cohort vary each month. It also show the number of assessments due each month across the programme - suggesting that these may have an impact on engagement.

This graph shows online activity and attendance at classes for the group during the first year of the programme, with the number of assessments required to be submitted per month.

Groups were asked to review the various graphs and tables and then develop an intervention plan addressing:

  1. Questions – what should we be asking about the effectiveness of our curricula?
  2. Data – what data should we collect to help answer these questions? Is the data provided useful?
  3. Stakeholders – who are the main actors who can adapt the curriculum?
  4. Interventions – what changes might you make to enhance the curriculum and what would you expect to happen?
  5. Adoption – what will staff do differently? How will they be encouraged to change their working practices?
  6. Evaluation – how will the success of the enhancement be evaluated?
  7. Sustainability – how do we make it effective and on-going?

Questions

It’s a realistic scenario that staff are increasingly going to be presented with a selection of different visualisations of data about engagement with learning activities, grades etc.  – and then have to see what questions these raise about the effectiveness of the curricula they’ve put together.

One of the most glaring issues with the simulated programme we presented was the bunching of assessments. “Which clown designed the assessment schedule?” asked one group. They felt that this was quite clearly driving attendance and online activities (or the lack of them). Another group wanted to know more detail about the assessments – were they formative or summative and was the feedback timely? It was also noted that a third of students hadn’t engaged in the quizzes at all.

There was a desire for greater detail about VLE use in general. Which tools are being used and which appear to be effective?

One group asked: “We observe that average grade correlates highly positively with attendance so how do we go about optimising consistent attendance patterns?” They also wondered how higher participation could be achieved in forums and quizzes as VLE engagement is correlated too with better grades.

We do of course need to keep reminding ourselves that correlation is not the same as causation. However if the pattern of a successful learner includes high engagement with the VLE and good face to face attendance, then we can at least attempt to design curricula which encourage this and advise students of the importance of engagement. We can probably assume that overall performance of cohorts will then improve if participation in the learning activities we’ve designed improves – though there are of course lots of factors to consider such as other aspects of the curriculum not standing still, different staff involved and new cohorts which vary in multiple aspects.

Another problem identified was low attendance in a module requiring studio attendance (it was a ceramics degree). The group I was in speculated as to whether attendance was being recorded accurately, or perhaps that students would do their studio work in a few long sessions rather than lots of short ones, the data thus suggesting lower participation than the reality.

One group noted that online engagement was low and asked “Why are we struggling to engage students with online activities? Should the course even be online? Is the VLE the best forum to share practical application? [it being a ceramics degree]”. Other fundamental questions people thought should be asked included “Does the teaching matter?” and “Does the VLE make a difference?”

Data

We asked the groups whether the data given to them was of use and what else should be collected to help them answer their questions about the curriculum. There was a wide variety of suggestions here.

Several mentioned the need for a greater breakdown of data on VLE activity, particularly the timing of access to content, quizzes and forums, and the duration of visits. Seeing patterns of use of quizzes, including numbers of students who’d accessed them multiple times, was also thought to be potentially useful – as well as further data on the scores achieved. Also, further detail on the nature of the assessments e.g. whether they involve group work was considered important.

Another group wanted to be able to drill down to see individual student data when required, as well as a cohort view.

Final module marks for the cohort along with retention and progression data were thought to be necessary pieces of information as well. Student satisfaction data would help to discover why some modules have much better engagement than others. Further insight could be obtained from focus groups asking questions such as “Why are we struggling to engage you with online activities?” or attempting to discover whether IT literacy is an issue.

Some groups suggested additional visualisations e.g. showing the relationship between quiz engagement and course grade, and between VLE engagement and attendance. Do those who attend less catch up at home by watching lecture videos, for example? Does that compensate for missing out on face to face contact?

Stakeholders

Students, academic staff, associate deans, programme directors, module leads, tutors, learning technologists, external examiners, alumni, employers, professional bodies and other institutions sharing best practice, were all suggested. One group noted the need for students to be active here rather than passive – another proposed that the staff-student consultative committee could help provide ideas. Students could be asked, for example, “What would the VLE look like if you had a hand in it?”

Enhancements/interventions

Several groups mentioned addressing the issue of assessments being clustered too closely. Assessments could be introduced earlier, to help students get used to them, and to provide more timely feedback to staff on student engagement. The assessment load, in this example, should clearly be spread more evenly across the year.

The learning design should ensure that online activities are more clearly linked to assessment, which should encourage better take-up of forums and quizzes. An early process to optimise student engagement with the VLE may be necessary. Study skills training at the start of the programme which shows students the most effective ways to use online learning might also help.

The data might even raise questions as to the effectiveness of the content on the VLE: is it “rich, dynamic, engaging and accessible?” The quality of the VLE platform itself might even be questioned. It may also be that there are too many quizzes and forums, and that they could be consolidated somehow – or non-online alternatives could be considered.

If use of the VLE is considered important by the institution then showing staff the correlations between its usage and assessment performance may be helpful in making the case that they should be using it more.

Adoption

In this section we were hoping that the groups would think about what staff would need to do differently to implement the use of analytics to enhance the curriculum. How would they be encouraged to change their working practices and what roles would be involved?

However the two groups which got to this point seemed to be thinking more about how to improve use of the VLE in particular.

One group suggested working with students as a partnership to redesign the VLE, and training staff in instructional design. The other suggested flipping the classroom, and moving from tutorials and seminars to a more personalised form of learning.

Evaluation

Some participants wondered what we should be evaluating here: the success of the programme or uptake of the VLE? If our intervention aims to increase VLE uptake because we see a correlation between that and performance then measuring VLE use before and after the intervention would be important. Likewise reviewing data on attendance may help to evaluate the success of the intervention. Comparisons could be made with historical data on attendance, engagement and satisfaction to see if these have improved.

Ultimately, reviewing the grades of future cohorts or the marks in individual assessments may give an indication of whether the intervention has been successful.

Sustainability

Pockets of good practice can disappear with individuals: one group suggested attempting to share ownership across the institution, and to capture any lessons learnt. Ensuring that the student voice is heard, and that there’s a process of feedback and iterative review were also thought to be vital.

Conclusion

Participants worked hard and seemed highly engaged with the activities all afternoon. There is a strong desire to see how we can use the rapidly increasing sources of data on student activity to develop curricula based on evidence of what works. Most universities and colleges are only beginning to do this but staff skills in interpreting such data and updating the curriculum as a result are, I believe, soon going to be essential. We now have the blueprint for a workshop which we can build on for institutions interested in using learning analytics for more than just addressing student retention.

Planning interventions with at-risk students

Originally posted on Effective Learning Analytics.

Last week Paul Bailey and I braved the sweltering heat which had briefly engulfed Birmingham and most of the rest of England to meet with staff and students at Aston University. 30 of us, including representatives from three other universities, spent the day thinking about how institutions can best make use of the intelligence gained from data collected about students and their activities.Workers enjoying the sunshine in Birmingham city centre

We concentrated on two of the main applications of learning analytics:

  1. Intervening with individual students who are flagged as being at risk of withdrawal or poor academic performance
  2. Enhancing the curriculum on the basis of data in areas such as the uptake of learning content or the effectiveness of learning activities (which I’ll cover in a later blog post)

In the morning, each of the five groups which were formed chose a scenario and thought about the most appropriate interventions. Here is one scenario which was selected by two of the groups:

Student Data Example 4: Business student with decreasing attendance and online activity

Student: Fredrik Bell

Programme of Study/Module: Year 1 Business Computing and IT

Background: The data provides a view of attendance vs online activity from the start of the course for a BUS101 Module. Fredrick has a low average mark, has asked 1 request for an extensions deadline. Attendance has been dropping and VLE use following a peak in October has dropped off. The IT modules show that his attendance is low but he has good marks in assignment up to January. [Note that the vertical scale represents this student’s engagement as a percentage of the average for the cohort – thus 150% online activity in October is well above average.]

Graph showing decreasing attendance and VLE activity by student against average

 

The groups were then asked to develop an intervention plan consisting of:

  1. Triggers – what precipitates an alert/intervention
  2. Assumptions – what other information might you consider and assumptions made before you make an intervention
  3. Types e.g. reminders, questions, prompts, invitations to meet with tutor, supportive messages
  4. Message and behaviour change or expectation on the student – suggest possible wording for a message and anticipated effect on student behaviour
  5. Timing and frequency – when might this type of intervention be made, how soon to be effective
  6. Adoption – what will staff do differently? How will they be encouraged to change their working practices? Which roles will be involved?
  7. Evaluation – how will the success of the interventions be evaluated

Triggers

The groups specified a variety of triggers for interventions. Attendance at lectures and tutorials was a key one. One group considered 55% attendance to be the point at which an intervention should be triggered.

A low rating in comparison to average is likely to be a better trigger. This is the case for accessing the VLE too – overall online activity well below average for the last 3 weeks was one suggestion. A danger here is that cohorts vary – and a whole class may be unusually engaged or disengaged with their module.

Requesting an extension to an asssignment submission was thought to be a risk factor that could trigger an intervention. Low marks (40%) in an assignment could also be used.

An issue which I was acutely aware of was that ideally we should have the input of a data scientist in helping to decide what the triggers should be. An attendance rate, for example, which indicates whether a student is at risk is likely to be more accurate if it’s based on the analysis of historic data. Combining statistical input with the experience of experienced academics and other staff may be the best way to define the metrics.

Assumptions

It’s likely of course that engagement will vary across modules and programmes within an institution, where different schools or faculties will have different approaches to the importance of attendance – or even whether or not it’s compulsory. One student mentioned that some lecturers have fantastic attendance rates, while students quickly stop going to boring lectures. The fact that they’re not attending the latter doesn’t necessarily imply that they are at risk – though it might suggest that the lecturer needs to make their lectures more engaging!

Events such as half-term holidays, reading weeks etc would also need to be taken into account. Meanwhile, if Panopto recordings of lectures are being made, then student views of these might need to be considered as a substitute for physical attendance.

The use of the VLE varies dramatically too, of course, between modules. Is regular access necessary or has the student accessed the module website and printed out all the materials on day 1?

One group suggested that intervention in a particular module should consider engagement in other modules too. A common situation is that a student has high engagement with one module which has particularly demanding requirements to the detriment of their engagement in less demanding modules.

If low marks are being used as a trigger, it may be too late to intervene by the time the assessment has been marked. More frequent use of formative assessments was suggested by one group.

Another assumption mentioned was that low VLE usage and physical attendance do actually correlate with low attainment. A drop in attendance or in VLE use may not necessarily be a cause for concern. Again, analysis of historic data may be the best way to decide this.

Types of intervention

Push notifications (to a student app or the VLE), emails and/or text messages sent to the student by the relevant staff member were suggested. These might consist of an invitation to meet with a personal tutor. Some messages might best go out from the programme director (sent automatically with replies directed to a support team).

Other options were to send a written communication or to approach a student at a lecture to attempt to ensure they attend a follow-up meeting (though if they’re disengaged, you probably won’t find them in the lecture, and of course in large cohorts, lecturers may not know individual students…)

One risk that was noted was that multiple alerts could go out from different modules around the same time, swamping the student.

Ideally, one group suggested, the messages would be recorded in a case management system so that allied services would be informed and could assist where appropriate. Interventions would be viewable on a dashboard which could be accessed by careers, student services etc (with confidentiality maintained where appropriate). People working in groups at tables

Timing and frequency

A sequence of messages was suggested by one group:

  1. A message triggered by extension request or late submission
  2. If no response received from the student after a week, a follow up message expressing concern is sent

Another group suggested that an intervention would need to be taken in the first 4 weeks or it might be too late. If low marks are being used as a trigger, this would have to happen immediately after marking – waiting for subject boards etc to validate the marks would miss the opportunity for intervention.

Points during the module e.g. 25%, 50% and 75% of the way through could be used as intervention points, as happens at Purdue University.

Messages

A lot of thought is put into the best wording for messages by some institutions. One suggestion at our workshop, with the aim of opening a dialogue with the student to gauge if further support is necessary, was:

I noticed that your coursework was submitted late. I hope everything is OK. Let me know how you are doing.

Another group thought that a text message could be sent to the student, which would maximise the chance of it getting through to them. It would ask them to check an important email which has been sent to them. The email would say:

  1. We are contacting you personally because we care about your learning
  2. We would like to have a chat
  3. Why we would like to have a chat – concerns about attendance and online engagement, and risk that this might translate to poor performance in assessments coming up
  4. Make student aware of support networks in place

The use of “soft” or “human” language, which relates to the student, was thought to be important.  Short, sharp, concise and supportive wording were recommended.  A subject line might be: “How to pass your module” or “John, let’s meet to discuss your attendance”.

I did have a conversation with one participant at lunchtime who thought that we need to be clear and firm, as well as supportive, with students about their prospects of success. If we beat about the bush or mollycoddle them too much, they’re going to get a nasty shock when they encounter firmer performance management in the workplace.

Adoption

A couple of groups managed to get onto this part of the exercise. Numerous projects involving learning technologies have failed at this hurdle – it doesn’t matter how useful and innovative the project is if it’s not adopted effectively into the working and study practices of staff and students.  What I was hoping to see in this part was more mention of staff development activities, staff incentives, communication activities etc. but most people had run out of time.

One group did however suggest defining a process to specify which staff are responsible for which interventions when certain criteria are met. A centralist approach could be less effective they thought than a personal one – so ideally a personalised message derived from a template would be sent from a personal tutor. Retention officers would be key players in the planning and operationalisation of intervention processes.

It was also thought that the analytics should be accessible by more than one member of staff e.g. the module tutor, programme director, personal tutor and retention officer in order to determine if the cause for concern is a one-off or whether something more widespread is happening across the cohort or module.

Evaluation

One of my personal mantras is that there’s no point in carrying out learning analytics if you’re not going to carry out interventions – and that there’s no point in carrying out interventions unless you’re going to evaluate their effectiveness. To this end, we asked the groups to think about how they would go about assessing whether the interventions were having an impact. Suggestions from one group included monitoring whether:

  • Students are responding to interventions by meeting with their personal tutor etc as requested
  • Students submit future assessments on time
  • Attendance and/or online engagement improve to expected levels

Another group proposed:

  • Asking students whether the intervention has made a difference to them
  • Analysing the results of those who have undergone an intervention

Engagement by the participants during the morning session would no doubt be rated as “high”, with some great ideas emerging, and some valuable progress made in developing the thinking of each institution in how to plan interventions with individual students. In my next post, I’ll discuss how the afternoon session went, where we examined using learning analytics to enhance the curriculum.

 

 

 

 

 

 

 

 

Learning Analytics Adoption and Implementation Trends: Quantitative Analysis of Organizational and Technical Trends

Originally posted on Effective Learning Analytics.

This is a guest post from Lindsay Pineda, Senior Implementation Consultant, Unicon.

  • What tool is used to quantify organizational and technical areas of readiness?
  • What can institutions learn from quantitative analysis of organizational and technical aspects of readiness?

For many institutions, the idea of beginning a learning analytics initiative may seem overwhelming and complex. There is often the perception that too much work needs to be done before any type of initiative can be explored. You might be surprised to learn that most institutions are already prepared for some type of learning analytics initiative. Whether your institution is ready to use student data and visualize it on a dashboard, or even pursue a small-scale pilot, the quantitative and qualitative analysis of organizational and technical trends supports an overall sense of institutional readiness for learning analytics.

The article “Learning Analytics Adoption and Implementation Trends: Identifying Organizational and Technical Patterns” outlined some of the key trends surrounding organizational and technical patterns obtained through Readiness Assessments conducted as part of Jisc’s learning analytics project1. Similar to the qualitative trends outlined previously, organizational and technical quantitative trends across institutions are evident. By sharing these quantitative trends in the aggregate, institutions interested in learning analytics may find the value in the quantitative analysis of organizational and technical readiness.

The Readiness Assessment process is designed to be collaborative and conducted onsite with a variety of key stakeholders across the institution. After the onsite visit, a comprehensive report is delivered back to the institution that contains all observations, direct feedback from participants (in an anonymous manner, collected as quotes without assignments of individual names), both qualitative and quantitative measures, and recommendations of next steps. The main goal of a Readiness Assessment is to gather key stakeholders to facilitate discussion, using activities designed to help the participants generate productive and collaborative conversations.

As introduced in the previous article, a matrix is used to measure readiness. The Unicon Readiness Assessment Matrix rates institutional readiness based on six criteria: Data Management/Security, Culture, Investment/Resources, Policies, Technical Infrastructure, and IR (institutional research) Involvement. These criteria provide an outline for the qualitative and quantitative elements discussed within the comprehensive report and throughout the onsite visits.

Quantitative Analysis

In order to gather information, we implemented a series of pre-visit electronic questionnaires, pre-visit phone calls, onsite visits, collaborative conversations with varying stakeholders, and post-visit communications. After collecting information from each institution, numeric values were assigned using a three-point Likert Scale based on a number of specific criteria. Then, once we had a robust sample of institutions, we created an aggregate for the sample.

Rating Scales

For each criteria, a rating of “one” indicated an institution was “not ready” due to the type and difficulty of the obstacles present. For example, when assessing organizational readiness, an institution may not have “high level buy-in for learning analytics” as evidenced by inefficient senior leadership support for the initiative, or, the leadership needs significant convincing of the initiative’s value.

When an institution had this type of rating, several recommendations were made to assist an institution with addressing the concerns for each of the areas outlined. An example of a recommendation made for an institution with a rating of “one” in the area of “leadership support” might be to set aside time with individual leaders to communicate the value of the initiative. Demonstrating potential retention benefits, understanding each leader’s goals for the institution, and involving the leader in communications and consultations moving forward may assist with overall understanding.

A rating of “two” indicated an institution was “somewhat ready” to move forward with a learning analytics initiative. When this type of rating occurred, it was often due to a combination of more difficult and less difficult obstacles present. For example, when assessing both organizational and technical readiness, an institution my not have “resource support available” as evidenced by a small number of staff possessing the skill sets needed; however, they may have a plan to address this by hiring outside resources for the required roles.

As with the rating of a “one,” several recommendations were made to assist the institution with moving forward. An example of a recommendation made for an institution with a rating of “two” in the area of “resourcing” might be to work with department leadership to establish the skills needed for each role. Involving stakeholders by giving them a forum to express their needs while communicating the benefits of added resources to the initiative could increase commitment for future efforts.

A rating of “three” indicated an institution was “ready” to proceed with a learning analytics initiative. When this type of rating occurred, it was often due to the institution demonstrating strengths such as having the needed resources to implement a learning analytics initiative, full senior leadership support, and a culture supporting the importance of data-driven decision making.

Recommendations for these institutions were made to help guide them toward a full-scale implementation. An example of a recommendation made for an institution with a rating of “three” in the area of “a culture supporting the importance of data-drive decision-making” might be to implement a small pilot. Choosing a department or school within an institution to implement a smaller pilot can lead to providing the evidence necessary to proceed with a full-scale implementation.

Quantitative Trends Summary

The specific criteria and scales used to quantitatively assess readiness are presented below. All scores are based on hypothetical institutional data.

Organizational readiness for implementation (cumulative average):
1 = not ready, 2 = somewhat ready, 3 = ready

Assessment Criteria Score
Demonstrates sufficient learning analytics knowledge 2.0
Level of change management comfort/willingness 1.0
Level of student engagement demonstrated/communicated 2.0
High level buy-in for learning analytics demonstrated/communicated 1.5
Organizational support demonstrated/communicated 1.5
Organizational infrastructure currently available 1.5
Policy/Practice management – Current processes 1.0
Obstacles/Challenges demonstrated/communicated 2.0
Ease of integration with existing organizational structure 1.5
Policy changes required 1.0
Level of understanding of ethics and privacy concerns 2.0
Level of enthusiasm demonstrated/communicated 2.0
Resource support available 1.5

Technical readiness for implementation (cumulative average):
1 = not ready, 2 = somewhat ready, 3 = ready

Assessment Criteria Score
Demonstrates sufficient learning analytics knowledge 2.0
Level of change management comfort/willingness 1.0
High level buy-in for learning analytics demonstrated/communicated 1.5
Institutional level of analytics – Current processes 2.0
Institutional infrastructure currently available 1.5
Data management knowledge/processes demonstrated/communicated 1.5
Data security knowledge demonstrated/communicated 1.5
Ease of integration with existing infrastructure 2.0
Maintenance support/feasibility (long term) 1.5
Resource support available 1.5
Obstacles/Challenges demonstrated/communicated 2.0
Level of enthusiasm demonstrated/communicated 2.0

The data from the hypothetical institution presented above demonstrates an overall rating of “somewhat ready” to implement a learning analytics initiative at their campus. What this indicates is that, while the institution may have obstacles related to areas such as change management, obstacles can be overcome with planning and collaborative effort. For example, this institution may be ready to pilot a small-scale technology solution, provided they address some of the other areas outlined including policy/practice management and data security.

Conclusion

Combined with qualitative analysis, quantifying readiness can be very effective in helping institutions assess how prepared they are for a learning analytics initiative. The quantitative analysis also assists institutions with moving toward a data-driven decision making culture. The Unicon Readiness Assessment Matrix not only provides both a quantitative and qualitative look at specific organizational and technical readiness criteria, but also presents an overall institutional readiness level for learning analytics.

Look for the next article in the related “Trends Unpacked” series to come in August 2017, where we will continue to discuss how the quantitative and qualitative data guides institutions in overcoming organizational-specific challenges.

Useful Reading

 

Education is not the filling of a pail, but rather the lighting of a fire.*

Originally posted on lawrie : converged.

It’s been over a month of events and workshops.

I’m trying to reflect on all of the things that have happened; delivering a course in Manchester the morning after the tragic event of 22nd May, talking about Rhubarb as a metaphor for digital strategy in Southampton and about Digital Leadership in an age of supercomplexity (not my term) with my colleague and friend Donna Lanclos in Sligo. Finally last week, I attended University of Gloucester’s Festival of Learning where Donna keynoted and later in the week I delivered a workshop trying to make sense of the future of the VLE.

It’s been busy.

I am tired. I am sat writing this first draft after spending a day making a cobb oven from clay, mud and straw, at the Centre for Alternative Technology in Powys.

A cobb oven

There is a common thread, I am trying to find it, it might just be that I was in these places, and that as a result of events and the places and things that I am doing I am more reflective. Or there might actually be, you know, a thread.

I’ve already written about Manchester and Southampton, and rhubarb. This arc of the story starts with Sligo. For anyone who hasn’t been to Sligo, you should, it is a fine place with fine people, and a fine institute – IT Sligo (best institutional Vegan food of the year – so far). I could talk about the paper that we gave, “Leading with digital in an age of supercomplexity” – but we are writing that up for elsewhere. The only thing I would say is that when you’re presenting with someone, and they are video conferencing in, it is extremely difficult to remain upbeat when delegates keep expressing their disappointment that your co-presenter is not actually in the room, but they might stay and listen anyway.

No, Sligo was the start of the story Arc because I sat in a keynote and heard all about competency based education, and how it was important to acquire skills as soon as you can and “complete” your education so you can get a job and start earning. I sat quietly, an invited guest at the conference, I gave a paper, but I was still a guest. But inside I was very uncomfortable, I was upset and I could not articulate it. And as I sat and I watched the faces of those around me, I wondered if others felt the same. This was the employability agenda at its worst. I have excellent colleagues who speak eloquently about employability, they speak of skills to participate in society, they speak of skills to continue to learn, and of skills to recognise and adapt to opportunities in their chosen careers and other paths.

But this competency based education seemed to focus on ‘learn the requisite skill’, the minimum for a credential, and move on. I was uncomfortable.

I saw this image walking out of Sligo, to the station.

Education is not the filling of a pail, but rather the lighting of a fire.

The sentiment remains no matter the provenance of the quote.

Donna Lanclos came to stay later in the week, she was bearing gifts, gin for my wife and I got Tressie McMillan Cottom’s book – Lower Ed. I have been looking forward to reading it, it feels part biography part political campaign, and all education.

Arriving at the Festival of Learning in Gloucester, I had already read part of Tressie’s book, and knew it would feature in the keynote Donna was giving.

Education, Employability, and Citizenship:  What is at Stake?

In the first half of the keynote Donna told the story of how the design of spaces can impact on learning, using the active learning spaces used at UNC Charlotte. She talked about how the design challenged lecturers, and students to think differently about learning and teaching. How even when presented with ‘traditional’ learning spaces the lecturers continued with the active practices they had established in the other spaces. And of the coming together of the beginnings of a community engaged in those practices.

Staff feedback included:

  • I get their full attention.  They [students] are very engaged
  • They interact with each other & build a stronger relationship/friendship.
  • I feel more connected to the students.  A reward for me as the instructor.

This was a good story about educational change, a story about good returns for both student and staff. Then she switched, up to this point she had been reporting on a small scale innovation (although a big estate investment in the classrooms), she highlighted the nature of innovation that was lacking at institutional scale, and put a slide up that just said:

#TEF

She talked about how teaching can be homogenized across Quality Assurance frameworks without talking about diversifying access to education, expanding the ranges of effective practice, or about processes of education and the complex ways that can prepare our students to be citizens. She challenged the audience, pointing out that checkboxes and metrics reduce teaching and learning to commodities that we sell our students.

Whilst a fees system is still relatively new in the UK, the US is an old hand at it, and not just fees but “for profit”. Donna referenced Tressie’s work in “Lower Ed” , where Tressie calls out the logical progression of “employability”, making the point that Lower Ed is a state of mind, not just a kind of institution.

Tweet: What message about employability are we giving students? That it's THEIR fault, not the economy's? Skills edu as bad education? #GlosLearn17 

 

I left school at 16, with no qualifications, in 1982. Thatcher had devastated the industry of the midlands and was about to destroy the coal and steel industry. As I read the book it struck me that the process of employers shifting responsibility for their workers had its routes back then (if not before). With massive unemployment, cue UB40 1 in 10, my only option was a Youth Opportunity Scheme, YOP. I started in May 82 working in a foundry pouring castings and any general dogs body jobs that were required of me. I still have some scars from the hot metal. The “promise” was a chance for a job with prospects, part funded by the state to provide us with training and skills, where at the end of 6 months there was a good chance of a job. At 5 months the cohort that started with me were all told that sadly there was no job available. The following week I joined the military and not long after I was standing on a beach 8000 miles away. The firm I worked for, RMI Steel (now long gone), got rid of the others in my cohort at 6 months, and started another group.

standing on a beach

Standing on a beach 8000 miles from the Black Country

Education should be a collective social good, Tressie argues we have lost that thread, in making it an individual good. I twitch when I see companies telling universities that they need to “train” graduates. I aspired to education, eventually through engaging with the Open University, and then a chance visit to Plymouth University, where I eventually got a degree in Environmental Science. I was inspired to learn, not to be trained. This shift in education into “training” feels wrong, it feels like another form of the YOP scheme.

I began this post telling you that this weekend I built a cobb oven. A group of us had come to on the course to learn a skill. Everyone on that course that I spoke to had the privilege of education, an education that made them curious and gave them the ability to continue to learn. One couple on the course came to learn how to build the oven so they could build it for their local community garden. We hadan education that had enriched our lives to the point where we wanted to experience this, not just exist. Our education (and most of us were not young) had not been focused on trading competencies, skills and training for jobs, we had been privileged to participate in an education that, perhaps, was more focused on the social good and community.

The Centre for Alternative Technology is very much focused on community and education. It was good to be around those people and in that place. It reminded me that I would like institutions to be like this; learning communities.

*Endnote: yes, I know he probably didn’t say it, and even though I make reference to that here, I am sure that someone will Yeats’splain anyway.

The mind is not a vessel to be filled, but a fire to be kindled.

 

Also on:

Medium

Consultation to inform Jisc’s research on Designing technology enhanced curricula

Jisc has embarked on a project to update its advice and guidance to help you design effective technology enhanced curricula – whether that be at course level, module level or lesson level.

We know that our existing guidance in this area is well used and we plan over the coming months to refresh and add to these materials.

This work will be undertaken by Gill Ferrell (gill@aspire-edu.org) and Ros Smith (ros.smith@gpisolutions.co.uk )

To get us started we would like to find out more about how we can best meet your needs. We would be grateful for 5 minutes of your time to fill in a short survey on the topic. We are interested in the views of colleagues leading on supporting staff with their practice on designing technology enhanced learning and anyone undertaking their own practice.

We are also looking for a few key contacts to interview to hear more about your practice and how we can support you, and the survey asks about this.

You can find the survey here and it will remain open until 30th June

Thank you for your assistance and we look forward to hearing from you.

It’s not what you say you do, it’s the way that you do it!

Originally posted on e-Learning Stuff.

language

I was thinking the other day that I don’t have enough readers of the blog and insufficient engagement.

So the solution has to be that the name of the blog isn’t right. First idea would be change the name from “elearning stuff” to “blended learning stuff”.

Then again maybe I could choose “e-pedagogy stuff” or what a about “threaded learning stuff”. How about “hybrid pedagogy stuff”?

Do you think that changing the name will significantly increase readership and engagement on the blog?

No.

If I want more readers and more engagement, then maybe, just maybe I should think more about the content I write, the style, the questions I ask, the quality of the writing, the frequency of posting and so on…

So when we start thinking that the problem with the embedding of digital and learning technologies, is the name that we use, such as blended learning or e-learning the problem, then we probably have a bigger issue.

If staff aren’t engaging with digital and learning technologies as part of their continuing professional development, then changing the term we use will have some impact, but not significant. It may encourage some to participate, but it may confuse others. However the language we use, though can be powerful in some contexts, is not the reason why people decide not to engage with digital.

It’s like the reason that people often say about lack of time, when the solution is not about providing more time, but is about setting and managing priorities. It really comes back to the reasons why people choose to engage or not and the reasons they give.

If you are having challenges in engaging staff in the use of digital and learning technologies and thinking that changing the “name” we use is the solution, i would suggest you may actually want to spend the time and effort thinking about your approaches and the methodology you are using.

Of course the real reason people choose to change the language, is that it is much easier to do that, then actually deal with people!

What do you think, we now language is important, but is the problem the terms we use or is it something else?

Get involved with ALT #altc

Originally posted on e-Learning Stuff.

ALT  have issued an open call to get involved with the association.

The Association for Learning Technology (ALT) represents individual and organisational Members from all sectors and parts of the UK. Our Membership includes practitioners, researchers and policy makers with an interest in Learning Technology. Our community grows more diverse as Learning Technology has become recognised as a fundamental part of learning, teaching and assessment.

Our charitable objective is “to advance education through increasing, exploring and disseminating knowledge in the field of Learning Technology for the benefit of the general public”. We have led professionalisation in Learning Technology since 1993.

They are seeking expressions of interest from:

  • all sectors of education and training including schools, vocational education, Higher Education and work-based learning
  • people in research, practitioner, management, technical and policy roles as well as learners who have a special interest in Learning Technology
  • the private and public sectors
  • within and outside the UK (subject to reaching agreement on an effective method of participation).

They are looking for new member of the Editorial Board as well as reviewers for Research in Learning Technology.

There are also vacancies on the main operational committees:

  • Committee for Membership Development – seeking two new members
  • Committee for Further Education and cross-sector engagement – seeking two new members
  • Committee for Communication and Publications – seeking two new members

They are also looking for new editors for the #altc blog editorial team.

There’s also the opportunity to get involved in the 2017 Online Winter Conference. With a focus on the work of Special Interest and Members Groups from across the community, and run over two days, the conference will include live online sessions and other online activities. Visit the 2016 Winter Conference online platform for reference.

We also still have places available on the OER18 conference planning committee. OER18 presents an opportunity for open practitioners, activists, educators and policy makers to come together as a community to reflect on the theme ‘Open to All’. Here is more information about joining the conference planning team for OER18.

You can find more information on any of these opportunities here: http://go.alt.ac.uk/2st4rJS.

Those interested to get involved can complete this short form: http://go.alt.ac.uk/2rEx9GM by 25 August 2017.

Online language – Why do we need to teach it?

Originally posted on Inspiring learning.

I’m in a WhatsApp group with the women from my gym class. We are from different age groups and backgrounds, but for us it’s the obvious tool to communicate – quick and easy for a spur of the moment chat.

The language is easy and relaxed, full of written sounds, awww, emojis, kisses xx, huns and babes. I love being part of it, but I’m too scared to make a comment most of the time. If I make a comment it sounds a bit wrong, a bit too formal – like I can’t really do it properly. And if I tried to add in the appropriate huns and emojis I think it would come across as fake because they know me in person. So I am happy to lurk in the group and pick up recipes and put in the odd comment when it’s needed.

suitcase

Free images: Pixabay.com

It’s not a question of being a so-called digital native. That theory has been well and truly replaced by White’s visitors and residents. People of all ages travel around in the internet between places which we visit briefly and leave no mark (which is no problem) – and places we reside in, having relationships and developing a presence. My presence in the WhatsApp group barely registers – I’m a visitor and that’s fine. But my presence in Facebook is quite strong – I have a residence there and come back and forth frequently, engaging with friends and family. When I’m there I use my language confidently!

The problem is that visitors can feel excluded from online communities that they might like to move in to, and one of the reasons could be because they don’t have the literacy skills to participate. Literacy happens in a lot of different contexts – families, work, schools, social groups, political or activist movements, hobbies and interests.  Multiple literacies are needed for individuals to thrive in multiple communities and the same is true online. The way we communicate using Twitter might be vastly different from the language we see in Facebook or when using instant messaging, WhatsApp or Snapchat.

I think we need to teach online language as part of literacy and digital literacy courses. More and more people are getting online – by May 2016 nearly 88% of adults in the UK had recently used the internet and only 10.2% had never used the internet (figures from the ONS). But how many of those are put off from the places that they want to visit or spend more time in by inaccessible language and confidence issues?

Let’s have a look at three big areas where people are encouraged to participate and language can be a barrier.

social media a

Free images: Pixabay.com

Social networking

Because people live not only in geographical communities but in online communities too, social skills can be vital. Whether we are keeping in touch with family or old friends, new acquaintances or colleagues, fellow students or enthusiasts, we usually prefer to fit in with the conventions and practices of the group so that we can feel comfortable and accepted.

Goods and services

Many of our goods and services are now online and UK governments are keen for citizens to access public services using digital means. We teach digital literacy and skills to help people gain experience, wisdom and confidence online and we need to teach language etiquette and social literacy skills as part of this.

 application form

Free images: Pixabay.com

Employability

Most jobs have to be applied for online; we have to build our professional networks, advertise and promote our businesses, research and learn online; we need digital skills in our own jobs and to be able to support others in their work. There are a multitude of ways in which technology supports our employability skills from starting our careers or companies to collaborating with colleagues or keeping up to date with developments in our field.

Debbie Edmondson, Talent Director at Cohesion Recruitment, recently spoke at Jisc’s Digifest about what employers want. You can see her slides from 17 to 29. She said that what they really want is new recruits who can fill in application forms, write appropriate emails, talk on the phone confidently and demonstrate good face to face communication skills. They want people who are literate in online language and linguistic etiquette as well as offline social skills.

So why do we need to teach online language?

Because we want citizens to be included; to participate; to benefit from access to digital services; and to feel empowered to use their voices and express their ideas when it’s the right time and place for them.

bridge

Free images: Pixabay.com

 

(We included online language and behaviour etiquette in the Essential Skills Wales Digital Literacy learner qualifications. You can read an FE News article from 2015 here for more information, or check out this Pinterest board.)

(If you would like to see some of my favourite examples of creative online language, here’s a link to my Tumblr collection of artefacts, articles and amusing memes )

(You can read sections of the PhD here )

(Have a look at Jisc’s work on digital capability here )

 

Previously:

  1. Online language – Journey to a PhD
  2. Online language – What does it look like?
  3. Online language – A new species of language
  4. Online language – How are communities using it?

Coming next:

  1. Online language – Bilingualism
  2. Online language – Somewhere along the line

The post Online language – Why do we need to teach it? appeared first on Inspiring learning.