Online language – What does it look like?

Originally posted on Inspiring learning.

It’s an exciting time for online language – it’s been an evolution from the beginning of communication on the internet in the 1960s, the development of ASCII through text speak and into tweets, instant messages, snap chats, status updates, memes, likes and omg, lol…

This new language is creeping into all aspects of our lives whether we socialise on Facebook, text our kids and grandkids, or email for work. Our digital capability skills, knowledge and experience has to be broad and flexible – and part of this involves understanding and communicating using new kinds of language.

emoticons

Image from: Pixabay.com

There are many different forms to this language and I have divided them into four main ones for this blog. The first one I am really interested in is emoticons. A couple of years ago I was writing a Kahoot quiz on emoticons and I did some research looking for inspiration. Incidentally, I have now delivered this quiz well over a hundred times to countless teachers and managers across HE, FE and Skills in the UK. Let me know if you have a Kahoot account and want me to share it with you.

Among many fascinating and surprising things I found was that Moby Dick has been translated into emojis. You can find the hardback version under the name Emoji Dick and the free pdf version here. Sometime later I found that the 200 most common words in the Christian bible have also been translated into emojis and you can access the Emoji Bible website here. If you are the sort of person who likes to verify its credentials and find out more about a word – please check out Emojipedia for a world of information and background.

Here is one of my quiz questions: what do you think this sentence means?

emoticons sentence

Answer: What time is coffee? Are you buying?

A second creative and exciting form of online language is acronyms. Acronyms have played an important part in English for centuries – where would we be without SOS, DIY, RSVP, ID and a thousand others? Common acronyms I come across in business circles include BRB, ATM, TBH, IMO, DM to name a few. In my various social circles the list is endless! I have great fun in my quiz asking participants how many of them use the acronym lol for ‘lots of love’ instead of ‘laugh out loud’ (which is correct). There is always either one person in the room or somebody knows someone who does this – much to the confusion of friends, colleagues and loved ones.

A third form is punctuation which can be used to create emoticons :-S or to add body language or tone to a chunk of text ¯\_(ツ)_/¯ Punctuation can be used cleverly to add *emphasis* or … to recreate speaking (um) pauses… and [smiling politely] imitate voice!!! Like SHOUTING… One of my quiz questions is this – What is the tone of this status? “You did a great job on that presentation! ;-)“   Is it Genuine? Sarcastic? or Joke? The room is always split – half choose genuine, half choose sarcastic or joke. I have to point out that if one half of the room used this winky face in a message, the other half of the room who received the message would misconstrue the tone, and could potentially be offended…

The fourth creative and fun form is writing the spoken word as it sounds. Many examples can be found of accents being reproduced in online messages, statuses and tweets. Some of my favourite examples are from Scottish tweeters – google it, you won’t be disappointed. For example “the police came tae ma door and told me my dugs were chasing people on bikes ma dugs don’t even have bikes” @darylgaughanx.

sign-1732791_1920

When these forms of language (and other forms such as images, memes, gifs and others) come together into chunks of text, messages or statuses the results can be creative, funny, moving and inspiring. But to some this language is completely inaccessible. The rules are being made up as we go along and people can feel alienated and excluded. And some, whose views on language are purist and driven by an insistence on traditional spelling, grammar and punctuation, would rather not engage at all.

 

(If you would like to see more examples of creative online language, here’s a link to my Tumblr collection of artefacts, articles and amusing memes )

(You can read sections of the PhD here )

(Have a look at Jisc’s work on digital capability here )

Previously:

  1. Online language – Journey to a PhD

Coming next:

  1. Online language – A new species of language
  2. Online language – How are communities using it?
  3. Online language – Why do we need to teach it?
  4. Online language – Bilingualism
  5. Online language – Somewhere along the line

The post Online language – What does it look like? appeared first on Inspiring learning.

Online language – Journey to a PhD

Originally posted on Inspiring learning.

It’s been a roller coaster of a journey which began in 2007 at a seminar for learning technologists.

The speaker had done some fascinating research about online learning and was giving a great presentation about it. I remember thinking ‘I could do that…’ and sketched out a PhD proposal on the back of the programme then and there.

From Pixabay.com

Free images: Pixabay.com

I had been an ESOL and literacy teacher in community learning and had recently landed a job with Jisc in Wales as an eLearning Advisor. I was really interested in bilingualism and literacy, especially as part of digital capability. Community learners at the time tended to be older and predominately women. Would they face language barriers when using online services and media? What about Welsh speakers and other bilingual learners? Would they find their literacy skills helping or hindering online?

I couldn’t find any research which combined the areas of community learning, bilingualism and learning using technology so I brought it all together in my own research project. My intention was to help practitioners and policy makers by providing some evidence around how bilingual learners feel about using their languages online and what would help to make technology more accessible for them.

The twelve word title I came up with was: An exploration of community learners’ attitudes around biliteracy and learning using technology. I spent the next eight years researching, interviewing, writing, talking and thinking about it.

Welsh emoji 1               Welsh emoji 2

Emojis for Wales: now available for Apple and Android

Jisc and Swansea University supported me all the way and many colleagues and friends spent countless hours in conversations about digital capability, language evolution, literacy skills and confidence, online services and sites, emojis and icons, and issues around Welsh and minority languages and cultural identity. I couldn’t have seen it through without their support and if you are thinking of doing a part time PhD make sure you have a patient network of people around you!

I was awarded my PhD in 2016 and I couldn’t be happier. My weekends and holidays are now free and my family have got to know me again. But it was all worth it – and now I want to share some of the ideas that came out of it as part of my work on digital capability. Look out for more blogs in this series.

smile

(You can read sections of the PhD here )

(Have a look at Jisc’s work on digital capability here )

Coming next:

  1. Online language – What does it look like?
  2. Online language – A new species of language
  3. Online language – How are communities using it?
  4. Online language – Why do we need to teach it?
  5. Online language – Bilingualism
  6. Online language – Somewhere along the line

 

 

 

The post Online language – Journey to a PhD appeared first on Inspiring learning.

Clickbait, Lies and Propaganda

Originally posted on lawrie : converged.

So, this is not a new subject. Or a new phenomena. And what sparked me off this morning was a tweet from Eric Stoller.

There are so many things going on in this tweet I actually struggled on where to start. But then I, like Eric, decided to challenge the headline; because I think it needed challenging. So what is my issue with it? As someone who was bullied at school I would like to know why teachers are not saying “hey, now we can see the bullying!”.  When I was at school teachers were at best unaware of it, but if I am honest – I think some of them just didn’t care or thought it “part of the culture of schools”. Now we have a culture of bullying in schools and in wider society that is both more visible, and possibly more enabled.

Whining that it is the fault of technology is not just pointless, it is abdication of responsibility.

It is that blaming of the technology that really annoyed me to start with and got me digging into the story. Blaming technology for things is not uncommon, those of who work in ed-tech are familiar with stories about Students being distracted by laptops in class and similar headlines; CNN once ran the headline “E-mails ‘hurt IQ more than pot’” and the Telegraph “Facebook and MySpace generation ‘cannot form relationships” (and yes I know that last headline will set all sorts of digital native bullshit klaxons off across the world). But this is not new, it is the same recycled alarmism that has been around since the 18th Century when Malesherbes complained that getting news from the printed page socially isolated readers and detracted from the spiritually uplifting group practice of getting news from the pulpit. Or complaints that radio will distract and over excite children in the 1930s (other technologies to complain about are available depending on decade). These arguments have all been made before. Eric was right to call it out.

Malesherbes railed against the fashion for getting news from the printed page, arguing that it socially isolated readers and detracted from the spiritually uplifting group practice of getting news from the pulpit.

 

Clickbait, Lies and Propaganda

But there is another issue in play here. Bexit and Trump have been foremost in my mind for a long time, the lies that were told, the clickbait stories and the propaganda machines they employed. But then the headline writer of the news story from the original tweet is not that different. Let’s look at it again;

@schoolsweek tweeted “Teachers are warning that apps like Snapchat and Facebook make it easier for pupils to abuse each other”  with a link to the story.

I was going to write a blog about the anthropomorphism of technology – how we blame “things” instead of our own culture. So I went off to read the story.

The story comes from a press release from the Association of Teacher and Lecturers. The headline they use is somewhat different –  Pupils subjected to hate crime and speech while at school – ATL poll.

They asked 11 questions (shown at the end of the blog), the first 8 questions make no reference to any form of “cyber bullying”, the first question that is asked is about incidents of bullying, where 22% of the respondents were aware of bullying incidents in the school, 60.1% weren’t aware and 17.7% weren’t sure. In question 9 they give a list of forms of bullying and almost 66% of the respondents think that cyber bullying increased over the last 2 years.

The next question focused on the types of bullying the results are below

Form of Bullying Percentage of (243) respondents Number of respondents
Verbal abuse 84.8 206
Cyber bullying via text/email/social media 51.9 126
Physical abuse 21.4 52
Emotional abuse (isolating, ignoring, humiliating) 60.5 147

The story put out by the ATL was about Hate Crime and Speech, it was about what support teachers needed, it was about what was going with young people in schools. For me the table is really worrying. 85% of respondents said there was verbal abuse. 60% said there was emotional abuse, then they mention the cyber abuse, but of course that can be both verbal and emotional and they are describing the medium? Which means that in face to face situations where 22% of teachers are aware of incidents 85% of them are verbal abuse (hate crime and speech). More than 1 in 5 of those incidents are physical!

At the start of this piece I was angry at teachers for blaming technology for something they need to take responsibility for. They are, the whole story is about teachers highlighting bullying. The poll is a cry for help in our schools, the press release is well written and referenced but the way the news story was written made me mad at teachers!

A great quote from Dr Mary Bousted, general secretary of ATL iS included in the press release demonstrating how seriously and sensitively they are addressing the issue:

“There are a number of complex reasons why pupils bully other pupils and schools try their best to work with both pupils and parents to deal with these incidents.

ATL calls for awareness to be raised about the discrimination faced daily by many. Schools need to play their role in educating children to build a culture of tolerance and respect. All schools should have robust bullying policies in place that cover how to deal with incidents of hate crime and speech. We hope that schools can support staff to educate young people in recognising and challenging hate crimes and hate speech wherever they occur.”

Let’s remind ourselves of the two headlines and first line side by side:

From Schools Week

Snapchat and Facebook make it easier for pupils to abuse each other, teachers warn

Teachers have warned that social media apps, such as Snapchat and Facebook, are making it easier for pupils to abuse each other, with mild insults escalating into “serious threats” which disrupt learning.

From the Press release

Pupils subjected to hate crime and speech while at school – ATL poll

Over a fifth (22%) of education staff believe that pupils have been subjected to hate crime or hate speech* while at school in the last academic year, according to a poll of 345 members of the Association of Teachers and Lecturers (ATL).

The social media story was inferred from a few anecdotes within the press release, and by other sources. Although those sources have nothing to do with bullying.

There are 3 lines that make this point particularly well:

[referring a bullying incident] to Social media’s role in such incidences is likely to grow as usage of such networks expands among children.

According to Business Insider, a recent survey of teenagers by PiperJaffray found that picture and video messaging app Snapchat was the most popular social network, used by 81 per cent of respondents. It was closely followed by picture app Instagram, used by 79 per cent.

One teacher from North Yorkshire said mobile phones were a “contentious issue” and that apps such as Snapchat have “made it easier than ever to send abuse”.

One might think that the Business Insider story also highlights bullying or abuse – and is being used for supporting evidence? Actually the story is about how teens are moving to larger social networks instead of the smaller ones and makes no reference to bullying or abuse.

We are all focused on the larger propaganda news stories that are doing the rounds, and I’ll be honest I don’t know if this is propaganda, clickbait  or bad reporting. But it is from inside the education sector – and if you are writing news about education I will hold you to the same standard that you hold our teachers. And if you ignore the real issues in favour of drivel like this you are damaging education, and those who work in it. ATL are really concerned about hate crime, hate speech and bullying.

You went for a cheap headline about Facebook and Snapchat.

ATL Poll questions

  1. Are you aware of any incidents of hate crime/speech in your school/college involving pupils in this academic year?
  2. What are the main reasons pupils in your school/college are bullied? Please mark all that apply.
  3. Do you believe there has been an increase in hate crime/speech, and bullying associated with hate crime/speech, among pupils in your school/college in the last year?
  4. Do you think that hate crime, hate speech and discrimination should be covered in mandatory PSHE and age-appropriate SRE?
  5. Do you think that your school/college provides enough support if anyone needs to report incidents of hate crime/speech?
  6. Have you received training on how to deal with incidents of hate crime/speech?
  7. Have any pupils in your school/college been subject to bullying in the current academic year? Please mark all that apply.
  8. How often does bullying occur? Please mark all that apply.
  9. Do you think that these forms of bullying have increased over the last two years?
  10. What form does this bullying take? Please mark all that apply.
  11. Do you feel you receive enough support from your school/college/SLT/head when dealing with incidents of bullying between pupils?

 

Medium

Second pathfinder meeting: addressing common institutional challenges

Originally posted on Effective Learning Analytics.

Group discussion in Jisc's London offices

At our recent pathfinder meeting, a number of institutions involved in implementing Jisc’s learning analytics architecture came together to work on issues of common concern. This followed a get together in Bristol last December where we looked at institutional culture, ethical & legal issues and data.

Participants from the different universities present discussed a different range of issues this time, which I describe below (with thanks for notes from Paul Bailey, Tony Sceales and Tim Stratton).

Business transition to using learning analytics
Tony Sceales, an entrepreneur brought in by Jisc to work on areas such as assisting institutions in building their business cases, facilitated discussions in this area. Tony has developed a tool which helps to quantify the costs and benefits of an institutional project.

One participant described how his university is at a fairly advanced stage and has put together a student panel to ensure that learners’ voices are incorporated. Learning analytics is currently being considered there as an R&D project to discover whether it can enhance student success. The institution has been looking at Eduroam logins as a proxy for attendance, which has helped understand building utilisation at different times of the day by staff and students. Library logins and accesses to digital content are also being analysed. There is particular interest in attempting to understand which curriculum structures lead to the best outcomes.

Another member of the group described how her university was still at a relatively early stage. She felt that the relevant pro vice chancellors had not been engaged early enough, and that executive level engagement is critical for success. This is a finding that is borne out in a number of studies, including a useful Australian report by Colvin et al.

She was also seeing wide variations in engagement across different schools due to cultural and practical issues such as budget reductions. She felt that “Do no harm” is a helpful guiding principle for learning analytics and that the forthcoming EU General Data Protection Regulation will be a driver for institutions in sorting out their processes around student data.

Adapting predictive models
This group also discussed whether it would be possible to turn on and off the factors used in the predictive model and the interventions they drive.  Whilst this is theoretically possible, the group felt that the accuracy of the model would be likely to be reduced with each metric removed. However, studies such as that reported in the seminal paper from Marist College by Sandeep Jayaprakash and colleagues. have shown that some indicators may make very limited impact on the predictive abilities of a particular model.

Jisc is developing a single/common predictive model to include in its standard offering, but it will be possible to procure further models to be built either by vendors or by Jisc. We expect vendors to start to differentiate themselves in this space, and some may decide to share their models freely.

The models will of course be imperfect and will deliver some level of false positive predictions for students at risk.  These will need to be carefully managed in the context of the relationship between teachers and students, given the ‘Do No Harm’ principle. As we transition from an R&D project to an operational service the group members considered it important to recognise the need to mediate between the predictions and the student.  This is especially true where the prediction is based on environmental or demographic factors rather than metrics for attainment or engagement. The role of the student retention office(r), where one exists, in determining intervention processes and the model of recording interventions was thought to be key.

Developing a picture of a successful end-state for learning analytics
It was also requested that pathfinder institutions could be helped to develop a picture of what a successful end-state looks like for learning analytics, and what the journey is to get there.  User stories, case studies, testimonials and best practice would all be useful.  Knowing which factors most heavily impact the outcomes and accuracy of predictions would help to build an evidence base we can rely and build on.  Key metrics were thought to be VLE usage, attendance, timely assignment completion and assessment results.

It was a desire to cover just these sorts of issues for institutions that led me to write my recent book, Learning Analytics Explained. A growing range of publications may also prove helpful in this regard, including our review of UK and international practice, Learning analytics in higher education. Insight gained as we roll out Jisc’s architecture will no doubt continue to be shared too via this blog.

Budget implications
Tony Sceales presented his business costing tool for learning analytics which prompted a number of thoughts from participants. One of these was that you will always lose some students regardless of what you’re putting in place to try to prevent this – moreover there is a cost to intervention and you do save some costs when students drop out.

Someone else made the point that investment in the institution is required over and above the costs of the software. Well, this is no surprise to anyone who has tried to implement new software at educational institutions – and in the case of learning analytics, it will be essential to help staff understand how to interpret the data and what interventions they should be carrying out that are likely to result in the biggest impact. A way to calculate such costs is included in Tony’s costing tool.

People also wanted to know about ongoing costs for the Jisc solution. These are being worked on at the moment and will be shared soon.  Jisc’s offering will include the Learning Records Warehouse (LRW), the Apereo open source predictive engine and a basic predictive model – as well as Study Goal and Data Explorer. More sophisticated analytics tools which integrate with the LRW will be able to be procured from leading vendors.

Consent
In this discussion we looked at issues around obtaining student consent and complying with the forthcoming EU General Data Protection Regulation, which will need to be implemented by universities. Our suggested way forward is fully explained in our earlier post: Consent for learning analytics: some practical guidance for institutions

Key points were:

  1. It’s unreasonable to expect students to understanding what they’re signing up to by consenting to data collection and interventions on the basis of learning analytics on day 1 of their studies
  2. Using consent as the justification for collecting (most) data is not necessary or appropriate anyway and seriously restricts the possibilities for its use further down the line
  3. Justify data collection on the legitimate interests of the organisation, with the exception of sensitive data where consent must be obtained
  4. Obtain consent for any interventions you wish to offer to students

Interventions
Paul Bailey led this discussion, which looked at issues around integrating learning analytics with existing student support and tutorial processes. Most institutions have a personal tutor system in place but are concerned about the timing and frequency of tutor meetings to provide timely interventions. There is also a potential increased workload for tutors if they receive regular alerts on students at risk.

Student support services offer another important source of interventions to help students at risk. Some institutions are putting in place specific roles e.g. student retention officers who focus on supporting the students most at risk or those identified as likely to benefit most from timely interventions.

The key challenge in any institution seems to be to develop a holistic view of the support and interventions being targeted at individual students and to ensure a coordinated approach from all staff involved. Many institutions have tutor dashboards and/or student services systems but these are rarely joined up. The Jisc learning analytics service offers a centralised place to hold and share information on interventions made with individual students. The Data Explorer tool will provide a reference implementation to record and view interventions on students and allow local and vendor solutions to integrate with the Learning Records Warehouse.

Jisc Learning Analytics Service
Jisc is now finalising plans for moving from project mode to the provision of a learning analytics service. Michael Webb outlined the plans for this. One participant wanted to know how to request changes and how institutions would be notified about changes. Michael pointed out the difference between fixes and changes. Bugs are fixed as quickly as possible. The data structures will be renewed around Easter, with tools updated accordingly at the start of summer. More substantial changes will take place during the summer when there will be a new major release. Less significant “non-breaking” changes will occur throughout the year. Participating institutions will be informed of the processes for submitting change requests imminently.

Study Goal, Jisc’s student app for learning analytics, is now available in demo mode (versions for Android & IOS) for institutions to try out. One participant asked for the ability for other apps and systems to be integrated with the Learning Records Warehouse (LRW). Attendance monitoring is one such application, and this is currently being trialled with some institutions. Someone pointed out that it is nearly impossible to record all attendance at universities – however, even partial records of attendance (or non-attendance) could be sufficient to flag at-risk students.

 

10th UK Learning Analytics Network meeting, University of Strathclyde, 3rd May 2017

Originally posted on Effective Learning Analytics.

Royal College building, University of StrathclydeThe 10th UK Learning Analytics Network meeting is taking place at The University of Strathclyde in Glasgow on Wednesday 3rd May. Strathclyde has a vibrant learning analytics initiative – see Ainsley Hainey’s blog for further details. We’ll be hearing about that from Assistant Deputy Principal, Brian Green in the afternoon.

Booking form

The main theme of the day is what next for learning analytics? The day will kick off with a welcome from Professor Helyn Gould and a brief update on Jisc’s learning analytics project. Next up, Patrick Lynch from the University of Hull presents his insights on the promising potential of learning analytics to inform learning design. Participants at the recent LAK17 in Vancouver, who were informed about recent worldwide research in the area, will then update us on their reflections from the conference.

During lunchtime, there’ll be an opportunity to view the latest offerings from some of the leading vendors in this space.  After lunch, anyone struggling with an institutional business case for learning analytics can benefit from the experience of entrepreneur, Tony Sceales, Jisc’s consultant, who has developed a useful costing tool. We also hope to learn from Jisc’s recently appointed data scientist on what to look out for when analysing data on learners and their activities. Finally, we finish with a panel session where leading vendors will present their visions for where learning analytics is heading next.

You’re advised to book early as there are only 80 places available and the event is likely to be oversubscribed.

Agenda (draft)

10:00 – 16:00, Wed 3rd May 2016
Hosted by the University of Strathclyde

University of Strathclyde
Glasgow

Directions and parking      Campus Map

09:30 – 10:15 Arrival and coffee
10:15 – 10:25 Arrangements for the day & welcome to the University of Strathclyde Professor Helyn Gould, Deputy Associate Principal – Learning & Teaching
10:25 – 10:40 Update on Jisc’s Effective Learning Analytics project Michael Webb, Paul Bailey, Rob Wyn Jones
10:40 – 11:25 Informing learning design with learning analytics Patrick Lynch, University of Hull
11:25 – 12:00 Reflections on LAK17, the recent Learning Analytics & Knowledge conference Michael Webb, Lee Baylis, Ainsley Hainey, Niall Sclater
12:00 – 13:00 Lunch, networking and product demos (confirmations so far from DTP Solutionpath, Hobsons and Jisc)
13:00 – 13:40 The University of Strathclyde’s Learning Analytics initiative Brian Green, Deputy Associate Principal – Learning & Teaching 
13:40 – 14:10 Developing an institutional business case for learning analytics Tony Sceales, Jisc
14:10 – 14:45 Insights for learning analytics from data science  Jisc data scientist (TBC)
14:45 – 15:00 Tea / coffee
15:00 – 15:55 Where next for learning analytics? – panel session Ben Stein, Hobsons; Richard Gascoigne, DTP Solutionpath; Michael Webb, Jisc & others TBC
15:55 – 16:00 Farewell  

College Analytics Lab – Enabling Cross Sector Collaboration – The “Team Manchester” Story

Originally posted on Jisc Innovation in Further Education and Skills.

Co-authored by Martin Hall and members of the Manchester College Analytics Lab team.

Colleges across the UK are facing-ever more complex demands to make their case for funding, and other forms of report. At the same time, government policies both here, and across Europe and North America, acknowledge the need for a new deal for building technical education,  training and the skills that are essential for re-securing economic growth. How can the huge and diverse sources of data that are now available be better used to address and inform key policy decisions, in ways that meet both the requirements of national and regional agencies and also the local nuances and concerns of colleges serving immediate communities?

The Jisc College Analytics Lab digital modelling environment provides a means to address complex practice and policy questions using highly diverse sources of data. By engaging both with colleges, with their command of the details of learner data, and regional and national planning agencies, that need to aggregate intelligence across wider areas in order to generate policy recommendations, we hope to make a step change in the quality of provision in this key sector of post-compulsory education

From January – March this year, we ran a three-month pilot to explore possibilities with the involvement of key organisations in the Greater Manchester City Region

Members of the Manchester team:manchester city region

  • Christian Spence, Subrahmaniam Krishnan Harihara, Alex Davies – Greater Manchester Chambers of Commerce
  • James Mortlock – Salford City College
  • Britta Berger-Voigt – Greater Manchester Combined Authority
  • Martin Hall, Shri Footring – Jisc; Phillip Lowe – The Information Lab

Supporting team members:

  • Derek O’Toole, Matthew Taylor – Hopwood Hall College
  • Rob Wyn Jones, Sue Attewell – Jisc

Business Questions

We explored two separate areas of need.

  1. How can information about the content of FE programmes be made accessible, in a meaningful way, to both learners and employers? How can we describe the relationship between college programmes, the attainment of competencies and formal qualifications more accurately?
  2. What do you do, as a college leader, when you need to be part of the city region plan for accelerating apprenticeship starts, but you have a severe shortage of local firms for absorption? How do you reconcile the dependency of learners on campus proximity and affordable transport routes with the rationalisation of provision across the city region?

To answer these questions we used internal college data and, crucially, we were able to “superimpose” detailed employment and other local data shared by GMCC and the New Economy.

Communicating College Programme Content to Employers and Learners

We know that employers value the practical, competence based aspects of a qualification when  recruiting. For some qualifications, such as NVQ, the competence element is clear. However other qualifications often also include significant competence based elements which are difficult to communicate and, because of the sheer number available, are not well understood by employers.

We used college data on module content of qualifications and combined it with detailed data gathered by the college on what percentage of each module was competency based.

The visualisation below clearly highlights which courses have more competency element than others as well as showing that a simplistic approach does not give an accurate indication of the competence element of courses.  This allows employers to readily identify relevant courses and learners to make informed choices.

Competency content of qualifications

Student travel between home, college and employer locations

hopwood hall heatmap

Heatmap showing employer locations in the area. Students’ home locations can be superimposed.

A user story from college point of view:

As the Director of Curriculum Development, when planning next year’s offer, I would like to have an accurate picture of which employers are within easy reach of  College Students, with vacancies at the appropriate level, so that can take an informed decision about which curriculum areas to grow / invest in”

A student point of view:

How far would a I have to travel to get to work / college? How long will the journey take? (How much will it cost?)”

“As a prospective apprentice, when applying for a job / college place, I want to ensure that I am able to travel from home to the location in a reasonable time frame”.

The data used for these visualisations include:

  • Student post-code (or out code) – each post code turned into a map coordinate so that it could be used to find the nearest bus stop location
  • Public transport routes – the local bus routes needed to be analysed to see which routes passed by the college
  • Public transport timetables – the timetables for all stops along the routes going to the college needed to be analysed to calculate how long each trip would take for buses at each stop along the route
  • Employer locations – nearby projects were mapped to see how many viable employers were in the local area
  • Employer vacancy information, where available
construction company locations

Location of construction companies with predicted skills shortages for projects in Greater Manchester

Potential for the Future

This pilot has demonstrated that by combining college internal data with local and national data sets it is possible to create compelling visualisations that enable data informed decision making relating to the construction sector.

  • By clearly showing the competence element of courses offered by the college employers are better able to identify which courses they should favour, and learners are better able to choose course to enroll on.
  • By overlaying national employer data and local project data the data dashboards can make it easier for colleges to find appropriate apprenticeship partners that aren’t necessarily locally based.
  • By highlighting challenges that learners face, such as transport, local area planning can be better informed.

The College Analytics Lab enables a wide range of organisations who need to work together to do so meaningfully through secure sharing of data, with dashboards that promote a shared understanding of complex issues.

 

Data visualisation and digital capability

Originally posted on Inspiring learning.

data visualisation

In a world where data is in abundant supply, the ability to understand, visualise and communicate the messages in that data is becoming more and more important. It’s also an important element of Jisc’s Digital Capabilities Framework.

This morning I attended a webinar delivered by Andy Kirk on the 7 Hats of Data Visualisation which was a fascinating way to spend an hour. He’s made his slides available openly via his excellent blog.

I’ve been on one of Andy’s workshops before back in the day and what’s great about his approach is how he takes a step back from the technicalities of data visualisation to focus on fundamental principles. It’s something that has relevance for loads of different people that we support in Jisc, for example:

  • People working with dashboards and data analytics
  • Researchers and support staff
  • Teaching staff and students
  • Administrators

The 7 hats

adam-wyman-185920 (800x532)

Someone wearing a data viz hat, contemplating normal distribution

The thrust of Andy’s argument was that in order to creative effective data visualisation you need to take a leaf out of Edward de Bono‘s book and wear 7 different “hats” relating to the different responsibilities and capabilities involved in a data viz project. In a nutshell:

  • Director – determining the purpose of the visualisation and understanding things like the social, technical and physical context of the audience.
  • Communicator – the audience advocate. Determining what the audience’s needs are (and what they don’t need!) in understanding the visualisation.
  • Journalist – the person that sniffs out the “story” behind the data. Looks for patterns and exceptions
  • Data analyst – responsible for collecting, cleaning and being familiar with the data set.
  • Scientist – the role that considers the integrity of the project, setting the benchmark for trustworthiness and rigour in the use of data.
  • Designer – responsible for critical design-related decision-making. Using their visual vocabulary to choose the best way to communicate with the audience.
  • Technologist – constructs the solution, balancing technical ability with discernment

(Andy wrote a blog post based around 8 “hats” back in 2012 and this presentation was an update.)

These don’t all have to be worn by the same person. It’s actually important to know which “hats” need to be worn by others if you don’t have the skills or experience needed.

Data and digital capability

For me, Andy’s approach is a neat summary of the thinking behind the digital capability framework. To thrive in a digital world it is not sufficient to have technical skill, you need to understand the context within which the technology operates and to understand the processes to make best use of it.

From and educational point of view, we need to help learners developed critical discernment of the data that is presented to them, either the appraise it’s quality or to understand the message that it is trying to communicate. A good illustration of this was the Washington Post article analysing Donald Trump’s election campaign’s use of data that simultaneously skewed the data to make political points and demonstrated a certain level of data illiteracy in that it often actually undermined some of those same points!

This isn’t about criticising one political view over another, though. My main take-home point from the webinar was that visualising data is never objective. It is trying to tell a story or help people to form their own interpretation. Whether we are creating or engaging, this is something we need to be awake to.

Jisc has some archived advice and guidance on creating effective data visualisation.

 

The post Data visualisation and digital capability appeared first on Inspiring learning.

Collaborative Working Through Data Sharing – The College Analytics Lab “Team Wales” Story

Originally posted on Jisc Innovation in Further Education and Skills.

When a group of colleges form a strategic partnership, they typically work together to share resources, expertise and a collaborative approach to planning.

Could sharing data help the partnership to become more effective? For example, could sharing information about applications, enrollments, outcomes and destinations help the colleges to work together to provide a more structured, joined-up offer to young people in the area to secure economic development of the region?

This post is co-authored by members of a forward-looking partnership from South East Wales who set out to explore just that, working as part of the Jisc College Analytics Lab project.
Team Wales 2

Project Structure

This short, three-month pilot project started with a face-to-face workshop to establish which business questions and data sources we would explore first. Business requirements were expressed as user stories.  Following agile principles, we then worked in time-bound ‘sprints’, with regular online meetings to review and reflect on progress and adjust project priorities if required.

High Level User Story (‘Epic’)

As a director of learner services and support, when monitoring application and recruitment for 2017 / 18, I want to target recruitment activity so that I can increase participation / recruitment and understand conversion rates by geographical area and subject type.

Data Sources

Internal college data from the two participating colleges

  • Institution / college statutory returns data (LLWR – Welsh ILR)
  • Internal student application, acceptance and enrolment data (historical)
  • Internal student application data (current live / in-year recruitment)

External

  • School data – Populations and Schools / Pupil Locations (PLASC / Careers Wales)
  • Schools achievement and performance (Stats Wales)
  • NOMIS – employment and industry demographics
  • EduBase – school locations

Data preparation and ETL

  • Coding and grouping of courses / students to align with industry standard codes
  • Matching pupil census data to college application data
  • Matching school location data to college application data

Project Sprints

We broke the high level user story down into manageable chunks to work on each month, summarised below.

Sprint 1 – January

Priority questions for this sprint:
Which schools are we receiving applications from? What are the pupil numbers at those schools and how many are applying?

school locations and pupil age

Welsh School Locations and Pupils by Age Group

“Creating a visual representation of the geographic distribution of source schools to the colleges required some advanced data blending techniques. The data provided by colleges highlighted which schools learners came from, however the location of these schools wasn’t included. Another piece of useful information was the total number of pupils at each school by age group. The difficulty here was that to join each of these data sets together the school name needed to be an exact match, however only 30% of the school names were an exact match.
Fuzzy matching is a more advanced way to match data sets as it looks at individual words and can match based on different criteria (e.g. the phonetic sound of each word of each word). This fuzzy matching approach improved the matched schools to roughly 85% and enabled the joining of both location data and pupil census data to the collages application data.” – Phillip Lowe, Tableau and Alteryx consultant, The Information Lab.

Sprint 2 – February

Priority questions for this sprint:
How are enrolled learners distributed across different subject areas? Is there a particular subject area that seems more popular in one college vs. others? Are there some schools in which there are more learners going to one college over another? What are the trends in course popularity?

course comparison

Course comparison dashboard

Understand enrolment trends by course

Understand enrolment trends by course

Sprint 3 – March

Priority questions for this sprint:
Can we predict enrolment trends? Can we identify areas where recruitment trends from specific schools are increasing? Can we understand enrolment trends by course? How are applications from certain schools comparing to last year’s applications?

Recruitment trends

Identify areas where recruitment trends from specific schools are increasing

Application target dashboard

How are applications from certain schools comparing to last year’s applications

Outstanding Benefits, Issues and Conclusions – Ivan Gregory

  • Having access to Tableau expertise, along with support (Alteryx) for ETL of external/ additional data sources was invaluable;
  • Ability to benchmark against previous years recruitment (at point in time) for geographically co-located campuses/ colleges was useful;
  • Potential to further look at student attendance / recruitment, mapping to public transport routes/ maps, to influence curriculum plans / space planning;
  • The availability of in-year or near-live data for internal and external data sources would be extremely powerful and enable business decisions (focus or re-focus of recruitment effort or resource) to be made ”just in time”;
  • Potential to expand to an all-wales BI and labs project (Colleges Wales);
  • Would like to work further with WAG to obtain more detailed regional data and also automated data provision above ILR from a central source;

Forty Four Years Ago

Originally posted on e-Learning Stuff.

using a mobile phone and a laptop

The first handheld mobile phone call was made forty four years ago, on the 3rd of April 1973. There had been mobile phone before, in cars and lorries, but forty years ago saw the first phone call from a handheld cellular mobile phone. Well you also needed to carry a bag too (for the battery).

I suspect most (it not all) the people reading this blog post have a mobile phone, or if they don’t they did at one time.

It’s interesting that a technology, which has reached such a milestone, is still seen by many teachers and practitioners as disruptive, and should be banned in classrooms.

Six years after this first experiment, we saw on Tomorrow’s World how they were being introduced to the UK, with some barriers and problems coming from the Post Office (of which the telephony was eventually spun off and sold off as British Telecom).

Even then, you could see the usefulness of the device as a way of making phone calls on the move, and whilst mobile. What was less apparent was the potential of the device as a mobile portable computer, even in 1979, personal computing was very much in its infancy. Even the first few pocket and portable computers didn’t have connectivity.

It is the smartphone’s connection to the internet and the web, which makes it a very different device to those early handheld mobile phones. The mobile phone today is a transformative and enabling device and in many ways a different concept to the one we saw back in 1973. The mobile phone today is much more than just a voice communication device, it can do so much more. I have done this exercise at many mobile learning workshops, I ask the participants to list all the different things they do on their phones. Interestingly, making phone calls is either not mentioned or very low on the list. The sorts of things that people today do on their phone includes (and is certainly not limited to) texting, social networking, photography, film making, audio recording, playing games, reading books, looking at magazines, listening to music and other recordings, watching video, streaming video, doing quizzes, creating content, and so much more…

It is this functionality that makes the mobile phone so much more than what was first seen back in 1973, and it is this functionality that teachers see as disruptive and challenging to manage.

using a mobile phone

The reality is that learners don’t use mobile phones in classrooms in the way they were envisaged, for making actual phone calls! The problem many practitioners have with mobile phones is not with the phones themselves, neither with learners making phone calls in lessons, the problem is a very different issue.

Banning mobile phones or asking students to turn them off, is not a real solution, at most conferences and events when delegates are asked to turn off their phones, most will turn them to silent mode. So much so that conference organisers seem to ask people now to turn them to silent mode rather than turn them off. I am sure many learners in a classroom situation will do something similar.

The question you have to ask is why are learners switching off in lessons and using their mobile phones? Yes there will be the odd learner who is addicted to their phones and can’t help themselves using it. However these learners are in a very small minority. Think about if this was the case for all learners, then in all lessons, all learners would be disengaged and using their mobile phones; now that doesn’t happen.

Rather than blame the learners, the key is to think about why they are disengaging in your lessons. Why are they switching off from learning and switching on their phones?

Another possible solution is to embrace the use of the mobile phone and make it part of the learning process, as well as making the learning engaging and interesting. The very functionality that can be so disruptive or attractive to learners, can also be effective in supporting learning and assessment.

Engaging doesn’t always mean interactive and doesn’t mean that it can’t be hard or difficult. Thinking about challenging problems is an effective learning process.

using a mobile phone

The mobile phone is forty four years old, in many ways the disruptive nature of mobile phones is new, but only because the mobile phone has evolved into something very different from a device used to make mobile phone calls.

version of this article was first published in 2013 when the handheld mobile phone call was forty years old.

Trends Unpacked: Organizational Challenges and Learning Analytics (Part 1)

Originally posted on Effective Learning Analytics.

Lindsay PinedaThis is another guest post from Lindsay Pineda, Senior Implementation Consultant, Unicon, Inc. with Patrick Lynch, Technology Enhanced Learning Advisor, University of Hull.

Earlier this month, I posted an article about the learning analytics readiness trends observed over the last year as I traveled to several UK HE (higher education) institutions with my colleague, Patrick Lynch from the University of Hull. Patrick and I are co-authoring a series of articles titled “Trends Unpacked.” In this series, we will expand upon some of the trends discussed in the article, “Learning Analytics Adoption and Implementation Trends: Identifying Organizational and Technical Patterns“.

Patrick’s knowledge is two-fold, as he not only works within HE, but he is also a leader at his university regarding learning analytics. Our combined expertise brings the added value of having two different perspectives when visiting institutions.

This article characterizes common concerns expressed by individuals we spoke with at the institutions. We will also provide potential solutions for these concerns, based on institutional feedback.

We are focusing on the first two aspects of organizational challenges and trends:

  • Level of change management comfort/ willingness
  • Organizational support for analytics

In future posts, we’ll cover additional organizational challenges and trends.

Level of Change Management Comfort/ Willingness

As highlighted in the article “Learning Analytics Adoption and Implementation Trends: Identifying Organizational and Technical Patterns”, some of the challenges and trends we observed were related to change management:

  • Staff expressed concern in areas including level of comfort and willingness to accept change. This included job roles, additional responsibilities, and changes to current practices
  • Institutional leaders sometimes did not have a clear understanding of the level of effort required for a larger scale initiative such as learning analytics
  • Academic and teaching staff expressed resistance around prescriptive allocations of their time related to teaching and advising

The following examples illustrate the types of change management challenges most often expressed at the institutions:

  • “Change fatigue” – This particular feeling was expressed by many at the institutions we visited. At one particular institution, we were told, “People are used to things changing all the time. Resistance is futile.” While this is somewhat comical, it is unfortunately very true. Some viewed this attitude as a positive thing because individuals within the institution are not “scared” by change. However, the fact remains that too many changes happening at the same time, or in short succession, can lead to wariness about what’s to come. Often, those charged with acting on the changes are not the ones actually mandating the changes. As one institution told us, “They (the changes) come from the third floor where the executives live”.
  • The “something else for me to do” syndrome – At other institutions, we experienced resistance from a group of academics who voiced significant frustration. When we asked the question, “How do you think learning analytics will affect your job duties?” we were met with heavy sighs. One participant stated, “We only have a certain number of hours allotted for teaching, advising, etc. A learning analytics implementation would add to workload and that would need to be planned for.”

Institutions shared with us their ideas regarding potential solutions and recommendations that they feel would be beneficial:

  • Clarifying job roles and duties – Engaging in this type of exercise is advantageous for any institution. We noticed that many institutions had old, outdated job descriptions and duties. Clarifying these job descriptions and duties can be a large undertaking, but we saw a lot of excitement, passion, and willingness from staff members to help leadership with these types of assessments. In the words of one participant, “It helps with job role, task clarification, and professional growth for those who are motivated to be involved.”Another approach to managing change fatigue is to reduce the overall number of changes people need to deal with by packaging change into larger programs whereby learning analytics isn’t a change, but rather a part of other initiatives. In general, we recommended institutions look for ways to incorporate learning analytics into existing initiatives for the best chance of success.
  • Involve others in the changes – Every institution we met with expressed a common challenge regarding implementing change. Those who would actually do the new tasks had no involvement in the planning phases. A participant at one institution stated, “Perhaps selecting a member from each department that will be affected by the change to sit on a steering committee or change management panel would be a way to involve us.” Delegation is the key takeaway here.
  • Communicate the changes before they happen – At many institutions, we were advised that changes were often relayed via email and often very shortly before they were to be implemented—some even cited only a week’s notice. There were even examples of particular groups who were not told about changes, but were still held accountable for them. One individual told us, “I never know when a change is going to happen or if I’m even doing my job right. Most of the time, I’m in a constant state of anxiety about what I might be doing wrong.” Employees who feel this on a daily basis are not able to focus on the real reason they are there: to help students succeed. Involving staff from the start of the initiative, and having their input upfront, helps establish the message and ensures that it is well understood among the “target audience” (e.g., advisors, academics, students, etc.). This allows the message to be tailored to the audience by someone from the particular group, which help with understanding and buy-in. The same can be said for the actual training to accommodate the changes. Most institutions did not have a formalized training plan for changes that were rolling out to staff. “Well thought out training is not something we excel at,” said one institutional leader. Recognition and awareness is a great first step and helps fuel the importance to communicate and train staff on a more regular basis.
  • Institutional approaches to project management – Having established project management processes helps the institution identify and manage what work is required to reach the end goal. Many of the challenges we identified could be addressed through well thought out and well-implemented institutional project management approaches. It isn’t just learning analytics initiatives that institutions struggle with; establishing project management processes is also a challenge. Creating a project management structure that can be applied to other initiatives helps set up the institution for success.

Organizational Support for Analytics

Organizational support is another challenge we observed while visiting institutions.

  • Staff (inclusive of academic and university/ college individuals) shared that they were particularly concerned with the impact on their current job requirements, roles, and workloads
  • Communication we received from all staff level roles was that a “top down” directive from leadership is necessary to properly implement learning analytics efforts

The following examples illustrate the types of leadership support challenges most
often expressed:

  • “I’m not doing anything until I’m told by leadership to do so” – This is a direct quote from one member of an IT group at an institution. He advised that he and his colleagues would not engage in anything new unless leadership told them it was important. His statement was made regardless of his personal opinion on whether or not learning analytics would be of benefit to the institution. Looking to leadership for the “OK” is very important at institutions, and not having a clear direction can make it difficult to navigate the leadership’s priorities.
  • “We already have so many things we are asked to do in a day, how can we possibly manage one more?” – Within all departments, there was concern about how to manage something else; another tool, another “thing.” One participant told us, “I have to spend 15 minutes searching for something and then [I] forget what I was looking for.” This was more common than one would like to think.

Institutions shared with us some ideas regarding potential solutions and recommendations that they feel would be beneficial:

  • “Single source of truth” – This can be both a technical and organizational solution. Institutions expressed that having one place to go for most, if not all, information would be the most effective way to mitigate the “one more thing to manage” concern. This could be a centralized data warehouse, a Learning Records Store, or a centralized place to house policies and procedures. For institutions, having one place to go to get the information needed is universally said to save time, energy, and effort. An individual at one institution said, “If we had one system it would be quicker, more efficient, and easier to find information and then you’d have more time to actually help students, rather than finding the information itself.”
  • Leadership buy-in is key – Those within the institution look to leadership to help guide overall strategic direction and vision. At some institutions we visited, the leadership was completely on board with the idea of implementing some sort of learning analytics solution. Others still needed convincing. One leader told us, “I see some of the overall benefits of learning analytics, but we have so many departments throughout the university that I’m not sure it would even work here.” The impact of unclear leadership buy-in upfront and in totality (among all leadership) transmits a sense that something is not a priority and does not need attention. Working with leadership to demonstrate the benefits of learning analytics, and how it can impact the “bottom line,” is something that is needed at the beginning and all throughout the initiative. As one senior leader told us, “I can understand the reluctance of my peers, as I know this will be a lot of work, take some time, and will result in some changes, but what I can’t understand is why anyone wouldn’t want to take that chance to help better our students and our institution as a whole. We have a duty to provide the best service possible and if this helps us do it, then we owe it to everyone to at least try it.”

We learned quite a bit from our journey across the UK, and we are excited to continue to share our findings. Please be on the look out for another article in April regarding what senior leadership needs to know about learning analytics, including the importance of information sharing, expectation setting, and collaborative thinking (a further extension on the discussion above).

Here’s to continuous growth and improvement!