Journal of Educational Innovation, Partnership and Change
We are delighted to announce the launch of the special issue of the Journal of Educational Innovation, Partnership and Change: Vol 3, No 1 (2017): Reacting to the ‘Hard to Reach’ through Student Engagement Initiatives.
This special issue of The Journal of Educational Innovation, Partnership and Change is slightly different from previous issues in that it focuses on a particular programme, known as ‘REACT’, funded by the Higher Education Funding Council for England (HEFCE). ‘Realising Engagement Through Active Culture Transformation’, or REACT, looks closely at the engagement of so-called ‘hard-to-reach’ students in Higher Education, and this issue of the journal provides a kaleidoscope of views and standpoints, starting points and conclusions, through both qualitative descriptions and reporting of quantitative data. It is not a ‘recipe book’ for ‘student engagement’. There is no clear-cut, neat picture of what ‘student engagement’ is, nor of what characterises a ‘hard-to-reach’ student. However, overall, it gives a rich picture of the many complexities of engaging with students who are less likely to engage, and of the many ways in which universities are working to understand the issues and consequences and to engage all students more effectively.
In all, forty-four contributions make up this issue, in the form of research articles, case studies and opinion pieces. Much has already been written on the topic of ‘student engagement’, but the importance of this particular set of pieces is that they narrow the focus of ‘student engagement’ by concentrating specifically on ‘hard-to-reach’ students. This does not mean narrowing or ‘closing down’ any aspect of discussion on the topic, but it provides a particular lens with the potential to inform wider debates.
Please do take a moment of your time to have a look at the journal and read some of the exciting work from the institutions across the programme.
Along with new question sets for students we have fully updated the guidance for organisations using the tracker in 2017-18. From today you can access the updated Guides and a full walk-through of the tracker process on our Guidance page.
Even if you have already committed to using the tracker there is useful information in the Guide: Planning to use the tracker. This covers start-up issues such as bringing key stakeholders together, allocating time and resources, and creating a communications plan. New for this year there is also advice on providing the organisational information that is now asked for in the confirmation form.
The first step after receiving your master surveys is to copy and customise them. We strongly recommend that you discuss this with key stakeholders before you start. Customisation allows you to make the tracker really relevant to your students and your organisational situation. Our Guide: Customising your tracker makes the whole process simple.
From this month, Jisc has taken over management of BOS, the system in which trackers are managed and delivered. We’ve updated the Guide: How to use the tracker to cover all the BOS basics, and this year we’ve included more advanced functionality such as uploading student email addresses to manage participation.
Also on the guidance page are details of who you can turn to for help, and updated advice on data protection. Trackers will be available to customise from 11th October, so why not browse some of the guides this week and be ready to hit the ground running?
In September a call for submissions appeared on the SEDA list that was a little different. To quote Kieran Fenby-Hulse
The Doctoral College and Centre for Research Capability and Development will be holding a one day Conference happening on Friday 19th January 2018. Our Call for Something is currently open and closes on 12th November.
A call for something – something different? Some of us had a chat, and something was created.
If you’ve been following the 2017-18 Tracker you’ll know that we’ve released a version of the new question set (HE) for you to peek at. The FE questions use ‘college’ instead of ‘university’ but are otherwise the same. The questions for ACL, skills-based and online learners have some minor differences in the wording and the options, particularly to account for the fact that many learners will not be attending on campus. But they are as similar as possible to the other question sets to support valid comparisons across sectors and modes of learning.
If you’ve signed up for the tracker this year you’ll soon have access to the master surveys you requested so you can copy and customise them for your students. If you haven’t, we hope that a review of the questions will convince you that that the answers will be useful. Guidance is also being completely updated to help you through the process and give you more ideas for engaging students. You can still sign up for this year’s tracker here.
If you used the tracker last year you will notice a few changes to the question set, and the rest of this blog post describes what they are and why we’ve made them. If you are interested in the tracker as a research tool or if you (or your organisation) want evidence of the research that has gone into its design, this post is also for you. Please comment on our thinking at the end – we are always keen to have this conversation!
Structure and navigation
We’ve grouped the questions slightly differently this year for a better flow. Each section is colour-coded so that students can see ‘where they are’ in the question set and broadly what their answers are about.
You and your digital
The first section includes the fixed demographic questions (these can’t be customised) and a question about use of personal devices for learning. This has been split off from the question about institutional devices for learning, as last year we found that BOS displayed the two results in a confusing way. The question about stage of course can’t be customised so that we can look for any course-stage effects across the data. Please note that the options here are different for the different sectors.
We’ve kept the question ‘In your own learning time, how often do you use digital tools or apps to…’ do various activities, plus the free text question nominating a favourite app for study. These were popular with organisations and informative to us. However, we’ve lost the self-assessment questions about basic digital skills. Students’ judgement of their own skills was so overwhelmingly positive that responses did not provide any useful differentiation, and it is not obvious in any case how institutions should interpret the results. We think the student Discovery Tool will do the job much better, at a greater level of detail, and with more actionable feedback for individual users (more information from this link).
The questions on assistive technology, very slightly modified, appear in this section too.
Digital at your uni (college)
This is where we now ask about access to digital services (Wifi, course materials, e-books and -journals, file storage) and about students’ use of institutional devices.
Questions about university/college provision have been grouped together and a new prompt has been added: ‘The university/college supports me to use my own digital devices’. Like last year we ask who students turn to for help with their digital skills. We’ve kept the opportunity for students to tell universities/colleges ‘What one thing should we DO’ and ‘What one thing should we NOT do‘ to support their digital learning experience.
Finally, in this section we ask the first of two new questions about satisfaction: ‘Overall, how would you rate the quality of this university’s digital provision?’ (see below). The second question, further on in the survey, asks ‘Overall, how would you rate the quality of this university’s digital teaching and learning?’
Both questions use the seven-point ‘System Usability Scale’, which has been shown to pick up relatively subtle changes over time and is supported by recent research. We’re particularly excited to introduce these items to the tracker. We hope they will stand as two key performance indicators that can be used to assess the impact of different organisational initiatives over time (see our recent blog post on the organisational data we are now collecting).
Digital on your course
This section starts with a question about the regularity of different course activities, which last year’s pilot sites found informative and which gave us some great headlines for the overall report. We have slightly changed the order and wording to improve reliability of interpretation. After this comes the question set about the virtual learning environment that was optional last year but so widely used that we now recognise it is core to the digital learning experience. You’ll still be able to add in the name of your own VLE, as we realise this is how most students will know it.
The following two sets of questions bring in some completely new issues. These are based on the same research we carried out to inform the design of the second sign-up form.
Teaching spaces (‘Teaching spaces are well designed for the technologies we use’). There is plenty of evidence that this matters to the overall digital learning experience, and it will be interesting to compare the answers of students with the judgements of staff.
Course software (‘The software used on my course is industry standard and up to date’). We find that this is important to many students and again we are asking organisational leads to assess the quality of course provision.
Digital skills (‘I have regular opportunities to review and update my digital skills’; and ‘Before I started my course I was told what digital skills I would need‘. These are based on research findings about students’ readiness to learn, digital fluency, and the reinforcement of digital skills through regular, meaningful practice.
The second overall satisfaction measure appears at the bottom of this page.
Attitude to digital learning
This page includes two sets of positive and negative statements about digital learning, identical to last year’s pilot because of the very strong validity of the response data and the popularity of the questions. There are also two new questions:
The question about preferences for individual or group work was trialled with online learners last year but this year makes it into the main question set. The second question was included because the query ‘what one thing should we NOT do’ last year elicited many similar answers, along the lines ‘(don’t) increase the amount of digital learning (relative to f2f)’. A smaller but important number of learners responded ‘(don’t) take away the digital services we have. Overall there was much higher agreement with prompts about relying on digital learning than about the enjoyment of digital learning, and we want to help organisations untangle what learners really feel about the amount of digital learning they do.
There is now just one page of customisable questions and more freedom to add/substitute questions of your own design. We’ve included a recommended question type if you are less adventurous or if you just want to keep the user experience smooth. New guidance on customisation will be available very shortly from the guidance page. This page is also where you can use the option to add a hidden question and upload student email addresses, triggering separate emails to them for survey completion. More about this in my next post.
In November I and colleagues will be delivering the next iteration of the Jisc Digital Leaders course, the fifth, and we have also got dates in the diary for January and February. One of the elements to the course is change management, we use examples of digital, but the key is the change.
This week I picked up on a paper by Andrew Balmford, Lizzy Cole, and Chris Sandbrook from Cambridge University and Brendan Fisher from the University of Vermont. “The environmental footprints of conservationists, economists and medics compared”, the paper does exactly what the title says. I’d suggest reading it but It’s behind a paywall so I haven’t linked to it.
The narrative, in what is essentially a critique of people who should know better, is sensitively handled; and from the conclusions, the big take away for me was:
Increased exposure to information does not lead to behavioural change
Even though conservationists know what is bad for the environment, for example the links between climate change and meat production and consumption, their behaviours don’t reflect their knowledge, in much the same way that smokers know that smoking is bad for your health. What does that mean for those of us who are trying to create change in organisations? What are the things that we can do that will lead to behavioural changes?
There are lots of papers, frameworks and models out there, but they can basically be distilled down to several simple ideas.
Adoption of the behaviour needs to be easy
When I was at TechDis (a disability advice service) one of early accessibility things I understood was that colour and font is important for a range of needs. Getting IT departments to make the default setting of the “normal template” font for new documents a sans serif font (such as Arial) is an easy step toward accessible documents – as long as you communicate it to staff.
Reward the change
It doesn’t have to be financial, but some sort of recognition of when the new behaviour is adopted should be used.
Use the power of other people
Make it a socially desirable change, encourage peer sharing of the behavioural change, or embed the change in commitments to other people.
Think about time
Time is our most valuable asset, it is the one thing we can’t make more of. Demonstrating how the change can save time, or free up time for something else will make adoption easier. Also think about when you communicate the change, people are receptive to different things at different times.
Model the change
There’s an often told story (and it may be apocryphal) about Mahatma Gandhi. A Mother is so worried about her Son’s health and his excess consumption of sugar that she decides to take him to see Ghandi, his hero. In hot sun, she walked for many miles, and when she got there she explained to Ghandi her worry. She asked him to tell her son to stop the behaviour, to eat less sugar.
Ghandi refused to tell the boy to change his eating habit, but told the mother to return with the boy in two weeks. Perplexed the mother left.
When two weeks had passed the mother walked many miles again, but this time Ghandi looked at the boy and told him for the sake of his health he should eat less sugar. Hearing this from his hero the boy committed to change.
Before she left the mother asked Ghandi “why did you make me take the journey twice, could you not have told him on my first visit?”
Ghandi replied “I needed those two weeks so that I could cut back on my sugar.”
Modelling the change is important, it lends credibility to your reasoning. Even if the change is not part of the role you have, you need to demonstrate how you are committed to it, and make it impact on you personally. Conservationists telling us to drive less and eat less meat, whilst driving to a burger bar will have less credibility than the vegetarian cyclist. The staff developer running a workshop about the VLE, who never uses it themselves in their practice will not be able to empathise with the staff that need to, or have been told to. Go do Twitter, is something that staff have been told in terms of engaging more widely with their subject – but have they been told by people who are effective in their social media practice?
These are some of the things that we discuss on the leaders course in much more depth, looking at the barriers and enablers to change, the tacit assumptions of our organisations, and looking for ways to model the changes we want to see.
Some of you have been asking about the changes to liability in the new Service Agreement, so I thought it was worth a blog post to highlight the main factors we had to bear in mind.
The new Service Agreement for Institutions marks the transition of Learning Analytics from a relatively small-scale R&D project to a national Jisc service with many customers. It also spans a change in data protection legislation from the Data Protection Act to the new General Data Protection Regulation (GDPR), which comes into force on 25 May 2018.
When defining the liability position in the new agreement we had to take into account the increase in the number of potential customers, the risks involved in delivering a major service, and the significant changes that GDPR introduces. Our goal was to ensure we offered a liability position that was reasonable and similar to that offered by other suppliers in the marketplace.
In the case of pathfinder institutions who have been helping us to develop our solutions over the past two years, we have also been able to offer them services for free for a period of up to 12 months. In addition, when fees become payable we’ve committed to keeping the early adopter price as low as possible. With this approach to pricing we’ve had to strike a balance between the fees we charge and a liability position that’s commercially sustainable for us.
In terms of the change in data protection legislation, the liability position addresses the new legal obligations that GDPR places on Data Processors (in this case Jisc). Under the Data Protection Act the Data Controller (in this case the Institution) was responsible for ensuring legal compliance.
Under GDPR the Data Processor (Jisc) will have a statutory obligation to implement appropriate security measures to protect the personal data made available to it by Institutions (the Data Controller). As such, under GDPR, Jisc (rather than the Data Controller) can be directly fined for a breach of these statutory obligations.
In the Service Agreement the liability position we set out is that each party’s aggregate liability to the other party for all claims arising in a given year will not exceed in aggregate a sum equal to 100% of the charges paid in that year or £10,000, whichever is the greater.
In summary, the new liability position reflects:
A commercial approach based on the fees paid by an Institution for the service.
The change in GDPR where Jisc has new obligations and liabilities as a Data Processor.
From the outset of the Jisc Learning Analytics project we were aware that institutions were likely to have requirements that went beyond what Jisc could offer. With this in mind we developed an architecture and service structure so that complementary services from other suppliers could easily be added on top of the services provided by Jisc.
In addition, we were seeing the learning analytics sector undergo rapid development with new products, services and suppliers regularly entering the marketplace. As a result, we also wanted institutions to have the opportunity to exploit new innovations as they became available.
Earlier this month Jisc launched a new procurement mechanism, the Learning Analytics Purchasing Service, so that institutions had access to a ready-formed marketplace for innovative learning analytics solutions, services and infrastructure that could be used alongside the core Jisc Learning Analytics Service.
The Learning Analytics Purchasing Service operates as a fully electronic Dynamic Purchasing System (DPS). New suppliers can join at any time during the term of the service, thereby allowing buyers swift access to new innovations in the marketplace. The Learning Analytics Purchasing Service also reduces the timescale, and therefore cost, of procurement, when compared to traditional static frameworks.
Buyers and new suppliers can access the Learning Analytics Purchasing Service via the Jisc procurement portal at http://tenders.jisc.ac.uk. Use the find opportunities feature and select ‘JISC’ from the list of portals. Look for ‘Opportunity Id: DN289989 Title: Learning Analytics Purchasing Service’
The service is currently live with two suppliers confirmed at this time and a further batch of 8 or so in the process of being added over the coming weeks. Once the initial batch of suppliers has been added we’ll post an update to this blog with details about them. We’ll also regularly update the suppliers list as new suppliers get added to the service over time.
Institutional Buyers can view the Buyer’s Guide, which has more details about the range of services available and how mini-competitions are run.
Today we’re launching the confirmationform for the Digital Student Experience Tracker 2017-18. You’ll automatically receive this form if you completed the first sign-up form. If you complete that first form now you’ll get the confirmation form within days – so if you haven’t committed yet, now is a good time to take the plunge.
‘Confirmation form’ may not set your pulse racing, but we are feeling pretty excited about it. Here’s why.
First, we already have over 100 institutions signed up. There has been so much interest that we are considering extending the sign-up window to the end of October. We have been asked to present this data at a number of national and international forums, and to support the Welsh Government in exploring the digital experience of college students across Wales. So we expect this year’s tracker to provide a bigger and richer picture of the student digital experience than ever before.
Second, with so much valuable data from learners, we’ve decided to collect some organisational data to put alongside it. That way we can see what universities and colleges are doing that relates to the student digital experience. We can create powerful messages about what makes a difference. And you as participants can compare your approach with those of your peers.
Student digital experience tracker 2017: the voice of 22,000 learners
So as well as asking which tracker surveys you want, and clarifying data responsibilities, the second sign-up form asks ten new questions about issues in your organisation. You only have to answer four, but we hope you’ll like them enough to go further. You’ll get your results by email, so you can compare them with your student responses later, and start using them to have conversations with your colleagues straight away.
We haven’t pulled these questions out of the air. We’ve spent the summer doing a literature search, reviewing other international surveys, and consulting with a panel of tracker users about the issues they think are important (thank you to everyone who took part). We also wanted to know what issues can be assessed by our lead contacts without them having to leave their desks. Many of the existing surveys are long and detailed. Producing something compact but meaningful has been a challenge.
Finally, as we were developing the latest version of the tracker questions (blog post follows), we grouped the questions into four new categories. We have followed these categories through into the data we collect from organisations, and we will also be using them to shape the new staff tracker we are designing for this year. So we are offering three ways you can collect data – from staff, from students, and at an organisational level – that build into a rich picture of how the digital environment supports learning and teaching.
Here you can explore the ten key questions we are asking:
Which best describes the state of your digital learning/technology-enhanced learning (TEL) strategy?
How many full-time equivalent TEL support staff does your organisation employ (including centrally located and in departments)?
Which best describes the state of your ‘bring your own device’ (BYOD) policy?
Which best characterises your organisation’s approach to adopting new technologies for learning and teaching?
How do you usually engage your students in improving the digital environment for learning?
How do you usually prepare students for using digital technologies on their course?
What percentage of your learning and teaching staff have undertaken TEL-related CPD over the past 2 years?
What percentage of the courses you offer make effective use of the functions available in the Virtual Learning Environment (VLE)?
Sample questions from the Tracker sign-up process
9. What percentage of the courses you offer make use of industry-standard and up-to-date digital software and systems
10. What percentage of learning spaces have been designed or adapted to support effective digital learning?
This year, also for the first time, we are bringing together panels of experts from the different sectors to review the evidence coming out of the tracker. This will include data from the sign-up process. If you have an interest in the questions we’re asking, and if you want your students to be part of this big conversation, please join us in the Tracker programme. Also you could consider joining our expert review panels later in the year.
Sign up for the 2017-18 Tracker here
Find out more about the tracker project here
Read about how others are using the tracker and explore the key findings from 2017 here
You can also contact Tracker Support for more details of the sign-up process, or Helen Beetham if you’d like more information about the background research that informed these questions.
The opening keynote at the ALT Conference this year was by Bonnie Stewart.
Bonnie Stewart is an educator and social media researcher fascinated by who we are when we’re online. An instructor in the Faculty of Education at the University of Prince Edward Island, Canada, and Founder/Director of the media literacy initiative Antigonish 2.0, Bonnie explores the intersections of knowledge, technology, and identity in her work.
Bonnie’s presentation was entitled, The new norm(al): Confronting what open means for higher education.
I wasn’t sure what to expect from the keynote, as I do like to be surprised, so hadn’t read the abstract. For those that do want to read it, here it is.
This talk opens up the intersection of learning technologies, open practice, and the idea of “norms” in learning and education. An exploration of the tensions around gatekeeping in higher education, the keynote examines our histories of norms and gatekeeping and the current trajectory and possibilities that openness offers learners and scholars, via learning technologies and digital practice. It also examines some of the dark corners of society opened up by the digital, and considers what this “new norm(al)” means for higher education. The talk frames our current moment as one of constant confrontation, and offers ideas for navigating confrontation overload while still preserving the spirit of openness and learning.
For me there were some key messages that came out, one of the main ones was that just saying you work openly doesn’t necessarily mean you are open to everyone. That open can sometimes be a solution, but can also sometimes be a problem. Listening to Siân Bayne the following day, the importance of anonymity (by definition not open) is something we need to recognise.
I do share much of my work openly, my Flickr images are Creative Commons licensed CC BY-NC 2.0 for example. However I also recognise as a white middle class, middle aged male that I have privileges and opportunities to be open that may not be available to others.
Bonnie recounted her early career up in the Arctic Circle and she said one thing struck her when she started was that she was white!
This resonated with me and reminded me of my early teaching career. I was bought up in Cambridge (not a real place) and at the time in the 1970s and 1980s wasn’t a culturally or ethnically diverse place. I started teaching in Somerset, first in Weston-super-Mare and then Bridgwater, both these places (back in the early 1990s) were predominantly white working class cohorts. I then got a job at Brunel College (now City of Bristol College) which is based in Ashley Down, literally a stone throw from the inner city district of St Pauls in Bristol. I don’t know why I didn’t realise but I was surprised when 90% of my students were not white. Like Bonnie did, I suddenly realised I was white!
The keynote also reminded me that the “norm” isn’t necessarily the “norm” for some people. Normal may be familiar, but reflecting on my time working in Bristol, the norm there was not familiar to me. My teaching needed to change to reflect the diversity and background of my learners and not my own background, which would have been inaccessible and unknown to the people I was teaching. We don’t always fit under a bell curve.
Another thing that came out of her keynote for me, was the essence of open working in a closed bubble. I know that my network, which is made up of lots of people who work openly, is very much a bubble and for many outside that bubble, despite the protestations of openness is as much closed to them as if the people were working in a closed manner. Even within the bubbles, open practice can be a barrier for many. Some people do not have the advantages or privileges that many have and can not afford to share and be open.
I also liked her slide on technical problems versus adaptive challenges and is something I recognise from working with academic staff in various colleges and embedding the use of learning technologies.
It was never about the technology, it was always about the people. Interestingly I also found it was never about the pedagogy either, it was always about the people too.
My sketch notes are really for me, rather than other people. The process of sketching allows my to digest for myself what is been talked about and demonstrated. The sketch note provides me with a mechanism that provides a process for my interpretation of what is being said and what I understand from the talk. The process of sketching engages me in the talk in ways in which note taking does for others, or conversing on the Twitter. They are not done for other people, if other people find them useful then that’s just a bonus. Having said that I do share them online, through Twitter (and Flickr).
Quite a few people came up to me to ask what I was doing, what app I was using and if I was sharing them. I had similar questions on Twitter as well.
Bonnie Stewart is an educator and social media researcher fascinated by who we are when we’re online. An instructor in the Faculty of Education at the University of Prince Edward Island, Canada, and Founder/Director of the media literacy initiative Antigonish 2.0, Bonnie explores the intersections of knowledge, technology, and identity in her work.
Her keynote, The new norm(al): Confronting what open means for higher education, was recorded and out on the YouTube.
I really enjoyed this talk about the meaning of open and how though we may think we are open, that may not necessarily be true of what we do, or how others perceive us.
My next sketch note was from Lawrie Phipps and Simon Thomson’s session, VLE to PLE – The next generation of digital learning environment, which was a forty minute session. One of my children called it, after seeing the sketch note, the Vile Pile.
This was a challenge to draw, partly as it was very much discussion based, but also it was quite a short session. My sketch note was very much about drawing out some of the main themes that came out, the core for me was about how the VLE is getting bloated (becoming a Swiss Army Knife, lots of tools, but not good at doing anything well) and that maybe we should move to a learner centred “system” which the VLE could be part of – this reminded me very much of the VLE is Dead debate we had back in 2009.
At Leeds Beckett University they are exploring the development of a PLE “space” through a HEFCE funded research project into Personalised User Learning & Social Environments (PULSE). This project explores the development of a hub for connecting students’ existing spaces with institutional spaces and empowering students to take ownership of their “content” within and beyond their learning.
The difference here is that they do not seek to develop an entirely new learning platform, but just an architecture through which to connect existing spaces.
On the Wednesday I did a sketch note of Siân Bayne’s keynote, The death of a network: data and anonymity on campus.
I did initially wonder where the talk was going, as Siân recounted her tale about a research project involving Yik Yak, but I found the end of the keynote fascinating as she spoke about the importance of anonymity in a world of big data.
This keynote will talk about a recent research study which traced the slow death of the anonymous, geosocial app Yik Yak at our university. I will provide a description of its use and decline but, more importantly, use it to understand what is at stake in the loss of the possibility of anonymity within universities in an age of data profiling, extraction and personalisation. Linking to the conference theme which explores issues at the forefront of innovation, I will use theory drawn from literatures on surveillance capitalism and the data economy to focus on developing our institutional values surrounding anonymity through and within our learning technologies.
I really enjoyed sketching this talk it just worked for me from a sketching perspective, I think drawing the gravestones was the heart of I what made the drawing for me.
You can watch this keynote on YouTube.
I think this is something that needs to be considered by all looking at the use of data and analytics, and will certainly inform my work at Jisc on the Intelligent Campus.
I also did a sketch note in the session, Kevin Costner is a liar: Field of Dreams and other EdTech fallacies, led by Kerry Pinny, Marcus Elliott and Rosie Hare. This was another forty minute session and I also was “forced” to participate, so it was a challenge to do and complete the sketch note in that short time. This is the reason why I didn’t do sketch notes for shorter 20 minute sessions I attended.
I had originally intended to “paste” an image of Kevin Costner into my sketch note, but I don’t think that this is a feature available in Paper 53.
The session was really interesting and I don’t think my sketch note really amplifies the content of the session.
Kevin Costner has a lot to answer for and so do we. In ‘Field of Dreams’ he was told that “if you build it, they will come”. This parallels the approach to innovation in educational technology, “if we install it, they will use it”. Given ‘At the forefront of innovation’ is one of this year’s themes it is the right time to ask whether limited innovation, impact and staff engagement is our fault?
The main focus for me was about “who is to blame” for the lack of use of learning technologies, something I might come back and explore in a future blog post.
I was looking forward to Peter Goodyear’s keynote on learning spaces, entitled Shaping Spaces.
This talk is about new learning spaces in universities and the scope for learning technologists to help shape better learning spaces. I will focus on design knowledge: knowledge that is useful in (educational) design work. Two ideas are core to my argument. The first is that the analysis and design of complex learning spaces – and learning situations more generally – must pay close attention to students’ activity: what it is they are actually doing. The second is that we need a shared set of actionable concepts that can connect human activity to the physical world (material/digital/hybrid), recognising that activity can be influenced, but is rarely determined, by features of its setting.
I found this quite a challenging keynote to sketch, often when sketching, key ideas and concepts make the whole process just work. With Peter’s keynote I struggled to create a coherent sketch note and capture the keynote.
Overall I was pleased with my sketch notes, I think they were much better than last year’s efforts. So did you do any sketching at this year’s conference?