Categories
Electronic management of assessment

Evaluating your readiness for electronic management of assessment: using our self assessment tool

Following pilot testing by 12 institutions we are offering a wider range of people the opportunity to try out our self-assessment tool. The tool is still in beta version so we know there will be functionality you would like to see added. The tool has so far been tested with universities so we are particularly …

Following pilot testing by 12 institutions we are offering a wider range of people the opportunity to try out our self-assessment tool. The tool is still in beta version so we know there will be functionality you would like to see added. The tool has so far been tested with universities so we are particularly interested in feedback from the FE and Skills sector as to whether adaptations are necessary to meet your needs.

The tool can be found at: http://ji.sc/emaready  Please read these guidance notes before using it.

You can give feedback by using the comment functionality on this blog or by contacting lisa.gray@jisc.ac.uk

About the tool

The self-assessment is designed to support institution-wide development of electronic management of assessment (EMA). It assumes that any college or university will benefit from a strategic approach to managing assessment and feedback (and hence EMA).  Such an approach requires a good overview of practice and a clear set of policies, regardless of the extent to which institutional policy leans towards promoting conformity or supporting diversity of practice.

At the end of the assessment you will be able to able to estimate your current maturity and develop an action plan. Our accompanying guide, Transforming assessment and feedback with technology provides ideas and resources to help you enhance the entire assessment and feedback lifecycle.

There are five stages:

Researching You are at an early stage of EMA. You do not seem to have a comprehensive view of organisational activity overall; policy, process and systems seem fragmented. Ensure you have senior management support to undertake further investigation. Start by defining the principles that underpin assessment and feedback in your organisation and find the areas of good practice you can build on.
Exploring You are probably aware of pockets of good practice but have not really begun to try to scale this up. You will need to be clear about expected benefits in order to effect the cultural change needed.
Embedding You are at a tipping point where fairly widespread experimentation is close to becoming mainstream practice. A key issue will be ensuring that business processes are sufficiently consistent to support a more holistic approach.
Enhancing You are probably already supporting the core of the assessment and feedback life cycle with technology.  You are looking to fill gaps and find more elegant solutions to existing workarounds.
Pioneering You are looking to go beyond automation, standardisation and efficiency gains to ensuring that EMA has a truly transformative impact on learning and teaching in your organisation. Your organisation is probably a provider of many of the resources in our accompanying guide but we can still provide some inspiration and support.

Why do the self-assessment?

For those in the early stages of EMA adoption the self-assessment will serve as a development tool and point you to resources that may be useful. For those further down the line you may find that your experience to date lies in particular aspects of EMA and the self-assessment gives a more rounded view of where you might benefit.

For those at a more advanced stage it may serve as reassurance that what you are doing is regarded as good practice but there may still be some areas where there are opportunities to do things differently.

Who should do the self-assessment?

The self-assessment aims to support whole institution development while recognising that different parts of the institution may be at different starting points and working towards different goals. It can therefore be completed from different perspectives:

  • A central support team may complete the evaluation from a whole institution perspective
  • Individual departments, schools or faculties may respond in the way that reflects their own practice
  • Course or programme teams may use the tool to give an even more localised view of practice

Completing the questions

Because the self-assessment is designed to work at different levels we ask you to define what happens in your ‘organisation’ in many of the questions. Organisation in this case refers to the coherent entity on behalf of which you are responding so it may be your programme team, your department or your whole institution. We also ask about consistency across your ‘component areas’ and practices at ‘local level’. At institutional level component areas/ local level will generally be schools or faculties, at department level they may be programmes of study and at programme/ course level they may be individual modules.

In many cases it may not be possible for an individual to answer all the questions. Indeed, we suggest that the self-assessment should be done as a group exercise because the dialogue that ensues is the first stage in the change process.

What will the self-assessment tell you?

The self-assessment tool will give you a report rating you at one of five levels against a range of headings:

  • Strategy/policy and quality assurance
  • Curriculum data
  • Processes and working practices
  • Technology
  • Culture
  • Student experience

A diagrammatic representation will serve as a quick overview of strengths and weaknesses.

An onscreen report will give you a set of suggested actions intended to help you build on your strengths and address your limitations and it will also provide links to resources that might help you carry out the suggested actions. The resources may be in the form of practical guidance, checklists and templates from other institutions or case studies and examples of practice.

To obtain an email copy of the report scroll to the bottom of the screen and enter your email address.

How should you use the self-assessment outcomes?

The tool has been tested with a range of experts from different institutions. They reported that the outcomes matched their own understanding of their strengths and weaknesses and that the suggested actions fitted with their experience of how to make progress in each of these areas.

Every institution is of course unique and your own local knowledge will be needed to put our generic guidance into context.

Suggested actions

Usually you will be offered a range of suggested actions and resources for each development area. You can have confidence that all of these approaches have worked for others but it will be up to you to decide which are best suited to your particular circumstances. As well as using the specific resources suggested you will find additional support in our guide to Transforming assessment and feedback with technology.

Depending on where you sit within the institution not all of the suggested actions will be within your remit to implement. Some actions such as those requiring clearer definition of policy or improvements to business processes may need support from a higher level of authority. Nevertheless the self-assessment outcomes will provide useful evidence of need for you to begin the necessary conversations.

Data validity

There will always be some concerns about the validity of data from a self-reporting exercise such as this. Different people may have a different interpretation of some of the questions and responses and it is only through dialogue that such differences can be explored and enhance the institutional knowledge base. Most of those involved in the collaborative development of this tool found the dialogue instigated by the self-assessment process to be the most valuable aspect of the activity.

We suggest the tool is best suited to supporting continuous improvement rather than any kind of benchmarking.

Whether your approach to developing EMA capability is top-down or bottom-up and whether you are a policy maker or a practitioner, you will probably find that you want to compare results from different parts of your institution.  This will help you target staff development, select areas to pilot new approaches and identify good practice that can be built upon.

What should you be aiming for?

Our five level scale reflects the increasing use of EMA bringing further benefits for students and institutions. It should however be viewed with a number of caveats.

Non-linear scale

The scale is not a simple linear one. The first two levels are quite similar in terms of the user experience. You may correspond to the researcher level because your institutional practice is disjointed and people do not have a clear idea what others are doing. However, the overall user experience may not be significantly different to that of institutions at the explorer level.

Progress through levels

Institutions have also reported that the amount of effort needed to move between levels is not equally distributed. The most significant amount of effort is needed to get from the early stages to the embedding and enhancing levels. Once there, further progress is proportionately easier to achieve.

Progress through the levels is associated with evidence of greater benefits but that is not to say that every institution will necessarily be aiming to reach the highest level. In some cases institutions may provide an excellent student assessment experience in spite of constraints on how much they can invest in supporting information systems.

A co-design initiative

The self-assessment tool was developed using our co-design approach and we are particularly grateful for participation from the following institutions:

Anglia Ruskin University
Aston University
Birmingham City University
Manchester Metropolitan University
Plymouth University
University of Bradford
University of Edinburgh
University of Hull
University of Nottingham
University of Sheffield
University of Southampton
University of York