This page links to some resources that may be useful when planning and carrying out the evaluation of your OER project. We expect that the latest version of the Synthesis and Evaluation Framework and the advice of individual evaluators will be the main resource for most teams.
Evaluating OERs
These resources may be helpful when evaluating the processes of open release (though see our evaluation framework) and/or the quality of OERs.
- Li Yuan at CETIS has produced a presentation on Understanding and Evaluating the Impact of OERs http://www.slideshare.net/cetisli/cal09-presentation as well as a blog post around the same issue http://blogs.cetis.ac.uk/cetisli/2009/03/30/developing-a-framework-for-understanding-and-evaluating-the-impact-of-open-educational-resources/
- The Open University’s Open Learning Network (OLNet) is carrying out several research projects which involve evaluating OERs in different settings. Learning Design for Open Educational Resources (http://olnet.org/node/117) and the OER Effectiveness Cycle (http://olnet.org/node/135) are two. There is also an OLNet cloudscape (http://cloudworks.ac.uk/index.php/cloudscape/view/562.html) on Cloudworks which hosts ongoing discussions and resources.
- The JISC Reproduce programme funded around 20 projects to repurpose educational content and report on the barriers and enablers they found. Many of the findings will also be relevant to OER projects, particularly if they are considering reusability as a quality of openness. http://www.jisc.ac.uk/whatwedo/programmes/elearningcapital/reproduce.aspxhas links to the programme’s evaluation plan and some quality assurance measures that were developed as a result (scroll to the bottom of this page).
- Becta’s ‘quality principles for digital learning resources’ http://partners.becta.org.uk/index.php?section=sa&catcode=_sa_cs_cf_03make an interesting but perhaps controversial distinction between design principles and pedagogic principles. They do not deal specifically with openness or re-usability.
- Helen has drafted a paper on quality issues arising from the Reproduce programme – dealing with re-usability specifically – which can be found here. approachestoquality.doc
- Surveys have been developed by subject strand projects and are available for all to use. Results from the surveys will be collated as part of the overall peogamme evaluation effort. Projects have adapted this onehttps://surveys.heacademy.ac.uk/oerbase developed by Helen Beetham for their own use.
- The Organising Open Educational Resources (OOER) http://www.medev.ac.uk/oer/ project created a survey to ask how people search for learning resources online, the sources used, and how they evaluate the results they find. http://www.surveymonkey.com/s/D3TPRVK
- Phil Barker at JISC Cetis has drawn together information around resource tracking http://wiki.cetis.ac.uk/Resource_Tracking_for_UKOER
Learner-centred evaluation
These resources may be helpful when evaluating OERs in use by learners
- Phil Barker of CETIS has produced a presentation and facilitated an elluminate session on the use of tracking data to monitor resource usage http://www.slideshare.net/philb/resource-tracking-for-ukoer. This kind of data can be triangulated with (e.g.) surveys, interviews, focus groups, guided elicitation or observation of learners to explore their use of OERs in more detail.
- The JISC Learners’ Experience of e-Learning wiki has a range of resources for researchers and evaluators athttps://mw.brookes.ac.uk/display/JISCle2g/For+researchers. These include a section on methods which has downloadable ‘recipe cards’ for conducting specific data collection or analysis techniques.
- Helen produced a guideline on learner centred evaluation in 2007 which includes some tables and checklists that might be relevant to projects which are focusing on learners as stakeholders in OER release and use. learner_centrd_evaluation_guide.doc
Note that both these resources assume a fairly strong focus on learners as stakeholders, and sufficient resources available to engage learners meaningfully in the evaluation process.
Institutional transformation
An important outcome of the OER pilot programme is an analysis of features of institutions and other communities which support OER release. These can then be used to assess, for example, progress towards more open content management, more OER-friendly legal and technical processes, or more open pedagogies. We should not therefore expect to find any off-the-peg evaluation tools already in existence. The generic framework records what issues are emerging as crucial in evaluating openness at institutional level.
- The Capetown Open Education Declaration http://www.capetowndeclaration.org/read-the-declaration provides the broad objectives for open education, within which framework we are developing our own indicators for assessing openness at institutional and community level.
However, the following resources may be useful when evaluating institutional change in a general sense, associated with project activities (NB they will all need to be adapted to purpose).
- Some of the tools and techniques http://www.jiscinfonet.ac.uk/tools described on the JISC infonet site are relevant to baselining and assessing institutional issues
- The e-learning benchmarking programme run by the HE Academy used a range of tools to benchmark institutional approaches to e-learning, some of which included indicators relating to content management. http://elearning.heacademy.ac.uk/weblogs/benchmarking/ (scroll through the right hand column to find methodologies)
- The SHEFC Transformation Programme evaluated the impact of specific e-learning initiatives on a range of Scottish HE and FE institutions: some of the methods used are documented on the TESEP web site http://www2.napier.ac.uk/transform/ and in the methods section of the Glenaffric evaluation report http://www.sfc.ac.uk/nmsruntime/saveasdialog.aspx?lID=1036&sID=298
- Becta has recently launched a ‘scaling framework’ to support scaling up innovations. The main target is the schools sector and it is a ‘how to’ rather than an evaluation framework, but the dimensions of depth, sustainability, spread, shift, and evolution could all be applied to evaluating changed processes around OERs. http://www.microsoft.com/education/demos/scale/ – and don’t be put off by the flash intro.
General Evaluation resources
- JISC elearning programme evaluation framework
- Six steps to effective evaluation – Glenaffric
- Measuring Benefits (from JISC infoNet Project Management infoKit)
- Synthesis of approaches to evaluation from the JISC Curriculum Delivery projects Synthesis of Evaluation Activities_Curriculum Delivery Programme_Inspire Research Ltd.pdf
- ‘Maximising impact: assembling and leveraging the evidence base’ – presentation Maximising impact_Evaluation Assembly_Inspire Research.ppt
- Evaluating practice (JISC infoNet)
- Tools for evaluating complex change (Saunders, Charlier, Bonamy) – Higher Education Academy
- LTDI Evaluation Cookbook– a practical guide to evaluation methods for lecturers
- Assessment & Feedback programme start-up meeting – Inspire Research presentation for Strand B including big picture evaluation questions & evaluation standards
Using video for evaluation
- Using video to capture evidence and reflections (online seminar recording)
- Blog conversation with Rebecca Galley on the use of video in the evaluation of the OULDI project, including tips and tools.
Tools and tips
- JISC infoNet Impact Calculator
- Tips for evaluating JISC projects
- Measurement Tools wiki (JISC infoNet)
- EvalKit resources (JISC infoNet)