Synthesis and Evaluation team approach to phase 2
In summary our approach to evaluation and synthesis included:
- Revision of the Pilot Programme Synthesis Framework which provided a strong foundation and common language for collating evidence
- Working closely with the programme management team, project teams and their evaluators to ensure all projects arrived at a coherent, feasible evaluation plan that met the needs of the programme
- Working closely with projects to ensure methods for evaluating OER uptake and use are shared with projects as soon as they are available
- Introducing projects to other evaluation methods and resources, and emerging evidence from other initiatives
- Contributing to the design of programme events and documentation (e.g. reporting templates) to ensure evaluation and synthesis issues remain a priority
- Responding to support requirements expressed by projects (as appropriate, and in collaboration with other programme management and support teams)
- Iteratively mapping project outputs and lessons learned to the synthesis framework: offering feedback expectations to projects in a timely way
- Developing the resources available to projects, including advice provided via social media to maximise collective learning
- Further developing measures of openness and encourage projects to apply them to their outcomes
- Setting up and supporting thematic peer review clusters among the projects to make our interventions with projects more efficient, and ensure conversations about evaluation are ongoing
Evaluation and synthesis has continued to be an iterative, two-way process. Projects were offered useful models for progression, and were encouraged to contribute to the further development of the synthesis and evaluation framework. Collation of findings and evidence builds on the outcomes of Evaluation and Synthesis in the Pilot phase, but takes into account emerging lessons from the new programme and from other global OER initiatives, including the UK Open University experience. We have worked with all participants in the programme to identify emerging approaches to OER release which are valuable and relevant to the UK HE and FE context, and to consider wider issues around open practice.
Working with the different strands of this phase has allowed us to see how different evaluation questions could be asked, and how different barriers and enablers were at work, in the very different contexts. The following table shows our starting point for thinking about strand-specific evaluation questions.
|A(i): OER release meeting sector needs||Projects committing to release a substantial amount of content in the indicated areas.||How do projects claim they will support the relevant sector priority?
How are OERs specifically of value in this context?
|A(iii): Cascade support in the release of OER||Teams already releasing OER sustainably working with one or more partners to support those new to OER release.||How effective is cascade support in enabling release of OERs?|
|C(i): Collections of OER based around a thematic area||Projects identifying, collecting and promoting collections of OER and other material around a common theme.||How does thematic presentation of materials support open access and reuse?|
We decided to group the 20-25 projects into congruent pairs/groups, within their strands. Each project was encouraged to give feedback on their partner’s evaluation plan – along with a nominated member of the Evaluation and Synthesis team – and to provide external peer review of a selection of outputs at the end of the process. Each strand had a dedicated memebr of the synthesis and evaluation team to support the pairings and project evaluation issues. Specific evaluation support was therefore provided on a ‘by request’ basis to any project that requested input.
Comments from Projects re evaluation pairings (peer review process)
Only 4 of the 11 projects in this strand mentioned their evaluation partners in their final report – EDOR, DELILA, OPENSTEM, CPD4HE Regular engagement with the OMAC Strand Management Group and other OMAC projects (in particular, our „evaluation partners‟, from the HEA Subject Centre for Business, Management, Accountancy and Finance (BMAF) and the HEA Subject Centre for Medicine, Dentistry and Veterinary Sciences (MEDEV)) proved useful, allowing discussion on project process and evaluation and sharing of resources. Some discussions were also had with the Synthesis and Evaluation team, feeding in to the on-going evaluation process and resulting in the „pilot‟ HE in FE CPD OER survey.EDOR final report
Our evaluation pairing with Falmouth drew on existing ties between our two institutions and hence felt very natural and complementary. OPENSTEM final report
“We were able to compare notes with the DELILA project, which was useful. We worked with our partner project; Jane Secker, the DELILA project director, participated in our webinar series and the CPD4HE project director participated in the DELILA dissemination event.”CPD4HE final report
DELILA Acknowledged partner and also highlighted other projects “Links were also made with projects such as OSTRICH (OER Sustainability Through Teaching and Research Innovation: Cascading across HEIs) – and DORRE (which provides guidance for creating OERs from scratch.” DELILA final report
Only 5 of the 12 projects (SPACE, DHOER, PORSCHE, ALTO, Learning from WOeRK) in the Release strand mentioned their evaluation partners in their final report. ALTO and Learning from WOeRK really developed their partnership and produced video interviews about this
Learning from WOeRK – http://cpdoer.net/2011/05/evaluation-buddy-interviews/
The idea was that we would both ask each other a set of simple questions about the progress and experience of our projects in order to share our learning and reflections with the rest of the UK OER programme, and indeed others interested in OERs. I have found it really helpful meeting with John, who is very experienced in this field. The evaluation buddy system has really worked well for us. Learning from WOeRK
Evaluation and synthesis framework
Several projects referred to the framework in their final reports and some projects used the framework to report to in evaluation and final reports.
Release strand – SPACE, Learning from WOeRK, PORSCHE, OSIER
OMAC strand – EDOR, DELILA, CPD4HE
Synthesis was carried out by the evaluation support person for each strand and collated into strand synthesis pages.
Phase2 Cascade strand synthesis (Helen Beetham)
Phase2 Collections strand synthesis (Isobel Falconer)
Phase2 Release strand synthesis (Lou McGill)
Phase2 OMAC strand synthesis (Lou McGill)
These were then drawn into the programme level synthesis pages which make up this report. A new phase 2 synthesis framework has emerged from this activity and will be taken forward into phase 3 of the programme.
Back to Introduction
Forward to Anticipated outcomes