Pilot phase methods and approaches
Table of Contents
The team developed a range of mechanisms to support the mapping of project activities, issues, messages, outcomes and outputs. These were developed and tested using input from project bids, project plans, blogs and web sites, start-up programme meeting, the Elluminate sessions and from individual contact with project teams.
Our approach to synthesis aimed to support the following functions:
- Providing mechanisms for projects to identify key evaluation questions and share methods used across the programme
- Enabling the projects to feed their findings into a framework which reflects strand and programme wide issues
- Supporting synthesis of key messages and issues and highlight areas of concern or that may require action
- Providing feedback to the Programme Team on issues, challenges, successes and likely outcomes and outputs
- Reflecting the stages of the projects throughout the programme and ultimately providing a series of mechanisms to disseminate to the wider communities
- Providing a series of mapping approaches that will be valid and appropriate for future iterations of the UKOER Programme
Synthesis and Evaluation Framework and mapping tools
Generic Synthesis and Evaluation Framework
The Generic Framework was essentially a working document which presented a structure which aimed to capture and reflect key dimensions of the programme. In particular it aimed to:
- provide a foundation and common language for collating data
- offer a range of questions/issues to support evaluation and review
- support the collation of key messages, challenges, solutions and outputs
- enable the identification of key areas of interest and highlights useful approaches
The framework has initially been created from several sources
- prior experience and knowledge from within the team
- issues identified as important within the original ITT
- information gleaned from project documents, programme activities and programme support teams
This framework was augmented throughout the programme. Various iterations of the framework have been preserved to reflect the stages when issues were added to provide a history/timeline of its development.
Strand specific frameworks
Three separate strand-specific versions of the framework were shared with project teams and various approaches were utilised to encourage project engagement. This included email circulation, input to individual programme manager telephone calls, direct contact with project teams, an elluminate session and presentations and workshops at programme and other meetings. Some projects engaged well with the framework and utilised it, as intended, to support evaluation activities. Some included their own version of the framework as part of their final reporting mechanism. For the institution strand a mindmap was created to provide a visual representation of key issues. The small number of projects made this approach work well and suited the team member who was responsible for synthesising outcomes from that strand.
Individual support to projects
The framework mappings were used to initiate contact with projects, initially around their evaluation plans and activities. A focus on proposed outputs has allowed us to negotiate some aspects of project activity and evidence gathering: so for example we encouraged projects to focus their evaluation efforts on the unique aspects of their community/institution/individual processes and experiences, rather than gathering multiple sets of similar data. We also adapted our evaluation framework to acknowledge that projects are more focused on quality issues, and on user evaluation, than we had anticipated. This dialogue has also allowed us to offer evaluation expertise where appropriate, though most projects have access to such expertise either within the team or by recruiting external consultants for this role.
The framework enabled us to structure our interventions with projects, and has been used as a means of evaluating the ‘openness’ of their outcomes. The framework was developed in collaboration with the support team so that support can be focused on those areas identified as most challenging or interesting, and so that evidence about the effectiveness of support in different areas can also be evaluated.
The framework proved to be a successful means of gathering information from the projects. Many projects used the strand frameworks to map their activities, final outcomes and outputs. These stand frameworks fed back into the generic framework and also in the final Pilot programme synthesis framework.
Support and synthesis remained intimately connected through our ongoing dialogue with project teams, and the shared understanding we established around evaluation allowed us to interpret findings as they emerged, and suggest new angles on evidence gathering and analysis as the issues became clearer.
Folowing consultation with projects around evaluation outcomes they identified an intention to focus on:
- project progress and activities: effectiveness of project interventions
- quantity and quality of resources released
- evidence of use and re-use, feedback from pilots and trials
- evaluation of educator/staff perspective (e.g. through surveys, interviews, peer review, observations, system data, critical friends)
- evaluation of learner perspective (e.g. through surveys, interviews, observations, system data)
- evidence of sustainability and institutional/community transformation in the direction of greater openness
Final Pilot Programme Synthesis Framework
The generic and strand frameworks served as working documents throughout the pilot programme and the resulting Pilot Programme Synthesis Framework presents a closing snapshot of the programme, highlights key outcomes and also includes links to specific relevant project outputs. This version of the framework could be taken forward into the next phase of the JISC/Academy OER Programme.