Many of the concerns raised by projects in relation to programme management and programme requirements have become key learning points for the OER community.
Table of Contents
Overall structure of the programme
The decision to fund three separate strands of release has proved a valuable one, with different communities emerging and distinctive lessons learned in each. The decision to segregate release from re-use processes (the latter having been partially explored in the JISC Reproduce programme) was considered less successful overall, as projects felt they lacked knowledge of their potential users, and as a growing consensus emerged that open content requires a lifecycle approach.
Institutional and subject strand projects found it particularly frustrating to run out of project time and funding just as the bulk of their resources were being released. This ruled out meaningful user evaluation – at least with project funding – and made it difficult to sustain the communities and release processes which had been established. PHORUS noted that the very perception of OER funding as short-lived can undermine some of the potential rewards for participating. ‘The need to demonstrate the benefit of OER to the discipline is problematic without examples, and without evidence of sustained investment’. The pilot phase has been successful at refining existing processes, enhancing expertise, and bringing OER champions together. It will be important now to evaluate whether and how these gains are sustained, and to find ways of supporting their continued development via ongoing resources and services.
The programme was timely in that it benefited from a sudden media and academic interest in open content during 2009/10 while also contributing significantly to that surge through its own communication activities. Its timeliness caused occasional problems of managing scope, e.g. as institutions saw an opportunity to build a local repository.
National support services
JISC CETIS, JorumOpen, and particularly JISC Legal were widely consulted by projects. The support of JISC Legal was seen as critical by many. Towards the end of the pilot programme, projects were also becoming aware of the role of SCORE as a national support service. Feedback and recommendations for improving these services has been collated in the case of JorumOpen. There was undoubtedly frustration among projects that the development pathway of JO was parallel to their own, meaning that most resources were uploaded in the final month and through an interface that still needed refinement. At the same time, some projects appreciated the opportunity to influence the development of JO as a national resource.
In general there is a consensus among projects that national services are necessary to plug the gaps in expertise at institutional level, and to communicate the lessons learned through pilot projects.
CATS (credits) as a means of assessing the content released
Depending on the type and granularity of content being released, CATS points were often not regarded as a valid way of assessing the contribution that projects had made to the body of open learning material. In some cases the content was deliberately decontextualised from the programmes of study in which the values of ‘level’, ‘time taken’ or ‘learning outcome’ make sense. In others, the type of content itself, e.g. images, simulations, applications, questions, invited a wide range of different learning and teaching applications of indeterminate level and length.
Despite the relative privilege of project funding, the expectation of public sector cuts was already being felt at some institutions involved in the programme. This was experienced for example in delays implementing repository solutions or learning environments, and in cuts to staff who might have supported the process of open release.
In the subject strand particularly, the requirement to gain written consent for open release from up to 21 institutions caused major delays, protracted negotiations, and unforeseen hitches. In the end, though, this requirement forced the relevant institutions (all 77 of them!) to review their OER and legal arrangements. Project teams are confident that the benefits in terms of openness and clarification of policy will be felt in the coming 12 months.
Related to the previous point, all projects were asked to show evidence of institutional impact, but in the case of the subject strand, the greatest transformations took place at community level. It was often not possible, working at a local scale across so many institutions, to demonstrate impact on the larger institutional picture.
There were many opportunities for collaboration within strands, which led to the shared production of guidance materials, and to regular sharing of experiences online (e.g. in relation to user surveys, clearance letters, consortium agreements). Sharing across strands took place at the programme meetings but was less well sustained between them, partly because of the sheer number of projects involved.