EPSRC monitoring and evaluation framework for the portfolio of Centres for Doctoral Training (2011)
This is the January 2011 version.
The EPSRC monitoring and evaluation framework provides a basic framework for the monitoring and evaluation of all Centres for Doctoral Training throughout their lifetime, from the start of the training grants to their end point with EPSRC funding.
The framework is written to give an indication of what EPSRC would expect the Centres to be able to measure and report on in years five and onwards because we believe it is important that all Centres know what information they need to start collecting now. Nevertheless we do recognise that at the mid-term review stage in year three (Summer 2011), some aspects of the time-delineated information will not yet be available in Centres (for example - next destinations of students, key publications).
The current portfolio of CDTs includes 45 new Centres approved in 2008 with formal start dates of 1st October 2009 and 17 Centres - largely in the life sciences cross-disciplinary interfaces (LSI) area - which have been in place for several years. There are also three each of additional Energy and Mathematical sciences Centres approved during 2009. Accordingly, different timescales will be operating for those Centres which have just recruited their first student cohorts compared to engineering doctorate, LSI and complexity sciences Centres which are already running strongly on previous grants started up to nine years ago.
The intention here is to have established an acceptable core plus evaluation model. This current core document is intended to allow all Centres to be compared across the portfolio, while the plus component (not included in this particular document) will ensure that the varying aims and purposes of different types of Centres (like Industrial Doctoral Centres (IDCs), LSI- Doctoral Training Centres (DTCs) and the new 'mission' programmes) can be reflected in the evaluation outcome over the same timescales.
Many existing Centres have already been reviewed in recent years. We will be incorporating such Centres in the overall evaluation framework, but there will need to be some adjustment of timescales and incorporation of some specific issues. In the same way, the basic monitoring and evaluation framework proposed here may need to be augmented with a number of additional programme-specific questions for those Centres operating as part of the Reserach Councils UK strategic themes and for the IDCs. Finally, we expect to issue well before the review in 2011 one or more template forms so that financial or numerical data can be collected in a standard format to facilitate comparison between Centres during the review process.