Content
Purpose
- This document sets out the Pilot framework for the assessment of People, Culture and Environment (PCE) in the REF exercise. This framework will be tested in the PCE Pilot Exercise. It is expected that the framework will be refined and evolved during the pilot exercise to produce recommendations on the assessment of PCE in the REF 2029 exercise. The guidance for the PCE assessment in REF 2029 will likely vary significantly from the Pilot guidance presented here.
- The pilot exercise will be focussed on developing the assessment process for PCE, any assessment process developed through the indicators project and the pilot exercise will need to be situated in the wider REF assessment. The pilot exercise will not examine statements on Contribution to Knowledge and Understanding (CKU) or Engagement and Impact (I&E); however, outcomes of the pilot may be applied in development of these elements. The Pilot will also focus on the newer elements of the PCE aspect of the REF; it does not include income, infrastructure and facilities (as previously assessed under REF environment) though these will continue to be an important part of the PCE element and details will be provided as part of the wider REF 2029 guidance.
- The nature of the PCE Pilot Exercise means that there needs to be some flexibility in the guidance to allow for testing of different approaches. Participating HEIs will have flexibility in how they approach their submissions, and assessment panels will have flexibility in how they approach the assessment. It is valuable to test these different approaches in the Pilot exercise as it will help to develop an understanding of what is feasible and workable for assessment of PCE in REF 2029.
Introduction
- The Research Excellence Framework (REF) is the UK’s system for assessing the excellence of research in UK higher education institutions (HEIs). It first took place in 2014 and was conducted again in 2021. The next exercise is planned for 2029.
- The REF outcomes are used to inform the allocation of around £2 billion per year of public funding for universities’ research. The REF is a process of expert review, carried out by sub-panels focused on subject-based units of assessment (UoAs), under the guidance of overarching main panels and advisory panels.
- The purpose of the REF is to:
- Inform the allocation of block-grant research funding to HEIs based on research quality
- Provide accountability for public investment in research and produce evidence of the benefits of this investment
- Provide insights into the health of research in HEIs in the UK
- Development between exercises is necessary to reflect the changing research landscape. Therefore, the four UK higher education funding bodies launched the Future Research Assessment Programme (FRAP) to inform the development of the next REF. The funding bodies published initial decisions on high-level design of the next REF in June 2023.
- Redesigning the UK’s national research assessment exercise offers an opportunity to reshape the incentives within the research system and rethink what should be recognised and rewarded. Changes for REF 2029 include an expansion of the definition of research excellence to ensure appropriate recognition is given to the people, culture and environments that underpin a vibrant and sustainable UK research system.
- The three elements of the assessment have been renamed for REF 2029 and their content adjusted to reflect this. The weightings between the three elements will also be rebalanced.
- People, culture and environment (PCE)
- Contribution to knowledge and understanding (CKU)
- Engagement and impact (E&I)
- Changes to the three assessment elements used will allow REF 2029 to recognise and reward a broader range of research outputs, activities and impacts and reward those institutions that strive to create a positive research culture and nurture their research and research-enabling staff.
- The PCE element replaces the environment element of REF 2014 and 2021 and will be expanded to include an assessment of research culture. Evidence to inform assessment of this element will be collected at both institutional level and at the level of disciplinary submissions.
- The consultation on the Initial Decisions, published in the summer of 2023, probed the community’s thinking on most of the major decisions. In addition, feedback was specifically sought on the PCE element of the assessment.
- As part of REF 2029 development the assessment of PCE within the REF is being explored through two key projects:
- PCE Indicators Project: A project led by Technopolis and CRAC-Vitae in collaboration with a number of sector organisations, has developed indicators to be used for the assessment of PCE. The project team engaged extensively with the research community through workshops and a sector survey to co-develop a shortlist of indicators to be used to evidence and support institutions’ PCE submissions.
- PCE Pilot Exercise: The REF team is delivering a PCE Pilot Exercise to trial the assessment of PCE within REF. A selection of HEIs will prepare submissions in a sample of UoAs, and also at institution level. These submissions will draw on indicators developed by the PCE Indicators Project. The submissions will be assessed by a group of panels composed of members with expertise in research assessment, and in PCE generally within academia.
- Pilot UoAs were selected to enable the inclusion of a range of institution and submission types across the four Main Panels, to provide a general insight into the assessment of PCE, and to highlight particular issues or special considerations that may exist for the assessment of PCE; these considerations included specialist institutions, practice-based research, and the production of non-traditional research outputs. You can view a list of pilot UoAs here.
- Similarly, HEIs participating in the pilot were selected from across the whole of the UK to give a broad range of different submission sizes, breadth of provision and experience of participation in previous REF exercises. You can view a list of pilot UoAs and participating HEIs here.
PCE pilot timetable
- A timetable of the REF PCE work is available on the REF website.
The assessment framework
- The proposed assessment framework has been structured around five factors which enable positive research culture. These are:
- Strategy: Having robust, effective and meaningful plans to manage and enhance the vitality and sustainability of the research culture and environment.
- Responsibility: Upholding the highest standards of research integrity and ethics, enabling transparency and accountability in all aspects of research.
- Connectivity: Enabling inter-disciplinary and cross-disciplinary approaches both within and between institutions, fostering co-creation and engagement with research users and society, and recognizing and supporting open research practices.
- Inclusivity: Ensuring the research environment is accessible, inclusive, and collegial. Enabling equity for under-represented and minoritised groups.
- Development: Recognising and valuing the breadth of activities, practices and roles involved in research, building and sustaining relevant and accessible career pathways for all staff and research students, providing effective support and people-centred line management and supervision, supporting porosity and embedding professional and career development at all levels and across all roles.
- Within each of these enablers, a number of indicators have been identified, and sources of quantitative and qualitative evidence which can be used by HEIs to illustrate performance against each indicator. HEIs will also be asked to provide narrative elements to support and contextualise their performance against the enablers.
- HEIs will make submissions at institution level, and also at UoA level. For the purposes of the Pilot we are asking HEIs to return evidence and produce contextual elements supporting their performance across all indicators in the framework. In addition, if participating HEIs feel that their performance in PCE is better contextualised by other indicators then they are afforded the flexibility to include them. We recognize that this represents a significant effort in preparing submissions, but want the scope of the Pilot to be as wide as possible, incorporating as broad a range of indicators as possible. The Pilot exercise will be used as a vehicle to narrow down and focus the list on indicators and evidence sources.
- In the REF 2029 exercise we anticipate some indicators or evidence being mandatory requirements, while others will be optional. We will seek feedback from the participating HEIs and the assessment panels on which indicators and evidence should be considered in each category.
- Some indicators may lend themselves better to institution-level assessment and others more to subject-level assessment. In addition, some indicators and evidence may be more informative for some disciplines than others. These are areas which we would like to explore through the Pilot exercise and will be seeking feedback from participating HEIs and the assessment panels on the utility of the different indicators and evidence.
- You can download a copy of the assessment framework using the link below. The table gives an overview of the indicators of excellence covered by each of the factors enabling positive research culture. Panels will assess submissions against each of these enabling factors. They will consider the operating context of the institution, the distance travelled, and any reflection or learning that has been gathered for future development. In addition, examples of quantitative, and qualitative evidence and contextual information are given, though HEIs may feel that other evidence better represents their performance in PCE and should not feel constrained by these examples.
PCE Pilot Exercise Assessment framework (PDF)
If you cannot open or read this document, you can ask for a different format.
Request a different format
Email info@ref.ac.uk, telling us:
- the name of the document
- what format you need
- any assistive technology you use (such as type of screen reader)
Find out about our approach to accessibility of our website.
Process
Submissions
PCE pilot submission template (Microsoft Word document)
If you cannot open or read this document, you can ask for a different format.
Request a different format
Email info@ref.ac.uk, telling us:
- the name of the document
- what format you need
- any assistive technology you use (such as type of screen reader)
Find out about our approach to accessibility of our website.
- HEIs participating in the pilot exercise will prepare their submissions during December 2024 to March 2025. Each HEI will prepare one or more subject-level submission and an institution level submission. You can view details of this in the submission template, (see link above).
- You can download a copy of the submission template using the link above. A template should be completed for each subject-level submission and for the institution level submission.
- The template provides space for HEIs to return quantitative and qualitative evidence and contextual information supporting the institution’s performance in the five enablers of research culture. For each enabler there are indicators and evidence that support the institution’s performance. We are asking HEIs to provide as much evidence as possible to support all of the indicators and all of the enablers. HEIs will also provide contextual information to support their performance against the enablers.
- Word limits are indicated for each of the sections of the template to keep the burden of producing and assessing submissions manageable. Indicative word limits for each section of the template are 1,000 words. However, it is important that there is space in the templates for institutions to provide sufficient contextual information supporting their submissions, this may include charts or diagrams where appropriate. Therefore, if there are significant issues with the word limits indicated, or other use of the templates, then participating HEIs should discuss with the REF team.
- HEIs will be permitted to include additional evidence to support their submissions. Additional evidence should be considered as clarification or corroboration of the submission template. The template should be written to be assessed as a stand-alone piece, assessment panels will examine supporting evidence if they feel it is necessary, it should not be assumed that the assessment panels will be able to read long reports or other supporting evidence.
- The PCE pilot is intended to include some flexibility and it is likely that institutions may take different approaches to preparing their submissions, and where institutions are making submissions to multiple UoAs within the Pilot, they may choose to take different approaches to these subject-level submissions. This flexibility is beneficial as it allows testing of a range of approaches and wider exploration of PCE assessment.
- A SharePoint site will be provided for HEIs to manage their submissions and for assessment panels to access the submissions for assessment. The completed templates should be uploaded to the SharePoint site, sub-folders have been provided for each subject-level submission, and for the institution-level submission. Additional supporting evidence should be uploaded alongside the appropriate template. Participating HEIs will be contacted shortly with the appropriate URL. When uploading files to the SharePoint site the following naming conventions should be used: “Institution name, Submission name (e.g. UoA and number or ‘institution level’), submission template/supporting evidence X.Y – evidence description”. For example:
- “University of Poppleton, UoA 3, submission template” – to indicate a submission template for UoA 3: Allied Health Professions, Dentistry, Nursing and Pharmacy
- “Poppleton University, institution level, supporting evidence 1.1 – institutional strategy document” – to indicate supporting evidence for an institution-level submission relating to section 1 of the template “STRATEGY”
- Feedback will be sought from participating HEIs on the process of developing PCE submissions including the ease of collecting evidence to support the indicators, the robustness of the evidence, and the possibility of verifying the evidence through audit.
- Feedback is likely to be framed around the following example questions:
- How much effort/staff time went into the preparation of each submission?
- What worked well? What could be improved?
- Which indicators were straightforward to work with? Which indicators were problematic?
- What aspects of the template were positive? What aspects of the template were challenging?
- Which indicators are not robust or may be challenging to audit? Were there any concerns about data quality?
- Could collection of any of the indicators be automated?
- Are there other indicators not tested in the pilot that should also be considered?
- We plan to provide a Miro board or online collaborative space to facilitate discussions between participating HEIs and to capture initial feedback as HEIs are producing the PCE submissions. Details of this will be circulated to the participating institutions in January 2025.
Assessment
- The PCE Pilot Assessment will be conducted by eight subject-level assessment panels and an institution level assessment panel. Panels will be asked to evaluate each of the submissions assessing against the draft criteria outlined below. Criteria may be refined during or following the pilot, but our starting criteria are:
- Vitality: which will be understood as the extent to which the institution fosters a thriving and inclusive research culture for all staff and research students. This includes the presence of a clearly articulated strategy for empowering individuals to succeed and engage in the highest quality research outcomes.
- Sustainability: which will be understood as the extent to which the research environment ensures the health, diversity, wellbeing and wider contribution of the unit and the discipline(s), including investment in people and in infrastructure, effective and responsible use of resources, and the ability to adapt to evolving needs and challenges.
- Rigour: which will be understood as the extent to which the institution has robust, effective, and meaningful mechanisms and processes for supporting the highest quality research outcomes, and empowering all staff and research students. This includes the sharing of good practices and learning, embracing innovation, robust evaluation and honest reflection demonstrating a willingness to learn from experiences.
- Final criteria for REF 2029 will be developed by the REF sub-panels. This will incorporate outcomes of the PCE pilot exercise and also take into account other elements of the research environment (e.g. income, infrastructure and facilities).
- Panels will evaluate evidence and contextual information and will be asked to assign a score to each of the enabling factors of the assessment framework. Scores will be based on the working definitions of quality descriptors outlined below. These descriptors are likely to be refined during and following the pilot. Final descriptors will be developed by the REF sub-panels and will take into account other elements of the research environment (e.g. income, infrastructure and facilities):
- 4 star: Provides robust evidence of a culture and environment conducive to producing research of world-leading quality and enabling outstanding engagement and impact, in terms of their vitality, sustainability, and rigour. There is evidence that the policies and measures in place at the institution are having a positive impact on PCE within the institution, and furthermore collaboration and sharing of good practice and learning mean that that there is also influence outside the institution.
- 3 star: Provides robust evidence of a culture and environment conducive to producing research of internationally excellent quality and enabling very considerable engagement and impact, in terms of their vitality, sustainability, and rigour. There is evidence that the policies and measures in place at the institution are having a positive impact on PCE within the institution.
- 2 star: Provides robust evidence of a culture and environment conducive to producing research of internationally recognised quality and enabling considerable engagement and impact, in terms of their vitality, sustainability, and rigour. There is evidence that the policies and measures in place to positively influence PCE at the institution are being adhered to.
- 1 star: Provides robust evidence of a culture and environment conducive to producing research of nationally recognised quality and enabling recognised but modest engagement and impact, in terms of their vitality, sustainability, and rigour. There is evidence that policies and measures are in place which are intended to have a positive impact on PCE at the institution.
- Unclassified: Evidence provided is not robust, or evidence suggests a culture and environment conducive to producing research falling below nationally recognised standards.
- Assessment panels will be provided with access to the SharePoint site to allow panel members to review the submissions relevant to their UoA. All submissions will be read by multiple members of the assessment panel and the panel will reach a collective decision on the score for each of the factors enabling excellent research culture. Spreadsheets will be provided for recording individual scores and for the agreed panel scores.
- The agreed panel scores for each enabling factor will be used to construct a quality profile for each PCE submission. For the purposes of the pilot, each enabling factor will be given equal weighting. Each participating institution will be provided with its own quality profiles; these will be for information only. Summary data is likely to be published in the final report, but individual scores for enabling factors, submissions, or participating institutions will not be published.
- Assessment panels will be given some flexibility in their approach to the assessment of PCE. The Pilot is intended to explore the possible differences between assessment of different subject areas and therefore different subject-level assessment panels may conduct their assessments in different ways. In addition, the assessment panels will be refining and developing the assessment process as the Pilot exercise progresses. For these reasons it would not be robust to make direct comparisons between submission scores from one UoA to another, or from one submission to another within the same UoA.
- Feedback will be sought from the Pilot panels on the process of assessing PCE submissions including the utility of each of the indicators, the robustness of the evidence provided, and the possibility of verifying the evidence through audit. Feedback will inform the development of the guidance for REF 2029. The guidance for the PCE assessment in REF 2029 will likely vary significantly from the assessment in the Pilot exercise.
- Feedback is likely to be framed around the following example questions:
- Which indicators were straightforward to work with? Which indicators were problematic?
- Which indicators could be considered mandatory? Do any indicators lend themselves to certain institutional types?
- What aspects of the template were positive? What aspects of the template were challenging?
- Which indicators are not robust or may be challenging to audit? Were there any concerns about data quality?
- Are there other indicators not tested in the pilot that should also be considered?
- What could be improved?
Conclusions
- The Pilot is expected to conclude in Autumn 2025. As noted above, feedback will be sought from the participating HEIs and from the assessment panels. Feedback will be gathered during the assessment (through Miro boards or online collaborative spaces) and will also include some reflective workshops and additional evidence gathering at the close of the exercise. A report will be published outlining the findings of the Pilot exercise. The approach to assessment of PCE in REF 2029 will need to be integrated into the broader REF framework and therefore final policy decisions will follow after publication of the PCE Pilot report. Final guidance for REF 2029 will be developed by the REF sub-panels taking account of the findings of the Pilot exercise.