Skip to main content
Menu

Web forum archive

Evidence check: Department's use of evidence

Education Committee

Thank you for all comments submitted to the ‘Evidence Check’ forum. The forum is now closed. Comments received will help the Committee evaluate the evidence received from the Department for Education.

The Committee will use the comments to select topics for one-off oral evidence sessions in early 2015.

Return to the Evidence Check web forum homepage

154 Contributions (since 18 November 2014)
Closed for contributions

Dissolution of Parliament

The dissolution of Parliament took place on Thursday 30 May 2024. All business in the House of Commons and House of Lords has come to an end. There are currently no MPs and every seat in the Commons is vacant until after the general election on 4 July 2024.

Find out more about:

This web forum is displayed for archive purposes and is no longer accepting public contributions. For queries relating to the content of this web forum, please contact the Education Committee.

Total results 154 (page 2 of 16)

Ben Durbin (NFER)

11 December 2014 at 20:23

Areas for attention (2) Further areas for attention include: • Strategic research commissioning that tackles issues of longer-term interest (including issues that transcend – or at least anticipate – changes of personnel and political priorities), to increase the chances that at the point policy is being developed, suitable evidence is available. The publication by the Department earlier this year of its research priorities in a range of policy areas was a welcome first step. • Greater support to schools in their own use of evidence (analysis of pupil data, use of formative assessment, and use of external evidence on effective practice). We welcome the Department’s support for an independent College of Teaching with a role in professional development informed by evidence. • A continuing commitment to methods such as longitudinal analysis (based on funding for datasets such as the Longitudinal Study of Young People in England (LSYPE)), Randomised Controlled Trials that test proposed policy initiatives (building on the EEFs work trialling in-school interventions), cost effectiveness analysis, and appropriate use of international comparisons.

Ben Durbin (NFER)

11 December 2014 at 20:21

Areas for attention (1) The Department’s use of evidence could be improved through: • Greater transparency regarding the evidence base used to inform decision making, underpinned for example through systematic literature reviews. This Select Committee process is therefore a welcome measure in support of this goal. • Evaluation being integral to the policymaking process, considered during policy design rather than implementation wherever possible. This would avoid the scenario whereby the rigour (and hence usefulness) of an evaluation is limited through being considered too late. • Greater systematic evaluation of policies’ impact on the wider education system, especially for those that are difficult to pilot (such as new curricula or accountability systems). Such evaluation should give due regard a wide range of intended and unintended consequences, and should provide independent scrutiny of the underlying policy rationale and its real world impact. In recent years there has been a distinct lack of government-funded evaluation of major initiatives such as changes to the curriculum, qualifications, assessment, and school structures.

Ben Durbin (NFER)

11 December 2014 at 20:20

DfE makes use of a range of evidence, but could do more The Department already has extensive capabilities to support this process, including administrative datasets of pupils, teachers and institutions; access to live views from panels such as NFER’s Teacher Voice; and the ability to link many of these datasets together. Moves over the past few years to widen access to these datasets (especially the National Pupil Database) are welcome, as was the establishment of the EEF to deliver a supply of robust, independent evidence for schools. However, there are opportunities for further improvement that ensure suitable evidence is available to the right people at the time they need it. This is true both of policymaking within the Department and across the wider education sector (on which the Department also has an influence).

Ben Durbin (NFER)

11 December 2014 at 20:19

Evidence has an important role to play in policymaking At its most effective, evidence should be used at multiple points in the policy shaping and implementation process. Evidence can provide: • An understanding of the past: the impacts of past policy interventions and the lessons that can be learned; • Insight into the present: what is the current situation, objectively (e.g. pupil numbers or results) and subjectively (e.g. the attitudes and experiences of teachers); • Predictions of the future: what might occur in a range of policy scenarios, based on a well-informed understanding of past research and the current environment.

Ben Durbin (NFER)

11 December 2014 at 20:18

Evidence has an important role to play in policymaking. DfE currently makes use of a range of evidence, but could do more. In the following four posts, we expand on these points in more detail, suggesting some areas for further attention.

Nick Johnson, British Educational Research Association

11 December 2014 at 13:51

BERA supports the commitment from the Department to promote evidence-based practice. However the key questions are which and whose evidence? As a body with members representing the vast diversity of educational research in both subject and methodological approach, BERA would be wary of any type of evidence being prioritised ahead of another. For instance, Randomised Control Trials (RCTs) might be good at showing that something works but not why, or under which conditions? This often requires qualitative research to answer. There are substantial difficulties in moving from trial results to ‘evidence-based practice’. Research on educational change over many years is clear that it is often not knowledge that we lack; it is implementation. BERA believes that we need an ongoing process and infrastructure for evaluating, synthesising and communicating existing and new knowledge, accumulated over decades of research. This needs to be protected against political interference. Secondly, we need to create the organisational cultures and conditions needed to roll out and scale up innovations in naturalistic settings. This will need as much research effort as the development of formal trials. Thirdly, our own 18-month Inquiry into the relationship between research and teacher education concluded that internationally, enquiry-based (or ‘research-rich’) school and college environ¬ments are the hallmark of high performing education systems. To be at their most effective, teachers and teacher educators need to engage with research and enquiry – this means keeping up to date with the latest developments in their academic subject or subjects and with developments in the discipline of education. Therefore, publishing evidence is not enough –there needs to be sustained focus on enquiry-based practice needs during initial teacher education programmes and throughout teachers’ professional careers, so that disciplined innovation and collaborative enquiry are embedded within the lives of schools or colleges and become the normal way of teaching and learning, rather than the exception. www.bera.ac.uk

Jonathan Breckon

11 December 2014 at 12:25

The Department for Education should be commended for its leadership and investment in using research evidence to inform policy and practice. They are, for instance, co-funders of some ‘what works centres’ like the Education Endowment Foundation. The department should also be applauded for publicising their research needs, so we know priorities and where there are gaps. Establishing a ‘college of teaching’ has also been given tacit support. The college will play in role in disseminating the evidence for best practice. However, the evidence submitted to the select committee has important gaps. To take the systematic phonics instruction memorandum, some of the evidence is inconclusive and weak. They mention reviews of RCTs that have created ‘sound evidence’, according to the department, such as the review for the Department by Professor Carole Torgerson. But Professor Carole Torgerson has cast doubt about the research. Although there was some promising evidence from the US – and a small-scale study in Clackmmanshire, Scotland, that this approach worked, the evidence was relatively weak. Her review found only a dozen small trials, the biggest of which involved 120 children. She has urged caution in making national policy based on such weak evidence. They also mention 12 Ofsted reports on primary schools that showed strong performance related to phonics teaching. But there are around 16,000 state-funded primaries in England so 12 is not reprsentative. And we are not talking research, just Ofsted inspections. Listing evidence is not enough. We have to dig deeper and make judgments on the relevance, replicability and rigour of the evidence. A key approach is to make judgments on the standard of evidence. One way of doing this is to use formal hierarchies of evidence. They are common practice in health and medicine, but are now increasingly used in social policy. For instance, evidence standards are used bodies such as Early Intervention Foundation (one of the What Works Centres), Project Oracle, the Social Research Unit, and the national innovation charity Nesta (see here for one: http://www.nesta.org.uk/publications/standards-evidence-impact-investing ). Nesta, for instance, uses a 1-5 scale; 1 = a clear articulation of something could have a positive impact; 5 = evidence that it can deliver impact at scale. Another example of only using the most appropriate and highest quality evidence comes from the DfE-backed Education Endowment Foundation. Their Teaching and Learning Toolkit (jointly with Sutton Trust) is an accessible summary of educational research which provides guidance for teachers and schools on how to use their resources to improve the attainment of disadvantaged pupils. But the Toolkit doesn’t include any research. It prioritises systematic reviews of research and quantitative syntheses of data such as meta-analyses of experimental studies. There also ways of checking the quality of qualitative research evidence. A successful model for quality controlling good evidence comes from international development by the UK network BOND ‘Evidence Principles’ checklist, used by many NGOs to help commission, design or review evidence, and is particularly relevant to qualitative research. The HM Treasury Magenta Book also has a special supplement to check the standards of qualitative evaluation The DfE and the select committee should consider using standards of evidence. Such standards are not new, they have been used for many years in the US (see for instance, the GRADE system, Maryland Scale of Scientific Methods, or US Department for Education’s What Works Clearing House) and there is good practice in the UK and expertise within the Department to build on. We should applaud Education Select Committee for taking this on and do more of such evidence-checks. We hope that this becomes a regular exercise. Showing the evidence behind policies should be normal practice for all departments. The Campaign for Science and Education has argued for all of government to open up the research behind policies: http://sciencecampaign.org.uk/CaSE2015ScienceinGovernmentBriefing.pdf Being transparent about evidence use is good. But we also need to make judgments on the credibility of the research that is revealed. Using standards of evidence is one way to help make such judgments.

Jonathan Haslam, Institute for Effective Education

11 December 2014 at 11:23

There have been positive changes over the last few years in the way that the Department for Education commissions and uses evidence. Its most significant step was the establishment of the Education Endowment Foundation (EEF). The EEF's substantial endowment, its mission to determine what can improve outcomes for disadvantaged children, and its commitment to robust research have challenged our understanding of "what works" in schools. Its findings have been at times surprising, and its commitment to robust methodology, including randomised controlled trials, means that this growing body of evidence cannot simply be dismissed or ignored. The establishment of the Children's Social Care Innovation Programme (CSCIP) evaluation framework with, again, an emphasis on robust research, adequately funded, has also shown a commitment to evaluating initiatives with good quality evaluations. Wherever possible, the Department should continue to evaluate new policies with robust, independent trials, and ensure that innovations are robustly evaluated before being rolled out nationally. The Department has also shown that it is supportive of how this evidence becomes embedded in practice. It sponsored an event for the Coalition for Evidence-based Education, looking at how the education "ecosystem" might be changed to promote the use of evidence, and it hosted an ESRC seminar on the difficulties of conducting large-scale evaluations in education. There is still a challenge in incorporating research evidence into the policy-making process. Some aspects of policy are a legitimate matter of democratic debate, in terms of the kind of society that we hope to achieve. However others, such as the best way to teach reading, should use research evidence on the most effective approach, in the light of the current best evidence, with a commitment to research further areas of uncertainty or ambiguity. The way to bring this about is not with any kind of mandatory approach but, naturally enough, through education. Wider understanding of what the research evidence tells us is vital for policy makers, practitioners, and the electorate. Put simply, it would mean that when new initiatives are proposed, the first question to be asked from all quarters would be "what is your evidence that this will work?" Good progress has already been made. The Education Endowment Foundation has been given the role of a What Works Centre for Education. They have begun to investigate the processes that are most successful in getting research evidence used, and started turning their attention to disseminating the results of their own research. There is some confusion between the role of the EEF as an evaluator (looking at the impact on pupil premium pupils) and as a What Works Centre (looking at the impact on all pupils), and this should be clarified. The Education Media Centre, a project of the Coalition for Evidence-based Education recently established with the support of a variety of funders, is providing an important service in responding to high profile education stories in the media with the research evidence behind the headlines. At the IEE we publish a magazine, Better: Evidence-based Education, and fortnightly e-newsletter, Best Evidence in Brief, both of which aim to present research from around the world in an accessible format. These efforts need scaling up, and this should be done by an organisation that is independent of government. Presentation of information on research must not shirk issues of complexity or ambiguity; giving clear messages on "what works" when the evidence is not clear will only do harm. As more research is carried out, these gaps in knowledge will be filled. For practitioners and policy makers, more detailed understanding of the state of the research evidence is needed, and this may require face-to-face support, whether from researchers, intermediaries or peers, in both understanding what the research says and how it was created. Ultimately, the whole field – practitioners, parents, policy makers and researchers – must move forward together by sharing the learnings from research. This will support the department in making the most of research in the improvement of policy and practice.

Keith Jones

11 December 2014 at 10:48

The Education Statistics branch of the DFE does good work. It is a positive step that the DFE has published “Building evidence into education” and also it is also good to see the DFE providing a founding grant of £125m for the Education Endowment Foundation (EEF). Yet both these latter steps promote randomised controls above other forms of evidence-gathering. In my field of research in mathematics education there is considerable high-quality research, not all of which depends on randomised controls. Such research is essential if there are to be well-designed randomised controls. The DFE could make better use of evidence generated by research in mathematics education in its policymaking and its mechanisms for analysing evidence. Keith Jones University of Southampton Mathematics and Science Education Research Centre

Dame Julia Higgins FRS

10 December 2014 at 17:31

As a Fellow of the Royal Society, the UK’s National Academy of Science, I am particularly concerned that policy for education and policy for science are properly informed by robust evidence. This should be standard practice. It is essential, then, that the use of evidence by the Department for Education, and indeed by all Government departments, should be examined internally and by independent experts. In the Royal Society’s Vision for science and mathematics education, published earlier this year, we drew particular attention to concerns about the extent to which policy is evidence-informed and/or tested and suggested some approaches for improving this. Ensuring that the UK’s education systems meet the needs of future generations and our economy is crucially important and represents an enormous challenge, particularly for those in government who have principal responsibility for leading on this. Better education policy-making is likely to result from closer collaboration between policy-makers, the education research community, practitioners and the wider public. Government departments and their agencies need to ensure they are comprehensively informed by evidence, understanding that this may come from diverse sources, and demonstrate how they have considered this when determining new policy. From my perspective as Chair of the Royal Society’s Education Committee, it is apparent that this does not happen consistently – yet! The Society will be undertaking a programme of work aimed at ensuring education policy and practice are better informed by evidence, and is looking forward to working with others in the education community on this.

Total results 154 (page 2 of 16)