Skip to main content

Web forum archive

Evidence check: Department's use of evidence

Education Committee

Thank you for all comments submitted to the ‘Evidence Check’ forum. The forum is now closed. Comments received will help the Committee evaluate the evidence received from the Department for Education.

The Committee will use the comments to select topics for one-off oral evidence sessions in early 2015.

Return to the Evidence Check web forum homepage

154 Contributions (since 18 November 2014)
Closed for contributions

This web forum is displayed for archive purposes and is no longer accepting public contributions. For queries relating to the content of this web forum, please contact the Education Committee.

Total results 154 (page 1 of 16)

Centre for the Use of Research and Evidence in Education

15 December 2014 at 12:36

(2/2) As noted earlier, active ‘pull through’ interest in research by classroom practitioners and school leaders has grown significantly in the recent past (but, let’s not get too carried away, from a low base) and they are frequently frustrated by: a) the gaps in the evidence base particularly studies which focus on effective pedagogies at subject level. There is a reasonable body of work about the ’core’ of English/literacy and maths/numeracy but very little on much of the rest of the curriculum. This matters. Research about effective CPD and improvement is increasingly emphasising a) the importance of subject based/rich professional learning and b) the lack of impact of generic pedagogic interventions b) the difficulty in tracking down valid and relevant material presented in forms which are useful to hard pressed practitioners – a problem which has grown worse since the various arrangements in place to take this forward where dismantled in the early days of the new Administration. The So in summary we applaud the efforts by the Department to nudge the system in the direction of evidence informed teaching, policy and practice and we think it would be even more effective if: • the limitations of randomised control trials were acknowledged and there was support for other equally valid research forms (including qualitative studies which would reveal HOW something worked and not just tested IF it did) • there was a substantial programme of subject-focused pedagogical studies • trials were conducted into current widespread practices (for instance setting in maths) rather than focussing on innovation • evaluations of programmes and projects commissioned directly or indirectly by the Department are properly specified from the outset and funded appropriately so that the outcomes can be meaningful and relevant • the Department acknowledged the legitimate demand from practitioners for access to good quality, relevant research presented in forms useful to them and dedicated some resource to developing this and doing so with some urgency (this might become a promotional role for a College of Teachers but only in the long term)

Centre for the Use of Research and Evidence in Education

15 December 2014 at 12:35

(1/2) The Centre for the Use of Research and Evidence in Education (CUREE) is an internationally recognised centre of excellence in the application of research and other sources of evidence to education policy, leadership and practice. Our interest is primarily in those policies which support the impact of evidence on practice so our focus is less on what the Department does with evidence itself and more on how it sponsors and support its use in the wider system. First, want to recognise and salute the Department’s public commitment to raising the both the quality and the use of robust evidence in the system. Some of the signs of this have been: a) support of the Education Endowment Foundation and its programme of research and support for a round of Research Informed Practice trials; b) commissioning direct from the Department of a number of randomised control trials c) the large scale investigation of interventions designed to close the achievement gap (via the NCTL Closing the Gap: Test and Learn project) d) the growth in interest and practitioner level in evidence informed practice – most notably evidenced by the grass-roots led ResearchED movement. The other major contribution from Government (not just education) is the commitment to making system performance data widely available to the research community and the general public which, though they didn’t invent the strategy, has been given a great boost by this Administration. Requiring schools and colleges to publish a range of ‘facts’ about themselves (including, for instance Pupil Premium measures), making data more accessible via the Ofsted Data Dashboard – are all useful and practical measures in waking the system up to the importance of evidence to underpin policy and practice. Critics would argue that the willingness of the Government to tell others to be more evidence informed is not modelled by its own behaviour. They cite the expansion of the academisation, the free schools policy and the synthetic phonics programmes as examples of initiatives where the evidence was weak or cherry picked in the first place and which have not been subject to rigorous evaluation. We recognise that in the real world of public policy: • governments are elected on a values agenda and that WHAT they choose to implement is legitimately driven by those values too; • that research is often too equivocal, too hedged about with qualifications and, above all, too slow to be valuable to policy makers who are working within an electoral cycle. We note, though, that jurisdictions ranked high in PISA (Ontario, Hong Kong, Shanghai, Finland, Singapore) have managed to establish longer term (10 year) policy cycles for education We would, however, agree that: a) there is not enough evaluation , nor enough use of new initiatives as a opportunity for increasing evaluative capacity b) though the WHAT of public policy is the politicians legitimate domain, the HOW is a technical/professional question which is and should be entirely open to examination on an evidence basis c) the Department should, therefore, be open to piloting/trialling new initiatives on a smaller scale in the first place; d) evaluations when they are commissioned are often underfunded, bolted on as an afterthought, unrealistic about how meaningful the results can be given the stage of the programme or quantity/quality of the evidence available and/or the costs of securing pupil/student impact data e) pushing the system in the direction of randomised control trials was entirely justified (whereas the sometimes emotional resistance to RCTs in education was not) but there is a risk of this approach being held up as the only legitimate one. RCTs have limitations and we see signs of them being bolted on by programmes designers who, frankly, don’t know what they are specifying. There is an additional risk of RCT fatigue in the wider system not helped by the small number of EEF funded trials reporting positive outcomes. A scientist would argue that negative results are as significant and valuable as positive ones but this is not how the practitioner on the street sees it. It would indeed be a valuable outcome if current widespread practices were subject to trials and shown to be ineffective. But the activities chosen for trialling have mostly and up to now not been current practice but innovations (at least in the UK context) and a succession of failures in these trials risks encouraging cynicism amongst those whose behaviour we are trying to influence

The Wellcome Trust

15 December 2014 at 08:58

This is an important area for the Wellcome Trust and we are pleased to comment. We use, wherever possible, an evidence-based approach towards improving science education, and are committed to increasing the quality, quantity and accessibility of the evidence that is available to teachers, technicians, school leaders and governors. DfE is increasingly acknowledging the importance of evidence in education, demonstrated, for example, by the commissioned review from Ben Goldacre1 in 2013 and its 15 year funding commitment to the Education Endowment Foundation (EEF) in 2011. We have been pleased to work with the EEF on our Education and Neuroscience initiative — this aims to build expertise at the interface between neuroscience and education, and ensure that educators can make informed choices. DfE also contributes to the research base through surveys such as the Longitudinal Study of Young People in England. However, DfE’s investment in research to inform policy, including monitoring the impact of new initiatives, must be adequately funded. The science, engineering and technology statistics2 show that DfE’s research and development budget halved in just one year from £28 million in 2010/11 to £13 million in 2011/12. We are concerned that the Department will be unable to produce vital evidence on new initiatives and evaluate policy changes, and could become reliant on external organisations to fund necessary research. In fact, the Gatsby Charitable Foundation, the Nuffield Foundation and the Wellcome Trust will be carrying out a long-term monitoring programme to capture changes in the quality and quantity of practical science in UK schools in order to evaluate the impact of recent curriculum and assessment reform. Arguably, DfE should be responsible for leading such critical research, but with current research budgets, this seems less and less likely. DfE’s Chief Scientific Adviser should provide scientific advice to Ministers, and they should ensure that they apply the ‘Principles of scientific advice to government’3. These include “publicly explain[ing] the reasons for policy decisions, particularly when the decision is not consistent with scientific advice and in doing so, should accurately represent the evidence”. It is essential to be transparent when evidence is not available, or not available to an appropriate standard. DfE should also be more stringent in ensuring there is robust research into the necessity of new initiatives including curriculum change; consulting the available evidence, and ensuring there is sufficienttime for piloting and to make necessary adjustments if the testing indicates that wider implementation is appropriate. The teaching workforce should be actively encouraged to use evidence to improve practice. We support the proposed College of Teaching and its intention to increase the professional status of teaching and drive forward the culture change which is already starting to make this a more evidence-based profession. The College is a good opportunity to make evidence more accessible to teachers and create a structure where they are rewarded for using good evidence in practice, as well as accessing high-quality continuing professional development (CPD). We continue to work with the National Science Learning Centre to test and improve its delivery of subject-specific CPD for science teachers and technicians, which DfE also contributes to through Project ENTHUSE. We would be happy to discuss any of these points in more detail with the committee, and appreciate the opportunity to input into this evidence check. We are pleased that evidence in education is high on the agenda for 2015.

Lesley Gannon (NAHT)

15 December 2014 at 08:54

NAHT is grateful that the DfE has been increasingly willing to invest in small-scale pilots, such as the NAHT Aspire project, and take time to evaluate how emerging approaches to school improvement work in practice. We have also welcomed the increased number of opportunities schools have been given recently to share their good practice and provide the Department with fresh thinking from within the profession. NAHT is always keen to explore educational success stories from across the world and assess what can be learnt and applied within a British context; in doing so however, we remain mindful of the significant contextual factors at play. Teaching and assessment strategies that have proved successful in one country cannot simply be imported wholesale and guarantee the same results. Too often it appears that the Department’s enthusiasm for international performance datasets results in an unhelpful over-emphasis on the work in other nations, and a reluctance to confront the specific social and economic challenges faced by our schools. An evidence-based approach to educational policy cannot simply begin and end with the use of evidence to justify the introduction of a new approach, it must extend to on-going monitoring and evaluation of that approach. Such monitoring should include an assessment of unintended consequences for pupils, schools and the system as a whole. The breakneck speed at which some policy areas have been rolled-out nationally and the seeming unwillingness to move away from initiatives that have been found to have limited success, is a cause for concern. NAHT believes that the Department’s definition of evidence perhaps needs to be broadened to place greater weight to the views of school leaders and those working within schools. Too often the practical implementation guidance offered by those within the system, via consultation or other feedback mechanisms, appears to be ignored, only for the very problems that were highlighted to emerge and require remedial action. NAHT would welcome a further opportunity to discuss these and other issues with the committee in the New Year. Lesley Gannon Head of Research and Policy Development, NAHT

Professor Robin Alexander, Cambridge Primary Review Trust

14 December 2014 at 11:54

PART 1 OF 3 Several contributors to this section commend DfE for its commitment to evidence, but surely this is a minimum condition of good governance, not a cause for genuflection. More to the point are the concerns of Dame Julia Higgins that DfE’s use of evidence is inconsistent (or, as Janet Downs puts it, ‘slippery’) and those many other contributors across the Committee’s nine themes who find DfE overly selective in the evidence on which it draws and the methodologies it prefers. The principal filters appear to be ideological (‘is this researcher one of us’?) and electoral (‘will the findings boost our poll ratings / damage those of the opposition’?) and such scientifically inadmissible criteria are compounded by DfE’s marked preference for research dealing in big numbers, little words and simple solutions. In the latter context, we should be wary of endorsing without qualification the view of several contributors that the randomised control trial (RCT) is the evidential ‘gold standard’, trumping all other attempts to get at the truth. Education is complex and contested, and its central questions are as much ethical as technical – a challenge which the fashionable but amoral mantra ‘what works’ conveniently ignores. The RCT language of ‘treatment’ and ‘dosage’ is fine for drug trials but is hardly appropriate to an activity which is more craft and art than science, and in untutored hands the effort to make teaching fit this paradigm may reduce to the point of risibility or destruction the very phenomena it claims to test. I should add that I make these observations not as a disappointed research grant applicant but as recipient of substantial funding from the rightly esteemed Educational Endowment Foundation for a ‘what works’ project involving RCT. Of the nine ‘evidence check’ memoranda submitted to the Committee by DfE, those on phonics, the school starting age and the National College most conspicuously display some of the tendencies I’ve so far identified. Thus the defence and citations in DfE’s phonics statement neatly sidestep the methodological controversies and evidential disputes surrounding what is now the government’s mandated approach to teaching reading, so the contributor who applauds DfE’s grossly biased bibliography as ‘accurate’ is plain wrong. DfE’s school starting age citations carelessly - or perhaps carefully - attribute a publication of the Cambridge Primary Review (Riggall and Sharp) to NFER, but again avoid any evidence running counter to the official view that children should be packed off to school as soon as possible; or the more nuanced finding of the Cambridge Primary Review that the real issue is not the starting age for formal schooling but the availability and quality of early years provision, wherever it takes place; or indeed the inconvenient truth that some of this country’s more successful PISA competitors start formal schooling one or even two years later than England. As for the National College of Teaching and Leadership (NCTL), no independent evidence is offered in support of DfE’s insistence that this agency, and the models of teacher training and school improvement it espouses, justify its consumption of public funds. Only two publications are cited in DfE’s ‘evidence check’. One is NCTL’s statement of accounts; the other a DfE press release which is neither evidence nor independent. Proper evaluation of NCTL became all the more essential when DfE abolished the relatively ‘arms length’ bodies that NCTL subsumed and charged it with ‘delivering’ approved policies. Of course NCTL can be shown to be effective in relation to the delivery of policies x and y. But what if those policies are wrong? The Committee has received many unhappy comments from parents about schools’ draconian responses to term-time absences. These highlight a further problem: there are important areas of educational policy, at both school and national level, where evidence is rarely or never on view and parents and the electorate are expected to comply with what may be little more than unsubstantiated claims. In the case of those blanket bans on term-time absence about which so many parents complain to the Committee, as with the tendency to fill more and more of children’s (and parents’) waking hours with homework (i.e. schoolwork done at home) of variable and in some cases little educational value, there appears to be a deep-seated assumption that schools have a monopoly of useful learning. The Cambridge Primary Review scotched this mistaken and indeed arrogant belief in the comprehensive research review on children’s lives outside school that it commissioned from Professor Berry Mayall. Except that the then government preferred summarily to reject the evidence and abuse the Review team rather than engage with the possibility that schools might do even better if more of them understood and built on what their pupils learn outside school.

Professor Robin Alexander, Cambridge Primary Review Trust

14 December 2014 at 11:52

PART 2 OF 3 So although the Education Committee has applied its ‘evidence check’ to nine areas of policy, it might also consider extending its enquiry in two further directions: first, by examining the evidential basis of policies and initiatives, such as those exemplified above, about which teachers, parents and indeed children themselves express concern; second by adding some of those frontline policies which DfE has justified by reference to evidence but which are conspicuously absent from the Committee’s list. Examples in the latter category might include: (i) the government’s 2011-13 review of England’s National Curriculum; (ii) the development of new requirements for assessment and accountability in primary schools; (iii) the rapid and comprehensive shift to school-led and school-based initial teacher education; (iv) the replacement of the old TDA teacher professional standards by the current set; (v) the strenuous advocacy and preferential treatment of academies and free schools. Each of these illustrates, sometimes in extreme form, my initial concerns about politico-evidential selectivity and methodological bias. Thus in the 2011-13 national curriculum review ministers deployed exceptionally reductionist and naive interpretations of the wealth of international evidence with which they were provided by DfE officials and others. They resisted until the last moment overwhelming evidence about the educational centrality of spoken language. They ignored Ofsted warnings, grounded in two decades of school inspection (and indeed evidence going back long before Ofsted) about the damage caused by a two-tier curriculum that elevates a narrow view of educational basics above all else – damage not just to the wider curriculum but also the ‘basics’ themselves. And they declined to publish or act on their own internal enquiry which confirmed the continuing seriousness of the challenge of curriculum expertise in primary schools, an enquiry which – and this much is to ministers’ credit – DfE undertook in response to, and in association with, the Cambridge Primary Review. The report of that enquiry, and the wider evidence that informed it, still awaits proper consideration. A job for the Education Committee perhaps? Similarly, DfE, like its predecessor DCSF, has stubbornly held to its view – challenged by the Education Committee as well as numerous research studies and the Bew enquiry – that written summative tests are the best way both to assess children’s progress and hold schools and teachers to account, and that they provide a valid proxy for children’s attainment across the full spectrum of their learning. Then, and in pursuit of what has sometimes looked suspiciously like a vendetta against those in universities who undertake the research that sometimes rocks the policy boat, DfE has ignored international evidence about the need for initial teacher education to be grounded in equal partnership between schools and higher education, preferring the palpable contradiction of locating an avowedly ‘world class’ teacher education system in schools that ministers tell us are failing to deliver ‘world class’ standards. Relatedly, DfE has accepted a report from its own enquiry into professional standards for teachers which showed even less respect for evidence than the earlier and much-criticised framework from TDA, coming up with ‘standards’ which manage to debase or exclude some of the very teacher attributes that research shows are most crucial to the standards of learning towards which these professional standards are supposedly directed. Finally, in pursuit of its academies drive government has ignored the growing body of evidence from the United States that far from delivering superior standards as claimed, charter schools, academies’ American inspiration, are undermining public provision and tainted by financial and managerial corruption. England may not have gone that far, but new inspection evidence on comparative standards in academies and maintained schools (in HMCI’s Annual Report for 2013-14) should give the Committee considerable pause for thought about the motivation and consequences of this initiative.

Professor Robin Alexander, Cambridge Primary Review Trust

14 December 2014 at 11:51

PART 3 OF 3 In relation to the Committee’s enquiry as a whole, the experience of the Cambridge Primary Review (2006-10) and its successor the Cambridge Primary Review Trust is salutary, depressing and (to others than hardened cynics) disturbing. Here we had the nation’s most comprehensive enquiry into English primary education for half a century, led by an expert team, advised and monitored by a distinguished group of the great and good, supported by consultants in over 20 universities as well as hundreds of professionals, and generating a vast array of data, 31 interim reports and a final report with far-sighted conclusions and recommendations, all of them firmly anchored in evidence, including over 4000 published sources. Far from welcoming the review as offering, at no cost to the taxpayer, an unrivalled contribution to evidence-based policy and practice in this vital phase of education, DCSF - DfE’s predecessor – systematically sought to traduce and discredit it by misrepresenting its findings in order to dismiss them, and by mounting ad personam attacks against the Review’s principals. Such behaviour in the face of authoritative and useful evidence was unworthy of holders of elected office and, for the teachers and children in our schools, deeply irresponsible. It is with some relief that we note that DfE’s stance towards the Review and its successor the Cambridge Primary Review Trust has been considerably more positive under the Coalition than under Labour, and we record our appreciation of the many constructive discussions we have had with ministers and officials since 2010. Nevertheless, when evidential push comes to political shove, evidence discussed and endorsed in such meetings capitulates, more often than not, to the overriding imperatives of ideology, expediency and media narrative. This, notwithstanding the enhanced research profile applauded by other contributors, remains the default.

David Gough

12 December 2014 at 11:10

I second the comments by Jonathan Breckon and others commending DfE in the use of evidence, the funding research in education and, crucially, in funding research in the use of research in policy and practice. They have even funded a project to assess their progress over time in increasing the use of research evidence in the education system. I also wish to commend the Select Committee for the use of this Evidence Check. It provides a powerful institutional driver to encourage ministries to consider how they have or have nor used research as part of the policy making process. This can have important knock on effects in societal discussion on specific policy areas and more broadly on how research is funded and applied to help policy and practice decisions. I am not a specialist in the topic areas under discussion but it is clear that there are some weaknesses in how research is reported to have been used in the policy making process. For teaching assistants, for example, the listing of research does not seem to include some important research and even research funded by DfE itself. Also, the way that the research evidence relates to the policy and its implementation seems rather vague. The reasons for these weaknesses could include: (i) The DfE accounts are retrospective explanations and so may encourage a narrative justifying how research informed policy rather than a more comprehensive view of how research could have informed policy. (ii) The DfE is a large complex organisation and the respondents at DfE may not be aware of all the relevant evidence; (iii) A wish not to discuss all of the research because it might lead to questions as to the extent that the policy or its implementation was supported by research; (iv) A wish not to discuss all of the research as it might in some way restrict movement about future policy and its implementation. The evidence is also often empirical, which is of course crucial, but there could be more use of theory on how issues are being understood and explained by research. This fits with Carol Weiss’s (1979) distinction between the symbolic (selective cherry picking to support decisions made by factors other than research), instrumental (findings driving decisions), and enlightenment (concepts to help understand issues) functions of research. She suggested that the enlightenment function is more common than the instrumental. One way to enable instrumental and enlightenment use of research would be to structure the evidence checks so that they more clearly provided an account of: 1. Full list of different types of relevant research evidence – preferably from systematic reviews with primary research where not covered by such systematic review; 2. The inclusion of both conceptual and empirical research; 3. The strength and relevance of this research for addressing the policy questions (and how these quality and relevance appraisals have been conducted); 4. Evidence gaps to inform further research investment; 5. How the research evidence is combined with the many other factors that policy makers have to take into account in order to make policy.

Save Childhood Movement

12 December 2014 at 10:27

We support the DfE’s expressed desire to base its policies on evidence and to develop ‘a new vision for evidence-based practice in education and teaching’ with new “What works centres’ along the lines of NICE. We agree with Janet Grauberg that over the last twenty years there has been a worrying lack of clarity and consistency about the outcomes of government policy and state intervention, with different political parties and different Ministers giving priority to different issues and policymakers constantly having to make trade-offs that “confuse parents and providers, increase costs and jeopardise the achievement of the desired outcomes.” (ibid) According to Action for Children’s 2008 ‘As Long as it Takes’ report, there have been over 400 different initiatives, strategies, funding streams, legislative acts and structural changes to services affecting children and young people over the past 21 years. This is equivalent to over 20 different changes faced by children’s services for every year since 1987. What is more, “the ‘churn’ was increasing rapidly” with half of the developments identified begun in the previous six years. Over the last fifteen years there has been enormous investment in the early years and with this has come a confusing, and what we believe is an inappropriately-evidenced, succession of new policies together with accompanying systems of monitoring and accountability. There have also been a series of policy changes that have impacted on the nature of family life and the time that parents spend with their children. At the heart of all these changes sits the child whose own needs have not changed, but who has been exposed to an increasingly complex range of politically led cultural pressures. With recent advances in brain development and the cognitive sciences we have never known more about how and why children learn – and we are also beginning to better understand what puts children off learning and makes them risk-averse. We would question whether current policy document s reflect an appropriate recognition of such research. We join with others in expressing deep concern at the DfE’s recent lack of openness to challenge and negation of academic and expert opinion. In our opinion Ministers have not looked at the veracity of the evidence presented, but have, instead responded only to the degree that it supported their own political narrative. When we launched our Open Letter in September 2012, the 127 eminent signatories (including 17 emeritus professors) were summarily dismissed as ‘The Blob’, with a subsequent denial of legitimacy of the concerns raised and a denigration of the credibility of those involved. We fail to see how this is in any way democratic or conducive to open and balanced dialogue that has the best interests of the child at heart. We have similar concerns about the dismissal of consultation recommendations such as that of Cathy Nutbrown’s Review and more recently the Primary Assessment and Accountability Consultation. We also highlight the dismissal of the highly evidenced Cambridge Primary Review and draw the committee’s attention to the comparison of the Rose and Cambridge Reviews which revealed a profound difference in quality and approach. One of the key areas of concern has been the lack of transparency of process with closed remits and unclear authorship and consultation processes. The movement believes that children are citizens with developmental rights that have to be protected and that governments have a duty of care to ensure that this is the case. In fact this is clearly stipulated in Article 29 of the United National Convention on the Rights of the Child (UNCRC) that states that “The education of the child shall be directed to..the development of the child’s personality, talents and mental and physical abilities to their fullest potential”. It is vitally important that those in power engage in serious and informed debate about the quality of the evidence that is available to them and how they can avoid political prejudice in areas that are so fundamental to societal wellbeing. You can read our comment in full on our website

Hank Roberts, ATL Past President in a personal capacity

12 December 2014 at 08:54

I would like to first welcome the Select Committee's call for evidence and make the following points. The DfE's evidence base for key policies has been lamentably inadequate, cherry picked and even non existent. Some of these have been covered in Mark Henderson's book 'The Geek Manifesto: why science matters'. For example academies. Sir Mark Walport, the Government Chief Scientific Adviser said, “It’s not unethical to do experiments in education. It’s unethical not to.” However, the academies programme was introduced without trials or pilots, without controls, and most certainly without evidence. They were actions to fulfil an ideological agenda. The DfE's use of 'evidence' regarding them has been a post facto attempt to justification. This is highlighted by the methods and difficulties surrounding implementation because of the lack of educational justification. First, a choice. It is offered as a reward for outstanding status and/or a substantial bribe given. Second, a choice is still offered but with no bribe, and the threat of being forced if you don’t jump. And finally, simply being forced in ever increasing numbers, because of being 'failed' by Ofsted by the simple mechanism of repeatedly raising the bar. Ofsted itself has been thoroughly discredited as to its judgements having any sound reliability or validity and its lack of any sound scientific base is evinced by its numerous changes in the methodology of its inspections. Should we have a research council for educational research and practice like medicine and engineering do? Yes. The government should assist with its establishment, but it should be independent of government (i.e. Party political interventions.) In a paper published by the NFER in 2010 'A guide to running randomised control trials for educational researchers', Hutchinson and Styles point out that not only are RCTs seen as the gold standard for evidence based educational practice, but that they are easier to do in many cases and many areas of education than is commonly believed. They state that an RCT should be considered as the first choice to establish whether an intervention works and they answer in their paper many common objections to RCTs. This who area has been considered in the United States by the Committee on Scientific Principles for Education Research in their publication 'Scientific Research in Education' 2002. They state that at its core scientific inquiry is the same in all fields and they list six principles 1) pose significant questions that can be investigated empirically 2) link research to relevant theory 3) use methods that permit direct investigation of the question 4) provide a coherent and explicit chain of reasoning 5) replicate and generalise across studies and 6) disclose research to encourage professional scrutiny and critique. If the Select Committee ensures these principles are followed it will be game changing. Not introducing new education policies that have no reliable valid and statistically significant evidence base. What a turnaround that would be. 

Total results 154 (page 1 of 16)