This chapter considers the substantial progress in research on teaching, whilst noting the ongoing challenges that remain in terms of building a more cumulative body of evidence. It offers an indication of the strength of the best available evidence for each practice. It also highlights the importance of greater synergies between scientific and professional knowledge.
Unlocking High-Quality Teaching

7. Moving towards more evidence-informed practices
Copy link to 7. Moving towards more evidence-informed practicesAbstract
In Brief
Copy link to In BriefEducational research has grown rapidly in recent decades, prompting questions about how this expanding evidence base can effectively support educational improvements.
There has been a lot of research on the impact of the 20 practices of the Schools+ Taxonomy on students' cognitive and non-cognitive outcomes. A rating exercise, involving 26 leading academics and knowledge brokerage organisations, showed that the best available evidence is stronger for classroom interaction and formative assessment practices rather than for cognitive engagement, quality subject content, and social-emotional support – partly because these areas are harder to conceptualise and measure.
Further research is needed to understand what works, where, why, for whom, and under what conditions these practices can be most impactful.
While schools seem interested in accessing and using research, barriers and challenges remain in interpreting it and adjusting established practices or habits. Deeper forms of collaboration, both among teachers and school leaders, as well as with researchers, around evidence and self-inquiry into one’s practice remain limited and often overlooked.
Greater attention is needed not only on what has an impact but also on how, fostering a dynamic process where professional experience and scientific knowledge enrich one another.
In the last two to three decades, education research and knowledge production has gone through transformative changes. Rough estimates put the production of educational research papers as increasing fivefold between 1996 and 2015 (Van Damme, 2022[1]). This trend is mirrored in the growth in the quantity of actors that are engaged in working with research evidence. OECD research suggests that the number of studies on knowledge mobilisation mentioning terms relating to “intermediaries” has increased from less than 200 studies in 2000, to over 2500 studies in 2021 (Torres and Steponavičius, 2022[2]). Whilst the growth has not kept pace in scale and efficiency as that of the health sector (Education.org, 2021[3]), there has still been a considerable shift towards building a larger evidence base in education.
This new abundance of evidence has raised fundamental questions to critically understand how it can support education improvement. What counts as evidence? What are the general characteristics of the current evidence on teaching practices? How can understanding the wider interplay between different sources of knowledge influence teachers’ decision making and support more evidence-informed practices in the classroom?
Building scientific evidence on classroom teaching
Copy link to Building scientific evidence on classroom teachingWhat counts as evidence
The growth in research production raises the question of what counts as evidence. Drawing upon the example of the health sector, defining ‘quality evidence’ has often been viewed in a hierarchical way with distinct standards based on methodological rigour in terms of identifying causal impact (Glover, 2006[4]; Nutley, Powell and Davies, 2013[5]). It is often presented in terms of a pyramid (Stimson Library Medical Center of Excellence, n.d.[6]; University of Canberra Library, n.d.[7]), with certain methodologies such as meta-analyses and systematic reviews at the top, followed by replications of experimental design studies and randomised control trials. Case studies and qualitative studies are often then viewed as lower quality evidence (Hoffmann, Bennett and Del Ma, 2017[8]). These can collectively be referred to as ‘scientific knowledge’, with pyramids of evidence often then including other bodies or forms of knowledge, such as ‘professional knowledge’ that draws upon sources like ‘expert opinion’ or ‘experiential knowledge’ (see Figure 7.1). The former may include information and evidence shared by colleagues during formative evaluation or observation, while the latter may include the more anecdotal and less formal forms of evidence, such as salient student behaviours or direct student feedback, that arise during the day-to-day of lessons, or across the wider school. Because methodologies evolve and change, these pyramids may not be exhaustive and have been criticised for presenting hard boundaries between certain methodologies (Murad et al., 2016[9]).
Figure 7.1. A pyramid of quality evidence based on methodological rigour
Copy link to Figure 7.1. A pyramid of quality evidence based on methodological rigour
Source: Adapted from Van Damme (n.d.[10]), Center for Curriculum Redesign, The Challenges of Evidence-Informed Education, https://dirkvandammeedu.net/wp-content/uploads/2024/02/The-Challenges-of-Evidence-informed-Education-CEIPP.pdf, (accessed on 7 August 2024).
In practice, what counts as ‘quality evidence’ can be understood in different ways. For example, there is a notable variation in how organisations that evaluate and broker education research gauge the quality of evidence for the same intervention programs (Wadhwa, Zheng and Cook, 2023[11]). Moreover, the hierarchical perspective to evidence reflects the tension between establishing causal relationships and developing research that responds to the needs and diverse realities of schools. This has been increasingly focused on in recent years, with there being more effort on encouraging more plurality in how the concept of evidence is approached to reflect that there is no single best method or type of evidence. Rather the most appropriate methodological approach depends on the question that is being investigated (Nutley, Powell and Davies, 2013[5]). Indeed, Nutley and colleagues (2013[5]) have explored this in more detail and have developed an adapted typology of the relative contributions that different kinds of methods can make to different kinds of research questions (see Table 7.1). There have been similar efforts more recently by the Education Endowment Foundation (EEF) (2024[12]) too, who released a guide explaining the different types of evidence and purposes. Such efforts point towards a greater acknowledgement in education research of the potential multiplicity of approaches to advancing the evidence base of education, from understanding whether or not a particular practice intervention works, to unpacking actually how it may work, and the variability of this in different contexts.
Table 7.1. The appropriate methodologies depend on the research question
Copy link to Table 7.1. The appropriate methodologies depend on the research question
Research question |
Qualitative research |
Survey |
Case-control studies |
Cohort studies |
Randomised control trials |
Quasi-experimental studies |
Non-experimental studies |
Systematic reviews |
---|---|---|---|---|---|---|---|---|
Does doing this work better than doing that? |
+ |
++ |
+ |
+++ |
||||
How does it work? |
++ |
+ |
+ |
+++ |
||||
Does it matter? |
++ |
++ |
+++ |
|||||
Will it do more harm than good? |
+ |
+ |
+ |
++ |
+ |
+ |
+++ |
|
Will students be more interested or engaged in learning? |
++ |
+ |
+ |
+ |
+ |
+++ |
||
Is it worth doing this? |
++ |
+++ |
||||||
Is it the appropriate learning opportunity for these students? |
++ |
++ |
++ |
|||||
Are students, schools, other stakeholders satisfied with it? |
++ |
++ |
+ |
+ |
+ |
Note: The number of ‘+’ signs corresponds to the extent to which the type of methodological approach is suited to answering a research question.
Source: Adapted from Nutley, Powell and Davies (2013[5]), What counts as good evidence? Provocation paper for the Alliance for Useful Evidence
Measuring teaching and learning is particularly challenging
Measuring teaching and learning is particularly complex. Teaching is never a linear process; instead, many teaching practices typically occur simultaneously, each of which is hard to disentangle and isolate individually (Pollard, 2010[13]; Leinhardt and Greeno, 1986[14]). The intrinsic complexity of teaching is further compounded by the context in which it takes place. Teaching is situated in a specific temporal, social and cultural context. For instance, in the classroom, the process and quality of instruction can vary tremendously from day-to-day (Praetorius, McIntyre and Klassen, 2017[15]; Rowan and Correnti, 2009[16]), and interactions with students can be highly variable too (Schweig, 2016[17]; Reinholz and Shah, 2018[18]).
Research efforts have primarily rested upon indirect measures, such as questionnaires where teachers and students report on the presence or frequency of different teaching practices (Goldhaber, Gratz and Theobald, 2017[19]; Hill, Kapitula and Umland, 2011[20]). These are practical in terms of implementation and cost-effectiveness. They have also been combined with analysis of the frequency and quality of learning experiences that students encounter, such as through analysis of learning resources (Stacey and Turner, 2015[21]). However, these indirect measures have a range of limitations, such as being susceptible to social desirability bias (Goe, Bell and Little, 2008[22]) or misinterpretation (Goe and Stickler, 2008[23]). One approach in response to this has been to move towards more direct measures such as observation (OECD, 2020[24]). However, looking directly into classrooms remains methodologically challenging and very costly, as well as constrained by its own potential limitations such as observer bias or the observation process distorting classrooms behaviours (Praetorius, McIntyre and Klassen, 2017[15]; Ho and Kane, 2013[25]).
Meanwhile, as the use of methodologies with rigorous control groups has become more accepted and common in education, efforts to build evidence on teaching and learning through these has encountered obstacles. One challenge is that research with this type of methodology struggles to build detailed evidence on the effectiveness of teaching practices. Partly this is because, as mentioned, numerous practices often occur simultaneously in classrooms, meaning isolating individual practices is very challenging (Wrigley and McCusker, 2019[26]). A second challenge connected to this is that there is considerable human agency in classrooms which reduces the possibility that external factors such as an assigned treatment in an experimental evaluation can alone explain effects (Parra and Edwards, 2024[27]). Because human agency is at the heart of learning and many intervention changes, rather than external medicines as in the health sector, the risks of bias are high, such as through the alteration of behaviour by teachers who are aware they are being studied (Thomas, 2016[28]). Finally, generalisability claims can be limited, as classrooms can be far more variable in practice than the labels of characteristics that are typically used and controlled for which might explain the low rates of replicability (Pawson, 2006[29]).
Building a cumulative body of knowledge is even more challenging
These limitations sit across far broader challenges in research production and mediation which impede building a coherent cumulative body of knowledge. An important concern is that the replicability of education research is very low (Perry, Morris and Lea, 2022[30]). Based on a sample of the top 100 education journals in 2014, researchers found only 0.13% of the published papers, in the journals’ complete publication history, were replications (Makel and Plucker, 2014[31]). Moreover, Makel and Plucker found in their 2014 systematic review that of the replication studies that did take place, when there was no author overlap in the replication study, only 54% of replications were successful in replicating previous effects. Indeed, the EEF has also encountered similar issues of replicating at greater scale studies that have shown positive effects in certain, smaller settings (Edovald and Nevill, 2020[32]). Furthermore, a follow up review inspired by that of Makel and Plucker but investigating replication studies in the years 2011-2022 found a small increase in the number of replication studies, though similar results and patterns in terms of success (Perry, Morris and Lea, 2022[30]).
The challenges around replication studies point to the potential misalignment between the inherent incentives of research production and what is needed for the development of a reliable and coherent cumulative body of knowledge for professionals to draw upon. One argument has been that the incentives of research production are too heavily geared towards producing results that are statistically significant. This has led to wider criticisms that effect sizes in education research, and more widely, can often present an exaggerated estimate of a programme’s effect (Button et al., 2013[33]; Vasishth et al., 2018[34]; Sims et al., 2022[35]). This raises the risk of misinterpretation or misalignment in terms of expectations. Hence, in retrospectively analysing 22 promising randomised control trials, Sims and colleagues (2022[35]) found that the estimated effect sizes were exaggerated by an average of 52% or more. This means that real effects may actually be smaller than those reported, particularly in small-scale trials as is characteristic often of much experimental research in education.
Connected to these questions around the reliability of certain effect sizes is the issue of who is conducting the research. Researchers have found that even when controlling for different design features and other covariates, studies that were commissioned or conducted by the developers of a particular intervention had, on average, a larger effect size than those studies that were conducted by independent actors (Wolf et al., 2020[36]).
A second critique of the inherent incentives in research production has been its dependency on extensive referencing of already well-cited work (Chu and Evans, 2021[37]). It has been argued that this can lead to the ‘canonisation’ of certain ideas, with some arguing that these referencing patterns are coupled with a culture of rarely challenging or negatively reviewing previous work (Van Damme, 2022[1]). These are trends that affect not only educational research but social science research more generally (Catalini, Lacetera and Oettl, 2015[38]). Manifestations of this can be seen in research into teaching and learning and how certain ideas can become concentrated around well-repeated labels that are often ill-defined, such as the case of ‘active learning’ as mentioned in Chapter 1 (Hood Cattaneo, 2017[39]), or where there may be far greater focus on discussing the promise and potential of a concept rather than actual empirical work, as has been argued in the field of ‘cognitive science’ (Perry, Morris and Lea, 2022[30]).
Finally, efforts to synthesise existing evidence and develop more generalisable findings, such as through meta-analyses and systematic reviews that consolidate empirical findings, are still relatively nascent in education. Drawing upon the work of Education.org (2021[3]), Van Damme (2022[1]) recently argued that the health sector produces 26 syntheses for every one synthesis developed in education. Whilst some of this difference can be explained by sectors’ expenditure on research, estimates on the magnitude of difference in spending (e.g. OECD, 2023[39]) suggest that other factors than purely funding are also at play. Nevertheless, recent years have also seen the development and establishment of “knowledge brokerage organisations” (also referred often to as “knowledge intermediaries”) that enable practitioners’ – as well as policy makers’ – engagement with research to support their practices and decision making by carrying out knowledge mobilisation activities (OECD, 2022[40]). In particular, many organisations engage in developing robust evidence syntheses. They are relatively new structural features in the education evidence landscape, which serve as established gatekeepers and pathways for rigorous education research. For instance, the What Works Clearinghouse in the United States, EEF in England, The Campbell Collaboration in Canada, and Leibniz Institute for Research and Information in Education (DIPF) in Germany have all become well-established in the last two decades (OECD, 2022[40]).
The scientific evidence on teaching practices
Copy link to The scientific evidence on teaching practicesDespite the challenges of measuring teaching and learning, there is growing scientific knowledge on the impact of some teaching practices on students’ cognitive and non-cognitive outcomes. After all, it can sometimes be too easy to outright dismiss research findings in a ‘blanket’, all-encompassing manner due to certain limitations. However, there is still valuable information that can be garnered from the sustained efforts to interrogate teaching and learning in recent decades.
Gauging the strength of the existing evidence
To understand the state of play of the existing evidence base on teaching, an Informal Expert Group (see Table 7.2) examined different pedagogical frameworks and their associated evidence bases to define clear conceptual descriptors for common practices and to draft background documents scoping the evidence on practices. This resulted in a first draft of a Taxonomy of Teaching, consisting of five broad dimensions (Classroom Interaction, Cognitive Engagement, Formative Assessment and Feedback, Quality Subject Content, Social-Emotional Support).
An expert review process was undertaken on the 20 practices included in the Schools+ Taxonomy to better understand the strength of the best evidence available and identify potential areas for further research. Evidence brokerage organisations and a selected group of leading academics were invited to participate in an expert rating exercise for each of these practices (see Annex A). In particular, they were asked to rate each of the 20 practices according to the following considerations:
the number of existing quality studies establishing a causal impact on student outcomes and building a cumulative body of research;
consistency of the direction of effects and predictive power over key student outcomes;
and coverage of a range of different contexts, subjects and ages.
A total of 26 leading academics and evidence brokerage organisations provided ratings. While pinpointing the current strength of evidence is challenging, Table 7.2 provides an initial indication according to the level of expert consensus in the ratings as well as the strengths and limitations that they shared.
Table 7.2. Evidence on causal impact on student outcomes
Copy link to Table 7.2. Evidence on causal impact on student outcomes
Pedagogy dimension |
Strength of the best available evidence according to expert ratings |
||
---|---|---|---|
Low |
Medium |
High |
|
Classroom interaction |
Student collaboration (e.g. Education Endowment Foundation (2021a[41]); Kyndt et al. (2013[42]); van Leeuwen and Janssen (2019[43])) |
||
Whole-class discussion (e.g. Howe and Abedin (2013[44]); Alexander (2018[45])) |
|||
Questioning and responding (e.g. Alexander (2018[45]); Hennessy et al. (2021[46]); Sedova et al. (2019[47])) |
|||
Cognitive engagement |
Ensuring appropriate levels of challenge (e.g. Wang and Eccles (2013[48])) |
Metacognition* (e.g. Muijs and Bokhove (2020[49]); Perry, Lundie and Golder (2018[50])) |
|
Working with multiple approaches & representations (e.g. Mayer (2002[51])) |
|||
Facilitating first-hand experiences (e.g. Kolb and Kolb (2009[52])) |
|||
Meaningful context and real-world connections (e.g. Education Endowment Foundation (2017[53]); Alifieri et al. (2011[54]); Furtak et al. (2012[55])) |
|||
Formative assessment and feedback |
Diagnosing student learning (e.g. Elliot et al. (2020[56]); Chiu (2004[57])) |
Feedback (e.g. Elliot et al. (2020[56]); Newman (2021[58])) |
|
Adapting to student thinking (e.g. Smale-Jacobse et al. (2019[59]); Deunk et al. (2018[60]); van de Pol et al. (2015[61])) |
Learning goals (e.g. Jussim and Harber (2005[62]); Sanchez et al. (2017[63])) |
||
Quality subject content |
Crafting explanations and expositions (e.g. Stockard et al. (2018[64])) |
Clarity, accuracy and coherence (e.g. Stockard et al. (2018[64]); Coe et al. (2020[65])) |
|
Making connections (e.g. Education Endowment Foundation (2017[53])) |
|||
Nature of the subject (e.g. Erduran and Dagher (2014[66])) |
|||
Social-emotional support |
Relationship building (student-student) (e.g. Yibing Li et al. (2011[67]) |
Nurturing a supportive classroom climate (e.g. Wang et al. (2020[68]); Khalfaoui, Garca-Carrin and Villardn-Gallego (2020[69])) |
|
Explicitly teaching and actively practising social-emotional skills (e.g. Education Endowment Foundation (2021b[70]); Takacs and Kassai (2019[71])) |
Relationship building (teacher-student) (e.g. Hamre and Pianta (2001[72]); Ansari, Hofkens and Pianta (2020[73])) |
Note: An independent literature review was also carried out which referenced 500 studies across the five teaching goals considered. The key references noted for each practice were suggested by the Informal Expert Group.
*Metacognition was added to the Taxonomy after the Consultation exercise with experts and knowledge brokerage agencies. It is positioned based on a review of the literature and some of the qualitative comments relating to metacognition from experts and knowledge brokerage agencies.
The level of scientific knowledge varies greatly across these practices according to experts’ judgements. Overall, there is more consensus around the causal impact of practices in the dimensions of classroom interaction and formative assessment and feedback than those of cognitive engagement, quality subject content, and social-emotional support.
Limitations and areas for further research
Table 7.3 suggests specific aspects requiring further investigation for each one of the practices in each of the five dimensions of the pedagogical taxonomy. Overall, the following limitations can be noted across the evidence base:
Number of studies and research designs. There is a limited number of research studies using empirical designs. For instance, in formative assessment and feedback, more empirical research is needed to isolate the effects of certain practices. Similarly, in social-emotional support, where correlational and non-experimental studies prevail, empirical research is required to understand the precise impact of teacher-student relationships and their direct contributions. There is also a wider question of the type of empirical research; more controlled out-of-classroom studies (e.g. laboratory studies) can play a role in building a picture of the effects of practices and how they may work, but there remains a need to carefully consider their translation to authentic classrooms which is not always direct (e.g. see Howe et al., 2019[73]).
Education levels. Research disproportionately comes from some age groups and education levels for some dimensions. Notably, in social-emotional support, there is a need to further investigate how findings on the effectiveness of concrete practices with younger students translate to older students when explicitly teaching skills. Conversely, in quality subject content, there is still a need to better understand how cognitive learning theories manifest particularly in younger students, in relation to the core practice of clarity and accuracy.
Subjects. Research tends to be carried out in the subjects of mathematics, literacy and science which limit the relevance of findings to other subjects. For instance, in the dimensions of quality subject content, the evidence base is quite heavily dominated by research in the field of science and mathematics, for nature of the subject and in, cognitive engagement, in science for first-hand experiences.
Contexts. Research tends to overrepresent well-resourced and English-speaking educational contexts, compared to those serving disadvantaged or marginalised students. This is particularly notable in studies on core practices such as collaboration and whole-class discussion where class composition plays a significant role in the manner that these practices are implemented. More diverse research contexts are needed to better understand how findings apply across different classroom settings to improve the generalisability of their results so far.
Conceptual clarity. In some dimensions, there remain challenges in operationalising certain constructs. For instance, in the social-emotional support dimension there can be a high degree of conceptual variation in terms of how aspects such as the classroom climate or specific skills that may be taught are defined and used in research (Steponavičius, Gress-Wright and Linzarini, 2023[74]). For other practices, such as in cognitive engagement, there is need for more clarity around what constitutes ‘challenge’. A lack of clarity on key constructs makes it challenging to build cumulative knowledge in the field.
More concrete limitations of current research studies are indicated in Annex 7.A to pinpoint existing gaps and opportunities for future research efforts.
Table 7.3. Potential future research directions
Copy link to Table 7.3. Potential future research directions
Pedagogical dimension |
Practice |
Areas for further research |
Classroom interaction |
Questioning and responding |
What types of questions may be most effective for promoting more advanced forms of reasoning, like critical thinking or problem-solving. The timing, and, connected to this, sequencing, of questions (Bishop, 2021[75]). |
The specific emotional or affective aspects of the learning environment and how they interact with questioning. |
||
Further understanding the cultural variation of certain questions and how they may be interpreted differently, including how certain cultures of responding may impact questioning and responding patterns (Xu and Clarke, 2019[76]). |
||
Collaboration |
Contextual differences in terms of class sizes and composition. |
|
More independent evaluations of interventions (Education Endowment Foundation, 2021a[41]). |
||
Whole-class discussion |
Contextual differences in how discussion and dialogue emerge, and how these are related to cultural practices (Xu and Clarke, 2019[76]). |
|
Cognitive engagement |
Ensuring appropriate levels of challenge |
Greater conceptual clarity and coherence around what constitutes a ‘challenge’. |
Isolating the exact mechanisms – analysis, evaluation, problem-solving etc. – that make particular work/tasks challenging (Sweller et al., 2024[77]). This may include the identification of certain concrete features of work/tasks challenging. |
||
Facilitating first-hand experiences |
Isolating particular effects of experiences; research interventions often consist of multiple program elements making it challenging to understand what features make them impactful (Sweller et al., 2024[77]). |
|
Understanding the most effective use of first-hand experiences in relation to student prior knowledge and when to progress to inquiry processes from certain levels of prior knowledge (de Jong et al., 2023[78]; Sweller et al., 2024[77]). |
||
Working with multiple approaches and representations |
What types of representations or different perspectives may be of value to different students in terms of deepening their understanding and building more flexible thinking. |
|
Metacognition |
Working with larger sample sizes to improve the generalisability of findings. This may be connected to trying to understand in more detail the particular mechanisms of particular metacognitive approaches that contribute to outcomes. |
|
The relationship between in-the-moment metacognition and retrospective metacognition (often referred to as “online” or “offline” metacognition) as some studies have found a disconnect between these types of measurements and suggest there is more to understand how these are developed (Fleur, Bredeweg and van den Bos, 2021[79]). |
||
Meaningful context and real-worlds connections |
More clearly defining and understanding what is ‘meaningful’ and ‘relevant’ to students, to identify specific features for task design and teaching. This may include consideration across subjects. |
|
Formative assessment and feedback |
Learning goals |
Empirical testing of different features of learning goals and their communication to understand if certain approaches are more effective than others. |
Diagnosing student learning |
What specific tasks and questions can best diagnose student learning in real-time for different types of knowledge and, for new, increasingly valued skills. |
|
When it is most appropriate to elicit student thinking and act upon it (Ruiz-Primo, 2011[80]). |
||
Feedback |
The long-term impacts of different types of feedback (e.g. long-term memory retention). |
|
Adapting to student thinking |
Advancing approaches to the measurement of the degree of adaptation of teachers to student thinking and their alignment to student needs (Deunk et al., 2018[60]). |
|
Quality subject matter |
Crafting explanations and expositions |
Moving from a primarily theory-based approach towards more empirical testing of how explanations of particular content are best structured and sequenced for students’ long-term learning. |
Clarity, accuracy and coherence |
More classroom-based research of particular learning theories, such as around retrieval of prior knowledge and its sequencing, as well as in-classroom cognitive load. |
|
Making connections |
Understanding more about sufficient levels of prior knowledge to move towards making connections (Education Endowment Foundation, 2021a[41]). |
|
Nature of the subject |
How the ‘nature of the subject’ can be conceptualised and analysed beyond scientific subjects (Puttick and Cullinane, 2021[81]). |
|
Social-emotional support |
Nurturing a supportive classroom climate |
Conceptual clarity around key concepts (e.g. grit and consciousness) to ensure a consistent application of terminologies (Gutman and Schoon, 2013[82]; Audley and Donaldson, 2022[83]; OECD, 2021[84]). This will also facilitate measurement efforts. |
Relationship building (teacher – student) |
How relationships are developed and maintained in different cultures and contexts to facilitate comparisons. |
|
More experimental research that goes beyond correlational, survey-based designs (Sabol and Pianta, 2012[85]). |
||
Relationship building (student – student) |
Further exploration of how to structure learning environments to facilitate conversation and dialogue (Alfieri et al., 2011[54]; Wentzel and Watkins, 2022[86]). |
|
The role of group work in relationship building, and how it might contribute to students’ sense of belonging (Tolmie et al., 2010[87]). |
||
Explicitly teaching and actively practising social-emotional skills |
Understanding how to support social-emotional skill development with older populations of students, particularly adolescents (Yeager, Dahl and Dweck, 2017[88]). |
|
Role of non-classroom spaces (e.g. corridors) in social-emotional skill development (Jones et al., 2021[89]). |
||
Examination of specific mechanisms within programmes to isolate what contributes to student effects. |
Note: Suggestions were provided by the 26 leading academics and knowledge brokerage organisations participating in the expert review exercise. An additional 17 academics and organisations provided qualitative input on the conceptualisation of practices and the scoping of their evidence (see Annex A: Methodology).
Moving towards more evidence-informed practices
Copy link to Moving towards more evidence-informed practicesAs research seeks to move the frontiers of knowledge in education forward, this raises the question of how this body of scientific knowledge can best inform what happens in the classroom. For scientific knowledge to have an impact, teachers need to be able not only to access and interpret it, but also to unlearn and draw upon new evidence in their decision making processes (Cain et al., 2019[90]). This is a dynamic and complex process.
Accessing scientific evidence
The efforts to translate research into more accessible ways for schools have increased in recent years. It has become clear that the simple access to ‘raw research’ is not generally an effective way for it to be used (Gorard, See and Siddiqui, 2020[91]). The aforementioned growth in knowledge brokerage organisations reflects this, as well as the increasing attention to understanding their role and activities as intermediaries (OECD, 2022[40]; OECD, 2023[92]).
A recent OECD survey of knowledge brokerage organisations from 34 countries looked at the types of support organisations provide practitioners. The survey included formal knowledge brokerage organisations and other actors such as research institutions, initial teacher education institutions, inspectorates and quality assurance services. Only about half (49%) of the organisations who said that their knowledge mobilisation work in education focuses on facilitating research use in practice reported that 'Self-evaluation or reflective tools about the implementation of interventions/strategies' were already in place for practitioners. As Figure 7.2 shows, 21% intended to make this type of support available. On the one hand, this suggests that supports for teachers' critical engagement with interventions/strategies are not yet fully established. The dialogue that is necessary around research evidence use and professional reflection could be further supported. Second, this suggests that the perspective of practitioners is still not systematically sought nor considered. How teachers would evaluate the effectiveness of interventions or strategies is not consistently an area of consideration.
Figure 7.2. Types of support organisations enabling research engagement offer to practitioners
Copy link to Figure 7.2. Types of support organisations enabling research engagement offer to practitioners
Note: The number of respondents to each item shown in the Figure varied: item 1 had 225 respondents, item 2 had 223, item 3 had 222, item 4 had 225 and item 5 (‘other’) had 30 respondents.
Source: OECD Survey of Knowledge Mobilisation in Education data, 2023.
There has also been a change in the attitudes and consumption of teachers and school leaders towards scientific evidence. In the Teaching and Learning Survey (TALIS) 2018, 76% of teachers reported attending “education conferences where teachers, principals and/or researchers present their research or discuss educational issues” (75%) (OECD, 2019[93]). In contrast, in the first Teaching and Learning Survey (TALIS) in 2009, less than half of teachers reported attending a similar form of professional development (“Education conferences and seminars”).
This echoes wider findings on evidence engagement among the profession, with research in several countries confirming the presence of some promising patterns around teachers’ and schools’ access scientific evidence (Brown and Malin, 2022[94]). A 2017 survey of 1670 teachers in England found that the majority of respondents had a positive disposition towards academic research, even if its actual impact on their decision making was still small relative to other sources of knowledge (Walker, Nelson and Bradshaw, 2019[95]). Meanwhile, two-thirds (65%, n=318) of Australian educators in the Q Project’s first survey indicated that, ‘when confronted with a new problem or decision, they look for research that may be relevant’. The same proportion (65%, n=318) indicated they ‘know where to find relevant research that may help to inform their teaching practices’ (Australian Institute for Teaching and School Leadership, 2021[96]). Interestingly, this blend of positive disposition but more limited direct application of research is also echoed in research in Spain; in one study, 68% of teachers declared that they frequently or always use research to inform their practices, but its actual use to inform innovations was less, with experiential or peer knowledge preferred (Ion, Diaz and Gairin, 2019[97]), cited in Malin et al. (2020[98]).
Across these different geographic areas of research, a notable feature is the role of colleagues and community. Indeed, more widely, the role of colleagues in sharing evidence is also increasingly seen as important to not just enabling access to evidence but also influencing and facilitating its use (Cain, 2015[99]). For instance, research has found that teachers often prefer to be recommended evidence by their colleagues, underpinned by the understanding that their colleagues would only recommend evidence they found to be useful or relevant to practice themselves (Williams and Coles, 2007[100]).
Box 7.1. Towards more structures and cultures of high-quality research use among schools
Copy link to Box 7.1. Towards more structures and cultures of high-quality research use among schoolsGrowth in research production in recent decades has also seen growth in the types of initiatives that seek to support the effective use of this research by teachers and school leaders. These efforts range from building specific structures such as processes and mechanisms for research use to developing wider cultures of habitual interrogation of research evidence to inform decision making.
Across the Schools+ Network, participants grapple with this challenge of promoting effective evidence use. Some notable approaches include:
The main brokerage organisation in England (UK), the Education Endowment Foundation (EEF) has undergone a considerable growth journey since its foundation in 2011. This has seen it grow from its original focus on randomised control trials and meta-analyses for its Teaching and Learning Toolkit, to also become the largest funder of qualitative research in their country. To better support schools to access, understand and use evidence, the EEF set up a partner network (“Research Schools Network (RSN)”) in 2016. The RSN has grown to a collaboration of 33 schools across seven regions in England. Research schools serve as evidence advocates in their local and regional networks and develop strategic partnerships through a blend of training, exemplification and school-to-school support. The RSN has collectively engaged with more than 40% schools across England and provided training to over 6,000 schools, with its role in fostering stronger cultures of evidence use reflected in the wider fact that 70% of senior leaders in England cite use of the EEF’s Toolkit when making decisions about school spending.
SUMMA is the Laboratory of Education Research and Innovation for Latin America and the Caribbean, created in 2016 by the Inter-American Development Bank. SUMMA aims to enhance the quality, equity, and inclusion of educational systems by generating comparative research, synthesising contextualised evidence, designing and evaluating innovations, and fostering effective structural reforms, through long-term policy and practice partnerships. One manifestation of this its development, in alliance with the aforementioned Education Endowment Foundation, of a Platform of Effective Pedagogical Practices to support both evidence-informed and context-sensitive decisions. Thus, the platform complements global evidence with regional research from the Latin America and Caribbean region to ensure both consistency in what the evidence states, and pertinence in its recommendations for practical actions. A further important strand of SUMMA’s work relates to its support of policy reforms and building shared policy and research agendas. It supports and collaborates with more than 20 Ministries of Education in the region. For example, at the regional level it has collaborated with the University of the West Indies to reform initial teacher training programmes in the Caribbean to support new teachers’ research literacy and their understanding of the latest evidence base of practices.
Leerpunt is a knowledge brokerage organisation in Flanders, Belgium that aims to strengthen educational practice through scientific insights. Founded in 2022, it has been able to build on the work of other brokerage organisations – such as the two toolkits of the EEF – as well as key lessons learnt. For instance, one notable strategy has been its development of a clear knowledge agenda in collaboration with schools, which outlines the themes where teachers and education professionals indicate a need for more knowledge. This is with a view to keeping research highly relevant to schools. Another notable strategy is its development of a range of partnerships to foster a network of support around schools that can better enable effective evidence use. Leerpunt collaborates with other organisations, such as teacher training programs and pedagogical support services, as well as partnering with the Flemish Education Agency. This more ‘system’ approach to knowledge mobilisation reflects the growing knowledge base not just in terms of evidence itself, but actually on how to support evidence use.
Edutopia.org, an initiative of the George Lucas Educational Foundation, serves as a free source of information on evidence-informed learning and teaching practices and connects a community of stakeholders committed to improving education. It aims to advance a vision where students become lifelong learners and develop fundamental skills for today's and tomorrow's challenges. Its content – a blend of multimedia stories, videos, and articles written by practitioners – features inspiring examples from real classrooms. Through vehicles like its "The Research Is In" newsletter, Edutopia helps translate educational research for practical application by educators. The platform has developed considerable reach, particularly in North America and increasingly globally, with an average of 12 million people reached each month across Edutopia.org and social platforms. Key to its success has been highlighting promising practices in clear, accessible formats, many of which are tied closely to the realities of the classroom and highly relatable.
Note: Input was provided directly from Schools+ participants.
Sources: EEF (2022[101]), Teaching and Learning Toolkit, https://educationendowmentfoundation.org.uk/education-evidence/teaching-learning-toolkit.; Gu, Q. et al. (2020[102]), https://educationendowmentfoundation.org.uk/public/files/RS_Evaluation.pdf.; SUMMA (2023[103]), Effective Education Practices Platform, https://practicas.summaedu.org/en/what-is-it-platform/what-is-it-main-objectives.; Sutton Trust (2024[104]), News and Findings - NFER; School Funding and Pupil Premium 2024, https://www.suttontrust.com/our-research/school-funding-and-pupil-premium-2024.
Interpreting and assessing scientific evidence
Schools have a growing interest in accessing and using research. However, have they the skills and capacity to effectively engage with it, and even contribute to it? In 2021, the OECD Strengthening the Impact of Education Research policy survey found that 62% of respondent education systems reported that a “lack of time to access and engage with research” was a barrier to the use of research in school practice, whilst just over half (53%) of respondent education systems reported that “low levels of skills and capacity to use research” were a barrier to the use of research in school practice (OECD, 2022[40]). This is particularly significant when considering how rapidly fields of research on teaching can evolve; for instance, consider the wealth of attention on metacognition (Muijs and Bokhove, 2020[49]) or practices relating to social-emotional support (Yeager, Dahl and Dweck, 2017[88]; Yeager et al., 2021[105]) in recent years, which can shift the knowledge base teachers must draw upon.
Figure 7.3. Practitioners’ research engagement skills
Copy link to Figure 7.3. Practitioners’ research engagement skillsPercentage of systems agreeing or strongly agreeing that "Practitioners have the skills and capacity to…"

Note: The OECD’s Strengthening the Impact of Education Research project surveyed 37 education systems from 29 countries in 2021. 20 systems responded to this question on practitioners.
Source: Adapted from OECD (2022[40]), Who Really Cares about Using Education Research in Policy and Practice?: Developing a Culture of Research Engagement, Educational Research and Innovation, https://doi.org/10.1787/bc641427-en.
Initial teacher education programmes and professional development opportunities are two important avenues to develop a teaching and leadership workforce that can effectively engage with education research. The aforementioned OECD survey found that only around one-third of the ministries reported that training future teachers to understand and interpret research findings is required in all Initial Teacher Education programmes, and less so in Continuous Professional Development (OECD, 2022[40]).
Figure 7.4. Skills taught in initial teacher education and continuing professional development
Copy link to Figure 7.4. Skills taught in initial teacher education and continuing professional development
Notes: Data show the percentage of respondent systems that reported the given skills as “required” or “mostly covered” by Initial Teacher Education (ITE) and Continuous Professional Development (CPD). N = 34 for ITE, 33 for CPD. See Who Really Cares about Using Education Research in Policy and Practice?: Developing a Culture of Research Engagement (OECD, 2023[92]) for more details. Skills are ranked in descending order of the percentage of systems reporting them as “Required” in ITE.
Source: OECD (2022[40]), Strengthening the Impact of Education Research policy survey, https://doi.org/10.1787/bc641427-en
Unlearning and relearning
The very challenges of the process of changing practice cannot be overlooked. It is one that can demand unlearning and relearning what may be deeply seated habits or beliefs. For evidence to have any impact, it must be integrated into teachers’ internal knowledge bases, enabling them to draw upon it in decision making (Cain et al., 2019[90]). Teaching is informed by a range of knowledge sources, and when new evidence is encountered, teachers need to engage in an unlearning and relearning process. When scientific evidence is of high quality and high relevance to teachers, it can be an invaluable source of information (Van Damme, 2022[1]; OECD, 2022[40]). But, it is one source of knowledge that interacts with other sources of knowledge (Figure 7.5) (Cain, 2015[99]; Sharples, 2013[106]).
Figure 7.5. The different sources of knowledge that may inform teachers’ decision making and their teaching
Copy link to Figure 7.5. The different sources of knowledge that may inform teachers’ decision making and their teaching
Source: Adapted from Van Damme (2022[1]), The Power of Proofs (Much) Beyond RCTs, www.curriculumredesign.org, and Guerriero (2017[107]), Pedagogical Knowledge and the Changing Nature of the Teaching Profession, https://doi.org/10.1787/9789264270695-en.
This interplay needs to be further understood, with its neglection making the relevant efforts at the system-level to move towards more evidence-informed practices all the harder. Research has not shed enough light into "how" scientific evidence actually imposes itself and changes practice. It has been proposed that when evidence is sought to solve a problem or inform deliberation, it is more likely to result in changes in practice rather than when it is just consulted out of curiosity (Farley-Ripple et al., 2018[108]). This emphasises that evidence is combined with context-bound tacit knowledge or practice-based research to address specific problems (Greany and Maxwell, 2017[109]; Earl and Timperley, 2015[110]). After all, consider how research on providing students with opportunities to revisit previous learning must be combined with the specific knowledge a teacher has of how the previous lessons have progressed as the teacher organises the clarity, accuracy and coherence of a lesson and how to use specific summaries or plenaries.
Because of the difficulty of the change process, researchers argue that evidence use needs to be understood as a social process, with interaction and relationships playing key roles in determining how evidence is applied in practical settings. On the one hand, Sharples (2013[106]) and Brown et al. (2021[111]) highlight the opportunity to discuss research and evidence allows practitioners to gain a deeper understanding and sense of ownership over the findings. This discussion enables the more relevant and sensitive integration of evidence into professional settings.
On the other hand, a collaborative approach to evidence use embedded into organisational procedures and culture can also facilitate the actual change of practices which is inherently challenging and needs to be sustained over a period of time (Cain et al., 2019[90]; Sharples, 2013[106]; Levine and Marcus, 2010[112]). Effective evidence-informed practice in schools depends on collaborative processes and school-wide structured approaches, which can help ensure that evidence use becomes an ongoing practice (Godfrey and Brown, 2018[113]). Moreover. the structures and cultures that can be created at the school-level are also informed by the wider system context. Rather than relying on schools pioneering impactful use of evidence use, they can also be supported by the system actors around them that can both enable and enhance their approaches to evidence (OECD, 2022[40]). Evidence-informed teaching, therefore, requires a clear commitment to collaboration and shared learning, rather than being an isolated endeavour (Darling-Hammond, Hyler and Gardner, 2017[114]).
The importance of school culture that promotes the use of evidence was underlined by an impact evaluation in England by Coldwell and colleagues (2017[115]) which revealed that sustained change occurs when teachers are given time for informed debate and opportunities to see the impact of evidence in practice. This finding underscores the arguments made on considering evidence use as an iterative process that unfolds in stages, involving the implementation of new practices or changes to existing ones, followed by impact assessment (Harn, Parisi and Stoolmiller, 2013[116]). Moreover, this is reinforced when considering the very nature of practices; the effective implementation of practices such as Student collaboration and whole-class discussion can be aided by the presence of strong norms and routines, which take time to build.
The iterative, ongoing approach to evidence use that research describes means that teachers must be attentive to the different salient outcomes that this evidence use yields, and reflective on what this means. Accordingly, it increasingly means moving towards a position where teachers need to be viewed as, essentially, researchers of their own practice. In particular, considering the power of school cultures and collaboration, it may be more appropriate to talk about teachers in the collective sense; hence, communities of teachers that are researching their practice, drawing upon evidence critically and informatively as well as their wider individual and collective professional expertise.
Box 7.2. Understanding and recognising the professional knowledge among the profession
Copy link to Box 7.2. Understanding and recognising the professional knowledge among the professionSome pioneering efforts have arisen to highlight the initiatives that schools have developed to address specific challenges. In general, these aim to surface and celebrate achievements, share effective strategies, and foster collaboration with a view to fostering a culture of excellence and ongoing learning.
One manifestation of this has been the development of repositories of practices and initiatives that aim to speak directly to schools. There are some examples of this operating at the national level. The Ministry of Education in the Slovak Republique has developed a Catalogue of Innovations in Education, which aims to inspire schools and educational institutions for further growth and development. Priority is given to initiatives that have robust experimental results and that have already been tested in Slovak schools. The catalogue also, however, includes initiatives that have been successfully used abroad, and some completely new ones that are still waiting to be evaluated. Initiatives are categorised based on type of practice (e.g. assessment methods, management) and the target group or type of school. An overview of the content focus, methods and conditions for its use is also provided, as well as the necessary steps for its introduction somewhere new and the findings from evaluations.
There have also been some initiatives to do develop repositories of schools’ initiatives the international level. HundrED, a global non-profit organization, searches for and shares inspiring ‘innovations’ in K-12 education. Each year, HundrED selects 100 ‘impactful and scalable’ education innovations from around the world and supports their spread to new contexts through an online platform. HundrED’s online platform features about 700 innovations which have been selected from thousands. A clear selection process underpins this work, including shared criteria that are used by a range of selected reviewers. To capture the process of context-informed design-thinking and ongoing refinement that has led to the development of these initiatives, the platform shares the step-by-step implementation process behind innovations.
Another approach that has grown in recent years has been the use of awards to shine a light on the work of the teaching profession. At the national level, many countries now award annual national or sub-national teacher or school prizes organised by public authorities or private entities. There are about 40 or so national teacher prizes affiliated with the Varkey Foundation’s Global Teacher Prize, organised by a range of public and private actors. Some of these also operate at the international level, with large monetary prizes. The T4 organisation’s ‘World’s Best Schools’ identifies schools who are implementing projects in five thematic areas with a prize each of 50 000 USD.
Similar efforts to recognise teachers’ work have been spearheaded by a range of other actors, from government ministries (e.g. Ministry of National Education for France’s Ordre des Palmes Academiques or Canada’s Prime Minister’s Award for Teaching Excellence), non-profit organisations (e.g. AdvanceHE in the UK), teachers associations (e.g. the National Science Teachers Association in the US), to universities and higher education institutes among others. An ongoing challenge across prizes is creating mechanisms through which the initiatives or approaches recognised can be systematically shared to a wider number of teachers and schools.
Note: Input was provided directly from the Ministry of Education of the Slovak Republique.
Sources: Mackenzie, N. (2007[117]), Teaching Excellence Awards: An Apple for the Teacher?, https://doi.org/10.1177/000494410705100207; Seppala and Smith (2019[118]), Teaching awards in higher education: a qualitative study of motivation and outcomes, https://doi.org/10.1080/03075079.2019.1593349.
Opportunities to research their own practice and contribute to further research
An inquiry stance towards one’s own teaching can be significant for a teacher’s practice and growth. It has been argued that this type of constant self-inquiry is essential to long-term refinement of something as complex as teaching (Hiebert et al., 2007[119]). There are a range of methodological approaches to practice-based research, including action or participatory research or research partnerships (Maxwell and Greany, 2017[120]), as well as professional learning communities (PLCs) (OECD, 2022[40]; Stoll, 2015[121]).
More broadly, a challenge, however, remains ensuring that forms of collaboration that are deeper and more meaningful are the norm in schools and systems. These may serve as helpful foundations for rich professional inquiry into practice among teachers. The OECD’s TALIS survey (OECD, 2019[93]) suggests that while collaboration is common, deeper forms of collaborative practice remain limited in many schools (Figure 7.6).
Figure 7.6. Teachers’ collaboration with colleagues
Copy link to Figure 7.6. Teachers’ collaboration with colleaguesPercentage of lower secondary teachers who report engaging in the following collaborative activities in their school with the following frequency (OECD average-31):

Notes: “At least once a month” covers the following response options: “1-3 times a month”, “Once a week or more”. “Less than once a month” covers the following response options: “Once a year or less”, “2-4 times a year”, “5-10 times a year”. Values are grouped by type of collaborative activity and, within each group, ranked in descending order of the collaborative activities in which lower secondary teachers report to engage at least once month.
Source: OECD (2019[93]), TALIS 2018 Database, Table II.4.1, https://doi.org/10.1787/19cf08df-en.
The relevancy of a more inquiry-orientated, self-reflective approach to teaching also appears particularly relevant for strengthening teaching skills related to the 20 fundamental practices that are considered in this report. As Chapter 1 outlined, it is not so much a case of teachers overhauling what they are doing in their classrooms, but rather building effectively on the existing foundations of the fundamental practices that exist (OECD, 2020[24]). It is more about interrogating one’s practice to understand what and how can be improved in a given context.
Beyond the immediate benefits to the teacher, there are also two notable areas in which it could further strengthen the larger, collective knowledge base of education. First, this could lead to strengthen the base of the education knowledge by codifying professional knowledge (Professional Knowledge in Figure 7.3). The different sources of knowledge that may inform teachers’ decision making and their teaching.
Professional knowledge is often seen as difficult to generalise, and there has been little effort to codify and synthesize this type of expertise. Professional knowledge has always been, and will be, somewhat intangible, due to its localised nature and dependency on experiences (Ulferts, 2019[122]; Guerriero, 2017[107]), but recent years have also seen the tools and means for examining the commonalities of this knowledge (Mulgan, 2024[123]).
Moreover, practice-based research could also help provide greater clarity on the needs and relevance of more rigorous research methodologies. A recent policy survey by the OECD found that practitioners’ involvement in research production was primarily a passive one, remaining as the archetypal ‘object’ of research (OECD, 2022[40]). On the research side, there has been more attention to teachers’ experience and perspective during the implementation of interventions and changes, such as through aspects like ‘process guidance’ which aim to translate research findings for a school audience in a way that is actionable (Cartwright, 2013[124]). Yet, at a time of experimentation and change, such as for example during COVID or the emergence of generative AI, rapid research from schools could provide insight into more rigorous evaluations to inform policy responses. This could build greater efficiency and responsiveness into the system and, significantly, support practitioners in periods of change.
There is a paradigm shift in research away from focusing simply on a ‘one way’ model of research findings being ‘pushed’ onto schools and teachers, to one that considers ‘two way’ exchange in a far more complex and sustained fashion (OECD, 2022[40]; Sharples, 2013[106]). Imagine that schools are asked about the future research agenda to ensure that scientific knowledge is relevant and aligned to their immediate needs. Similarly, making the professional knowledge of implementation explicit may provide inspiring examples of how to balance fidelity and contextual adjustment during the implementation process for other schools to consider.
Moreover, certain patterns may exist across this body of professional knowledge which could help inform further research on the critical components of their implementation, helping to build a more granular evidence base. Together, this can mean evidence that is more likely to be adopted, implemented well, and, thus, impactful, and at greater scale. The complex interplay of practitioners and researchers and their respective knowledge can be seen as a feedback loop that can heighten the effectiveness of both scientific and professional knowledge, and how they work together to ensure high-quality teaching.
References
[45] Alexander, R. (2018), “Developing dialogic teaching: genesis, process, trial”, Research Papers in Education, Vol. 33/5, pp. 561-598, https://doi.org/10.1080/02671522.2018.1481140.
[54] Alfieri, L. et al. (2011), “Does discovery-based instruction enhance learning?”, Journal of Educational Psychology, Vol. 103/1, pp. 1-18, https://doi.org/10.1037/a0021017.
[73] Ansari, A., T. Hofkens and R. Pianta (2020), “Teacher-student relationships across the first seven years of education and adolescent outcomes”, Journal of Applied Developmental Psychology, Vol. 71, p. 101200, https://doi.org/10.1016/j.appdev.2020.101200.
[83] Audley, S. and M. Donaldson (2022), “When to grit and when to quit:(How) should grit be taught in K-12 classrooms?”, Theory Into Practice, Vol. 61/3, pp. 265-276.
[96] Australian Institute for Teaching and School Leadership (2021), Collaborative Teaching: Sharing Best Practice, https://www.aitsl.edu.au/research/collaborate/collaborative-teaching-sharing-best-practice (accessed on 13 August 2023).
[75] Bishop, J. (2021), “Responsiveness and intellectual work: Features of mathematics classroom discourse related to student achievement”, Journal of the Learning Sciences, Vol. 30/3, pp. 466-508, https://doi.org/10.1080/10508406.2021.1922413.
[121] Brown, C. (ed.) (2015), Using evidence, learning and the role of professional learning communities, UCL IOE Press:.
[94] Brown, C. and J. Malin (eds.) (2022), The Emerald Handbook of Evidence-Informed Practice in Education, Emerald Publishing Limited.
[111] Brown, C. et al. (2021), “Facilitating collaborative reflective inquiry amongst teachers: What do we currently know?”, International Journal of Educational Research, Vol. 105, p. 101695, https://doi.org/10.1016/j.ijer.2020.101695.
[33] Button, K. et al. (2013), “Power failure: why small sample size undermines the reliability of neuroscience”, Nature Reviews Neuroscience, Vol. 14/5, pp. 365-376, https://doi.org/10.1038/nrn3475.
[99] Cain, T. (2015), “Teachers’ engagement with research texts: beyond instrumental, conceptual or strategic use”, Journal of Education for Teaching, Vol. 41/5, pp. 478-492, https://doi.org/10.1080/02607476.2015.1105536.
[90] Cain, T. et al. (2019), “Bounded decision‐making, teachers’ reflection and organisational learning: How research can inform teachers and teaching”, British Educational Research Journal, Vol. 45/5, pp. 1072-1087, https://doi.org/10.1002/berj.3551.
[124] Cartwright, N. (2013), “Knowing what we are talking about: why evidence doesn’t always travel”, Evidence and Policy, Vol. 9/1, pp. 97-112, https://doi.org/10.1332/174426413x662581.
[38] Catalini, C., N. Lacetera and A. Oettl (2015), “The incidence and role of negative citations in science”, Proceedings of the National Academy of Sciences, Vol. 112/45, pp. 13823-13826, https://doi.org/10.1073/pnas.1502280112.
[57] Chiu, M. (2004), “Adapting Teacher Interventions to Student Needs During Cooperative Learning: How to Improve Student Problem Solving and Time On-Task”, American Educational Research Journal, Vol. 41/2, pp. 365-399, https://doi.org/10.3102/00028312041002365.
[37] Chu, J. and J. Evans (2021), “Slowed canonical progress in large fields of science”, Proceedings of the National Academy of Sciences, Vol. 118/41, https://doi.org/10.1073/pnas.2021636118.
[65] Coe, R. et al. (2020), Great Teaching Toolkit: Evidence Review.
[115] Coldwell, M. et al. (2017), Evidence-informed teaching: an evaluation of progress in England.
[10] Curriculum Redesign, C. (ed.) (n.d.), The Challenges of Evidence-Informed Education, https://dirkvandammeedu.net/wp-content/uploads/2024/02/The-Challenges-of-Evidence-informed-Education-CEIPP.pdf (accessed on 7 August 2024).
[114] Darling-Hammond, L., M. Hyler and M. Gardner (2017), Effective Teacher Professional Development.
[78] de Jong, T. et al. (2023), “Let’s talk evidence – The case for combining inquiry-based and direct instruction”, Educational Research Review, Vol. 39, p. 100536, https://doi.org/10.1016/j.edurev.2023.100536.
[60] Deunk, M. et al. (2018), “Effective differentiation Practices:A systematic review and meta-analysis of studies on the cognitive effects of differentiation practices in primary education”, Educational Research Review, Vol. 24, pp. 31-54, https://doi.org/10.1016/j.edurev.2018.02.002.
[110] Earl, L. and H. Timperley (2015), “Evaluative thinking for successful educational innovation”, OECD Education Working Papers, No. 122, OECD Publishing, Paris, https://doi.org/10.1787/5jrxtk1jtdwf-en.
[32] Edovald, T. and C. Nevill (2020), “Working Out What Works: The Case of the Education Endowment Foundation in England”, ECNU Review of Education, Vol. 4/1, pp. 46-64, https://doi.org/10.1177/2096531120913039.
[12] Education Endowment Foundation (2024), Using research evidence - a concise guide, https://educationendowmentfoundation.org.uk/support-for-schools/using-research-evidence (accessed on 7 August 2024).
[101] Education Endowment Foundation (2022), Teaching and Learning Toolkit, https://educationendowmentfoundation.org.uk/education-evidence/teaching-learning-toolkit (accessed on 28 March 2022).
[53] Education Endowment Foundation (2017), Review of SES and Science Learning in Formal Education Settings.
[41] Education Endowment Foundation (2021a), Collaborative Learning Approaches, https://educationendowmentfoundation.org.uk/education-evidence/teaching-learning-toolkit/collaborative-learning-approaches#nav-what-is-it (accessed on 19 March 2024).
[70] Education Endowment Foundation (2021b), Evidence Review: Social and emotional learning.
[3] Education.org (2021), Calling for an Education Knowledge Bridge: A White Paper to Advance Evidence Use in Education, https://whitepaper.education.org/download/white_paper.pdf (accessed on 19 March 2024).
[56] Elliot, V. et al. (2020), Feedback: Practice Review, https://educationendowmentfoundation.org.uk/education-evidence/evidence-reviews/feedback-approaches (accessed on 19 March 2024).
[66] Erduran, S. and Z. Dagher (2014), Reconceptualizing the Nature of Science for Science Education: Scientific Knowledge, Practices and Other Family Categories, Springer.
[108] Farley-Ripple, E. et al. (2018), “Rethinking Connections Between Research and Practice in Education: A Conceptual Framework”, Educational Researcher, Vol. 47/4, pp. 235-245, https://doi.org/10.3102/0013189x18761042.
[79] Fleur, D., B. Bredeweg and W. van den Bos (2021), “Metacognition: ideas and insights from neuro- and educational sciences”, npj Science of Learning, Vol. 6/1, https://doi.org/10.1038/s41539-021-00089-5.
[55] Furtak, E. et al. (2012), “Experimental and Quasi-Experimental Studies of Inquiry-Based Science Teaching”, Review of Educational Research, Vol. 82/3, pp. 300-329, https://doi.org/10.3102/0034654312457206.
[4] Glover (2006), Evidence-based medicine pyramid.
[113] Godfrey, D. and C. Brown (2018), “How effective is the research and development ecosystem for England’s schools?”, London Review of Education, Vol. 16/1, https://doi.org/10.18546/lre.16.1.12.
[22] Goe, L., C. Bell and O. Little (2008), Approaches to Evaluating Teacher Effectiveness: A Research Synthesis, https://eric.ed.gov/?id=ED521228 (accessed on 19 March 2024).
[23] Goe, L. and L. Stickler (2008), Teacher quality and student achievement: Making the most of recent research, https://files.eric.ed.gov/fulltext/ED520769.pdf (accessed on 19 March 2024).
[19] Goldhaber, D., T. Gratz and R. Theobald (2017), “What’s in a teacher test? Assessing the relationship between teacher licensure test scores and student STEM achievement and course-taking”, Economics of Education Review, Vol. 61/C, pp. 112-122.
[91] Gorard, S., B. See and N. Siddiqui (2020), “What is the evidence on the best way to get evidence into use in education?”, Review of Education, Vol. 8/2, pp. 570-610, https://doi.org/10.1002/rev3.3200.
[109] Greany, T. and B. Maxwell (2017), “Evidence-informed innovation in schools: Aligning collaborative research and development with high quality professional learning for teachers”, International Journal of Innovation in Education, Vol. 4/2/3.
[107] Guerriero, S. (ed.) (2017), Pedagogical Knowledge and the Changing Nature of the Teaching Profession, Educational Research and Innovation, OECD Publishing, Paris, https://doi.org/10.1787/9789264270695-en.
[102] Gu, Q. (2020), The Research Schools Network: Supporting Schools to Develop Evidence-Informed Practice, https://educationendowmentfoundation.org.uk/public/files/RS_Evaluation.pdf. (accessed on 7 August 2024).
[82] Gutman, L. and I. Schoon (2013), The impact of non-cognitive skills on outcomes for young people, http://www.ioe.ac.uk (accessed on 7 August 2024).
[72] Hamre, B. and R. Pianta (2001), “Early Teacher–Child Relationships and the Trajectory of Children’s School Outcomes through Eighth Grade”, Child Development, Vol. 72/2, pp. 625-638, https://doi.org/10.1111/1467-8624.00301.
[116] Harn, B., D. Parisi and M. Stoolmiller (2013), “Balancing Fidelity with Flexibility and Fit: What Do We Really Know about Fidelity of Implementation in Schools?”, Exceptional Children, Vol. 79/3, pp. 181-193, https://doi.org/10.1177/001440291307900204.
[46] Hennessy, S. et al. (2021), “An analysis of the forms of teacher-student dialogue that are most productive for learning”, Language and Education, Vol. 37/2, pp. 186-211, https://doi.org/10.1080/09500782.2021.1956943.
[119] Hiebert, J. et al. (2007), “Preparing Teachers to Learn from Teaching”, Journal of Teacher Education, Vol. 58/1, pp. 47-61, https://doi.org/10.1177/0022487106295726.
[20] Hill, H., L. Kapitula and K. Umland (2011), “A Validity Argument Approach to Evaluating Teacher Value-Added Scores”, American Educational Research Journal, Vol. 48/3, pp. 794-831, https://doi.org/10.3102/0002831210387916.
[25] Ho, A. and T. Kane (2013), The Reliability of Classroom Observations by School Personnel, Measures of Effective Teaching Project.
[8] Hoffmann, T., S. Bennett and C. Del Ma (2017), Evidence-Based Practice Across the Health Professions, Elsevier.
[39] Hood Cattaneo, K. (2017), “Telling Active Learning Pedagogies Apart: from theory to practice”, Journal of New Approaches in Educational Research, Vol. 6/2, pp. 144-152, https://doi.org/10.7821/naer.2017.7.237.
[44] Howe, C. and M. Abedin (2013), “Classroom dialogue: a systematic review across four decades of research”, Cambridge Journal of Education, Vol. 43/3, pp. 325-356, https://doi.org/10.1080/0305764x.2013.786024.
[125] Howe, C. et al. (2019), “Teacher–Student Dialogue During Classroom Teaching: Does It Really Impact on Student Outcomes?”, Journal of the Learning Sciences, Vol. 28/4-5, pp. 462-512, https://doi.org/10.1080/10508406.2019.1573730.
[97] Ion, G., D. Diaz and J. Gairin (2019), Developing the teachers’ professional capital through research informed innovations.
[89] Jones, S. et al. (2021), Navigating SEL from the Inside Out: Looking Inside and Across 33 Leading SEL Programs: A Practical Resource for Schools and OST Providers; Preschool and Elementary Focus.
[62] Jussim, L. and K. Harber (2005), “Teacher Expectations and Self-Fulfilling Prophecies: Knowns and Unknowns, Resolved and Unresolved Controversies”, Personality and Social Psychology Review, Vol. 9/2, pp. 131-155, https://doi.org/10.1207/s15327957pspr0902_3.
[69] Khalfaoui, A., R. García-Carrión and L. Villardón-Gallego (2020), “A Systematic Review of the Literature on Aspects Affecting Positive Classroom Climate in Multicultural Early Childhood Education”, Early Childhood Education Journal, Vol. 49/1, pp. 71-81, https://doi.org/10.1007/s10643-020-01054-4.
[52] Kolb, A. and D. Kolb (2009), “Experiential Learning Theory: A Dynamic, Holistic Approach to Management Learning, Education and Development”, in The SAGE Handbook of Management Learning, Education and Development, SAGE Publications Ltd, 1 Oliver’s Yard, 55 City Road, London EC1Y 1SP United Kingdom , https://doi.org/10.4135/9780857021038.n3.
[42] Kyndt, E. et al. (2013), “A meta-analysis of the effects of face-to-face cooperative learning. Do recent studies falsify or verify earlier findings?”, Educational Research Review, Vol. 10, pp. 133-149, https://doi.org/10.1016/j.edurev.2013.02.002.
[14] Leinhardt, G. and J. Greeno (1986), “The cognitive skill of teaching.”, Journal of Educational Psychology, Vol. 78/2, pp. 75-95, https://doi.org/10.1037/0022-0663.78.2.75.
[112] Levine, T. and A. Marcus (2010), “How the structure and focus of teachers’ collaborative activities facilitate and constrain teacher learning”, Teaching and Teacher Education, Vol. 26/3, pp. 389-398, https://doi.org/10.1016/j.tate.2009.03.001.
[117] Mackenzie, N. (2007), “Teaching Excellence Awards: An Apple for the Teacher?”, Australian Journal of Education, Vol. 51/2, pp. 190-204, https://doi.org/10.1177/000494410705100207.
[31] Makel, M. and J. Plucker (2014), “Facts Are More Important Than Novelty”, Educational Researcher, Vol. 43/6, pp. 304-316, https://doi.org/10.3102/0013189x14545513.
[98] Malin, J. et al. (2020), “World-wide barriers and enablers to achieving evidence-informed practice in education: what can be learnt from Spain, England, the United States, and Germany?”, Humanities and Social Sciences Communications, Vol. 7/1, https://doi.org/10.1057/s41599-020-00587-8.
[120] Maxwell, B. and T. Greany (2017), “Evidence-informed innovation in schools: aligning collaborative research and development with high quality professional learning for teachers”, International Journal of Innovation in Education, Vol. 4/2/3, p. 147, https://doi.org/10.1504/ijiie.2017.10009078.
[51] Mayer, R. (2002), “Multimedia learning”, in Psychology of Learning and Motivation, Elsevier, https://doi.org/10.1016/s0079-7421(02)80005-6.
[49] Muijs, D. and C. Bokhove (2020), Metacognition and Self-Regulation: Evidence Review.
[123] Mulgan, G. (2024), Generative Shared Intelligence (GSI), https://demoshelsinki.fi/wp-content/uploads/2024/07/Demos-Helsinki-Generative-shared-intelligence-and-the-future-shape-of-government-Geoff-Mulgan.pdf (accessed on 2025 January 7).
[9] Murad, M. et al. (2016), “New evidence pyramid”, Evidence Based Medicine, Vol. 21/4, pp. 125-127, https://doi.org/10.1136/ebmed-2016-110401.
[58] Newman, M. (2021), The impact of feedback on student attainment: A systematic review.
[5] Nutley, S., A. Powell and H. Davies (2013), What counts as good evidence? Provocation paper for the Alliance for Useful Evidence, http://www.alliance4usefulevidence.org (accessed on 19 March 2024).
[92] OECD (2023), Who Really Cares about Using Education Research in Policy and Practice?: Developing a Culture of Research Engagement, Educational Research and Innovation, OECD Publishing, Paris, https://doi.org/10.1787/bc641427-en.
[40] OECD (2022), Who Cares about Using Education Research in Policy and Practice?: Strengthening Research Engagement, Educational Research and Innovation, OECD Publishing, Paris, https://doi.org/10.1787/bc641427-en.
[84] OECD (2021), AI and the Future of Skills, Volume 1: Capabilities and Assessments, Educational Research and Innovation, OECD Publishing, Paris, https://doi.org/10.1787/5ee71f34-en.
[24] OECD (2020), Global Teaching InSights: A Video Study of Teaching, OECD Publishing, Paris, https://doi.org/10.1787/20d6f36b-en.
[93] OECD (2019), TALIS 2018 Results (Volume I): Teachers and School Leaders as Lifelong Learners, TALIS, OECD Publishing, Paris, https://doi.org/10.1787/1d0bc92a-en.
[27] Parra, J. and D. Edwards (2024), “Challenging the gold standard consensus: Randomised controlled trials (RCTs) and their pitfalls in evidence-based education”, Critical Studies in Education, pp. 1-18, https://doi.org/10.1080/17508487.2024.2314118.
[29] Pawson, R. (2006), Evidence-based Policy, SAGE Publications Ltd, 1 Oliver’s Yard, 55 City Road, London England EC1Y 1SP United Kingdom , https://doi.org/10.4135/9781849209120.
[50] Perry, J., D. Lundie and G. Golder (2018), “Metacognition in schools: what does the literature suggest about the effectiveness of teaching metacognition in schools?”, Educational Review, Vol. 71/4, pp. 483-500, https://doi.org/10.1080/00131911.2018.1441127.
[30] Perry, T., R. Morris and R. Lea (2022), “A decade of replication study in education? A mapping review (2011–2020)”, Educational Research and Evaluation, Vol. 27/1-2, pp. 12-34, https://doi.org/10.1080/13803611.2021.2022315.
[13] Pollard, A. (2010), “Professionalism and Pedagogy: a contemporary opportunity”, Teaching and Learning Research Programme.
[15] Praetorius, A., N. McIntyre and R. Klassen (2017), “Reactivity effects in video-based classroom research: an investigation using teacher and student questionnaires as well as teacher eye-tracking”, in Videobasierte Unterrichtsforschung, Springer Fachmedien Wiesbaden, https://doi.org/10.1007/978-3-658-15739-5_3.
[81] Puttick, S. and A. Cullinane (2021), “Towards the Nature of Geography for geograph education: an exploratory account, learning from work on the Nature of Science”, Journal of Geography in Higher Education, Vol. 46/3, pp. 343-359.
[18] Reinholz, D. and N. Shah (2018), “Equity analytics: A methodological approach for quantifying participation patterns in mathematics classroom discourse”, Journal for Research in Mathematics Education, Vol. 2/49.
[16] Rowan, B. and R. Correnti (2009), “Studying Reading Instruction With Teacher Logs: Lessons From the Study of Instructional Improvement”, Educational Researcher, Vol. 38/2, pp. 120-131, https://doi.org/10.3102/0013189x09332375.
[80] Ruiz-Primo, M. (2011), “Informal formative assessment: The role of instructional dialogues in assessing students’ learning”, Studies in Educational Evaluation, Vol. 37/1, pp. 15-24, https://doi.org/10.1016/j.stueduc.2011.04.003.
[85] Sabol, T. and R. Pianta (2012), “Recent trends in research on teacherchild relationships”, Attachment & Human Development, Vol. 14/3, pp. 213-231.
[63] Sanchez, C. et al. (2017), “Self-grading and peer-grading for formative and summative assessments in 3rd through 12th grade classrooms: A meta-analysis.”, Journal of Educational Psychology, Vol. 109/8, pp. 1049-1066, https://doi.org/10.1037/edu0000190.
[17] Schweig, J. (2016), “Moving beyond means: revealing features of the learning environment by investigating the consensus among student ratings”, Learning Environments Research, Vol. 19/3, pp. 441-462, https://doi.org/10.1007/s10984-016-9216-7.
[47] Sedova, K. et al. (2019), “Do those who talk more learn more? The relationship between student classroom talk and student achievement”, Learning and Instruction, Vol. 63, p. 101217, https://doi.org/10.1016/j.learninstruc.2019.101217.
[118] Seppala, N. and C. Smith (2019), “Teaching awards in higher education: a qualitative study of motivation and outcomes”, Studies in Higher Education, Vol. 45/7, pp. 1398-1412, https://doi.org/10.1080/03075079.2019.1593349.
[106] Sharples, J. (2013), EVIDENCE FOR THE FRONTLINE A REPORT FOR THE ALLIANCE FOR USEFUL EVIDENCE, http://www.alliance4usefulevidence.org (accessed on 19 March 2024).
[35] Sims, S. et al. (2022), “Quantifying “Promising Trials Bias” in Randomized Controlled Trials in Education”, Journal of Research on Educational Effectiveness, Vol. 16/4, pp. 663-680, https://doi.org/10.1080/19345747.2022.2090470.
[59] Smale-Jacobse, A. et al. (2019), “Differentiated Instruction in Secondary Education: A Systematic Review of Research Evidence”, Frontiers in Psychology, Vol. 10, https://doi.org/10.3389/fpsyg.2019.02366.
[21] Stacey, K. and R. Turner (eds.) (2015), Assessing Mathematical Literacy, Springer International Publishing, Cham, https://doi.org/10.1007/978-3-319-10121-7.
[74] Steponavičius, M., C. Gress-Wright and A. Linzarini (2023), “Social and emotional skills: Latest evidence on teachability and impact on life outcomes”, OECD Education Working Papers, No. 304, OECD Publishing, Paris, https://doi.org/10.1787/ba34f086-en.
[6] Stimson Library Medical Center of Excellence (n.d.), Evidence-Based Practice: Evidence Pyramid, https://amedd.libguides.com/ebp (accessed on 19 March 2024).
[64] Stockard, J. et al. (2018), “The Effectiveness of Direct Instruction Curricula: A Meta-Analysis of a Half Century of Research”, Review of Educational Research, Vol. 88/4, pp. 479-507, https://doi.org/10.3102/0034654317751919.
[103] SUMMA (2023), Effective Education Practices Platform, https://practicas.summaedu.org/en/what-is-it-platform/what-is-it-main-objectives/ (accessed on 13 August 2024).
[104] Sutton Trust (2024), News and Findings - NFER; School Funding and Pupil Premium 2024, https://www.suttontrust.com/our-research/school-funding-and-pupil-premium-2024/ (accessed on 7 August 2024).
[77] Sweller, J. et al. (2024), “Response to De Jong et al.’s (2023) paper “Let’s talk evidence – The case for combining inquiry-based and direct instruction””, Educational Research Review, Vol. 42, p. 100584, https://doi.org/10.1016/j.edurev.2023.100584.
[71] Takacs, Z. and R. Kassai (2019), “The efficacy of different interventions to foster children’s executive function skills: A series of meta-analyses.”, Psychological Bulletin, Vol. 145/7, pp. 653-697, https://doi.org/10.1037/bul0000195.
[28] Thomas, G. (2016), “After the Gold Rush: Questioning the “Gold Standard” and Reappraising the Status of Experiment and Randomized Controlled Trials in Education”, Harvard Educational Review, Vol. 86/3, pp. 390-411, https://doi.org/10.17763/1943-5045-86.3.390.
[87] Tolmie, A. et al. (2010), “Social effects of collaborative learning in primary schools”, Learning and Instruction, Vol. 20/3, pp. 177-191, https://doi.org/10.1016/j.learninstruc.2009.01.005.
[2] Torres, J. and M. Steponavičius (2022), “More than just a go-between: The role of intermediaries in knowledge mobilisation”, OECD Education Working Papers, No. 285, OECD Publishing, Paris, https://doi.org/10.1787/aa29cfd3-en.
[122] Ulferts, H. (2019), “The relevance of general pedagogical knowledge for successful teaching: Systematic review and meta-analysis of the international evidence from primary to tertiary education”, OECD Education Working Papers, No. 212, OECD Publishing, Paris, https://doi.org/10.1787/ede8feb6-en.
[7] University of Canberra Library (n.d.), Evidence-based practice in health, https://canberra.libguides.com/c.php?g=599346&p=4149721 (accessed on 19 March 2024).
[1] Van Damme, D. (2022), The Power of Proofs (Much) Beyond RCTs, http://www.curriculumredesign.org (accessed on 19 March 2024).
[61] van de Pol, J. et al. (2015), “The effects of scaffolding in the classroom: support contingency and student independent working time in relation to student achievement, task effort and appreciation of support”, Instructional Science, Vol. 43/5, pp. 615-641, https://doi.org/10.1007/s11251-015-9351-z.
[43] van Leeuwen, A. and J. Janssen (2019), “A systematic review of teacher guidance during collaborative learning in primary and secondary education”, Educational Research Review, Vol. 27, pp. 71-89, https://doi.org/10.1016/j.edurev.2019.02.001.
[34] Vasishth, S. et al. (2018), “The statistical significance filter leads to overoptimistic expectations of replicability”, Journal of Memory and Language, Vol. 103, pp. 151-175, https://doi.org/10.1016/j.jml.2018.07.004.
[11] Wadhwa, M., J. Zheng and T. Cook (2023), “How Consistent Are Meanings of “Evidence-Based”? A Comparative Review of 12 Clearinghouses that Rate the Effectiveness of Educational Programs”, Review of Educational Research, Vol. 94/1, pp. 3-32, https://doi.org/10.3102/00346543231152262.
[95] Walker, M., J. Nelson and S. Bradshaw (2019), Teachers’ engagement with research: what do we know? A research briefing Independent researchers, http://www.educationendowmentfoundation.org.uk (accessed on 19 March 2024).
[48] Wang, M. and J. Eccles (2013), “School context, achievement motivation, and academic engagement: A longitudinal study of school engagement using a multidimensional perspective”, Learning and Instruction, Vol. 28, pp. 12-23, https://doi.org/10.1016/j.learninstruc.2013.04.002.
[68] Wang, M. et al. (2020), “Classroom climate and children’s academic and psychological wellbeing: A systematic review and meta-analysis”, Developmental Review, Vol. 57, p. 100912, https://doi.org/10.1016/j.dr.2020.100912.
[86] Wentzel, K. and D. Watkins (2022), “Peer relationships and collaborative learning as contexts for academic enablers”, School Psychology Review, Vol. 31/3, pp. 366-377.
[100] Williams, D. and L. Coles (2007), “Evidence‐based practice in teaching: an information perspective”, Journal of Documentation, Vol. 63/6, pp. 812-835, https://doi.org/10.1108/00220410710836376.
[36] Wolf, R. et al. (2020), “Average Effect Sizes in Developer-Commissioned and Independent Evaluations”, Journal of Research on Educational Effectiveness, Vol. 13/2, pp. 428-447, https://doi.org/10.1080/19345747.2020.1726537.
[26] Wrigley, T. and S. McCusker (2019), “Evidence-based teaching: a simple view of “science””, Educational Research and Evaluation, Vol. 25/1-2, pp. 110-126, https://doi.org/10.1080/13803611.2019.1617992.
[76] Xu, L. and D. Clarke (2019), “Speaking or not speaking as a cultural practice: analysis of mathematics classroom discourse in Shanghai, Seoul, and Melbourne”, Educational Studies in Mathematics, Vol. 102/1, pp. 127-146, https://doi.org/10.1007/s10649-019-09901-x.
[105] Yeager, D. et al. (2021), “Teacher Mindsets Help Explain Where a Growth-Mindset Intervention Does and Doesn’t Work”, Psychological Science, Vol. 33/1, pp. 18-32, https://doi.org/10.1177/09567976211028984.
[88] Yeager, D., R. Dahl and C. Dweck (2017), “Why Interventions to Influence Adolescent Behavior Often Fail but Could Succeed”, Perspectives on Psychological Science, Vol. 13/1, pp. 101-122, https://doi.org/10.1177/1745691617722620.
[67] Yibing Li, A. et al. (2011), “Peer relationships as a context for the development of school engagement during early adolescence”, International Journal of Behavioral Development, Vol. 35/4, pp. 329-342, https://doi.org/10.1177/0165025411402578.
Annex 7.A. Overview of the features of the ‘Strengths and Limitations of the Evidence Base’
Copy link to Annex 7.A. Overview of the features of the ‘Strengths and Limitations of the Evidence Base’Annex Table 7.A.1. Summary of strengths and limitations of the core teaching practices of the Schools+ Taxonomy
Copy link to Annex Table 7.A.1. Summary of strengths and limitations of the core teaching practices of the Schools+ Taxonomy
Schools+ taxonomy |
Breakdown of notable strengths and limitations |
|||||||
---|---|---|---|---|---|---|---|---|
Dimensions |
Sub-dimensions (Taxonomy 1.0) |
Proposed level of evidence |
Number of studies & research designs |
Consistency of findings |
Education levels primarily considered P: Primary school contexts S: Secondary school contexts |
Subjects primarily considered M: Maths Lit: Literacy Sci: Science |
Contexts primarily considered H: High resource contexts L: Lab-based studies R: Range of contexts O: Older studies S: Small-scale studies SP: School-based Programmes |
Priority questions for a future research agenda |
Classroom interaction |
Collaboration |
Solid |
Solid number of meta-analyses and other studies with a range of research designs. |
P |
M Sci Lit |
H |
There remains a need for further independent evaluations. |
|
Whole-class discussion and dialogue |
Solid |
Solid number of large-scale correlational studies with reasonably consistent findings. Recent large-scale RCT with positive impact. |
P |
M Sci Lit |
H |
How this practice works for diverse student backgrounds. |
||
Questioning |
Strong |
Large number of studies with a range of research designs, with reasonably consistent findings across different student contexts. |
O R |
What are the effects of specific types of questions. |
||||
How combinations of questions work together. |
||||||||
Cognitive engagement |
Ensuring good levels of challenge |
Solid |
Some correlational observation studies, but one limitation is the conceptual variation in how 'challenge' is defined in these studies. |
L |
Greater conceptual clarity. How to measure cognitive engagement. What are the exact mechanisms that drive cognitive engagement. |
|||
Some evidence from experimental designs. |
||||||||
Fluency and flexibility |
N / A |
As per Table 2, to be re-conceptualised with Quality of Subject Matter |
||||||
Metacognition |
Solid / Strong |
Solid number of meta-analyses and other studies with a range of research designs. |
Greater understanding of the subject-specific nature of certain mechanisms |
|||||
Working with multiple perspectives |
Solid |
Some robust empirical studies on dual-coding theory. |
M Sci |
L |
Understanding the effects of the practice in a greater range of subjects. |
|||
Theoretically can be a high-risk practice if confounded with less evidence-informed approaches such as learning styles. |
||||||||
Facilitating first-hand experiences |
Promising / Solid |
Some robust empirical studies with young students showing positive outcomes. |
Some mixed results with older students. |
P |
Sci |
Understanding the effects of the practice with older students, and across a range of outcomes (e.g. learning, motivation). |
||
Meaningful context and real-world connections |
Promising |
Some robust empirical studies. |
Variation in the findings with some showing it makes little difference. Some variation by age groups too. |
How to measure ‘meaningful’. |
||||
Question of measuring ‘meaningful’. |
||||||||
Formative assessment and feedback |
Learning goals |
Strong |
Large number of studies with a range of research designs, from a number of years. |
R |
Empirical testing of different types of learning goals. |
|||
Eliciting student thinking |
Solid |
Solid number of primarily correlational studies. |
R |
Understanding in more detail the role that different types of eliciting, including tasks, play in driving effects. |
||||
Some recent robust empirical studies. |
||||||||
Feedback |
Strong |
Large number of meta-analyses and other studies with a range of research designs, from a number of years. |
R |
The long-term effects of different types of feedback on memory retention. |
||||
Aligning to student thinking |
Solid |
Solid number of primarily correlational studies, from a number of years. |
R |
How to measure alignment of teaching for larger-scale studies. |
||||
'Aligning' can be conceptualised and operationalised differently in studies. |
||||||||
Quality of subject matter |
Explanations and making expositions |
Promising |
Some correlational studies, but difficult to isolate the exact role of explanations. |
H S |
More precisely defining what makes a high-quality explanation. |
|||
Solid number of studies on using worked examples and variation theory. |
||||||||
Nature of the subject |
Promising |
Some small-scale empirical studies. |
Sci M |
H S |
Larger-scale empirical work to understand the effects on different student outcomes. |
|||
Limited number of robust empirical studies. |
||||||||
Making connections |
Promising / Solid |
Solid number of studies from psychology and theoretical studies. |
M |
L |
Understanding exactly which connections in subject matter are of value. |
|||
Exploring patterns and generalisations |
Promising |
Limited number of robust empirical studies. |
||||||
Merged as per Table 2. |
||||||||
Explicit procedures and methods |
Promising |
As per Table 2, to be re-conceptualised with Explanations and Making Expositions |
||||||
Clarity and accuracy |
Solid |
Solid number of correlational studies on the presentation of well-structured, coherent content. |
M |
L |
Understanding the features of well-structured content in a greater range of subjects. |
|||
Solid number of studies on the sequencing of content and learning opportunities, with reasonably consistent findings. |
||||||||
Social-emotional support |
Creating a supportive classroom climate |
Solid |
Some large-scale correlational studies. |
Mixed findings around certain constructs (e.g. perseverance and growth mindset). |
Greater conceptual clarity. |
|||
Some meta-analyses showing positive outcomes. |
How to measure common constructs such as respect and warmth. |
|||||||
One limitation is in the variation of how climate is conceptualised. |
How climates may vary by different subjects. |
|||||||
Relationship building (student-student) |
Promising / Solid |
Some robust empirical studies on cooperation, supported by some qualitative studies. |
Some variation in the findings on cooperation, but consistency in qualitative studies. |
R |
How to measure relationships. |
|||
Question of measuring relationships. |
||||||||
Relationship building (teacher-student) |
Solid |
Some large-scale correlational studies, with both student and teacher perceptions of positive relationships predicting positive outcomes. |
R |
How to measure relationships. |
||||
Explicitly teaching and actively practising social-emotional skills |
Promising / Solid |
Some robust empirical studies and some meta-analyses. |
Mixed findings with older students. |
P |
SP |
How generalisable certain findings are to older students. |
Note: Ratings on the levels of evidence were provided by the 26 leading academics and knowledge brokerage organisations participating in the expert review exercise. Participants were also invited to share qualitative input on the rationale behind their ratings too. An additional 17 academics and organisations provided qualitative input on the conceptualisation of practices and the scoping of their evidence (see Annex A: Methodology).
1. Ratings were defined as follows: (i) Emerging: The evidence is primarily theoretical and there is limited robust empirical evidence, or the evidence is limited to specific contexts and/or students; (ii) Promising: The research base is developing and showing promise, but there may still be a greater reliance on theoretical rather than robust empirical studies including experimental studies, and/or a high degree of variation in studies. There may only be a limited number of contexts represented in studies; (iii) Solid: The research base is solid with a good number of robust empirical studies including experimental studies, and a solid understanding of how effects may vary across different contexts; and, (iv) Strong: The research base is strong with a large number of robust empirical studies including experimental studies, and a high degree of consensus around the mechanisms that drive outcomes and how these vary in different contexts. There are observational and cross-sectional studies that feed into the evidence base too.