What happens when teachers start using the science of learning? A new study suggest some very powerful outcomes.
A group of researchers have been exploring the question of what happens when teachers get professional development based on the science of learning.
Their latest research article explores the effectiveness of two approaches to teacher professional development at improving middle school students’ science achievement. It’s also one of the largest, most rigorous tests of the idea of applying the science of learning to teacher professional development, involving 90 schools, 267 teachers, and nearly 12,000 students.
The result? Students learn more when teachers apply science of learning principles in the classroom. A science of learning professional development program produced better student outcomes than the existing professional development program and better student outcomes than a program focused on giving teachers greater content knowledge on what they teach.
This is promising news to those of us arguing for wider application of these research-based methods. But the results also suggest that increasing teacher professional development time, lowering administrative hurdles to reform, and reducing teacher turnover are also keys to effective professional development.
The teaching practice reports reveal that the professional development programs only affected teaching practice a little bit.”
This study is unprecedented for its scope, rigor, and realistic research setting. Researchers randomly assigned 90 schools to one of three conditions:
The professional development programs took place over the course of two years and involved two separate cohorts of teachers. The study also followed current best practices for teacher professional development: subject-area focused content, collaborative and active learning opportunities, alignment with existing curricula, and professional development that took place over a long period of time.
That said, the total amount of professional development given to teachers was only 34 hours over the course of a two-year period (four 2-hour sessions over the course of the school year, in addition to a 2.5-day summer session).
The research took place in a large urban district, where over 90% of students are eligible for free or reduced-price lunch. For reference, the National Center for Education Statistics considers schools where over 75% of students are eligible for free or reduced-price lunch to be high-poverty schools. This district was among the lowest-performing district in the state and the majority of students scored well below state averages.
It’s also a setting with high teacher turnover. About 40% of the teachers involved with the study were no longer teaching the same subject or the same grade after the first year of the intervention. And the study authors suggest that administrative hurdles, such as school leaders with different priorities, may have impacted the effectiveness of the professional development programs.
In addition to evaluating student outcomes between the conditions, the authors also had teachers fill out reports on their own teaching practices, to verify whether the professional development actually changed teacher practice.
Importantly, the teachers in the cognitive science group had less teaching experience. In spite of randomization, teachers in the cognitive science group had about half the teaching experience as teachers in the comparison groups — 7.5 years compared to around 15 — a statistically significant difference given the groups’ respective standard deviations. This sometimes happens with randomized designs.
Overall, student outcomes in the cognitive science intervention were better than student outcomes in both other groups.”
The teaching practice reports reveal that the professional development programs only affected teaching practice a little bit. For instance, across all three categories of cognitive science principles and all cohorts, only three out of fifteen groups posted significant differences in self-reported teacher practices as compared to the control group (six out of fifteen groups posted significant differences when compared to the content-only group). In most cases, however, the trends are in the expected direction.
Considering these are self-report measures on constructs that were specifically taught during professional development, it’s troubling that teaching practice seems to have changed only a little. Changes were particularly small for using spaced practice and visualization.
Something similar happened in the content knowledge group. The content knowledge teachers were only significantly better than the control in one out of the three subject areas on a test meant to measure their knowledge. The content knowledge group didn’t perform significantly different than the cognitive science group in any content knowledge area. The trends, however, are in the expected direction.
The key finding, though, is clear: Overall, student outcomes in the cognitive science intervention were better than student outcomes in both other groups. The article focused on the effect sizes of the intervention, so we report these here, too. Effect sizes are a standardized measure of how big of an impact a change made. To give you some point of comparison: an effect size of zero means there was no average difference between the groups. An effect size of .3 would be pretty substantial. And an effect size of .5 or greater is rare in education research.
Out of six comparisons between the control and the cognitive science groups across the two cohorts, three effect sizes are small to non-existent — 0.0, 0.06, 0.05 — and three are moderate — 0.17, 0.19, 0.20. Out of the six comparisons between the content knowledge and the cognitive science groups across the two cohorts, two are small — 0.06, 0.09 — two are moderate — 0.16, 0.17, 0.24 — and one is large — 0.36.
Maybe this doesn’t seem so impressive. But consider the context. In a challenging, realistic teaching setting, with high teacher turnover, the cognitive science intervention produced superior outcomes in spite of teachers in that group having half of the teaching experience, on average, of teachers in the comparison groups and in spite of minimal reported changes in teaching practice for all groups. This outcome is nothing to scoff at.
It also seems that teachers can transfer science of learning principles into other subject areas without explicit training in those areas. For example, if a teacher learned to use science of learning techniques when teaching students a unit about cells, this helped them teach a unit about matter more effectively, too. This implies that professional development based on the science of learning might have broader beneficial effects that traditional professional development programs and programs based on content knowledge alone can’t match.
Notably, student outcomes in the content knowledge condition were worse than those in the control condition. This suggests that learning content knowledge alone, without accompanying pedagogical knowledge, may even reduce teaching effectiveness.
Professional development based on the science of learning is promising, but that the professional development model needs to be improved for such interventions to be more effective. This likely means more total time in practicing and integrating such techniques in the classroom, sustained administrative support in the practice, and reduced teacher turnover.
The same research group is doing a follow-up study in a different district and a different science program. We’ll be awaiting the results.
Interview with Pooja Agarwal
Ulrich Boser on TEDx Talk
Interview with Ken Koedinger
The Learning Curve publishes articles about how people learn. Please reach out with any ideas for articles on the research on learning