Research tells us that, overall, Head Start has positive effects on children’s health, education, and economic outcomes. But there is wide variability in quality from program to program — and, as a field, we don’t understand why.
Earlier this year, Sara Mead and I tried to figure that out. We published an analysis, conducted over three years, of several of the highest performing Head Start programs across the country. We specifically looked at programs that produce significant learning gains for children. Our goal was to understand what made them so effective.
As part of this project, we provided detailed, tactical information about exemplars’ design and practices. We hope to serve as a resource and starting point for other Head Start programs interested in experimenting with something new and, potentially, more effective.
Here are three action steps that Head Start programs can take right now to improve their practice:
- Regularly analyze multiple types of internal program data
All programs included in this study have evidence of impact on children’s learning outcomes. But they (rightly) care about many more data points that contribute to or reflect child, family, and staff experience and outcomes. To that end, these programs regularly analyze child outcome data alongside and in the context of input (e.g., teacher credentials) and process data (e.g., CLASS scores). These analyses allow programs to keep track of performance on multiple indicators, which paints a more complete picture of program quality; provides insight on trends, areas of strength, and opportunities for growth; and helps programs determine if further action is appropriate.
Utah Community Action, for example, looked at the relationship between child attendance and performance on assessments — in other words, they compared a process data point to an outcome data point — and found that lower attendance was correlated with lower performance. In response, they developed simple, visual tools to show parents the relationship between attendance and performance, specific to each classroom, which teachers posted outside their doors. They later found that child attendance increased after teachers shared the visuals.
Of course, just because there is a correlation between two data points doesn’t mean one leads to the other. But, absent the ability to conduct more rigorous analyses, exemplar programs found that correlations were worth exploring and, in some cases, directly informed successful improvement efforts.
Conducting these analyses and comparing data points doesn’t require a fancy data system or dedicated program staff. In fact, many teachers do so intuitively: they know, for example, that students often have lower assessment scores in the beginning of the year, when they are still figuring out the classroom dynamics. Programs interested in using data to drive improvement can leverage these existing skills and curiosity.
Action step: Hold monthly meetings where teachers have access to multiple types of data for their classroom and for each of their students. Ask them to identify patterns and, if appropriate, develop hypotheses about what led to those patterns.
- Pay attention to curriculum and even closer attention to fidelity
Exemplar programs found that instruction is one of the most — if not the most — important drivers of their success, and that curriculum is a key component of instruction. But none of these programs uses curriculum off-the-shelf, unaltered. Instead, they either create their own curriculum, or they substantially supplement or tailor commercially available ones.
CAP Tulsa, for example, created an implementation guide that describes what high-quality instruction should look like in their classrooms. It includes a checklist outlining what a teacher’s activities and classroom environment should look like if they are implementing the curriculum with fidelity, so teachers can self-assess their practice and make adjustments. The document also gives teachers and coaches a shared language: Coaches use it as a road map for assessing and providing feedback on teacher practice at the beginning, middle, and end of the school year.
Action step: Review the content of your curriculum and the fidelity of the supports for implementation, with an eye toward whether they align with your understanding of high-quality instruction.
- Intentionally connect family engagement to children’s learning
All exemplar programs have developed a tight connection between what happens in the classroom and how programs engage families to support children’s learning outside of the classroom. Each program has created structures to build relationships and facilitate information-sharing between classroom teachers and family support staff.
In Educare Miami-Dade, for example, family engagement staff visit classrooms and observe children at least daily and share information from those visits with parents. Family engagement staff also participate in multidisciplinary meetings in which teachers and academic program staff talk about children’s progress. During these meetings, which occur twice a month, academic staff provide performance and observation data, and family engagement staff connect data to insights about what’s happening to children at home and ensure that the family’s goals are prioritized alongside the education goals for children. When information emerges about a child’s experience — from teachers, classroom observation, or multidisciplinary meetings — family engagement staff communicate immediately with the family, leveraging their relationships with them.
Action step: Set up regular meetings between family engagement staff and teachers to share information about children and families.
We realize that implementing these practices isn’t possible for all Head Start programs; several of the practices are difficult to replicate widely in the current policy environment, where many early childhood programs have limited resources and capacity building supports. But absent systemic policy changes, other Head Start programs (and early childhood programs outside of Head Start) can still leverage what we learned from the exemplars. That’s why “Leading by Exemplar” offers recommendations for system-level levers alongside more local ones. Ultimately, improving results for kids requires both types of changes.