Head Start
Performance Standards
As a federally funded program, Head Start is authorized under Congress and governed by federal regulations called the Head Start Program Performance Standards (HSPPS). These Standards are used to guide funding decisions, federal monitoring, re-competition, and key management practices such as how programs should collect and use data.
Head Start Performance Standards: An Overview
The first Head Start Program Performance Standards were published in 1975. Since then, the standards have been revised and updated several times, including the Improving Head Start for School Readiness Act of 2007 and additional updates to the HSPPS in 2016. The latter was the first comprehensive revision of the standards since they were first released in 1975.
Head Start Performance Standards: An Overview
The first Head Start Program Performance Standards were published in 1975. Since then, the standards have been revised and updated several times, including the Improving Head Start for School Readiness Act of 2007 and additional updates to the HSPPS in 2016. The latter was the first comprehensive revision of the standards since they were first released in 1975.
Here’s an overview of areas covered in the Head Start Performance Standards:
- Eligibility, Recruitment, Selection, Enrollment and Attendance
- Program Structure
- Education and Child Development Program Services
- Health Program Services
- Family and Community Engagement Program Services
- Additional Services for Children With Disabilities
- Transition Services
- Services to Enrolled Pregnant Women
- Human Resources Management
- Program Management and Quality Improvement
Requiring a Stronger Command of Program Data
In many ways, the 2016 changes simplified and streamlined the Head Start Performance Standards, reducing the overall number of federal requirements and compliance details by a third. These changes expanded the important role of parents and required Head Start and Early Head Start programs to offer longer service hours to meet children and families’ needs.
However, the biggest change to the Head Start Performance Standards came in a new expectation for ongoing monitoring, a stronger command of data, and the use of independent judgment. In the past, programs focused on complying with Head Start Performance Standards. Once they met those standards, they felt they had done their job.
In many ways, the 2016 changes simplified and streamlined the Head Start Performance Standards, reducing the overall number of federal requirements and compliance details by a third. These changes expanded the important role of parents and required Head Start and Early Head Start programs to offer longer service hours to meet children and families’ needs.
However, the biggest change to the Head Start Performance Standards came in a new expectation for ongoing monitoring, a stronger command of data, and the use of independent judgment. In the past, programs focused on complying with Head Start Performance Standards. Once they met those standards, they felt they had done their job.
How Head Start Programs Use Data Coaches
As Head Start programs seek to strengthen their data use, they are beginning to use a new kind of resource: coaches. Data coaches started appearing in K-12 school districts over a decade ago. They help faculty glean insights from the mass of information they collect. With new federal requirements for data, now Head Start programs are engaging coaches as well. What is it like working with a data...
Member Profile: Q&A with YMCA of the East Bay about Teacher Wellness and Retention
YMCA of the East Bay directly operates 21 Head Start and Early Start Centers. They partner with four additional centers across Alameda and Contra Costa counties in California. They also operate state funded and migrant programs in Yolo and Sacramento counties. In total, the program can serve approximately 1,800 children at full capacity. They joined the QI Network in 2021. Melanie Mueller has...
Attendance vs. Chronic Absence in Head Start: Understanding the Difference
One of the most powerful levers at our disposal is predictive data. In contrast to lagging indicators such as end of year assessments or child outcomes, predictive data enables us to proactively intervene where we can make a real difference. While there are many types of predictive data, one of the most powerful is chronic absence. Chronic child absence translates into serious learning loss over...
As programs adjust to the new Head Start performance standards, data coaching can help them adopt a continuous improvement mindset and comply with these requirements.
Early Learning Outcomes Framework
- NApproaches to Learning
- NSocial and Emotional Development
- NLanguage and Literacy
- NCognition
- NPerceptual, Motor, and Physical Development
The ELOF has inspired many state level early childhood frameworks, and it informs various child assessments. The Head Start Performance Standards require grantees to implement program and teaching practices that are aligned with the ELOF.
QRIS & Head Start
Head Start Federal Reviews
Focus Area 1 and 2 reviews each happen once every five-year grant cycle for all Head Start and Early Head Start grantees. These reviews present a key opportunity for programs to come together as a team, reflect more deeply on data, and highlight promising practices to federal reviewers.
The goal is to not only demonstrate not only that they are meeting Head Start Performance Standards but that they are positively impacting children, families, and communities.
Focus Area 1
The Focus Area 1 (FA1) review is conducted virtually, generally during the first year of the grant cycle. In the past, it was primarily discussion-based. However, the Fiscal Year 2024 Monitoring process now requires documentation to back up the information programs share about improvements, outcomes, and how they use data to drive decision-making in all content areas.
During an FA1 review, grantees share their program design. They discuss the role of community assessment and explain how they’ve identified program goals that will positively impact their community. Grantees also show how their budget supports these program goals.
As part of the new monitoring protocol, FA1 reviews now also include data tours. During a data tour, programs should be prepared to present and discuss data from the previous two years. The goal is to show an understanding of the data’s context and display program-wide trends over time rather than day-to-day case management.
After the FA1 review, the grantee and the regional office receive a report including feedback for the program moving forward. This review sets the stage for Focus Area 2, when reviewers go beyond program design to understand how well grantees have actually implemented their program design.
Focus Area 2
During a Focus Area 2 (FA2) review, federal reviewers dig deeper than a FA1 review. They visit on-site to interview staff and conduct a data review and analysis around program implementation. A data review takes place during the data tour, when a program, center or site leader shares how they collect, analyze, aggregate, and use data to make program decisions and meet the Head Start Performance Standards.
Monitoring Protocol
The Office of Head Start releases an updated monitoring protocol each fall. The biggest shift was in 2016, focusing more on continuous improvement and less on compliance to regulations. However, the 2023 revisions are also significant. It’s helpful for program leaders to review the new monitoring protocol as soon as it comes out during a program year with a scheduled FA1 or FA2 review. This can help prevent any surprises.
In keeping with the Head Start Performance Standards, federal reviewers ask to see up to two years worth of data. They want to know that programs can articulate how they use their data across time. They also want to know data use continues to impact their ability to sustain short term and long term goal setting.
The Fiscal Year 2024 Monitoring process includes the following:
- Management Discussion: While other components may be scheduled based on a program’s needs and staffing, the timing of this discussion is non-negotiable. The management discussion typically includes Head Start directors and content leads. It covers the who, how, and why, which will be tested in subsequent review activities
- Governance and Parent Discussions: These discussions are attended by volunteer members that are selected by the program at a time that is agreeable to attendees. These discussions are designed to test areas including communication systems, data awareness, improvement planning, and relationship development systems.
- Content area data tours: Content area teams in fiscal, health, human resources, ECD, ERSEA, and FCE lead reviewers through data tours. The goal is to demonstrate how that content area collects, aggregates, analyzes, and uses data.
- Classroom and Center Explorations: The purpose of classroom and center explorations is for reviewers to assess environments and safety practices. They are looking for effective teaching practices, curriculum fidelity, data-driven lesson planning and implementation, and observing classroom culture for nurturing and safe practices.
- ERSEA File Review: This file review is based on a predetermined sample size with no more than 93 files per grant. Files are randomly selected so that the reviewer can assess compliance and quality and test selection and enrollment practices.
- Fiscal Transaction Testing: The goal of fiscal transaction testing is to ensure that the program has checks and balances in place to ensure that the grantee has practices in place to prevent waste, abuse, and fraud. This is monitored virtually.
Head Start Data Stories in a Review
Unfortunately, many programs conflate anecdotes with data stories. For instance, consider a Family Services team that discovers a family is unhoused and then connects them with a community partner to secure stable housing. This is a heartwarming story, but if it’s not backed by broader program data reflecting a larger pattern or system, it’s merely an anecdote, not a data story, and it won’t impress a federal reviewer.
This approach also creates a missed opportunity for the program to understand trends and risks impacting their community and take steps to fill that need. Staff might say “I think” rather than “I know,” because they don’t have the data to back up assumptions.
Programs that don’t take a data-informed approach may try to present anecdotes during an FA2 review. Often, they can’t explain the rationale behind what data is collected and how it’s used, because they don’t understand the bigger picture. More importantly, they can’t demonstrate that they’re meeting the Head Start Performance Standard to collect, aggregate, and analyze data to achieve program performance goals.
Contrast that to a program taking a more data-informed approach. During an FA2 review, a Head Start program leader might explain that as of last year, 80% of children were not up-to-date on their health screenings. Through data analysis, they discovered that the most commonly missing screening was dental. Based on that finding, they then partnered with a local organization to bring dentists into their centers for cleanings twice a year.
Thanks to this new partnership and an effort to educate parents and secure the necessary paperwork for their children to get their teeth cleaned on-site, the number of children without up-to-date health screenings dropped 20% compared to last year. Program leaders might add that they’re looking at other missing health screenings so they can bring that number closer to zero.
This example not only cites data but shows the team’s commitment to collecting the right data and taking positive action based on findings as outlined in the Head Start Performance Standards. Coaching can help teams develop data stories and articulate them with greater confidence and fluency.
Recompetition and the Designation Renewal System
In the past, Head Start and Early Head Start grants rolled over from year to year once the grantee submitted paperwork. Beginning in 2011, the Designation Renewal System (DRS) was created to maintain quality standards and ensure that Head Start and Early Head Start programs meet program and financial requirements. This system created five-year grant periods for all agencies running Head Start and Early Head Start programs. It was updated in 2020.
However, under the DRS, agencies meeting one or more of the following conditions must compete for Head Start or Early Head Start funds with other local agencies:
- Two or more deficiencies, defined as a systemic or substantial material failure of an agency in an area of performance
- Score below a competitive threshold in one or more Classroom Assessment Scoring System (CLASS®) domains. Additionally, grantees that fall below a quality threshold on any CLASS® domain receive support from the Office of Head Start (OHS) for the program to improve classroom learning environments.
- Two or more audit findings of material weakness or questioned costs associated with Head Start funds, or a going concern finding
- Failure to establish and take steps to achieve school readiness goals
- License revocation
- Suspension by OHS
- Debarment by another federal or state agency or disqualification from the Child and Adult Care Food Program (CACFP)
Agencies are typically aware that this is coming due to low CLASS® scores or a deficiency that came up during their FA2 visit. Typically, they will have a chance to address the issue.
Once an agency goes into the recompete process, there is a Notice of Funding Opportunity (NOFO) and interested agencies have the chance to participate in an open competition for a grant to provide Head Start and Early Head Start services in that area. The agency meeting those conditions can still receive a grant through this re-competition process.
While the majority of programs that have gone through DRS have been successfully refunded, programs can sometimes lose a portion of their funding to other providers in the area. DRS competition can also vary by region, with more urban areas facing stiffer competition because more providers are available to compete. In cases where a grantee loses all of their funding and the grant is awarded to another agency in the area, the new grantee will sometimes take over the lease from the prior grantee and hire some of their staff.
Grantees preparing for the recompete process may benefit from quality assurance coaching or consulting support to help ensure that they are meeting Head Start Performance Standard