Summary: The scientific understanding of the long-term effects of a healthy environment and effective parenting on the developing brains of children has never been better. The evidence has fueled a national movement among states to institutionalize early childhood education programs. Despite this momentum, foundations seeking to advance this field find that measuring effectiveness is challenging.   

No easy answers to questions of impact

The field of parenting education is not for funders looking for easy answers.

In 1994, one of the first assessments of parenting education titled See How We Grow, noted that the social science evidence about the effectiveness of parenting education programs was mixed:

“This is the adolescence of an emerging discipline, filled with great promise but plagued by the tensions of maturity and immaturity. Our collective challenge is to understand what nurturing is necessary in order to delivery on the promise of this youth.”

GMA Foundation’s advisory work with supporters of parent education such as the A.C. Ratshesky Foundation, and the Linden Foundation have forced us to adjust our sights so that we may measure what matters.

Unrealistic expectations about outcomes and evaluation?

We recently reached out to a group of Boston-area practitioners in the field of parent education to learn about the current state of the field. Much of the conversation focused on the challenges of program evaluation.

Everyone agreed that high quality parenting education programs should have a strong outcomes orientation, but were concerned that the expectations of funders about measuring outcomes were often unrealistic.

“Funders, legislators, and school administrators want a simple answer to a complex question. They want to know how kids do in school as a result of a program,” says Brita McNemar, Even Start Coordinator of the Waltham Family School.

“What they do not realize is that this is work is messy and the academic research has yet to catch up to the practice”.

A thousand different outcomes for parents, children, families, and communities

To start with, the field lacks a common set of indicators. Parenting programs aim for a wide variety of outcomes. Programs range from early childhood literacy, parental bonding, abuse prevention, health education, to parent involvement in school. The many categories of programs do not fit into a tidy logic model.

The diverse settings and participants of parent education programs also make it difficult to gather comparable information. Programs take place in childcare centers, libraries, homes, prisons, and community centers. They may target teens, people with low-literacy, new parents, fathers, or grandparents. Each setting and target population presents a methodological puzzle for the best social science researchers.

Even the term, “parent engagement” is seldom defined with a degree of clarity or precision that is helpful in capturing the full story.

Gathering information that helps improve programs

The good news is that most provider organizations track indicators and actively wrestle with how to best collect information.

Practitioners want to be sure that evaluation is not only useful to funders in measuring successes but useful to their own staff. Collecting information that helps improve the program is a top priority.

Many organizations track a set of indicators that monitor the intensity and continuity of relationships with parents. Simple indicators such as number of parents, attendance, frequency of accessing other supportive programs, for example, are proxy or indirect indicators that suggest that the program is effective.

Monitoring and evaluation is labor intensive

A few organizations that we spoke with have sophisticated program monitoring and evaluation programs, but the vast majority struggle to build this capacity.

The organizations that GMA Foundations has worked with often complain that while foundations demand rigorous evaluation systems that document impact, they don’t always support the costs of program evaluation.

Nonprofit leaders want foundations to better understand the staffing requirements and high degree of internal capacity required to track outcomes.

An end to silos?

The leaders we spoke with were enthusiastic about the degree of collaboration and new partnerships in the field. They described their programs as part of a larger network of supports in the community for parents and children.

There are now more provider networks that support parents, better relationships with schools, and dramatic changes in the way funding and resources are allocated to support parent engagement.

These collaborative partnerships have supported a vibrant conversation about measuring the impact of the collective work.

Acknowledging complexity and understanding community

This growing network of parent education programs and support services in a community is a complex system that defies a simple evaluation approach.

Sandy Sachs, Child Development Specialist of the Family Nurturing Center puts it bluntly. “A family may participate in a well-baby program, enroll in the WIC feeding program, get kids books from the doctor, drop-in to a library reading session, and take an English class. How would you attribute the reading score of their third grader to a single program? That’s crazy!”

The current push for better outcome measurement in human services has focused on individual outcomes. A growing movement of researchers and practitioners, however, are also urging organizations to tease out the community level outcomes that identify changes in organizations, increases in supports and services and changes in community attitudes.

The broader view of outcome measurement suggests that foundations may need to analyze their grantmaking in the field of parent education more in terms of how it supports networks of organizations rather than the traditional grant-by-grant assessment of individual organizations.

It also reminds us that funders need to be more willing to fund the program evaluation that is so essential to nonprofit effectiveness.

As more foundations step up to the challenge of supporting program monitoring and evaluation they will discover a rich and rewarding conversation that will sharpen their grantmaking.

We look forward to your comments and ideas about what can help move this conversation forward.

–Prentice Zinn

Prentice is a managing partner at GMA Foundations