Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock () or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

July/August 2019Vol. 20, No. 6Using Data and Evaluation to Drive Decisions and Make Improvements During a Change and Implementation Process

Written by the Capacity Building Center for States

People use data everyday: They use it to search for online reviews before selecting a restaurant; compare features and functionality when buying a car; and track steps, calories burned, and progress toward fitness goals.

Data—particularly reliable and accessible data—can contribute to informed decision-making. As with everyday activities, this is true in the course of implementing programs or practices to serve children, youth, and families. Data help implementation teams uncover the root causes of agency problems or needs so they can fully understand the issues. Data also help teams compare program options when selecting an appropriate solution to address the problem. Moreover, data can be invaluable during implementation as part of evaluation to help teams understand if their efforts are on track and guide needed adjustments.

A new brief from the Capacity Building Center for States, Change and Implementation in Practice: Monitoring, Evaluating, and Applying Findings, offers step-by-step guidance for collecting, analyzing, and using data effectively during a change and implementation process.

Three Key Considerations

The brief underscores three themes related to using data and conducting evaluations of new programs and practices:

  • Continuous learning. Data analysis and evaluation can contribute to feedback loops that highlight lessons learned, which in turn can contribute to program improvements. Continuous learning requires an agency culture that is open to taking a close and transparent look at what's working and what's not.
  • Collaboration. Child welfare teams often need to partner with experienced evaluators or data analysts. Evaluation assistance may be available from agency continuous quality improvement or data leads, local universities, or the Center for States (see the Center's site for contact information for liaisons). In addition, teams benefit from considering multiple perspectives and working collaboratively with varied stakeholders—including system partners, community providers, and families and youth—to identify questions of interest and interpret findings and their implications.
  • Capacity. Agencies will typically need to build data and evaluation capacity over time. This may include increasing staff knowledge and skills on data and evaluation topics (find some resources below), forging partnerships, developing easy-to-use processes and tools, and fostering a culture that embraces data and learning.

Formative Evaluation

Formative evaluation uses data and feedback loops to refine an intervention and promote effective implementation of the program or practice. It applies systematic methods to collect, analyze, and use data for the purpose of guiding improvements. Formative evaluation can help program managers and implementation teams explore what is working well, what is not working, and what needs to change. For example, if data indicate that program activities are not being implemented as intended, what added supports might help (e.g., practice guidelines, training, coaching)? If services are not reaching family members, how can identified barriers be addressed?

As part of a formative evaluation, teams can work with evaluators to identify which data sources can best answer their questions (e.g., administrative data, feedback from staff delivering services or family members receiving them), how to collect data (e.g., through case reviews, surveys, focus groups), and how to analyze and share data to provide credible and meaningful evidence. From there, teams apply findings, particularly in making decisions about what to continue doing, when to adapt, or when to stop.

While formative evaluation requires deliberate steps, it does not have to be a lengthy process. Some agencies are looking at faster ways to collect data and apply findings within rapid cycle learning approaches.

See the following for more information on using data and evaluation to inform improvements: