Logic Model Development as a Collaborative Process with the NACOG Head Start Program
Posted on
By Lenay Dunn, REL West Deputy Director, and Tran Keys, Senior Research Associate at WestEd
This post first appeared on the REL West blog and is posted here with permission.
A logic model is a graphical representation of the relationships between the parts of a program (or intervention or strategy) and its expected outcomes. It can serve as a framework to help with program planning, implementation, and evaluation. A well-defined logic model showcases connections between program components very explicitly. It creates a common language among team members/staff and can be shared broadly to tell the story of your program.
The post provides an overview of REL West’s work with the Northern Arizona Council of Governments (NACOG) during the Spring/Summer of 2022 to support their development of a logic model and associated measures.
Documenting Program Components and Assessing Goals
How can you represent the components of a comprehensive program that has multiple, ambitious goals? How can you assess whether the program is meeting those goals? Driven by these questions, the Northern Arizona Council of Governments (NACOG) Head Start engaged in a series of workshops with REL West to develop a logic model and associated measures.
NACOG Head Start is a federally funded preschool program serving ages infant–toddler up through age 5 in a 27,000 square mile radius within northern Arizona that is mostly rural and very diverse. Head Start is governed by federal performance standards. Within those standards, programs need to manage program data and support its availability, usability, integrity, and security. Head Start programs must establish procedures on data management, and through ongoing assessment, use data to effectively oversee progress towards addressing program needs, evaluating compliance, and successfully achieving program goals.
NACOG Head Start staff are not new to data collection or using data, but felt they needed support, and a thought partner, to learn how their data supported their mission, vision, and values.
Jennifer Brown, Director of NACOG Head Start shares: “We wanted to take our data-driven decision-making to the next level. We wanted to ensure that our team had a common understanding of programmatic activities and desired outcomes. This was our time to stop the day-to-day grind of the work we do and engage a broader leadership team to support their work with programmatic aspects. Time for us to inspect what we expect of our staff. Inspect what we expect. We’re firm believers that being able to bring outside expertise increases our team capacity to develop logic models, goals, and objectives, therefore helping us to meet the federal Head Start performance standards.”
With these thoughts top of mind, NACOG Head Start leadership reached out to REL West for logic model support, because they recognized the need for a succinct document that tells the story of their work.
REL West Logic Model Development Collaboration with NACOG Head Start
Logic model development is a team sport (or should be). The clearest and most comprehensive logic models are created by a team, not just by one or two individuals. At our initial meeting where we listened to NACOG Head Start’s request for support to create a logic model, we suggested that the experience be collaborative: not just with the three NACOG leaders on the call and REL West, but also with their broader leadership team. NACOG agreed to include all 10 managers/administrators to work with REL West staff to collectively develop their logic model. The commitment: actively participate in a virtual logic model development workshop series of five 90-minute sessions facilitated by REL West over 6 months. The workshop was a mixture of content delivery, pre-work, and working sessions. REL West staff recognize that participation in our workshop series was an “add on” to already very busy calendars; however, we know that logic models are most meaningful when the right group of folks are in the room, working together.
The group of 10 workshop participants were individuals overseeing different areas of Head Start such as the Operations, Human Resources, Health, Nutrition, and Safety, and Component Managers/Directors. Individually, these 10 leaders had plenty of insights to share on their respective areas of responsibility, in order to collectively develop a logic model for NACOG Head Start. At the end of our fifth and final virtual logic model development session, when asked what aspects of the workshop were most helpful, the following responses from NACOG Managers were representative of the feedback from the group:
- “Being able to work as a team to bring all of our thoughts and ideas together.”
- “I liked how we were given the framework of the logic model and then were asked questions to make it our own.”
- “Working through the varying components of the program and being able to identify what was important from each component manager. Helps me to focus my work and support field staff in understanding this as well.”
- “Adjusting our thinking of data collection and why we collect what we do. Reminding us of the WHY brings us together as a more solid team.”
New REL West Resources Developed
REL West’s productive collaboration with NACOG Head Start resulted in two resources that will help other educators who would like to develop (or refine) a logic model and associated measures for their program. First, REL West hosted a webinar on the topic of Defining and Measuring Progress: Aligning Data and Measures to Outputs and Outcomes of Logic Models where we co-presented with our partner, Jennifer Brown, Director of NACOG Head Start, who provided the often-requested practitioner’s perspective. The webinar provided guidance on how to connect data and measures to the output and outcomes of the logic model to know whether an implemented program, intervention, or strategy is having its intended effect. Aligning data and measures to a program’s logic model components can help with tracking progress and measuring impact. As was evident in our collaboration with NACOG Head Start, ensuring alignment of data and measures with outputs and outcomes is an important and needed next step to take in logic model development. Doing so will support continuous improvement toward program goals.
The second resource derived from our collaboration with NACOG Head Start was the development of the Aligning Data and Measures to Outputs and Outcomes of the Logic Model infographic. The infographic is designed for a broad audience of educators looking for a quick resource on the importance of and how to align program data and measures to various logic model components. The infographic tackles how to connect data and measures to know whether the program being implemented is having its intended effect. Various types of data, data sources, and data collection methods are outlined.
NACOG Head Start Next Steps
After finalizing their logic model, NACOG Head Start will incorporate its use to support continuous improvement towards Head Start program goals. They plan on sharing the logic model with all staff, their governing body, consultants, and families they serve. They will now also include their logic model in future grant applications. Final advice from Jennifer to webinar attendees:
We would also say they don’t leave your logic model on the shelf… Talk about it. It’s data. It’s important. It’s living, breathing. Make it a part of your vision and mission. Share it with everyone. Make it public. Revisit it regularly. We’re doing it monthly. Data and measurement alignment is so critical, and the logic model allows us to show the hard work we do. — Jennifer Brown, Director of Head Start, NACOG
Helpful Resources
The following selected resources will assist state-, district-, and school-level educators to develop and use logic models, and to guide data collection efforts to determine progress towards measuring outputs and outcomes.
Kekahio, W., Cicchinelli, L., Lawton, B., & Brandon, P. R. (2014). Logic models: A tool for effective program planning, collaboration, and monitoring (REL 2014–025). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Pacific. Retrieved from https://ies.ed.gov/ncee/
Lawton, B., Brandon, P. R., Cicchinelli, L., & Kekahio, W. (2014). Logic models: A tool for designing and monitoring program evaluations (REL 2014–007). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Pacific. Retrieved from https://ies.ed.gov/ncee/
Keys, Tran & Dunn, Lenay (2023). REL West. Aligning data and measures to outputs and outcomes of the logic model https://ies.ed.gov/ncee/
Shakman, K., & Rodriguez, S. M. (2015). Logic models for program design, implementation, and evaluation: Workshop toolkit (REL 2015–057). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Northeast & Islands. Retrieved February 7, 2022, from https://ies.ed.gov/ncee/
Stewart, J., Joyce, J., Haines, M., Yanoski, D., Gagnon, D., Luke, K., Rhoads, C., & Germeroth, C. (2021). Program evaluation toolkit: Quick start guide (REL 2022–112). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Central. Retrieved February 7, 2022, from https://ies.ed.gov/ncee/