Promise Program Measurement Plan Resources

Articulating the program theory of change

As place-based initiatives, Promise programs are intended to be more than simple scholarships. The goals of most Promise programs generally fall into three categories: 1) creating or supporting a college-going culture in the pre-K-12th grade setting, 2) increasing access to and success in post-secondary education, and 3) bringing about community-level change, whether that is defined as the development of a more educated workforce, an improved quality of life, or other forms of economic development. But how do Promise programs bring about these desired outcomes? What needs to happen in order for the financial award and other components of the program to actually affect these goals? A Theory of Change can provide insight into these questions by codifying the expected early, intermediate, and long-term changes sought as part of the Promise effort.

In Pittsburgh, for example, a simplified theory of change showed the different pathways of the direct impacts of the Promise on students' college going as well as the indirect effects of the place-based nature of the initiative through broader changes to the K-12 and post-secondary ecosystem. Promise program leaders and researchers may find it valuable to use some readily available tools and processes to develop a theory of change specific to the community in which it is being implemented.

Here are some well-recognized resources for developing a theory of change:

 

Understanding Program Implementation Timeline

Which outcomes can be fairly measured and when will depend on how the Promise program is intended to work and how any program eligibility criteria are enacted. Developing a Theory of Change, as described in a prior section, is an important first step in thinking about what needs to happen in order for a particular Promise program to realize its intended outcomes. Since Promise programs represent long-term efforts at systemic change, careful consideration of when various targeted outcomes are even possible to detect is suggested. In order to do that, one must clearly understand when the key components of the Promise intervention are fully implemented or have had sufficient time to be experienced by the students.

The Pittsburgh Promise, for example, made scholarship dollars available beginning with the high school graduating class of 2008. This Promise program has grade point average, attendance, and residency requirements that were phased in over 3 cohorts of students (a programmatic choice to give students graduating soon after announcement more of an opportunity to access the scholarship). In addition, it took time for relevant supports to be developed and implemented by the Pittsburgh Promise, the Pittsburgh Public Schools, families, and community partners. Below, we show an example table that was used to help think about the Pittsburgh Promise implementation timeline. Each Promise program may have its own constraints and conditions that need to be considered. Please feel free to use this matrix as a starting point that can be adapted for your particular context.

table of pittsburg promise data

This table shows quite clearly that the 2008 and 2009 graduating classes were not recipients of the full Promise model (as articulated in the Pittsburgh Promise Theory of Change), but did benefit from up to $5,000 a year toward post-secondary education (the full award of $10,000 annually became available to the 2012 class).

If we consider looking at college enrollment and degree attainment rates of the 2008 cohort we are measuring the impact of the financial award exclusively since there was not sufficient time from announcement of the Promise for the school system, parents, and/or students to make attitudinal and/or behavioral responses before that cohort matriculated. Students who graduated from high school in 2014, however, experienced all of middle and high school with the knowledge of the Pittsburgh Promise and with corresponding supports that were rolled out in the several years after the program start.

Other programs with simpler requirements, such as the Kalamazoo Promise, made the full amount of benefits available to the first eligible graduating class (Class of 2006), but students who were seniors when the program was announced in November 2005 had very little time to adapt their high-school experience in light of the new financial aid resources now available to them. Successive graduating classes have had more years of exposure to the Kalamazoo Promise, so one would expect to see greater impacts in both the K-12 setting and post-secondary outcomes. Not until 2018 will students who spent their entire K-12 years knowing about the availability of the Kalamazoo Promise graduate from high school. This cohort will potentially represent the full "Kalamazoo Promise effect."

Some communities may announce the creation of a Promise program to take effect several years down the road, giving students time to adjust their expectations in light of future benefits. The roll-out of benefits in these cases may lead to different timing in terms of when results might be captured. Similarly, early commitment programs that ask students to enroll in 6th or 7th grade provide several years in which both students and systems can adjust to an impending scholarship opportunity.

Completion of a table such as this exemplar may help determine when it is appropriate to measure outcomes and for which groups of students.

Documenting programmatic interventions

Upon announcement of a Promise program a range of actors and stakeholders may respond in many ways, such as by implementing new programs, changing resource allocation or intensity, forming new partnerships, or shifting focus and attention to different issues. It may be important to document these ecosystem shifts in order to later understand outcomes data.

Key areas to attend:

  • Availability of high school to college bridge programs/school year transition programs/senior year transition courses
  • Early assessment and intervention programs that are developed
  • Programming aimed at "college knowledge," including college visits and summer outreach programs
  • Development of programming around career interests and links between careers and educational pathways (internships, partnerships with employers)
  • College readiness programs
  • Embedded college and career counseling
  • College assessment (SAT/ACT) programs-test preparation, financial aid for fees, increased access through "SAT Days"
  • FAFSA completion and support system
  • College application process supports
  • Presence of summer bridge programs

Documentation of the nature of the programming or resource shifts, including when the change was made and for whom, could increase the power of outcomes data that are obtained at a later date.

Identifying Appropriate Indicators

he following indicators are organized into three broad outcome areas– the K-12 system, post-secondary outcomes, and community development/economic revitalization -- to align with the goals of most current Promise programs. We offer a fairly comprehensive list of potential indicators in each of the three outcome areas, along with a rationale for the indicator, and some possible data sources. Some of these indicators are strongly predictive of future success in a post-secondary setting and thus are important for school districts to track, while others are tightly tied to and would be directly influenced by the implementation of a Promise program (and some indicators do both). Of course, whether, how and where these data might exist and in what form will vary across systems, and it will take some exploration to determine the specific data sources that might yield the indicators described.

A note on collecting baseline data. The task of measuring the impact of a Promise program, especially in the area of post-secondary access and attainment, will be easier if researchers capture some baseline data before a Promise program is announced or implemented. Useful information includes the college-going patterns of a district’s graduates (available through the National Student Clearinghouse); student attitudes, expectations, and aspirations regarding plans after high-school graduation; and “college knowledge” – i.e., awareness of college costs, the application process, and so on. These can be obtained through surveys of high-school students.

In addition to the indicators listed here, Promise programs will also want to collect some basic information about usage of the scholarship. These indicators could include annual metrics for:

  • Rate of student eligibility (if eligibility criteria exist)
  • Rate of student scholarship use
  • Post-secondary institutions attended by scholarship recipients
  • Amount of money spent by scholarship program
  • Additional scholarship dollars accessed by Promise recipients
  • Academic performance of Promise scholars in post-secondary institutions
  • Post-secondary retention, progression, and completion
  • Degrees or credentials received by Promise recipients

Click the graphic to access indicators in each category, or download a PDF of the entire indicators table.

Reviewing Example Data Dashboards

"A dashboard is a visual display of the most important information needed to achieve one or more objectives, consolidated and arranged on a single screen so the information can be monitored at a glance" (Few, 2013).

Promise programs can use a dashboard approach to provide clear and concise information about the status or impact of the program to various audiences. The Pittsburgh Promise has developed an impact dashboard that presents metrics on fundraising, scholarship totals and amounts, Scholar college graduates, and impact on high school graduation rate, post-secondary education enrollment, and college retention rates.

Say Yes Syracuse, a Promise-like initiative, also has a one-page results summary that reports on economic development, academic growth, and community engagement metrics. Although not quite a dashboard, it has some features that could be considered foundational to developing a dashboard.

Dashboard expert Stephen Few offers these key principles for developing dashboards:

  • Dashboards are visual displays that usually employ a mix of text and graphics.
  • They display the information needed to achieve specific objectives, therefore it is important to be clear about those objectives.
  • They fit onto a single computer screen so that they are easily used for attaining just that needed information.
  • Dashboard users can monitor the issue from a single glance at the screen.
  • Dashboards present information using small, concise, direct, and clear display media.
  • In order to be effective, dashboards should be customized to the function, audience, and content.

Planning for When to Measure

The Theory of Change allows one to clearly articulate and understand how the program is supposed to bring about the targeted outcomes, the implementation timeline makes clear which program recipients experienced the entirety of the intended intervention, and the indicators framework offers some possible metrics for ongoing monitoring to gauge progress. As Promise programs embark on the measurement adventure, it is critical to understand (and communicate to stakeholders) exactly when various outcomes are likely to be observed.

The table below is an example of the type of thinking that is necessary to gain clarity of when outcomes can and should be measured to determine the impact of the Promise program. We have included just a few possible categories of outcomes and show how there is a significant range in terms of when change might be observed. School attendance rates may be changed almost immediately (especially if Promise eligibility includes attendance requirements) while upticks in educational attainment of the population may happen immediately if the Promise attracts residents with higher education levels or could take many successive cohorts of newly produced college graduates remaining in the region.

Alderman, M. K. (2013). Motivation for achievement: Possibilities for Teaching and Learning (3rd ed.). Routledge.

Allensworth, E., & Easton, J. (2007). What matters for staying on-track and graduating in Chicago public high schools. Consortium on Chicago School Research, (July).

Aud, S., Kewal-Ramani, A., & Frohlich, L. (2011). America's youth: Transitions to adulthood (NCES 2012-026). U.S. Department of Education, National Center for Education Statistics. Retrieved from http://nces.ed.gov/pubs2012/2012026.pdf

Balfanz, R., & Byrnes, V. (2012). The importance of being in school: A report on absenteeism in the Nation's public schools. Education Digest, 78(May), 1–46.

Balfanz, R., & Chang, H. N.-L. (2013). A focus on attendance is key to success.

Bartik, Timothy J., and Marta Lachowska. 2012. "The Short-Term Effects of the Kalamazoo Promise Scholarship on Student Outcomes." Upjohn Institute Working Paper No. 12-186. Kalamazoo, MI: W.E. Upjohn Institute for Employment Research. http://doi.org/10.17848/wp12-186

Benner, A. D., & Mistry, R. S. (2007). Congruence of mother and teacher educational expectations and low-income youth's academic competence. Journal of Educational Psychology, 99 http://doi.org/10.1037/0022-0663.99.1.140

Bettinger, E. P., & Long, B. T. (2013). THE FAFSA PROJECT : Results from the H & R Block FAFSA experiment and next steps. Retrieved from http://isites.harvard.edu/fs/docs/icb.topic1232998.files/Bettinger Long Oreopoulos - The FAFSA Projects - description 7-25-13.pdf

Braxton, J. M., Vesper, N., & Hossler, D. (1995). Expectations for college and student persistence. Research in Higher Education, 36(5), 595–611. http://doi.org/10.1007/BF02208833

Chang, H., & Romero, M. (2008). Present, Engaged, and accounted for: The critical importance of addressing chronic absence in the early grades. Texas Medicine, 103(September), 33–36. http://doi.org/10.1891/1933-3196.3.1.39

CRIS Annenberg Institute for School Reform (2010). Leading indicator spotlight series. Webinar.

Cumpton, G., Schexnayder, D., & King, D. T. (2012).Factors associated with education and work after high school for the classes of 2008 and 2009. Austin, TX: Central Texas Student Future Project.

Few, S. C. (2013). Information dashboard design: Displaying data for at-a-glance monitoring (2nd ed.). Burlingame, CA: Analytics Press.

Fiester, L. (2010). Early warning!: Why reading by the end of the third grade matters. A KIDS COUNT Special Report. Retrieved from http://www.aecf.org

Glaesar, E. L., & Saiz, A. (2003). The rise of the skilled city. Federal Reserve Bank of Philadelphia.

Glaeser, E. L., & Berry, C. R. (2006). Why are smart places getting smarter? Policy Briefs: Harvard University, (617), 1–4. Retrieved from http://www.hks.harvard.edu/var/ezp_site/storage/fckeditor/file/pdfs/centers-programs/centers/taubman/brief_divergence.pdf

Hernandez, D. J. (2012). Double Jeopardy: How third-grade reading skills and poverty influence high school graduation. Annie E. Casey Foundation.

Hershbein, Brad J. 2013. "A Second Look at Enrollment Changes after the Kalamazoo Promise." Upjohn Institute Working Paper 13-200. Kalamazoo, MI: W.E. Upjohn Institute for Employment Research. http://dx.doi.org/10.17848/wp13-200

Hossler, D. (1999). Going to college: How social, economic, and educational factors influence the decisions students make. Baltimore: Johns Hopkins University Press.

Hossler, D., Schmit, J. & Vesper, N. (2002). Going to college: How social, economic, and educational factors influence the decisions students make. Baltimore: Johns Hopkins University Press.

Hossler, D. & Stage, F. K (1992). Family and high school experience influences on the postsecondary educational plans of ninth-grade students. American Educational Research Journal, 29(2), 425-452.

Iriti, J., Bickel, W. E., and Kaufman, J. (2012). Realizing "The Promise:" Scholar retention and persistence in post-secondary education. Pittsburgh, PA: University of Pittsburgh, Learning Research and Development Center.

Jones, J. N., Miron, G., & Young, A. J. (2012). The Kalamazoo Promise and perceived changes in teacher beliefs, expectations, and behaviors. The Journal of Educational Research, 105(1-1), 36-51.

Kahlenberg, R. D. (2012). The future of school integration: Socioeconomic diversity as an education reform strategy. Century Foundation Press.

LeGower, M., & Walsh, R. (2014). Promise scholarship programs as place-making policy: Evidence from school enrollment and housing prices.

Miller-Adams, M. & Timmeney, B. (2013). The Impact of the Kalamazoo Promise on College Choice: An Analysis of Kalamazoo Area Math and Science Center Graduates. Policy Paper No. 2013-014. Kalamazoo, MI: W.E. Upjohn Institute for Employment Research. http://doi.org/10.17848/pol2015-014

Miron, G., & Evergreen, S. (2008). The Kalamazoo Promise as a catalyst for change in an urban school district: A theoretical framework. Working Paper for the Kalamazoo Promise Evaluation.

Miron, G., Jones, J. N., Kelaher-Young, A. J. (2011). The Kalamazoo Promise and perceived changes in school climate. Education Policy Analysis Archives, 19(17), 1-25

Moore, C., & Shulock, N. (2009). Student progress toward degree completion: Lessons from the research literature. Sacramento, CA: Institute for Higher Education Leadership & Policy, California State University.

Trostel, P. a. (2010). The impact of new college graduates on intrastate labor markets. Journal of Education Finance, 36(2), 186–213. http://doi.org/10.1353/jef.2010.0003

Back to the PMEF Overview