How do you know if your program achieves what it sets out to achieve? How can you prove it? This might seem like a straightforward endeavor; however, it is statistically complex, time consuming, and usually cost-prohibitive.
Proving that a program or intervention caused an intended result is tricky business. It requires you to prove that your result was NOT caused by myriad other factors in peoples’ lives and environments—it was caused only by your intervention. This type of analysis is called an ‘impact evaluation’. There are different approaches one can take when conducting an impact evaluation, all with varying degrees of sophistication and rigor. The gold standard is called a Randomized Control Trial (RCT) – the kind often used for medical studies. This is almost never a feasible approach for social programs, so other methods are more commonly used. ‘Quasi-experimental’ design is the next best thing, typically comparing a program’s participants against folks who – apart from participating in your program – have extremely similar characteristics.
Conducting an impact evaluation requires thoughtful and diligent data collection, expertise (both internal to your organization and from external partners), long range planning, and of course, funding. Lining up all these requirements is extremely difficult, making evaluations of this caliber quite rare in the nonprofit space. Even if you have access to these requirements, your organization must recognize the importance and significance of evaluation – a commitment to evidence-based programming that prioritizes both program participants and good stewardship of philanthropic dollars.
In 2021 the D’Aniello Institute for Veterans and Military Families was thrilled to be able to commission an impact evaluation of its Onward to Opportunity (O2O) program with The Clearinghouse for Military Family Readiness at Penn State , a landmark evaluation in the VSO space. Onward to Opportunity is now the only veteran career training program that can demonstrate third-party validated efficacy for program participants with this level of evaluation rigor (experimental design).
IVMF and Clearinghouse teams met regularly for over a year to establish the study design, clean and align datasets, and perform the statistical analysis. The evaluation would not have been possible without an earlier longitudinal (i.e., follows participants’ outcomes over time) study of veteran well-being launched in 2016 by the Henry M. Jackson Foundation with Clearinghouse, VA, and DoD researchers, called The Veterans Metrics Initiative (TVMI). It was against this group of veterans that evaluators compared a cohort of Onward to Opportunity participants. Simplified, the evaluation posed the question: Did Onward to Opportunity participants fare better in certain employment outcomes than other veterans transitioning out of the military at the same time, with similar characteristics, who did not take the program?
This impact evaluation is significant for several key reasons:
- It demonstrates IVMF’s ongoing commitment to build and deliver evidence-based programs, ensuring that we do our best to validate our approach using available program data. We intuitively know that we deliver highly valuable programming to our participants, but we and our philanthropic partners want to know.
- It informs future data collection efforts for both O2O and other areas of the Institute. Evaluations of this scale shed light on ways to improve our thinking on program outcomes and on ways to collect and store this data.
- It is virtually unheard of in the nonprofit space – truly a differentiating factor when funders and prospective participants are searching for high quality programs to invest or participate in.
- It proves, statistically, that participation in Onward to Opportunity leads to higher salary, an increased likelihood to leave a job for a better opportunity, and a strong benefit for the junior enlisted population.
The purpose of Onward to Opportunity is to improve service members’ (and military spouses’) career prospects post-service. Proving that program participation leads to starting higher salaries validates a major component of this purpose. The evaluation also showed that O2O participants who left their post-program jobs did so for positive reasons (i.e., a better job opportunity), showing that they are well-equipped to move up their career ladders and have gained employability through the program, rather than just entry into a first job. It was even more significant that we saw increased benefits for lower enlisted participants, who tend to experience more employment difficulties post-transition.
There are, of course, other valuable program outcomes: employability, job attainment and retention, and training and certification, to name a few, but we were not able to include all of those in this evaluation. We were limited in certain areas by historical data collection methods, and how outcome data aligned between the TVMI dataset and Onward to Opportunity. This highlights the importance of thoughtful and consistent collection of program data for performance and impact evaluation efforts. Additionally, since 2019 O2O has made several key program improvements that enhance the benefits demonstrated in this evaluation:
- Fully digitized Onward to Your Career content, now available to all participants (in-classroom and online)
- Expanded recruitment efforts and tailored program offerings for junior enlisted populations
- Virtual and extended reality tools for participants to simulate civilian job interviews (Learn more here)
- Added 20 new additional learning pathways (50+ total)
Evaluations of this caliber are not possible without generous funding and strong partnerships. IVMF is grateful to the Schultz Family Foundation for providing the financial support that enabled this evaluation to move forward, and to Penn State’s Clearinghouse for Military Family Readiness, who have been ready partners in research and evaluation to the IVMF for many years. Without their considerable expertise and comparison group dataset, we would not have been able to achieve a study design of this level of statistical rigor. We aim to build upon this evidence base in the future for continuous quality improvement through expanded program offerings, enhanced learner experience, increased completion, and employment outcomes.