One of the few things both sides of the political aisle are able to agree on is getting better performance out of programs, as seen by the move toward evidence-based policymaking under prior administrations. The federal government could lead by example, suggests Brookings Fellow Andrew Feldman, by creating a bipartisan team focused on improving performance of federally funded programs, giving states the freedom to implement innovative programming and shoulder more accountability for results, and reducing hurdles to program evaluation while encouraging the incorporation of data analytics into regular reporting. In a time of new federal spending priorities, budget shortfalls, increased need, and much uncertainty, states must get serious about investing in programs that work by actively incorporating outcomes research into their policymaking.
A report from the Pew-MacArthur Results First Initiative, How States Engage in Evidence-Based Policymaking: A National Assessment, assessed the levels of commitment and action of states using research to guide decision-making related behavioral health, criminal justice, juvenile justice and child welfare policy. This study scored each state and the District of Columbia on the extent to which they incorporated research findings in policy, including defining categories of evidence, conducting program cost-benefit analyses, and identifying specific funding for evidence-based programming. According to the brief, 50 states have taken some sort of action through the allocation of funding for programs supported by research findings, while 42 states report outcomes in the budget annually. Just 17 states compare program outcomes and costs.
While Washington, Utah, Minnesota, Connecticut, and Oregon lead the nation in evidence–based policymaking, Pennsylvania is one of 11 “established” states, with 13 evidence-based policymaking actions (three advanced and ten minimum) across the four policy areas studied. According to the assessment scorecard, Pennsylvania uses advanced research-driven policy actions most often in the juvenile justice sector.
Read more about the levels of evidence-based policy-making and individual state scorecards, in the report available at the Pew Charitable Trusts website. Case studies are also available on how to design contracts and grants to require outcomes reporting tied to program performance.
This is an unexpected follow-up to my last post. I just heard about a nonprofit losing a hefty grant at renewal time due primarily to a lack of reported outcomes. There was measurement – lots of data on process and organizational performance metrics – but not much to demonstrate the difference the program made in the lives of participants. This kind of news is disheartening.
My first thought is – how did it get to that point? Were the grant terms a surprise sprung on an unsuspecting organization at the last moment? Was any mention of measuring program outcomes waved off by executives who preferred to discuss ways to scale up at the next funding cycle?
That said, I am pretty certain that…
- the funder/s made their reporting criteria and protocol clear;
- the program administration and staff were dedicated to their mission and conducted outreach and activities according to their model;
- people who experienced the program gained something from it;
- the nonprofit thought that they were collecting data that showed the impact they made on participants and in the community.
So what went wrong in that story I heard? I’ll never know. No one accepts a grant award with the expectation of a giant hole in their final report, but if there are questions about program application, geographic distance between sites, and/or irregular communication, measurement can and will get lost in the shuffle. Here are some steps you can take to prevent a similar situation from happening to your organization.
- Update your data collection plan. The outcomes listed in a column on a chart in your program materials will not measure themselves. What are you currently collecting that may also fit as an indicator of your expected results? Can you create a measure to better capture a specific change expected in the program participants?
- Make expectations clear and follow up regularly. Keep staff up-to-date on data collection with a matrix that lays out indicators, data sources, person(s) responsible and timeline. Have a check-in call monthly to report on progress and address questions and other issues around the collection.
- Have patience. It will take a while to get used to a shift from collecting process metrics (still important – don’t stop doing that) to outcomes data, But, if you have a plan ready to go you can work out any knots early on in the funding period rather than panic at report time.
A study from the U.S. Department of Agriculture Economic Research Service suggests that grants to rural-based organizations are on the decline. The report, Foundation Grants to Rural Areas from 2005 to 2010: Trends and Patterns by John Pender, examined data on grants from the Foundation Center (of at least $10,000 awarded by the largest private and community U.S. foundations between 2005-2010), the National Center for Charitable Statistics, the Census Bureau, and USDA’s Economic Research Service to identify patterns grant distribution to rural communities in the United States.
Although 19 percent of the country’s population is located in rural areas, Pender concludes that grant funding “to rural-based organizations accounted for 5.5 percent of the real value of domestic grants by large foundations during 2005 to 2010, with a slight downward trend (based on Foundation Center data on grants by the largest 1,200 to 1,400 foundations).” A random sample of large foundations found that 6.3 percent of the total value of grants awarded in 2010 went to organizations in rural areas. Analysis using a sample of small foundations found the rural share of total grant value went from 7.5 percent in 2005 to 7 percent in 2010. During this time period the majority of grants to rural communities came from independent foundations.
Other findings from the study:
- The average dollar value per person of grants from large foundations to rural organizations was $88, versus $192 per person in metro counties.
- Counties with more college-educated residents (even when grants to universities and students were removed from the sample) received more grants per person.
- Rural organizations received more grants related to higher education, environment, and recreation/leisure than their urban counterparts.
Report Citation: Pender, John L. Foundation Grants to Rural Areas Frrom 2005 to 2010: Trends and Patterns, EIB-141, U.S. Department of Agriculture, Economic Research Service, June 2015.
Recent high-profile hirings and movements to tweak what philanthropy “looks like” aside, new data indicate that an emerging trend in grantmaking is the decline of Black professionals within the field. The Association of Black Foundation Executives (ABFE) and members of the Black Philanthropic Network teamed up to take a deeper look at why Black professionals were leaving the philanthropic arena, where they ended up and what organizations could do to address this recent pattern.
Main findings from the report, The Exit Interview: Perceptions on Why Black Professionals leave Grant making Institutions:
- 72 percent of respondents (the majority of whom had been or currently were in a leadership position at a grantmaking organization) believed that leadership roles for Black professionals were not substantial within philanthropy
- 22 percent stated they were “pushed out” of their recent position in philanthropy
- 48 percent agreed or strongly agreed that employment outside of a philanthropic institution allowed for more on-the-ground work and contact with the community, another 32% agreed somewhat
- Over 60 percent of respondents left philanthropy for employment with a nonprofit organization
Additional study findings, perspectives from former foundation professionals, a look a regional differences in urban philanthropy (including Pittsburgh) and recommendations regarding organizational leadership, accountability and professional growth in the complete report at the ABFE website.