Making the Most of Each Budget Dollar

One of the few things both sides of the political aisle are able to agree on is getting better performance out of programs, as seen by the move toward evidence-based policymaking under prior administrations. The federal government could lead by example, suggests Brookings Fellow Andrew Feldman, by creating a bipartisan team focused on improving performance of federally funded programs, giving states the freedom to implement innovative programming and shoulder more accountability for results, and reducing hurdles to program evaluation while encouraging the incorporation of data analytics into regular reporting. In a time of new federal spending priorities, budget shortfalls, increased need, and much uncertainty, states must get serious about investing in programs that work by actively incorporating outcomes research into their policymaking.

A report from the Pew-MacArthur Results First Initiative, How States Engage in Evidence-Based Policymaking: A National Assessment, assessed the levels of commitment and action of states using research to guide decision-making related behavioral health, criminal justice, juvenile justice and child welfare policy. This study scored each state and the District of Columbia on the extent to which they incorporated research findings in policy, including defining categories of evidence, conducting program cost-benefit analyses, and identifying specific funding for evidence-based programming. According to the brief, 50 states have taken some sort of action through the allocation of funding for programs supported by research findings, while 42 states report outcomes in the budget annually. Just 17 states compare program outcomes and costs.

While Washington, Utah, Minnesota, Connecticut, and Oregon lead the nation in evidence–based policymaking, Pennsylvania is one of 11 “established” states, with 13 evidence-based policymaking actions (three advanced and ten minimum) across the four policy areas studied. According to the assessment scorecard, Pennsylvania uses advanced research-driven policy actions most often in the juvenile justice sector.

Read more about the levels of evidence-based policy-making and individual state scorecards, in the report available at the Pew Charitable Trusts website. Case studies are also available on how to design contracts and grants to require outcomes reporting tied to program performance.

Plotting a Course for 2017

2016 was a year of flipping the script and changing up the status quo. Come tomorrow, it is time to push through our anxiety about what may lie ahead and plot a course to best navigate the unknown terrain of 2017.

But where to start? Some thoughts…

Where will the road take you in 2017?

In December, I always look forward to Lucy Bernholtz’s data and philanthropy forecast for the upcoming year and the insights in Blueprint 2017 are as thought-provoking as those of its predecessors. It is available for download at the Foundation Center’s Grant Craft website.

Diversification of revenue is more important than ever, especially among donors as well as sources.

Show the impact of the work you do – the very change your program facilitates at both the client and community levels. It seems to be in fashion to downplay all measurement because quantifying impact can be challenging, what with small samples and scattered cohorts and bias (oh my!). Yes, it is. But demonstrating how a program meets expected and desired goals – the outcomes – is not a clinical trial, it is just good practice. As is using those data to inform and improve services.

In Pennsylvania, as this fiscal year’s budget shortfall grows, all signs point to a doozy of a 2017-18 negotiation process. Structural changes to the current human services system are also on the table, which may signal new opportunities for nonprofits. How can you best advocate for the sector and your organization?

Moving purposefully into the unknown may be less intimidating for a nonprofit when there is a verbal AND a financial commitment to cultivate leadership within the ranks.

On the topic of developing leaders, this is a perfect time to engage in a some formative assessment of a more personal nature. As an established or up-and-coming nonprofit leader, how do will you look back on 2016 and plan for 2017?

  • Set aside some time to conduct your own career-centered end of year review.
  • Use/create a rubric to determine where you are now and what you should focus on, add, or set aside in 2017. Rubrics consist of a descriptive set of items or elements and a related performance scale. List your goals or expectations for 2016, then rate each one on a numerical scale where each point is defined along a continuum of progress, for example, 0 = “No progress made” while 4 = “Achieved 100%.” Add as much or as little detail to each rating point as needed to accurately capture the situation.
  • Last January, I worked with Emily Marco on a year-in-review that included a look back at professional and personal events and milestones of 2015 and planning for 2016. She also helped me clarify my goals and identify “action steps” to begin working toward them immediately. Emily is a visual problem solver who excels at helping people organize their thoughts and build a plan of action to achieve their goals. If you are interested in exploring a new way to digest the old and plan for the new you can learn more about her new online learning experience Relaunch 2017 or contact her for a goal setting session at Emilymarco.com.

 

 

Note:  This post is not sponsored.  I do not receive any compensation or services for mentions or links included in the post.

PA Nonprofits May Want to Prepare for a Rough 2017-18 Budget Process

A few years ago, a study from the Forbes Funds, The Pittsburgh Foundation and the United Way of Allegheny County examined the impact of nonprofits (minus the health care systems and institutions of higher education) on Pittsburgh’s economy. It found that nonprofits, ranging from human service agencies to animal rescue organizations, provided over 75,000 jobs for local residents and spent $4.4 billion in the local economy – supporting over 31,000 jobs in other industries.  Preventive factors associated with such community–focused programming resulted in both lives and tax dollars saved. The state-wide data for nonprofits (including healthcare organizations) confirms the strength of the sector. In 2015, Pennsylvania nonprofits employed over 15% of the state workforce and generated $132 billion in annual revenue.

Unfortunately, even with this proven social and economic impact, there is concern that Pennsylvania nonprofits may once again face a serious threat to their operations during the upcoming budget process. First, the 2016-17 budget was never really balanced, and the expected revenue shortfalls are a reality (first quarter revenue collections were $200 million short).  Second, a pension reform bill supported by Governor Wolf failed to pass in the PA House earlier this week after opposition from unions, including the Pennsylvania State Troopers Association, pushing that debt issue further down the road.  Third, our decaying infrastructure issues, already underfunded, are not going away in 2017-18 and the gas tax residents pay to fund repairs and improvements is being spent on state police. Also, 2018 is a gubernatorial election year in Pennsylvania. Could we see the sequel to the 2015 impasse?

You can stay up to date on nonprofit-focused policy and budget news at the Greater Pittsburgh Nonprofit Partnership (GPNP) website’s weekly summary page. The Pennsylvania Association of Nonprofit Organizations (PANO) is another resource for news out of the General Assembly.

Process Metrics are not Outcomes

This is an unexpected follow-up to my last post.  I just heard about a nonprofit losing a hefty grant at renewal time due primarily to a lack of reported outcomes.  There was measurement – lots of data on process and organizational performance metrics – but not much to demonstrate the difference the program made in the lives of participants.  This kind of news is disheartening.

My first thought is – how did it get to that point?  Were the grant terms a surprise sprung on an unsuspecting organization at the last moment?  Was any mention of measuring program outcomes waved off by executives who preferred to discuss ways to scale up at the next funding cycle?

Probably not.

That said, I am pretty certain that…

  • the funder/s made their reporting criteria and protocol clear;
  • the program administration and staff were dedicated to their mission and conducted outreach and activities according to their model;
  • people who experienced the program gained something from it;
  • the nonprofit thought that they were collecting data that showed the impact they made on participants and in the community.

So what went wrong in that story I heard?  I’ll never know.  No one accepts a grant award with the expectation of a giant hole in their final report, but if there are questions about program application, geographic distance between sites, and/or irregular communication, measurement can and will get lost in the shuffle. Here are some steps you can take to prevent a similar situation from happening to your organization.

  1. Update your data collection plan. The outcomes listed in a column on a chart in your program materials will not measure themselves.  What are you currently collecting that may also fit as an indicator of your expected results?  Can you create a measure to better capture a specific change expected in the program participants?
  2. Make expectations clear and follow up regularly. Keep staff up-to-date on data collection with a matrix that lays out indicators, data sources, person(s) responsible and timeline.  Have a check-in call monthly to report on progress and address questions and other issues around the collection.
  3. Have patience. It will take a while to get used to a shift from collecting process metrics (still important – don’t stop doing that) to outcomes data,  But, if you have a plan ready to go you can work out any knots early on in the funding period rather than panic at report time.