14 min read

It's More Than Money - Impact Measurement

How do know your nonprofit is making a positive social impact? What is the difference between outputs and outcomes? Why does measuring impact matter?

On August 21, 2020, we had the privilege of sitting down with Eric Barela and Andrew Means of the Salesforce.org Global Impact & Engagement team to discuss these questions and more. While we had prepared a full list of questions ahead of time, we were delighted to find they wanted to get down to what matters most right away. The conversation flowed as if we were old friends connecting on a common interest of really doing #moregoodbetter.

Here are some key highlights:

"Every nonprofit exists because there's something about the world as it is today that they want to see changed or preserved for tomorrow. We want to change the natural state of the world - it's going in one direction, and we want to nudge it in a slightly different direction. We need to understand how well we're doing that - or if we’re doing it at all. . . Regardless of the size of your organization, are you creating the change that you set out to create? How can you communicate that change? How can you improve that change? How can you continue to ensure that you're creating the world you want to see created?" ~Andrew Means

"It’s important to make our work useful for the broadest number of organizations and that linkage is technology. As you start collecting more information and providing more data, the questions become more sophisticated. That’s where the technology piece comes in - by bringing these tools to the masses." ~Eric Barela

". . . We’re waking up as a world to the fact that not just money matters. To understand the performance of our companies, governments, and nonprofits, we need to go beyond finances and include things like environmental and social impacts. I think impact is going to be front and center as a society for the next number of years. There’s a tipping point that’s being reached where I think we’re going to see significant changes, where we see companies competing for talent and customers by using their impact in the world. I think we’re going to see funding move towards outcomes. Not every small NGO needs to have a data scientist on staff, but we can build tools that enable them to collect information and connect to communities and evaluate the impact they’re having." ~ Andrew Means

Read on for the full interview or download the PDF! 

[contentupgrade id="2266"]

Tim: One of my first questions is, what is that your team does? What do you do related to that team? And how does that fit within what Salesforce thinks is important?

Andrew: We're members of the Global Impact Engagement Team. Brian Komar is our Vice President of Impact. I'm the Senior Director of Global Impact Data Strategy. Eric's the Director of Measurement and Evaluation. As part of the Global Impact Engagement Team, we care about the impact of Salesforce.org and our customers. We want to understand the impact that we as an organization are having and support our customers to better understand the impact that they're having. My role is to figure out how our customers can do more with their impact data. How do we help customers collect information to build their capacity and their ability to use data and technology to drive their missions forward?

Eric: If you replace “customers” with “Salesforce.org,” that is basically my role. I'm focused on our own impact management capacity, making sure that we're able to measure our own impact and really drive improvement from the impact data.

Tim: What data are you looking at? What does that data tell you?

Eric: As more data comes along, people start asking more more sophisticated, more difficult questions about what it means. We have over 40,000 customers - okay, what does that actually mean? I'm helping Salesforce.org understand its impact in terms of outcomes, as opposed to outputs. We want to understand how a customer uses Salesforce technology to increase efficiency and effectiveness.

Tim: Why does this matter for nonprofits of all sizes? It’s obvious for the big, huge humanitarian ones, but I don't think anybody on this call thinks that it's just the largest organizations for whom impact matters. . .

Andrew: Every nonprofit exists because there's something about the world as it is today that they want to see changed or preserved for tomorrow. We want to change the natural state of the world - it's going in one direction, and we want to nudge it in a slightly different direction. We need to understand how well we're doing that - or if we’re doing it at all. We have a limited number of resources with which to accomplish those changes. Because of the limited resources, we should flow resources to organizations that are doing the work well; oftentimes those include very small organizations. There are some very small organizations that have a tremendous impact on the people that they serve and the communities in which they're working. Regardless of the size of your organization, are you creating the change that you set out to create? How can you communicate that change? How can you improve that change? How can you continue to ensure that you're creating the world you want to see created?

Eric: I absolutely agree with Andrew. Prior to coming to Salesforce, I was working at a 27 person nonprofit. More than a few people said, "Eric, we have a purpose and we're trying to work toward that purpose. Isn't that enough?" No. You can say you're doing a whole bunch of different things, you can say you're changing the world, but unless you have evidence showing that you actually are, you can't really say that. Having evidence that you are doing what you say and being able to do something with that evidence is really important. Every organization, regardless of size, can (and hopefully does) make a positive impact. It might not be for an entire country, it might just be for a neighborhood. It might be changing a small group of people's lives. It's important to have impact evidence, regardless of size.

Angela: I really appreciate the distinction you made between an outcome and an output. We think about structuring Salesforce in a way that can enable people to track outputs, we're wanting to learn more on how we can enable Salesforce to also track outcomes. Not just "How many people are trained or how much food was distributed?" but also "What's the change? What is this affecting in the real world?"

Andrew: As we understand impact there's an element of trying to understand what would have happened without this intervention. If I'm running an after-school program, and I'm trying to help kids graduate from high school, some of those kids would have graduated even without my program. Looking at how many of them graduated high school isn't actually telling me my impact. My impact is how many of those kids that graduated high school wouldn’t have. Impact measurement is a little bit of a misnomer because you can't actually measure impact. We can count things like our outputs. We can count and observe our outcomes. But measuring impact requires us to estimate and to bring our best methodological work to understanding what would have happened had we not run an intervention.

Eric: That’s where technology comes into play. You have to really be thinking about the differences between outputs and outcomes. Technology is meant to clarify, not confuse. If you go into a project thinking “Well, if we put everything into Salesforce, it’s going to tell us what our outputs and outcomes need to be. It’s going to tell us everything,” you’ll end up with very little. You can implement Salesforce well, but you’ll still have to interpret the data to get increased impact, outcomes, and outputs. Using outputs as a proxy for impact is not enough anymore. It really isn’t - and it shouldn’t be. You have to strive for more and strive for really being able to know what it is you’re doing. Looking for metrics that tell you “we’re doing great” are vanity metrics.

Tim: What do you see as the trajectory for converting from vanity metrics to metrics with a deeper, better understanding, even if that understanding isn’t great marketing. I’d like to hear your input on the benefit of better metrics.

Eric: It’s important to be able to measure impact. That's tough, especially in a world where things are measured in terms of the financial bottom line. Resources need to be put into a shift from financial bottom line to impact. You can't just put everything into a model or an algorithm and have it spit out all the answers. The work that we're doing is social science, which develops based on the real world and requires more resources. The payoff, however, is changing impact measurement from being a marketing exercise to being able to say "Here’s what all our customers do, here’s who they were able to support, who they were able to change, and what that change looked like." Impact measurement is collecting that deeper information at a high level of rigor. Marketing can use it, but it’s not just marketing. Yes, it does take resources to track impact. It’s going to take funders who really stress the importance of impact, especially in the nonprofit sector. If funders are just calling for output numbers, then that is what the sector is going to get. If funders understand the importance of impact, they’re going to be calling for it and others will follow suit. There’s a little bit of peer pressure; if the cool kids are starting to do this, so should we. 

Tim: Do you anticipate a tipping point? And would you put a date on it?

Eric: We’re going a bit into prediction as opposed to measurement. As an evaluator, I want to be very, very measured, but I do think we’re getting to a tipping point. I don’t know exactly when it’ll be. It’s going to be imperative to society for it to happen sooner rather than later. I’d like to think that with the thinking around the SDGs, that we will hit that tipping point to propel us to meet those goals by 2030. I can’t give you an exact date, but I would hope to see something sooner rather than later.

Tim: For an impossible question, that's totally reasonable. Andrew, do you mind giving a 30-second resume? I’d like to provide some context for why you have such a multifaceted view of impact.

Andrew: Thank you. I’ve worked inside nonprofits, for example as the Director of Research Analytics at the YMCA, helping them think through using data to measure and improve their impact as well as their organizational effectiveness. I’ve worked with funders; I ran the uptake.org. I’ve consulted with philanthropists. I started a data science consulting firm working with organizations all over the world building algorithms and models to drive impact forward; helped co-found an organization that does data governance, and started Data Analysts for Social Good, Good Tech Fest, Do Good Data and a bunch of convenings in the space.

I believe that there’s change going on. Eric spoke to this a little bit as well, but I think what’s exciting at this time is that we’re waking up as a world to the fact that not just money matters. To understand the performance of our companies, governments, and nonprofits, we need to go beyond finances and include things like environmental and social impacts. I think impact is going to be front and center as a society for the next number of years. There’s a tipping point that’s being reached where I think we’re going to see significant changes where we see companies competing for talent and customers by using their impact in the world. I think we’re going to see funding move towards outcomes. Not every small NGO needs to have a data scientist on staff, but we can build tools that enable them to collect information and connect to communities and evaluate the impact they’re having.

I think one of the big challenges in our sector is that we put all of the costs for social change on NGOs, whereas society as a whole is benefiting and data is allowing us to see that and better align those costs. I like to think data collaboration is the key here and that we’ll see the emergence of new kinds of models. I hope to see intermediary organizations that help sectors or communities do their impact management and evaluation work. Here's an example: I remember years ago reading about Ronald McDonald Houses. Hospitals that are next to Ronald McDonald Houses actually have much better performance outcomes than those that don’t and there's a financial incentive for hospitals to have these positive health outcomes. Because they see this benefit, the hospitals could be helping underwrite some of the cost. Or there’s Walmart, which is the largest recipient of food stamps in this country. We should be going to Walmart and telling them “It’s in your business interest to ensure that everyone eligible for food stamps is receiving them because you’ll see an increase in your business.” As more non-financial indicators are being sought out, investors are wanting to know the return on their investment, whether that be financial or not.

Tim: You’re hitting one of my recent points. I wrote a blog at four in the morning when it was bugging me and I couldn’t sleep. I pulled a Jerry Maguire like “this is what's wrong with the industry” and just put it out there. Now I wonder if I should’ve put it out there. But regardless, it’s about the cost of our language of "donor" instead of "investor." A donor expects nothing in return. That’s what’s beautiful about it. But as an economist, I don’t trust that and it makes me frustrated. I want people to give because they expect a return. Just to be clear, nonprofits are accountable and they’re some of the best people in the world. What I am saying is that we train our donors to think “I just want a thank you,” which means that very qualified experts are writing thank you notes with their years of experience and multiple degrees because that’s where the financial incentive from the donor lies - in the personal connection to the organization. Investors are investing to get a return. I’ll step off my soapbox for a minute and let you reply.

Andrew: Language matters so much. As a donor, once the money leaves my account I’ve accomplished what I need to be a donor. As an investor, I care about what I’m getting back and what the world is getting back. I think that shift in language matters a lot. This is also where data and technology become really critical in helping to do that at a scale. You can’t prove to every $50 donor where and what exactly their $50 went to. We have to be honest about the fact that impact is expensive. If we want to look at some of the organizations that we hold up as some of the most impactful organizations, they’re spending a lot of money to achieve that impact. As donors, we spread our money much too thinly. If you ever want to have any measurable impact as a donor, you need to concentrate your dollars. If you want to just give money and build relationships, you can do that. But if you want to have a measurable impact and act as an investor, then you need to concentrate your resources in a way where you can actually move the needle on that one thing. Social change is hard. Social change is expensive. We need to identify what those costs are, monitor who's doing the work efficiently and effectively, and then channel resources to them.

Tim: Delivering that last mile in extreme poverty is outrageously expensive and that’s often ignored. There isn’t a quick differential on impact around delivery, so where there is a lack of infrastructure delivery, whether it be UPS or clean water, it’s going to be more difficult and expensive.

Andrew: Absolutely. We need to be realistic as investors about what it is we’re getting back. As an arts education funder, we sometimes send kids to symphony. I don’t need to know that I’m producing the next Yo Yo Ma because a kid went to symphony one time when he was eight. We need to also understand that there are some things that are good in our society and should then exist. You can’t necessarily measure that progress through change occurring in the world.

Tim: I’d like to talk about why this is and what the costs are related to it, which is rarely done.

Eric: There aren’t many organizations that can bring in someone like me and our team. There’s a lot of training and experience that goes into what we do. It's a big investment. Not even well-funded organizations necessarily have the wherewithal to bring in an in-house team to do this work. I understand the value of my work, but also understand there has to be a way to democratize it. It’s important to make our work useful for the broadest number of organizations and that linkage is technology. As you start collecting more information and providing more data, the questions become more sophisticated. That’s where the technology piece comes in - by bringing these tools to the masses.

Andrew: Technology is going to be how we lower the barrier to entry for doing this kind of work well. It’s almost taking the burden off the program providers and automating or sharing that burden across other similar organizations. That’s one of the nice things about data science. Whether I’m looking at 1,000 rows or 100,000 or a million rows, as long as they’re structured in the same way, the marginal cost isn’t that huge for the adding of information. Sometimes we might be so focused on proving impact that we forget that there’s a lot of evidence of impact that we might get before even getting to proving. Focusing on randomized control trials and the quantitative social science methodology can actually help us within a reasonable amount of doubt prove that impact is occurring. Some might say that’s the gold standard of proof to show that an organization is having an impact, whereas I like to think that there are a whole lot of ways to think about what the evidence is that interventions will be effective and impactful. Evidence scales whether you’re a one person or million person group. The quality and quantity might vary between those sizes, and the burden of proof may vary, but every organization can create a culture of evidence. Any size organization can think in terms of evidence. What is the evidence that you would present to show that your organization is effective?

Tim: If you’re going to have any evidence, and if you’re going to use information to make decisions, your data has to tell the truth. Just like you were saying about evidence - tech implementations are not enough. It’s not a magic bullet to have implemented Salesforce. As a company, Now IT Matters spent last year evaluating our services, which ones are effective, and which clients we can model services around. We found, over and over again, it’s behavior that trumps data every time. Unless you get at the behavior layer of humans, the culture can’t change. Without the correct culture and behavior the data won’t be input correctly. Until the data tells the truth, you’re not going to access it for information, you’re not going to access that information for insight, and you’re not going to turn that insight into impact. Do you think there’s a shift in where that’s headed and more recognition around the behavioral aspects?

Andrew: The question of data truth is an important and interesting one. My perspective and experience is that we’re shifting to collect much more streaming information. We’re collecting data exhaust and data in seamless ways. Instead of having to go out and take surveys, now we’re collecting observational information that’s continuous. I’m seeing interesting uses of IoT technology to collect streaming information.

What do we mean by saying that the data is true? In Chicago some people built a Twitter bot to find where there were restaurant violations for, let’s say, sanitation issues. They found all these violations in typically wealthier neighborhoods with younger people, because those were the people going on Twitter to complain about restaurants. It wasn’t an accurate representation of the world, but it was a representation of a part of the world. In the same way, data is always inaccurate in some way or another. The question is, is it telling us something useful? Many organizations don’t feel like they have an accurate representation of what’s going on.

Tim: There are lots of questions around accuracy. We haven't even started on implicit bias or how first world M&E work is. What I'm talking about at this point is in the field that says "Birthdate" is there a date and do you trust it? Or for a "Phone number," is there an email address? Until an organization can cross that bar, the rest just never lands. Many organizations have been told that an implementation will solve that, then two years later they find out it wasn't exactly true. In our experience, we have to be the ones saying "In the end, this is what we’re shooting for." It starts with behavior and we lay out a plan.

Andrew: What is so exciting about the direction that you’re taking your organization is that you’re thinking beyond the technology to the behaviors and culture underneath. That’s what’s needed. To really create change and value from information, you need to look at behavior and you need to be able to trust the data. I think that the innovative work you are doing at Now IT Matters is huge in helping that shift for organizations.

Eric: Ultimately, it’s the people of the organization who are making these decisions, pushing the work forward, and getting the work done. It’s incredibly important, therefore, to not only have the data tell the truth, but also for your organization to actually be able to tell the truth from your data. For this to be done well, it comes down to the culture and the mindset. I will have failed in my work if Salesforce.org sees our impact data as Eric’s data, because it’s not. It’s the organization’s data and there is a lot of cultural work that needs to be done on a daily basis to help people see the value of the data. It’s another failure of mine if I come to the end of my work and show all these findings and people are like “What? Huh? I never knew that before." If it’s a complete surprise to people, then I haven’t been doing my work in terms of transparency and bringing people along. There’s space for partners and for technology to bring people along in that way as well.

Tim: Thanks for the vote of confidence! To wrap things up, I’ve got a "hit list"of things you won’t know until you’ve done this with hundreds of clients over a decade. So from each of you, can you take one last minute and give us one of your “We Don’t Know What We Don’t Know” items?

Andrew: I think the biggest thing I know now is the importance of change management, behavior change, and technology change data. I come from a quantitative social science background. I tend to want everything to be a social science experiment where I can control all the variables. Recognizing that investments in culture and behavior are important has been one of my biggest learnings.

Eric: Starting my career in a school district, that was something I learned really early. For me, it’s not just about using technology, it's about when you bring in the technology. There needs to be a deep understanding of what an organization is trying to do and to have that model established first. It’s important to put in the work to build a conceptual framework.

Tim: Thank you both so much for your time, your thoughts, and for dedicating your careers to impact. It's so important and I'm glad to see it emerging as a theme around the ecosystem.