Heidi McAnnally-Linz: Using evidence to inform policy

By EA Global @ 2017-06-02T08:48 (+6)

This is a linkpost to https://www.youtube.com/watch?v=PAYVIXlk4rw&list=PLwp9xeoX5p8Pi7rm-vJnaJ4AQdkYJOfYL&index=6


In this 2017 talk, Innovations for Poverty Action's Heidi McAnnally-Linz argues the evidence movement has been successful at designing and evaluating effective solutions, but that it now should move to mobilizing policy-makers to use those solutions. She also gives examples of IPA's work to influence policy-makers in Zambia, Peru, and Ghana.

The below transcript is lightly edited for readability.

The Talk

Hi, everyone. My name is Heidi, and I work for an organization called Innovations for Poverty Action. We run randomized evaluations in the field, and we've done about 650 of them in over 50 countries around the world, in the last 15 years.

So I'm going to start the way I normally introduce IPA, and I'm going to ask you guys what might seem like a pretty silly question. If you had a dollar, and you wanted to buy oranges, you wanted some tasty, good-for-you oranges, which one would you buy? It's not supposed to be a tricky question. It's supposed to be very straightforward, because it is.

So now I'm gonna ask something that might seem even more simple to you guys in this room, but I always get varied answers to this question. So let's say you wanted to increase attendance at schools in Kenya. Let's say the reason why kids don't go to school is because they don't have the clothes, they don't have the right shoes, uniforms are required, so they don't have uniforms, so they don't show up. Or, would you give them deworming pills, so that they don't have to feel bad, and they come to school?

Okay, in this room, everybody gets the right answer, right? But you'd be surprised, around the world, whether it's in the Ministry of Education in Ghana, or in a room with non-EA philanthropists in San Francisco, the message still isn't quite out there. However, in this room, it's pretty obvious, the additional years of schooling that you get for $100, for school-based deworming versus for uniforms, is quite significant. And I won't go into more details on that.

But really over the last 15 years or so, this movement, together, all of us, have really changed standards for evidence. No longer is it acceptable for charities just to say that their outputs are impactful. This is even becoming much more mainstream. All kinds of donors, who are not just involved in the EA movement, are now asking charities, asking governments even, to say what is the impact of their programs. And increasingly, people have answers to this.

So this is something that you've all been involved in over the years, and a lot of the research behind some of these cost-effective, fundable programs was carried out by IPA. So this is just to say, look, we've been a part of this, together with all of you, for a very long time, and we've seen these things that are really pill-like have lots of success, particularly at scale, because they're really cost-effective, they're fundable, all the analytical brains can get behind it and say, "Yes, we're gonna do this."

So we've seen it scale. We're seeing it with Deworm the World, with the Against Malaria Foundation, with GiveDirectly, a lot of this thanks to GiveWell and many others. And that's great, and there's been a lot of impact of this, and I'm in no way arguing that this should change. This is certainly a part of our legacy, and something we're all really proud of.

But what happens when the questions get a little bit more complicated? Let's say instead of attendance, you actually want to increase learning levels. A study that we did in Ghana showed that, when we did the baseline, only 6% of kids in third grade could read a really basic sentence. Imagine your kid being in school for three years, or four years in some cases, and not learning anything.

So learning levels are a real problem. What would you do? Which one would you buy to solve this learning level problem? Would you spend money on merit-based scholarships for girls? Or would you want to re-assign students based on learning level?

So it wasn't obvious to us, when we did these original studies. And by we, I mean that IPA did the implementation of a number of these studies. We also work closely with J-PAL, so you've probably seen some of this from them as well.

But when we think about how we then put this into practice at scale, it becomes a lot more complicated, because it's not just a matter of going a couple times a year into the schools, and giving out deworming pills. If you really want to re-assign students to classes based on learning level, and teach to their level, you’ve got to change the way classrooms are structured, you have to change the way teachers teach, and particularly if you want to do this at scale in Africa, you have to do this at the government level. And that becomes a lot more complicated.

So to take a step back, as I said, we have over 650 studies, randomized trials with few exceptions, things like complement randomized trials, in over 51 countries across sectors. But the number of these studies that all of you have heard of may be slightly larger than your average person, but when I talk to folks even here in EA Global, people are only aware of somewhere around five studies beyond the ones I mentioned already, the deworming one, GiveDirectly's, etc.

Is that because we didn't find positive results? Or is it because applying those results is a little bit more complicated? I argue that it's because applying those results is a little bit more complicated. Certainly, all 650 do not have positive results, and we've talked about a lot of the negative ones a lot, and I'm not going to get into that today, but there are a lot more positive results than are being used.

So what are we doing about all of this? And how do we actually get these more complicated kinds of programs implemented at scale, such that we're leveraging all of the money, all of the effort, that has been put into all of these 650 studies, to actually get the most cost-effective outcomes at scale? Not just things that are going to get us the best pill-like interventions, but things that might change systems, and leverage existing aid money, leverage existing donor money, to make those interventions more effective.

So for IPA, for the last 15 years, this has kind of been what we call our theory of change. We see the really basic problem in the world is that there's limited evidence on what works best, and then there's limited use of that available evidence, and therefore you get ineffective programs and policies. And our solution is two-fold as well: we design and evaluate effective solutions, and then we mobilize and support decision makers to use evidence, and bippity boppity boo, better programs and policies.

The first part of our theory of change is really scientific. We've got this down, we've been doing it for 15 years, design and evaluate potential solutions to poverty problems. We know how to run a good randomized evaluation, we have over a thousand people in the field across 20 countries right now who do that, who know how to do that really well.

But we, and my hypothesis is that this is true in EA as well, and I'm looking forward to hearing others speak on how they're influencing policy as well, we don't really know how to do the mobilizing and supporting decision makers to use evidence part yet. At IPA, we've only really been starting to focus on this, in spite of it being a part of our mission, a part of what we do. We've only really been putting actual staff time into this over the last maybe three or four years. And really, the investment of staff time in this is maybe 3% of what our staff are doing right now.

So I think this is the next frontier for the evidence movement. So even what little we have invested is paying off. We're working with governments in all of these countries, we're investing more in the ones in orange, and we're starting to create relationships, and starting to help think about how to scale interventions with evidence that already exists, but also how to co-create evidence together with these policy makers, answer the questions that they actually have, and create more effective solutions for them in their context.

I'm gonna tell you about three different successes in that, and even in some of these cases, I would still consider them only along the way, and not there yet. We still haven't figured out how to do this well. But I think there's an opportunity for us, in the EA community, to think more about how to actually use evidence at real scale, and create real change.

So the first example is in Zambia, where we did a really basic study, together with the Ministry of Health. The second is from Peru, where we created kind of a nudge unit. And then the third is from Ghana, where we've been working for almost a decade with the Ministry of Education there.

The first example came to us because we've been working in Zambia for quite a few years with Nava Ashraf, who's a researcher. And the Ministry of Health came to her because she's been very helpful, before it came to us, because we've been very helpful before. And they said, "Look, we're going to scale up our program for community health workers. We don't have enough doctors in this country, we don't have enough nurses in this country. We’ve got to recruit more community health workers. We want to do this in the most effective way possible, but we basically have no budget. What do you think we should do?"

Well, this is a very common problem around the developing world. These are very under-trained workers, so we said, "Okay, what if we just varied the way you recruit them?" Because there's a very big debate about should you recruit people who are more motivated to serve their community, or should you recruit people who are more motivated to advance their career? This was a debate that the Ministry wanted to solve, so we said, "Well this is something we can randomize."

So all we did was randomize the flyer. It was very low-cost, in terms of what the government had to do. And it turns out that just changing to the career flyer meant that the community health workers who were recruited that way were 29% more effective at reaching households, and we're starting to see results coming out, seeing that it actually leads to even better health outcomes for those households. And as they scale this, and they are using this tactic to scale to 5,000 community health workers, we estimate that they'll capture an additional 315,000 households, just with this small tweak.

What's really exciting about this story is that not only are they using the results, but now they're saying, "Okay, now we've made this career promise. How do we deliver on this career promise? How do we create effective careers for these folks? Can you help us with evidence for that?"

So this one study is leading to what we're starting to call, at IPA, a culture of evidence-based decision making, and we think that's an exciting success. I'll tell you more when I come back, as I'm going to be in Zambia next month, learning a little bit more about this.

My next example, and we did this together with our partners at J-PAL, is working with the Ministry of Education in Peru. So the Ministry of Education in Peru has, or at least had, a lot of very sophisticated thinkers, and they wanted to think about what kinds of things they could do that are low-cost, to improve their system. So we spent some effort, and had an embedded staff member in the Ministry, and they created this lab, together with researchers, Ministry officials, and folks from IPA and J-PAL. And their goal really was just to say, "What can we do with the administrative data that we have, to test small things that we could then use to improve learning outcomes?" Or attendance, or whatever the outcome they wanted to improve was.

And so we've done quite a few small-scale studies with them in the past few years, and the results now are just starting to come out. There are a couple things that are really exciting about this. The first is that it's institutionalized. So we created an institution, a nudge unit within the Ministry of Education in Peru. So when government changes are happening, this is already within that. Of course, there's always a risk that these kinds of things get cut, but at least it's already kind of there and institutionalized.

Another really exciting thing that's happening from that, is that people who work in the Ministry of Education, who worked with us on this, have now moved to the Ministry of Health, and also to the Ministry of Social Protection, and we're exploring doing labs with those ministries as well.

And then finally, the first results are out, and the study was essentially what we will call a field replication plus, a replication of a study in the field that kind of tweaked a couple of things. The original study was about what happens when you give information to students about what the returns to education are: are they more likely to invest in education? There were big impacts in the Dominican Republic, and we're seeing similar results in Peru, and we tried a couple of different technology tweaks there. So the Ministry is excited to start working on scaling that up. We still don't know exactly how that's going to play out, but it's in process.

And then the third example is in Ghana. Ghana is the place where we've invested the most in policy staff. Like I said, we haven't quite figured out what the right mix of research and policy staff is, and quite frankly, there hasn't been a lot of funding for the policy side, because we've all been so excited about generating the research, which we've done a really great job of.

But we started by doing another field replication of a successful program run by an NGO in India, which targets teaching at the level of the child, and separates kids based on their levels. We did this at a very large scale with the government, nationally representative, and we saw quite strong results, even though there was varied implementation, even though this was being implemented by the government. It's very impressive.

And we thought "Great. Done deal. We're going to scale." But it's more complicated than that. We were using teaching assistants that were being paid for by the youth employment program, and now that youth employment program is defunct. So where are we going to get these teaching assistants? It's very complicated.

And they said "Look, what we really want to do is what we told you at the beginning. We want to do this model, but we want our teachers to do it."

We said "Okay, well that's why one of the arms of the study was the teacher-led intervention." It wasn't as strong as when the teaching assistants did it, but it still has a positive impact, and it still was cost-effective.

So they said, "Well look, what we want to do is we want to make this even more effective. We want our teachers to lead it, and we want it to be more effective. What can we do?"

So there are even more questions. They are still planning to grow this at scale, but we're now testing how we can use people in the existing system, so district monitors, the head teachers, to help improve implementation of this program, such that the results are even stronger. So we're working together with them on this.

But what that led to is all kinds of other questions. It led to, again, what we've been calling this culture of evidence-based decision making. And now, today, with the Ministry, we have five ongoing studies with them, and all of them are influencing what they're doing in their curriculum, in their teacher training, even in their large, large policies on what kinds of things to fund. With all five of these, from pre-kindergarten all the way through secondary, they're calling us with questions, and they're working with us on answering them. And really, we're in this position where we can help answer these questions, and help improve the overall system.

So how do we do this, IPA, and as a community? I think there are a couple of important things that we need to remember about the necessary conditions in which evidence can be used. These are based on not only these examples that I've told you, but even when we think back to how it was that we got deworming to be taken up at such a big level, this was also involved.

So first, the evidence has to exist and it needs to be credible, which means we want it to be rigorous. We're doing really well on this. We've got the 650 randomized trials so we're doing well. It's with everything else that we're struggling.

It also should be relevant, and sometimes these things take time. Sometimes the studies take time, and by the time we get to the end of them, the decision has already been made. So we need to be thinking together, otherwise we share the results with the policy makers, and they go, "I don't really care about that. Thanks for telling me, but this doesn't really help." And so, we've learned that co-creation of evidence from the beginning is really important.

And this evidence has to be accessible. Right now, with a lot of these 650 studies, if we're not out there talking about them, and writing clear summaries of them, people don't get access to them. And they need access to them at the right time. There are policy windows. Sometimes those policy windows are right after a study, and sometimes they are ten years, and people need the right information at the right time.

You have to get buy-in, from the beginning, ideally, and this is what we've learned the hard way. You have to get it from users and influencers, people who are going to use this information, and critically, the major donors. It's one thing to get the Ministry involved, but if USAID is their main donor, and they're not on board, we're in trouble. There also has to be funding for this. So the fact that the youth employment program got cut, well that really messed up that plan.

And users have to know how to apply it. So I like to think of building a culture of evidence-based decision making as the middle piece between evidence creation and evidence use at scale. And it's not so easy, it's more of an art than a science, and we don't like it as much. But it is the critical piece that's gonna help us get the ultimate outcomes that we want.

And so at IPA, we're doing this a lot by focusing on the local. So we're doing all these things, we're sharing the solutions, we're providing technical assistance, we're doing advocacy, and critically, as you've seen, we're engaging very deeply, particularly with our government partners, and helping them be able to use and understand evidence. But we've learned that to do this, we have to be on the ground, and we have to be there long-term. And so that's why IPA, in the past couple of years, has made a decision not to work in all 50 countries, but we have a presence in 20 countries, and offices in 14, and we're really focusing on how we can help with this iterative, collaborative process, and help answer next-phase questions, such that the evidence actually gets used along the way.

So to conclude, I don't want this to suggest that I don't think that everyone who is here for the GiveWell talk shouldn't continue giving their money to GiveWell. I'm our policy and communications person, I think that is the best kind of larger message for EA, to recruit more people. Get the best bang for your buck, invest in the most effective things, here are the most effective charities, if you're giving $100, this is where you want to give your money.

But I think we, as a community, also have to think about the creating of impact tomorrow. And we're doing this already, by continuing to support really good research, thinking about what is the next deworming. Open Phil is doing a really great job of this. And you all are doing this, but I think we can all do this more, help people understand how evidence can make their lives better, how creating evidence can actually be like a negative cost to them. So if they create evidence, it helps them cost-save, then the impact of whatever it is that they're doing, can be more cost-effective, wherever you are.

And then finally, I think this is something we're not yet doing. There's serious leverage that can happen if we think about how to actually create policy influence from all of the research that we already have. Small investments could leverage millions of dollars that have already been spent on research, to actually create sustained action, at the government level, in the countries where we all care about people's futures.

So I'm happy to take questions.

Q&A

Question: We had questions along a few different veins, but I'm going to start with some that talk a little bit more about the structure of IPA, who exactly works with whom, and how you establish your partnerships, that sort of thing. One of the questions asks what your relationship is to developed world funders. You mentioned a little bit that you get funded by USAID in some respects, but you're also working with developing countries, and yet you're a non-profit, you're not technically part of the US government. How does that work out?

Heidi McAnnally-Linz: So the question really is how IPA gets funded, and how do we collaborate with funders. I'll divide it into two parts. The answer to the first question, how does IPA get funded is that we have funding from all of the large organizations that fund research, from the Gates Foundation to USAID to a number of large individuals, and the World Bank, 3ie, etc.

How do we work with funders? It's different in every country. And one of the things that we have not invested in is working on influencing funders at a global level. I don't know if anybody from J-PAL is here, but I think in terms of the family, they have the comparative advantage to do that a little bit more than we do, because they have a much larger global policy team. Our comparative advantage as an organization is really our presence on the ground, because we're running all these trials on the ground, and we have most of our staff on the ground. We have about 60 in the US, but a thousand people around the world. So we’re really leveraging that local presence.

So for example, I was mentioning the Ghana example because we've been working closely with USAID, with a couple of other funders there, to varying degrees of success. And what we've found is that the larger the funder, the harder in. So some of the funders that have been really supportive of us are your business networks, your UBS, people who are from the business world, who are thinking about how to create leverage. When you start talking about larger aid, it gets much harder and more complicated, and requires more sustained investment of time and resources.

Question: I'm particularly curious, and I think a couple people voiced similarly, that you have a unique spread. You said you're not going out to literally every country, but you're still spread across quite a few countries, and that sort of network development seems difficult. At least, I can't imagine dropping into a random country and getting close to the people who are making policies there. And that seems like both a sensitive thing, and also just a really difficult place to put your foot in the door. How do you build those initial contacts?

Heidi McAnnally-Linz: So when I talk about the people who are doing this, I'm mostly talking about local staff. Our education policy person in Ghana is a woman who was in the education system before, she's incredibly networked, she's personable, and she just bought into why evidence matters, and actually applied for a different job with us, and we were like, "No, no, no. Come over here and do this." And this is what I mean by building evidence-based culture.

Same kind of thing in Zambia, and in Peru. So that's really what I mean about building the long-term presence. We've only been able to do that because we've invested heavily in capacity-building of our staff, and recruiting people from those countries, and having a long-term presence locally.

Question: I had a question or two also about how you choose which projects to focus on. One person said that they really like the cost-effectiveness work you've done in education, but they haven't found other attempts at cost-effectiveness calculations. They wonder, how do you choose education out of any number of things you could be doing?

Heidi McAnnally-Linz: I think that with a lot of the cost-effectiveness analysis, we can thank our friends at J-PAL for doing that. But one of the things that has happened, is that because IPA is so decentralized, we've worked with over 400 researchers in all kinds of universities, and collecting cost data has strangely been an afterthought. And so, my team said "Wait a minute, what do you mean you don't have accurate cost data? We've got impact data. We don't have cost data."

So we are making a particularly concerted effort now to collect cost data, because doing cost-effectiveness analysis when you have cost data, and you have an RCT, is pretty simple, all things considered. But when you don't have good cost data, which is difficult to collect, you don't have it. And we usually don't think about that until something's over.

So we have collected more cost data in education, partially because of the excitement around cost-effectiveness in education, and partially because of the researchers we've been working with, where we're already thinking about that. So one of the things that we're hoping to do with some of the strategic funding that we've been raising, as an organization, is more cost-effectiveness analysis in different sectors.

Question: Might you be able to shed light on what those other sectors might be?

Heidi McAnnally-Linz: Not yet. I mean health is also an easy and obvious one. I think we've been doing some in social protection, particularly with the older poor graduation studies, and we're doing a lot of replication plus of that work. So there will definitely be a cost-effectiveness analysis there.

Question: And another thing in choosing projects. Do you factor in the difficulty of implementation when you're choosing what to do, rather than just, what seems like it would be a good intervention?

Heidi McAnnally-Linz: That makes it sound like IPA has agency in deciding a lot of things! So IPA is in the middle, and we've got three groups to deal with. You've got the donors, the researchers, and the implementing partners. Our job is to bring all of them together to do a study.

Now sometimes the donor pushes us to do something, and we go out and we get the researcher and the implementing partner. Sometimes the researcher is very interested in a particular question, and so he says, "We want to do this. Are there any partners?" So we find the partners. And typically, the researcher can also pretty easily find the money.

So a lot of it is driven by donors. But increasingly, what we want to be doing more of is where the policy maker comes to us and says "This is a question we have" and we then help find the funding, and in a lot of cases, do the research ourselves, or bring our network along.

And in terms of what IPA invests in strategically, a lot of it has been in education, but I think that has to do with our leadership, and what we think the best opportunities have been, historically, not because of any particular other strategy reason.

Question: If you personally, or the people that you work with pretty closely, are trying to make something happen, are you now trying to create new policy partners? If you're saying, "We should do a new project," what's the first action that you would decide to take?

Heidi McAnnally-Linz: For the last year, I had to play an interim development director role for IPA, which was a lot of work. So now that I'm not doing that anymore, my personal work has been building our policy team on the ground. We've only, in the last year, recruited people in Kenya, and in Zambia, to do this work. In Burkina Faso and in the Philippines, we're hoping to recruit more, particularly to pursue this strategy. And we're working to identify where are the opportunities where we can have the most impact. And in those countries, there are particular projects around which I think we can go for a similar Ghana model. So we're focusing on that, and raising money to do more of that. That's my personal work.

Question: Might you be able to talk a little bit more about what it's like on the ground, insofar as you've been involved, or you know other people who have done on-the-ground work? It feels easier to talk in abstract, of these policy partners in Peru or Ghana or somewhere else, but what's the messy actual situation like?

Heidi McAnnally-Linz: I’m just trying to think of what's the right story. So I'm better at marketing stories, so the more nitty-gritty ones are not coming to the top of my head right now.

But I had not spent time in our Ghana office, other folks had led that. And I was just there in March, and I showed up to what I already knew was this really deep relationship, and I really just had no idea. I had no idea that we were co-hosting this conference together with the Ministry, and they really felt like it was their conference about evidence, until I was really there. And so we asked the director of the Ghana Education Service, which is the implementing arm of the Ministry, "So how did you think our conference went? What did you think?" She goes “Well you tell me. It was our conference." So I just didn't really understand the depth with which that was happening.

There's a USAID story along with that. They also were incredibly skeptical that Ghana was even remotely ready for something as advanced as teaching at the right level. I mean, they even said to me "Look, they need to get their teacher attendance right." And I said "Well, but I'm pretty sure," and I had to go look at the study to be sure. I was pretty sure that teaching at the right level, even the teacher-led version, increased teaching attendance. And I looked, and sure enough I was right.

So I went back to them, and I said "Look, I understand you want to improve teacher attendance. Quite frankly, that's what we're trying to do with improving implementation. What that means is, improving teachers doing what they were doing. Which, in a lot of cases, means improving them showing up. And we saw that even when poorly implemented, this was doing that, and that was one of the reasons we had outcomes. And so, let's work together and think about this."

I don't know if he completely came around, but I know that since then, we have had a lot more interaction with him, and with his team, and they're sounding a lot more excited about what we're doing.

And again, in the flip side, my team on the ground spends a lot of time sitting on the bench outside the Ministry, waiting for somebody to help them get a meeting. So in the places where we don't have this relationship already, getting in the door has been incredibly challenging. Once we've been in the door, like in Ghana, it's much easier. But there's a lot of time spent waiting for the assistant of the right person and asking “Can you help me out?” So it’s not all glamorous.

Question: I imagine we have a couple of cynics in the audience, but I'm just projecting on them that they would say "Well, there's no reason that these governments should care in the first place. Especially if it's not a particularly democratic government, why should they even care about the efficacy of these programs?" Do you find some sort of golden ticket in the midst of a corrupt government? How do you find your way in? Are we just too cynical?

Heidi McAnnally-Linz: There are lots of ways in. Donors is one way, and that's definitely been a way in for us. But I think you do find your gems, you find your champions, and those champions tend to grow and move around, and that's the long-term nature of it as well, and that's why we do think this is not a three-year project, and it's going to change everything.

That said, your skepticism is well-founded. I think it's a difficult thing, I think we don't know how to do it well, and I think it'll fail in a lot of places. But I think where it doesn't, it'll be high impact. The last thing I'll say is that we did one study that found that health workers did not actually want to take bribes, which we were very surprised by. And I think that just speaks to, when we're implementing policy, we're not necessarily talking at a very high level. We've had a couple cases in which we've engaged at an incredibly high level, with the President and the Minister himself and all that, but really, what we're talking about is changing the way things operate. And the people at those levels typically are bureaucrats that want their jobs to be doing something, more often than not.

Question: I'm going to finish with a last question, which is presumably from some people in the audience who would be interested in actually helping out this work, not just observing from the sidelines. If you could just drag and drop them into the positions you'd really like to see filled, what would they be doing? Should they be researchers? Should they try to move to one of your on-the-ground offices somewhere? Or should they just donate money?

Heidi McAnnally-Linz: At IPA, we've never struggled with getting really good entry-level researchers, RAs who are good, and then go on to do PhDs. Certainly, if that's something you want to do, we're a great place to come for that. We haven't struggled with that. Where we've really struggled is getting folks at the management level, people with a couple years of experience on the ground.

I've been pretty successful growing the policy and communications team, but people don't really think of us for that. And I saw a slide that was presented here earlier, it said that there are skills lacking both in management and operations, and also in policy outreach and in marketing, and I think that's true. So if you have any inkling in those kinds of skills, I'd love to talk to you.