Our research process: an overview from Rethink Priorities’ Global Health and Development team

By Rethink Priorities, Melanie Basnak🔸, Greer Gosnell, Ruby Dickson, jenny_kudymowa, Tom Hird, bruce, JamesHu, Erin Braid @ 2023-03-20T17:02 (+63)

Summary

Rethink Priorities’ Global Health and Development team is a multidisciplinary ten-person team conducting research around various global health, international development, and climate change topics. We have so far mostly done “shallow” style reports for Open Philanthropy, though we have also worked for other organizations, and have conducted some self-driven research. This post aims to share our current research process. The hope is to make our research as transparent as possible.

About the team

The Global Health and Development (GHD) team is one of the newer departments at Rethink Priorities (RP). It officially formed in Q3 2021, and throughout 2022 the team grew from the initial four hires to its current 10 members. Our team consists of two senior research managers (Tom Hird and Melanie Basnak) overseeing eight researchers of different seniority (Greer GosnellAisling LeowJenny KudymowaRuby DicksonBruce TsaiCarmen van SchoubroeckJames Hu, and Erin Braid). GHD team members have expertise in economics, health, science, and policy, and bring experience from academia, consultancy, medicine, and nonprofit work.

Our past research reports

Rethink Priorities is a research organization that strives to generate impact by providing relevant stakeholders with tools to make more informed decisions. The GHD team’s work to date has mainly been commissioned by donors looking to have a positive impact. Since its inception, the team has completed 23 reports for five different organizations/individuals, as well as two self-driven reports. We have publicly published four of these reports: 

  1. How effective are prizes at spurring innovation?
  2. Livelihood interventions: overview, evaluation, and cost-effectiveness
  3. The REDD+ framework for reducing deforestation and mitigating climate change: overview, evaluation, and cost-effectiveness
  4. Exposure to Lead Paint in Low- and Middle-Income Countries

Whenever possible, we want to disseminate our findings to maximize our impact. We intend to publish 13 of the remaining 19 reports that we have previously completed, but not yet shared publicly.[1] Going forward, we hope to be able to publish within three months of their completion.[2]

Most of our past reports (78%) have been commissioned by Open Philanthropy (OP). The projects we typically do for OP are “shallow” investigations looking into specific cause areas (e.g., hypertension, substandard and falsified drugs). These reports usually contain the following:

We have also done different types of work (for OP and others), including red-teaming (providing an outside skeptical challenge to existing work/ideas), investigating specific uncertainties around a topic following a previous report on it, and exploratory/strategy reports on relevant research within the effective altruism (EA) space.

Our research process

Our workflow

Most of our projects involve collaboration across two to three researchers of different seniority. We typically ensure that there is one senior researcher per project to act as “project lead,” making most of the coordination efforts and ensuring, along with the manager, that the project is on track.[3]

Our commissioned projects usually kick off with a brief from the client that contains research questions that guide and structure our research. For internal projects (and some commissioned projects), the managers put together the briefs.

Most of our research projects, regardless of their nature or topic, involve the following components:

The amount of time spent on a given project depends on features like its scope and the number of researchers involved. The average project has involved about 60% of two full-time researchers’ time over the course of five weeks, though some projects have taken just one to two weeks.

Our reports undergo several rounds of internal review. During these periods (often in the middle and at the end of each project), the manager overseeing that project will thoroughly review drafts. Often, the other manager (and sometimes a researcher not involved in that project) will also act as a reviewer. Reviews have usually taken place ~two days before the draft or final report was due to be completed, allowing some time for the researchers to address standing comments, doubts, or concerns. In the context of commissioned research, we send this version of the report to the client.

We then spend some extra time finalizing and polishing the report for publication. This step involves checking for consistent formatting, reaching out to experts to ensure their views are represented accurately and securing permission to quote them publicly, adding an editorial note and an acknowledgments section, and conducting a final (and particularly thorough) round of internal review.

The timeline of a typical project

Next is an example timeline for a typical project to date:

Throughout the course of the project, we have recurring team meetings to discuss progress, and we may reach out to the client via email or have weekly check-in calls with them to ensure short feedback loops.

Some general principles

Across topics and project types, there are some underlying principles that remain constant:

Future developments

Our research process has been evolving and will continue to do so. To ensure our research continually improves in rigor and thoroughness, we periodically revisit our processes. As our emphasis shifts toward internally driven research, the features and format of our reports and methodological approaches could also change.

We aim to incorporate relevant aspects (e.g., assumptions, moral weights) of research outputs from other organizations if we think they are well supported and will improve the conclusions of our reports.

We have begun to assemble guides related to some of our primary research components. For example, we are currently working on a cost-effectiveness analysis guide to converge on a more unified and replicable framework. In the spirit of transparency and collaboration, we hope to eventually make our internal guide publicly available.

We mentioned above that our reports go through several rounds of internal review. We would like to encourage and participate in external review processes in the future, for instance among researchers in other global health, development and climate organizations and/or from academics with relevant expertise. We imagine this being a collaborative endeavor, where other researchers review some of our work, and we review some of theirs.

Contributions and acknowledgments

This post was written by Melanie Basnak with feedback from the full GHD team. We would like to thank Adam Papineau for copyediting and Rachel Norman for reviewing the post and providing useful suggestions. If you are interested in Rethink Priorities’ work, you can sign up for our newsletter. We use it to keep our readers updated about new research posts and other resources.


 

  1. ^

     Some of our reports cannot be published because we have not secured permission from our clients to do so, and there are good reasons to withhold some of them. Other reports are very niche and we do not think there would be a lot of value in publishing them, so the trade-off between time invested in preparing them for publication and the value readers might get out of them is not enough to compel us to publish them.

  2. ^

     Our publication process has been delayed in the past due to the limited size of our team, with researchers spending most of their time tackling new projects as soon as previous projects were completed. With more staff, we are now making progress to shorten the window between project completion and publication.

  3. ^

     This is not always the case. Three projects to date have been carried out by a single researcher, and four were completed without a senior researcher on board.

  4. ^

     For more on reasoning transparency, see this research report by Luke Muehlhauser of OP.


justaperson @ 2023-03-20T23:56 (+8)

Thanks for sharing. I'm not a professional researcher, but spend a fair bit of time researching personal projects, areas of interest, etc., and enjoy learning about different exploration frameworks and processes. As a generalist myself, it can sometime be difficult to know if you're adding signal or noise to a picture you've yet to fully envisage -- particularly where a high-level of outside domain or technical knowledge is necessary. 

In my experience, beneficial answers are often the result of pinging the right sources with the right queries. This alone can be a difficult chain to establish,  but there's a deeper layer that strikes me as paradoxical: In most cases: the person/team/org seeking knowledge is also the arbiter of information. So...

bruce @ 2023-03-21T09:37 (+13)

Thanks for engaging! I'll speak for myself here, though others might chime in or have different thoughts.

  • How do you determine if you're asking the right questions?
    • Generally we ask our clients at the start something along the lines of "what question is this report trying to help answer for you?". Often this is fairly straightforward, like "is this worth funding", or "is this worth more researcher hours in exploring". And we will often push back or add things to the brief to make sure we include what is most decision-relevant within the timeframe we are allocated. An example of this is when we were asked to look into the landscape of the philanthropy spending for cause area X, but it turns out that excluding the non-philanthropic spending might end up being pretty decision relevant, so we suggested incorporating that into the report.
    • We have multiple check-ins with our client to make sure the information we get is the kind of information they want, and to have opportunities to pivot if new questions come up as a result of what we find that might be more decision-relevant.
  • What is your process for judging information quality?
    • I don't think we have a formalised organisational-level process around this; and I think this is just fairly general research appraisal stuff that we do independently. There's a tradeoff between following a thorough process and speed; it might be clear on skimming that this study is much less updating because of its recruitment or allocation etc, but if we needed to e.g. MMAT every study we read this would be pretty time consuming. In general we try to transparently communicate what we've done in check-ins with each other, with our client, and in our reports, so they're aware of limitations in the search and our conclusions.
  • Do you employ any audits or tools to identify/correct biases (e.g. what studies you select, whom you decide to interview, etc.)? 
    • Can you give me an example of a tool to identify biases in the above? I assume you aren't referring to tools that we can use to appraise individual studies/reviews but one level above that?
    • RE: interviews, one approach we frequently take is to look for key papers or reports in the field that are most likely to be decision-relevant and reach out to its author. Sometimes we will intentionally aim to find views that push us in opposing sides of the potential decision. Other times we just need technical expertise in an area that our team doesn't have. Generally we will reach out to the client with the list to make sure they're happy with the choices we've made, which is intended to reduce doubling up on the same expert, but also serves as a checkpoint I guess.
    • We don't have audits but we do have internal reviews, though admittedly I think our current process is unlikely to pick up issues around interviewee selection unless the reviewer is well connected in this space, and it will similarly likely only pick up issues in study selection if the reviewer knows specific papers or have some strong priors around the existence of stronger evidence on this topic. My guess is that the likelihood of the audits making meaningful changes to our report is sufficiently low that if it takes more than a few days it just wouldn't be worth the time for most of the reports we are doing. That being said, it might be a reasonable thing to consider as part of a separate retrospective review of previous reports etc! Do you have any suggestions here or are there good approaches you know about / have seen?
justaperson @ 2023-03-23T06:21 (+1)

Thanks for your explanations!

Re: Questions

Apologies…I mean the questions your team decides upon during your research and interview processes (not the initial prompt/project question). As generalist, do you ever work with domain experts to help frame the questions (not just get answers)?

Re:  Audit tools

I realize that tools might have sounded like software or something, but I’m thinking more of frameworks that can help to weed out potential biases in data sets (ex. algorithm bias, clustering illusion, etc.), studies (ex., publication bias,  parachute science, etc.), and individuals (ex. cognitive bias(es), appeal to authority, etc.). I’m not suggesting you encounter these specific biases with your research, but I imagine there are known (and unknown) biases you have to check for and assess.

Re: Possible approach for less bias

Again, I’m not a professional researcher, so I don’t want to assume I have anything novel to add here. That said, when I read about research and/or macro analysis, I see a lot of emphasis on things like selection and study design — but not as much on the curation or review teams i.e. who decides?

My intuition tells me that — along with study designs — curation and review are particularly important to weeding out bias. (The merry-go-round water pump story in Doing Good Better comes to mind.) You mentioned sometimes interviewing differing or opposing views, but I imagine these are inside the research itself and are usually with other academics or recognized domain experts (please correct me if I'm wrong). 

So, in the case of say, a project by an org from the Global North that would lead to action/policy/capital allocation in/for the Global South, it would seem that local experts should also have a “seat at the table” — not just in providing data — but in curating/reviewing/concluding as well.

Oscar Delaney @ 2023-03-21T09:49 (+5)

Great! I am curious why publishing has been so slow - I would have assumed it is easiest to put it up roughly immediately while the project is fresh in your mind and before the research is out of date. Also, I was pleased to see that the time estimates stack up pretty well in my ballpark calculation: research supply = 1.5 years * 48 work weeks/year * 7 researchers = 504 researcher-weeks research use = 6 weeks/report * 3 researchers * 23 reports = 414 researcher weeks Which is pretty close for a calculation like this I reckon :)

bruce @ 2023-03-21T10:46 (+5)

Thanks for this! Yeah, the research going out of date is definitely a relevant concern in some faster-moving areas. RE: easiest to put it up ~immediately - I think if our reports for clients could just be copy pasted to a public facing version for a general audience this would be true, but in practice this is often not the case, e.g. because the client has some underlying background knowledge that would be unreasonable to expect the public to have, running quotes by interviewees to see if they're happy with being quoted publicly etc.

There's a direct tradeoff here between spending time on turning a client-facing report to a public-facing version and just starting the next client-facing report. In most cases we've just prioritised the next client-facing report, but it is definitely something we want to think more about going forward, and I think our most recent round of hires has definitely helped with this.

In an ideal world the global health team just has a lot of unrestricted funding to use so we can push these things out in parallel etc, in part because it is one way (among many others we'd like to explore) of helping us increase the impact of research we've already done, and also because this would provide extra feedback loops that can improve our own process + work.

Oscar Delaney @ 2023-03-21T23:18 (+4)

Thanks, makes sense re funding and tradeoffs. I think it would be understandable if you decided for some fraction of your research projects that it would be too much work to write up for a public audience, my guess would be that there is something of a bimodal distribution or something where writing it up immediately or never are best and writing it up later is dominated by immediately. Also, there may already be this somewhere that I have missed, but (except of course for any secret/extra-sensitive projects) it seems low cost and potentially quite valuable to put up a title and perhaps just a one-para abstract of all the projects you have done/are doing, so that anyone else researching a similar topic can reach out, or even deprioritise researching that if they know you already have and are just yet to publish.

bruce @ 2023-03-22T07:23 (+5)

it seems low cost and potentially quite valuable to put up a title and perhaps just a one-para abstract of all the projects you have done/are doing

This is a great suggestion, thanks!

Vasco Grilo @ 2023-03-24T06:38 (+3)

Thanks for sharing your process!

We aim to incorporate relevant aspects (e.g., assumptions, moral weights) of research outputs from other organizations if we think they are well supported and will improve the conclusions of our reports.

Since you mention moral weights, are you considering addressing the effects on animals? I think it would quite important. I estimated:

Melanie Basnak @ 2023-05-30T14:47 (+5)

Hi Vasco, I apologize for the delayed response. Because of capacity constraints, we can’t always address all comments, so we prioritize them based on relevance/importance and upvotes. To answer your question, we don’t currently address the effects of the interventions on animals. As we mention in the post, most of our work to date has been commissioned. Because of this, the questions we seek to answer and the scope associated with those questions are often decided by the client (though we only work with value-aligned clients, in topics we think are relevant and could be impactful). So far, our clients haven’t wanted us to assess the effect of potential interventions on animals, and we haven’t done so. If we encountered prospective clients interested in this topic, or if RP was interested in conducting internal research on this topic, this is something that would likely fall under the new Worldview Investigations Team, since because of their mission and expertise they are better positioned to tackle this question. I encourage you to stay tuned to their future research and invite you to join our newsletter in case WIT publishes topics of interest to you! Thank you for your interest and for sharing your estimates.

Vasco Grilo @ 2023-05-30T16:21 (+2)

Thanks for the update, Melanie!

Carl Otto Schell @ 2023-03-23T11:01 (+1)

This look great!

How do you select projects and how are you funded? 
Do do commissioned work or pro-bono work or both? 
Are you a buisness or an NGO? 


What would be the (ballpark) cost of a 6-week project? 

Melanie Basnak @ 2023-03-23T13:57 (+1)

Hi Carl, thank you!

How do you select projects and how are you funded? 

We work a lot with Open Philanthropy. We believe in their mission and see a clear path to impact through them. We are value aligned, and are usually also aligned in terms of topics of interest/topics we think could be impactful to look into. We have a long-term arrangement with them and they commission projects from us. We also work with other clients, usually on a project-basis. For these other clients, the projects have been a mix of either them asking for a specific project, us pitching a specific project, or a combination (e.g. each party shares a list of projects). We decide to move forward with a commissioned project if we think it could be impactful (either because we are aligned with the funder and see the path to impact through their decision-making, because we think the topic is important and it could be impactful to publish research on it, or usually both of those).

Do you do commissioned work or pro-bono work or both? 
Our team mostly does commissioned work, though we have started doing some internal research which is self-driven but hopes to be helpful for the community. We would like to do more of it, but need more unrestricted funds to do so.

Are you a business or an NGO? 

We are an NGO


What would be the (ballpark) cost of a 6-week project? 

This depends on the size of the organization commissioning the project, and whether it's a standalone project or we have a longer term contract with them.