"Slower tech development" can be about ordering, gradualness, or distance from now

By MichaelA🔸 @ 2021-11-14T20:58 (+47)

Epistemic status: Shower thought quickly written (once dry). Partly just a reframing of existing ideas. See also other work on differential progress. My examples are oversimplified, and I don’t necessarily actually endorse the intervention ideas mentioned.

Summary

Let’s say, for example, that we want certain major advances in AI or synthetic biology to occur 50 rather than 5 years from now. I think there are three major reasons why we might want that - three different “active ingredients” that could account for the potential benefits of slower technology development:

I think it would often be useful to explicitly distinguish between these reasons and consider how much we care about each in a given case, because they suggest different interventions and different factors to consider. I give some examples below.

Ordering

We might think it would be good if a given technology (let’s give it the imaginative name “technology X”) were developed after some other specific development(s), because those other developments reduce the risks from technology X. For example, we might want various developments in AI capability to occur after rather than before various developments in AI safety, alignment, policy, or governance.

For this, absolute distance from now doesn’t matter in itself, but rather serves as a useful proxy - since, generally speaking, the longer it’ll be till technology X is developed, the more likely it is that the risk-reducing development(s) occur first. For example, we might not mind whether technology X will be developed 5 or 50 years from now if we believed that, either way, the risk-reducing developments are equally likely to occur first.

The more we care about ordering, the more we might be interested in:

Gradualness

We might think it would be good if technology X were developed gradually rather than suddenly/“discontinuously”. That is, roughly speaking, we might want it to be the case that the development proceeds in many small steps rather than a smaller number of big jumps (separately from how long from now those steps/jumps occur). See also AI Takeoff and Strategic considerations about different speeds of AI takeoff.

I think there are basically four key reasons why more gradual development could be good:

For gradualness, as with ordering, absolute distance from now doesn’t matter in itself, but rather could serve as a somewhat useful proxy - that is, a technology being developed further in the future could serve as some evidence that it will be developed more gradually. (Though the opposite can also be true - e.g. if AI software improvements happen further in the future, that could increase the chance that there’s a large hardware overhang at that point, which could increase takeoff speeds.)

The more we care about gradualness, the more we might be interested in:[2]

Distance from now

Finally, we might think it would be better if the technology were developed a longer time from now, for reasons unrelated to how gradually it’s developed or whether it’s developed before/after specific risk-reducing developments occur. I think the main reason for this is if various risk-reducing efforts are already occurring or are expected to occur in future by default (not just in response to initial steps of a gradual development progress), such that extending timelines would “buy more time” and mean that more such work occurs by the “deadline”.[3]

There are two ways this is distinct from “ordering”:

The more we care about distance from now, the more we might be interested in:

My thanks to Neil Dullaghan, Ben Snodin, and James Wagstaff for helpful comments on an earlier draft.


  1. See also the idea of “nearsightedness” in The timing of labour aimed at reducing existential risk. ↩︎

  2. We could also in theory try to accelerate some initial steps, if we were somehow justifiably convinced we could do this without thereby also accelerating later steps to a similar extent. One reason that condition could hold is if we think a certain risky technology (a) would be rapidly developed by or with the assistance of sufficiently advanced AI systems but (b) is very unlikely to be developed before then. If so, then nearer-term steps towards that technology have little effect on when the technology will reach a particularly risky stage of maturity, but could still inspire and inform risk-reduction efforts. But I expect that sort of scenario to be relatively rare, and I think anyone considering accelerating initial steps based on that sort of logic should seriously consider downside risks related to possibly worsening the order of developments and reducing the distance between now and the technology’s development. ↩︎

  3. Another possible reason it could be better if a technology were developed a longer time from now is if we think the world is simply becoming safer, more cooperative, more stable, or similar over time for reasons other than “risk-reducing efforts”. For example, we might think international relations, cooperation, and governance will gradually improve for reasons related to things like a desire to facilitate profitable trade and improve near-term health outcomes, and this will happen to also mean that technological developments will be less risky if they happen later, once such processes have had longer to play out. ↩︎


tessa @ 2021-11-15T17:43 (+5)

I like this distinction! Trying to find examples from biotechnology:

Ordering: you'd prefer cheap benchtop DNA printers to be developed after decent screening mechanisms, Ă  la SecureDNA or Common Mechanism

Gradualness: environmental deployment of gene drives, maybe? (mostly for the "more time with more clarity" reasons of wanting a fair bit of time to observe how these work in practice)

Distance from now: germline gene editing of humans (people like Doudna have often called for a "society-wide conversation" + more time to develop norms for this before we deploy it)

willbradshaw @ 2021-11-14T21:20 (+3)

Thanks, I find this distinction helpful!

Minor: my brain really wants to interpret "order" here in the sense of "law and order" rather than "order of operations". I interpreted the title in this sense. Maybe try for a synonym? ("ordering" springs to mind)

MichaelA @ 2021-11-15T08:36 (+3)

Yeah, good point + suggestion , thanks! I've now switched to "ordering". "Sequence" could also perhaps work.