Can you have an egoistic preference about your own birth?

By Mati_Roy @ 2020-07-16T03:14 (+5)

Ex.: a preference for personally being born or not being born.

For simplification, I propose we first assume the instant birth of an adult human.


Mati_Roy @ 2020-07-16T03:18 (+7)

It seems to me like you can't.

I think I can imagine someone that doesn't want to live, and so it might end up equivalent as wanting to die as soon as they are born. But in that case, living 2 minutes would be twice as bad as living 1 minute. I don't see the "first minute" / the birth as having a qualitative difference. I think it would be possible in principle to create a mind that cares more about the first minute, but that still wouldn't literally be a preference about the birth itself. And in any case, I doubt humans have such preferences.

Preferences seem to be about how you want the/your future to be (or how your past self wished its future would have been). But being born isn't something that happens *to* you. It happens, and *then* things start happening to you.

You could have an altruistic preference of not creating other minds, but it wouldn't be an egoistic preference / it doesn't directly affect you personally.

Related thought experiment

I create a mind that is (otherwise) causally disconnected from me (and no other minds exist). That mind wants to create a flower, but won't be able to. It's their only preference. They don't have a preference about their existence.

Is it bad to have created that mind?

It doesn't personally affect anyone. And they personally don't care about having been created (again: they don't have any preference about their existence). So is it bad to have created them?

See related thread on the Effective Altruism Polls Facebook group.

Mati_Roy @ 2020-07-17T19:29 (+1)

follow-up question

imagine creating an image of a mind without running it (so it has experience 0 minute, but is still there; you could imagine creating a mind in biostasis, or a digital mind on pause)

would most self-labelled preference utilitarians care about the preferences of that mind?

if the mind wants and does stay on pause, but also has preferences about the outside world, do those preferences have moral weight? to the same extent as the preferences of dead people?

imagine creating an image of a mind without running it (so it has experience 0 minute, but is still there; you could imagine creating a mind in biostasis, or a digital mind on pause)

would most self-labelled preference utilitarians care about the preferences of that mind?

if the mind wants and does stay on pause, but also has preferences about the outside world, do those preferences have moral weight? to the same extent as the preferences of dead people?

MichaelStJules @ 2020-07-17T23:07 (+2)

What does it mean for it to have a preference if it's never been run/conscious? Is it functionality/potential, so that if it were run in a certain way, that preference would become conscious? In what ways are we allowed to run it for this? I'd imagine you'd want to exclude destroying or changing connections before it runs, but how do we draw lines non-arbitrarily? Do drugs, brain stimulation, dreams or hallucinations count?

It seems that we'd all have many preferences we've never been conscious of, because our brains haven't been run in the right ways to make them conscious.

I wouldn't care about the preferences that won't become conscious, so if the mind is never run, nothing will matter to them. If the mind is run, then some things might matter, but not every preference it could but won't experience.

I think there are some similarities with the ethics of abortion. I think there's no harm to a fetus if aborted before consciousness, but, conditional on becoming conscious, there are ways to harm the future person the fetus is expected to become, e.g. drinking during pregnancy.

Mati_Roy @ 2020-07-17T19:29 (+1)

I do think it's possible for a mind to not want things to happen to it. I guess it could also have a lexical preference to avoid the first experience more than the subsequent ones, which I guess would be practically equivalent to not wanting to be born (except the edge case of being okay with creating an initial image of the mind if it's not computed further)

Mati_Roy @ 2020-07-17T19:28 (+1)

Seth Nicholson wrote as a comment on Facebook:

I don't think this argument works. "I have a preference for someone to travel back in time and retgone me" is perfectly coherent. It is, as far as we know, not physically possible, but why should that matter? People have preferences for lots of things that they can't possibly achieve. Immortality is a classic.

I responded:

I don't think "time" is fundamental to the Universe. but let's say it is. by some "meta-time" you will (in the future) go in the past. you still have existed before you went back in time.