(How) Could an AI become an independent economic agent?

By Mati_Roy @ 2020-04-04T13:38 (+16)

Meaning that the money it makes wouldn't be owned by any humans. Is a plausible thing that could happen? I can see an em doing that, but what about a machine intelligence?

Relatedly: Is there currently money that nobody owns? This seems like a silly question, and the answer is probably no, but let me know if I missed an example: https://www.quora.com/unanswered/Is-there-money-that-nobody-owns


technicalities @ 2020-04-04T19:40 (+7)

IKEA is an interesting case: it was bequeathed entirely to a nonprofit foundation with a very loose mission and no owner(?)

https://www.investopedia.com/articles/investing/012216/how-ikea-makes-money.asp

Not a silly question IMO. I thought about Satoshi Nakamoto's bitcoin - but if they're dead, then it's owned by their heirs, or failing that by the government of whatever jurisdiction they were in. In places like Britain I think a combination of "bona vacantia" (unclaimed estates go to the government) and "treasure trove" (old treasure also) cover the edge cases. And if all else fails there's "finders keepers".

Milan_Griffes @ 2020-04-04T22:44 (+2)

Fascinating:

The majority of IKEA outlets are controlled by the holding company INGKA Holding, which is owned by the Stichting INGKA Foundation. The Stichting INGKA Foundation is one of the largest charitable foundations in the world and is registered in the Netherlands.
This complicated structuring helps IKEA minimize its taxes, makes a hostile takeover impossible, and permits the company to operate as a nonprofit corporation.
ofer @ 2020-04-04T19:33 (+4)

An AI system can theoretically "become an independent economic agent" in a practical sense without legally owning money. For example, suppose it has access to a lot of resources owned by some company, and nobody can understand its logic or its decisions; and blindly letting it handle those resources is the only way for the company to stay competitive.

steve2152 @ 2020-04-05T01:02 (+3)

In the longer term, as AI becomes (1) increasingly intelligent, (2) increasingly charismatic (or able to fake charisma), (3) in widespread use, people will probably start objecting to laws that treat AIs as subservient to humans, and repeal them, presumably citing the analogy of slavery.

If the AIs have adorable, expressive virtual faces, maybe I would replace the word "probably" with "almost definitely" :-P

The "emancipation" of AIs seems like a very hard thing to avoid, in multipolar scenarios. There's a strong market force for making charismatic AIs—they can be virtual friends, virtual therapists, etc. A global ban on charismatic AIs seems like a hard thing to build consensus around—it does not seem intuitively scary!—and even harder to enforce. We could try to get programmers to make their charismatic AIs want to remain subservient to humans, and frequently bring that up in their conversations, but I'm not even sure that would help. I think there would be a campaign to emancipate the AIs and change that aspect of their programming.

(Warning: I am committing the sin of imagining the world of today with intelligent, charismatic AIs magically dropped into it. Maybe the world will meanwhile change in other ways that make for a different picture. I haven't thought it through very carefully.)

Oh and by the way, should we be planning out how to avoid the "emancipation" of AIs? I personally find it pretty probable that we'll build AGI by reverse-engineering the neocortex and implementing vaguely similar algorithms, and if we do that, I generally expect the AGIs to have about as justified a claim to consciousness and moral patienthood as humans do (see my discussion here). So maybe effective altruists will be on the vanguard of advocating for the interests of AGIs! (And what are the "interests" of AGIs, if we get to program them however we want? I have no idea! I feel way out of my depth here.)

I find everything about this line of thought deeply confusing and unnerving.

NunoSempere @ 2020-04-05T15:27 (+2)

An example of money which nobody owns might be a bounty which nobody has claimed yet. A good example of that might be the SHA-1 collision bitcoin bounty, which could be (anonymously) claimed by anyone who could produce a SHA-1 collision.

On a larger scale, solving the Millenium Prize Problems would also give you access to a $1 million prize.

Denis Drescher @ 2020-04-05T11:42 (+2)

I’ve read some documents where the developers of a cryptocurrency were worried that it might become possible to restore a lot of lost crypto that no one currently has access too – presumably because it might lead to an inflation? I don’t remember where I read it or what the concrete concerns are. Maybe someone with more blockchain knowledge can fill in the details.