Philosophy hypotheticals

superrific

Master of the ZZLverse
Messages
12,540
Let's do some hypotheticals here.

1. Suppose AI advances to the point where it can run a brain scan on you and reproduce your neural circuits perfectly in a neural network. Talking to the neural network is exactly like talking to you. Now someone gets out a gun to shoot you. The argument is that he won't kill you, because you're still alive in the neural network. Should that be a valid legal argument, or should he be charged with murder? Note that you willingly participated in that digital self-upload. Should you fear death?

2. Suppose now that AI advances even further to where it can start embedding that network into your mind. Two sub-hypotheticals here. First, suppose it would require many chips in parallel. They are inserted one by one, and each time some of your natural brain circuitry is removed and the neuronal connections are connected to the chip. After a year, your brain has been fully replaced. Are you still you? If not, when did you stop being you? And should you fear being shot, assuming that the neural network can be re-implanted in another body (human or otherwise)?

Does it change if the operation is completed all at once -- like, you go in for surgery with a brain, and you come back with a chipset that is identical in function?

3. Suppose, once that happens, you can also set up digital interfaces with other people. In particular, you can link your brain to that of your spouse, allowing you to share any thoughts and experiences as if the spouse lived them, and vice versa. Let's assume for now that you would only share the memories you want to share, but present sense impressions can be accessed at will. The downside is that it's irreversible; once you're linked to the spouse, that link will always exist and can't be removed. Would you do it?

Questions 1 and 2 would be familiar to anyone who studied the history of philosophy (or, depending on the course, in a phil course itself), but this is a new twist.
 
I think at some level most believe that humanity depends somewhat on body-first reactions. We would be different without fight or flight fear or sexual attraction. I guess generalizing everything that might be endocrine-based and is impacted by our fragility and ultimately our mortality. Is my chip brain ever going to be hangry or hungover?

For Q3 - Seems like a set up for constant expectation of empathy and thus exhausting :)
 
I think at some level most believe that humanity depends somewhat on body-first reactions. We would be different without fight or flight fear or sexual attraction. I guess generalizing everything that might be endocrine-based and is impacted by our fragility and ultimately our mortality. Is my chip brain ever going to be hangry or hungover?

For Q3 - Seems like a set up for constant expectation of empathy and thus exhausting :)
Oh, the endocrine bit messes up the hypo. LOL. Let's assume that it can replicate everything that constitutes "mind" or "mental states."

Constant expectation of empathy, sure. But also empathy would be much much easier.
 
Hypothetically speaking, if we advance to the point where we can create a digital version of ourselves we would never be able to upload our self conscious. So we wouldn't be aware of our existence beyond our own death. It'd be cool for our loved ones, or creepy.
 
Hypothetically speaking, if we advance to the point where we can create a digital version of ourselves we would never be able to upload our self conscious. So we wouldn't be aware of our existence beyond our own death. It'd be cool for our loved ones, or creepy.
Expand on this point. I think our intuitions align here but I'm not sure
 
Expand on this point. I think our intuitions align here but I'm not sure

In the hypothetical described, the digital version created would be cognisant of all memories made prior to his/her existence. Thus, he could interact with your family/friends as you would. Essentially allowing you to live beyond your death. However, your human self would not be cognisant of any memories/events which occurred after your death. So the only person who wouldn't be aware of your existence is you. Your death would be no different from your perspective.
 
In the hypothetical described, the digital version created would be cognisant of all memories made prior to his/her existence. Thus, he could interact with your family/friends as you would. Essentially allowing you to live beyond your death. However, your human self would not be cognisant of any memories/events which occurred after your death. So the only person who wouldn't be aware of your existence is you. Your death would be no different from your perspective.
So, suppose the person who was copied gets shot. Who cares? The person is dead so s/he doesn't care. Everyone else sees the person as alive. No harm, no foul?
 
So, suppose the person who was copied gets shot. Who cares? The person is dead so s/he doesn't care. Everyone else sees the person as alive. No harm, no foul?
Seriously? No one misses that person's touch or smell or taste when they kiss? Have I missed that we're are talking about some sort of cyborg? You don't think people wouldn't be creeped out by a faux dead friend? It's their visceral reactions you can't digitize.
 
Seriously?. No one misses that person's touch or smell or taste when they kiss? Have I missed that we're are talking about some sort of cyborg? You don't think people wouldn't be creeped out by a faux dead friend? It's their visceral reactions you can't digitize.
Haha. You're exposing the law professor in me. Everything gets abstracted. Fair point. I have no answer to it. So score one for you, and now let's put these factors aside for a moment and explore the "mind" issues.
 
Still stinks to high heaven for me. Seems entirely too much like enshrining the memories and not learning the lessons of living and losing.

It's hard to answer this without addressing the concept of a soul. If you do, what are the implications of a duplicate human without a soul? What happens to further relationships with family and friends as they die? Isn't the whole thing futile? If there is a soul, isn't hanging on to a physical manifestation of the person on earth like trying to keep a butterfly's cocoon viable?
 
Let's do some hypotheticals here.

1. Suppose AI advances to the point where it can run a brain scan on you and reproduce your neural circuits perfectly in a neural network. Talking to the neural network is exactly like talking to you. Now someone gets out a gun to shoot you. The argument is that he won't kill you, because you're still alive in the neural network. Should that be a valid legal argument, or should he be charged with murder? Note that you willingly participated in that digital self-upload. Should you fear death?

2. Suppose now that AI advances even further to where it can start embedding that network into your mind. Two sub-hypotheticals here. First, suppose it would require many chips in parallel. They are inserted one by one, and each time some of your natural brain circuitry is removed and the neuronal connections are connected to the chip. After a year, your brain has been fully replaced. Are you still you? If not, when did you stop being you? And should you fear being shot, assuming that the neural network can be re-implanted in another body (human or otherwise)?

Does it change if the operation is completed all at once -- like, you go in for surgery with a brain, and you come back with a chipset that is identical in function?

3. Suppose, once that happens, you can also set up digital interfaces with other people. In particular, you can link your brain to that of your spouse, allowing you to share any thoughts and experiences as if the spouse lived them, and vice versa. Let's assume for now that you would only share the memories you want to share, but present sense impressions can be accessed at will. The downside is that it's irreversible; once you're linked to the spouse, that link will always exist and can't be removed. Would you do it?

Questions 1 and 2 would be familiar to anyone who studied the history of philosophy (or, depending on the course, in a phil course itself), but this is a new twist.
I wish that I had proper time right now to write some stuff on this. Just glanced at this board, which I'm not on as much because of being tired of the Zen troll thing, but there is no more interesting topic realm that the one you raise here. Possibly I will come back to this Saturday with a few more thoughts, but for now...

Cutting edge research is beginning to work towards the rudiments of this sort of tech:



Take your thought experiments in reverse order, which ranks my interest least to most. On (3), I would not engage with that without knowing far more about how it works and has worked for others. It would seem to cause an immense confusion, say of needed separation of minds, perhaps in ways we can't guess ahead of time. There are damaging possibilities of a kind of dissolution of the personal natures of both singular minds, on initial thoughts about it.

On (2), Neuralink claims steps in this direction, but questions and controversies surround their work.


China is working intently in this field. Given the Trump catastrophe America is being left in their tech dust in dozens of fields now.

On (1), first your thought experiments: Yes, it's murder, like killing a twin while the other lives on.

(2) above; wow, this is deep diving in the thought experiment pool.

None of the above gets to your interesting notions though, and the philosophical question in (2) is the ancient Ship of Theseus paradox. For those who don't know, it's if you replace all the lumber and parts of the ship bits at time, is it the same ship when nothing original remains. I have no reason to suspect if the turnover tasks of chips (or chip sets) to neuron function works, that it would not be "me" in any noticeable or truly important way. I can give perhaps a real sense of disturbance to complacency if I explain it's never actually really the same "you" that results each time you wake from sleep, and worse than that... but I have zero time for that digression now (preview of coming attractions, possibly).

However, (!) this begins to get into what I take to be about the deepest philosophical conundrum that exists, or at least that I know of. It has an origin of sorts in Descartes' "evil demon" notion, that the world is possibly a ruse, and we can't trust "reality," but a better modern version examines the perception of one's own conscious existence through things like the Star Trek transporter question.

This is not just is what we experience real, it transforms into the far larger implication to the notion of teleporting a consciousness, and that deals with the philosophical conundrum of the role and value of the "where" of the location of the information stream* that is conscious thought. Imagine your physically instantiated consciousness (neuron arrangement in a fraction of a second) is copied with perfect (or perfectly adequate) fidelity, and then it arrives in an environment, like a simple room, that is perfectly identical to the one the original is in. Are there two conscious minds in this scenario... or just one? There is no divergence of thought or experience, both streams have zero difference from external or internal input and influence at this point. If there is one mind in two locations at this point, then mind only has a location at the point of divergence of thought (from randomicity, at some level, percolating upward to say, a single neuron firing in a different way). Or is it the case that multiple tracks of identical synchronous thought mean multiple fully identical minds?

I don't know of a more difficult philosophical problem than this one.

* It's not a "stream" (something else I will possibly come back to), but go with that as a model for now.
 
The country boy's version of the Ship of Theseus.

That's my granddaddy's axe. My daddy replaced the handle and I replaced the head but it's my granddaddy's axe.

Heard that one twenty years earlier than the first one.
 
The country boy's version of the Ship of Theseus.

That's my granddaddy's axe. My daddy replaced the handle and I replaced the head but it's my granddaddy's axe.

Heard that one twenty years earlier than the first one.
Exactly. Gets to the idea that the naming of things is the bestowed identity of things.

I glossed over the Star Trek transporter question, of, would you be afraid to get into it? The person who comes out the other side, thinks everything is fine and it worked, but did the person who stepped into it die? The person who stepped out has named themselves the same me, and a success in transporting one person, just as the ax is named.
 
Back
Top