A few weeks ago, the Emmy Award nominations were announced. Normally, this is an event that I ignore with great abandon. However, it appears that this will be a banner year for Man-Machine interaction. Two futuristic properties got nods: Westworld and Black Mirror's episode 'San Junipero' both got multiple nominations.
I've watched all of Westworld Season 1 and thought it was some of the best discussion on the nature of consciousness and what it means to have self determination. (Did Maeve get back off the train to find her daughter because she wanted to or because she was programmed to? Is there a difference?) However, I had not watched San Junipero, though I have seen a few other Black Mirror episodes. So, I thought to give it a watch.
[Spoilers after the break]
Should We Die?
First off, I did enjoy San Junipero. It tells a nice story, doesn't slap you in the face with alternative life styles the way that the Wachowski siblings do. (For what it's worth, enjoy your life and love however you want and I'll do the same). The 'twist' is handled with grace instead of some third act surprise: there are plenty of clues and none of the writing felt like it was trying to hide anything but also not handing it to the viewer on a plate.
Where I do take a bit of exception to the story is in the ending. Should Kelly have joined Yorkie in San Junipero? There is some thought that, despite the scene that is interlaced with the credits, she may not have, but for the purposes of this article, I'm going to assume that Kelly is there in San Junipero. My problem is that old Kelly (almost) does a good job building a case for actually dying and not [spoiler] uploading. If the person (man or woman) that you loved for forty years, that you built your life around is not going to be there, then can it really be an afterlife worth living? Even if there may be new love? Then she is there, and gives us a quick answer: yes it is.
I Would Pass Over
This is not to say that this is a personal conflict. I would upload myself into this other reality in a heartbeat. After I finish raising my daughter. After I explore more of my current reality. But when my quality of life in this world is low enough, then I would cross over without a thought.
But that's me. Is there a case for someone to choose to fully, completely, no-safety-net die? Or if the option exists to extend your life and you do not take it, are you committing suicide? Heck, is suicide even a sin in an overpopulated world?
Of course, I don't have an answer to these moral dilemmas. I'm just some unemployed marketing guy with a keyboard. My sophomore philosophy class tells me that this is most likely an individual choice that is highly dependent on each person's life situation.
What I can say with certainty is that these kinds of questions interest me: what are the consequences of technology on the human condition? Black Mirror, as a series, usually does a great job of highlighting these issues and exploring them. I do not feel that they did as good a job looking over these ramifications and instead chose a crowd pleasing ending, something that they have avoided in most of their other episodes.
Can It Be Done
For all of the moral ambiguity that might exist in this episode, none of it really matters right now because we do not have this choice. But will we? Is this even possible?
There are plenty of people working on the problem, including AI doom speaker Elon Musk with his Neuralink project. Most of these take the approach that if we can map the firing of the neurons in an individual brain, get a good look at them over time and how their firing changes with different stimuli, then we can start to predict how that brain behaves. Once the model can accurately predict that brain's behavior, then the model is thinking like that brain. Record out the memories, dump the whole mess into a server, apply processing power and, voila, consciousness recorded.
Only, it has not proven to be quite that easy. Part of the problem is figuring out how to map the brain. Things like fMRI are out there, but are only just beginning to probe how the brain responds when we think. Other projects are focused on getting commands out of the brain without first having to through our physical meat bags.
Maybe the biggest challenge might be our memories. An article in Scientific American questions whether we can get those memories out and, even if we did, would they be 'you'? The basis of this thought is that we human are constantly editing our memories. Every time we remember something, we reinforce it in favor of things that we choose not to remember. Furthermore, when we remember something, if our memory is inaccurate, those inaccuracies become part of the memory. The next time that we remember it, we are not remembering the original memory, but the memory of the last time that we remembered it. As a result, we are not the sum or our memories, but the sum of the mistakes we have added. Computerizing our memories will turn the silk of our lives into coarser nylon.
And but so
San Junipero side steps all of the 'can it' and focuses on the 'should it', landing with a 'yes'. I do not believe it is that cut and dried, but none of it matters until the capability exists. Then we can really start debating the need for real death or not. Either way, it will be really nice to have the choice.
No comments:
Post a Comment