Ethics in the age of androids and cyborgs

This week I’ve thus far been fairly focused on the ethical implications of technological advancement. I posted previously about the 1995 anime, Ghost in the Shell, but I spoke in quite general terms about the themes. Today, let’s take a closer look at a couple of scenes:

    Motoko & Batou pass judgement on the Garbage Collector (Ghost in the Shell, 1995)

This clip [1m12, from 23 minutes into the film] shows the capture of the Garbage Collector, and the reactions of Motoko and her second in command, Batou. Motoko seems to almost spit out her questions, “Can you remember your mother’s name or what she looks like? Or how about where you were born? Don’t you have any happy childhood memories? Do you even know who you are?” Her reference to memory as indicative of identity parallels Blade Runner – but it is the viciousness of her questioning which intrigues me from an ethics perspective. The Garbage Collector was human, but has had his ‘ghost’ (consciousness or soul) hacked. Yet somehow it is his fault: “Ghost hacked humans are so pathetic,” says Batou. It seems like an attack on a victim (What were you thinking wearing that skirt? Drinking? Were you asking for it?).

Screenshot from the 1995 anime Ghost in the Shell

In this clip [2m30, from 47m25 into the film] the Puppetmaster, a non-human who has hacked a cyborg, asks for asylum in section 9, raising questions about what it is that differentiates man and machine:

Puppet Master: What you are now witnessing is an action of my own free will. As a sentient life form, I hereby demand political asylum.

Chief Aramaki: Is this a joke?

Nakamura: Ridiculous! It’s programmed for self-preservation!

Puppet Master: It can also be argued that DNA is nothing more than a program designed to preserve itself. Life has become more complex in the overwhelming sea of information. And life, when organized into species, relies upon genes to be its memory system. So man is an individual only because of his intangible memory. But memory cannot be defined, yet it defines mankind. The advent of computers and the subsequent accumulation of incalculable data has given rise to a new system of memory and thought, parallel to your own. Humanity has underestimated the consequences of computerization.

Nakamura: Nonsense! This babble is no proof at all that you’re a living, thinking life form!

Puppet Master: And can you offer me proof of your existence? How can you, when neither modern science nor philosophy can explain what life is?

Chief Aramaki: Who the hell is this?

Nakamura: Even if you do have a Ghost, we don’t offer freedom to criminals! It’s the wrong place and time to defect.

Puppet Master: Time has been on my side, but by acquiring a body, I am now subject to the possibility of dying. Fortunately, there is no death sentence in this country.

[quotes from wikiquote]

Of course, Ghost in the Shell is fictional, and the notion that we could achieve a state of technological advancement wherein it was possible to upload human consciousness to a machine, or for a machine to develop consciousness (singularity) remains questionable, even amid claims that the first head transplant could occur in the UK in 2017. Yet we are already at a stage where we need to think of the ethics involved in the decisions that machines make, as the previous posts/feed on self-driving cars has indicated. Beyond that, what of the rights of machines should they gain consciousness? On our screens we see their fates played out desperately (Westworld, for example), or alternatively, our own fate is portrayed as under threat (by Stephen Hawking, for example).

If humans are to be accorded different, more privileged rights than machines, they need to actually behave differently to them. Perhaps the increasingly human likeness of some machines is a call to bolster our own humanity, through empathy and the like, so as to truly differentiate ourselves from machines. Thoughts?