In many ways this news article demonstrates the gap between sci-fi and reality. Cyberpunk imagined a world in which human minds could be uploaded to machines, and humans, as immortal cyborgs, could live alongside androids who performed ‘the work’. Medical science has made incredible leaps – with his titanium chest Edward Evans (further) illustrates William’s and Bendelow’s (1998, pp. 79-85) claim that humans have become increasingly plastic and bionic. However, through scientific enhancement, Evans gains a ‘normal’ life rather than superhuman/transhuman superiority. The ethical complexity is therefore diminished; technologies which ‘equalize’ opportunity are accompanied by far less fear than technologies which promise superhuman performances for the few that can afford them (if/when the technologies are developed).
The same tension exists within digital technologies and education. If the market alone is allowed to determine what technologies are developed, there’s every chance that the privileged few will be given access to more tools to increase their in opportunities, rather than tools which increase opportunity for all. Of course, this is a simplistic reading though, in which technology is still viewed as a tool rather than part of the sociocultural fabric of life.
The immortalist: Uploading the mind to a computer – BBC News
Tags: edc17, immortality, cyborg, neuroscience, scifi, mind uploading
February 06, 2017 at 03:41AM
This BBC article from 2016 provides interesting perspectives on the possibility of cyborg futures:
The millionaire (Dmitry Itskov) funding research into uploading human minds to computers: “For the next few centuries I envision having multiple bodies, one somewhere in space, another hologram-like, my consciousness just moving from one to another.”
The neuroscientist (Dr Randal Koene) working with/for the millionaire: “All of the evidence seems to say in theory it’s possible – it’s extremely difficult, but it’s possible.”
A neuroscientist (Dr Ken Hayworth) involved in mapping tiny parts of mice brains, and who has no ethical objections to the notion of mind-uploading: “The idea of mapping a whole human brain with the existing technology that we have today is simply impossible.”
A neurobiologist (Prof Rafael Yuste) who has ethical concerns: “The pathway that leads with the new neural technologies to our understanding of the brain is the same pathway that could lead, theoretically, to the possibility of mind uploading. Scientists that are involved in these methods have the responsibility to think ahead. If you could replicate the mind and upload it into a different material, you can in principle clone minds. These are complicated issues because they deal with the core of defining what is a person.”
Visionaries or mad people? The quest for immortality through cyborg lives persists..
Bowie’s Cyber song contest involved a level of fan participation previously unheard of. Participants could collaborate by suggesting lyrics and being taken through rehearsals, as well as view the recording in real-time (webcast) and comment/chat. Today, with technologies such as Skype, Google Hangouts, Twitter and Facebook Live, this does not seem terribly progressive – but at the time it was groundbreaking, and demonstrative of the possibilities of digital technology-enabled participatory cultures.
This week I’ve been reliant on free wifi, or wifi in the houses of friends. While Hong Kong is generous in its ‘Internet provision’, not being constantly connected has been challenging due to the nature of the course. Beyond connection itself, I’ve had a lot more face to face interactions – which meant it was difficult to simultaneously engage in the online world. Sherry Turkle would be ‘proud’: I was not connected, not alone.. The experience highlights the tension of being present both online and IRL – perhaps more relevant to our second block, ‘community cultures’.
The short stories in the MIT Technology Review Annual offer renewed sci-fi glimpses, each with their own promise and warning. Lots of discomfort and wonder at what we could have to grapple with in the future – but is it just a distracting narrative? What do we need to grapple with now, ethically?
More evidence of our increased plasticity/bionic-capacity (Williams & Bendelow, 1998). Here an amputee’s artificial limb goes beyond regaining human capability/function to exceeding it, with help from Gil Weinberg and team at the Georgia Institute of Technology.
It’s a common theme that runs through most of Weinberg’s work – the idea that robots can help us make music that wouldn’t be possible by humans alone. For example, software can crunch data much more quickly than we can, he says. It can also combine different musical styles in unexpected ways.
In sport, it has been asked whether certain modern artificial limbs give users an advantage, and should therefore not be permitted in competition (blogged about here). Music, however, has less clear lines about what the competition is, or whether indeed it is a competition. If we can produce ‘better’ music with such artificial limbs, what’s to stop so-called able-bodied musicians from obtaining them, beyond finance? Does knowing that music is machine-made affect enjoyment?
Jonathan Sterne argues that ‘we need to be more careful in our object construction’ (2006, p. 18), and not just assume that we know what cyberculture is. He suggests that the danger in doing so is that significant parts of cyberculture, such as sound/audio, are overlooked.
Sterne also highlights the role of periodization in deciding what is or is not included in our construction of (historical) concepts. In cyberculture studies the standards of periodization include (Sterne, 2006, p. 23):
- by technology: computers – personal computers – Internet
- by art: avant-garde art – cyberpunk – cyberculture
- by economics: fordism – postfordism
Sterne calls for less acceptance of historical periods as ‘self-evident categories in our data, and more like problems to be considered and debated’ (p. 24). This prompted me to interrogate my own ideas of when cyberculture ‘started’. Can we include arcade culture, or is the separation between human and machine and the limited connectedness* within it, too far conceptually from the connected worlds of the Internet, and the socially/bodily integrated nature of digital technologies assumed/witnessed in more ‘traditional’ notions of cyberculture?
*for me, arcade ‘culture’ provided the first opportunity to play against unknown others – but the experience was tied to the location of the game played on, and examination of ‘top scores’ on return to the arcade/pub housing the game. I still don’t know who BK_STONE was/is.. but they were very good at Pac Man..
Do parents still model news consumption? Socializing news use among adolescents in a multi-device world – Jan 23, 2017
February 01, 2017 at 04:49PM
Open in Evernote
I previously referred to Bourdieu’s (1984) work, with reference to his suggestion that the human body has always been a site where social distinction is sought and that by extension, using technology to enhance human appearance and capacity is part of a continuing tradition. In the linked article (above) could be said to bring into play another of Bourdieu’s concepts, cultural capital.
The study in the article examined the impact of socialisation on adolescents’ news consumption:
Results indicate that parental modeling remains an important factor in socializing news consumption, even when modeling takes place via mobile devices. Additionally, we find consistent evidence for “matched modeling” between the devices parents use for news and those used by youth.
My interest was spurred by Bayne’s (2015) assertion that digital technologies cannot be separated from social practice. The use of hand-held devices and prevalence of screens in homes has seen media consumption become individualised, changing ‘social practice’:
Media use inside the home is increasingly individualized as parents and children adopt mobile devices and often use them behind closed doors, in a shift toward what Livingstone (2007) has called the privatized “bedroom culture.”
Connecting the results of the study to cyberculture, many of the fears expressed in cyberpunk relate to the increasing gaps between rich and poor or able and not able – for example, in Blade Runner anyone who can afford it and is declared fit for it has left for the Off-World colonies. The study shows how access to digital technologies alone is not an enabler or equalizer – how it is used, and how it is talked about has significant impact; technologies are not adopted universally in the same ways. Yes, digital technologies impact on social practice, but culture also impacts on how digital technologies are used.