In many ways this news article demonstrates the gap between sci-fi and reality. Cyberpunk imagined a world in which human minds could be uploaded to machines, and humans, as immortal cyborgs, could live alongside androids who performed ‘the work’. Medical science has made incredible leaps – with his titanium chest Edward Evans (further) illustrates William’s and Bendelow’s (1998, pp. 79-85) claim that humans have become increasingly plastic and bionic. However, through scientific enhancement, Evans gains a ‘normal’ life rather than superhuman/transhuman superiority. The ethical complexity is therefore diminished; technologies which ‘equalize’ opportunity are accompanied by far less fear than technologies which promise superhuman performances for the few that can afford them (if/when the technologies are developed).
The same tension exists within digital technologies and education. If the market alone is allowed to determine what technologies are developed, there’s every chance that the privileged few will be given access to more tools to increase their in opportunities, rather than tools which increase opportunity for all. Of course, this is a simplistic reading though, in which technology is still viewed as a tool rather than part of the sociocultural fabric of life.
Tags: edc17, immortality, cyborg, neuroscience, scifi, mind uploading
February 06, 2017 at 03:41AM
Commentary:
This BBC article from 2016 provides interesting perspectives on the possibility of cyborg futures:
The millionaire (Dmitry Itskov) funding research into uploading human minds to computers: “For the next few centuries I envision having multiple bodies, one somewhere in space, another hologram-like, my consciousness just moving from one to another.”
The neuroscientist (Dr Randal Koene) working with/for the millionaire: “All of the evidence seems to say in theory it’s possible – it’s extremely difficult, but it’s possible.”
A neuroscientist (Dr Ken Hayworth) involved in mapping tiny parts of mice brains, and who has no ethical objections to the notion of mind-uploading: “The idea of mapping a whole human brain with the existing technology that we have today is simply impossible.”
A neurobiologist (Prof Rafael Yuste) who has ethical concerns: “The pathway that leads with the new neural technologies to our understanding of the brain is the same pathway that could lead, theoretically, to the possibility of mind uploading. Scientists that are involved in these methods have the responsibility to think ahead. If you could replicate the mind and upload it into a different material, you can in principle clone minds. These are complicated issues because they deal with the core of defining what is a person.”
Visionaries or mad people? The quest for immortality through cyborg lives persists..
Bowie’s Cyber song contest involved a level of fan participation previously unheard of. Participants could collaborate by suggesting lyrics and being taken through rehearsals, as well as view the recording in real-time (webcast) and comment/chat. Today, with technologies such as Skype, Google Hangouts, Twitter and Facebook Live, this does not seem terribly progressive – but at the time it was groundbreaking, and demonstrative of the possibilities of digital technology-enabled participatory cultures.
This week I’ve been reliant on free wifi, or wifi in the houses of friends. While Hong Kong is generous in its ‘Internet provision’, not being constantly connected has been challenging due to the nature of the course. Beyond connection itself, I’ve had a lot more face to face interactions – which meant it was difficult to simultaneously engage in the online world. Sherry Turkle would be ‘proud’: I was not connected, not alone.. The experience highlights the tension of being present both online and IRL – perhaps more relevant to our second block, ‘community cultures’.
The short stories in the MIT Technology Review Annual offer renewed sci-fi glimpses, each with their own promise and warning. Lots of discomfort and wonder at what we could have to grapple with in the future – but is it just a distracting narrative? What do we need to grapple with now, ethically?
More evidence of our increased plasticity/bionic-capacity (Williams & Bendelow, 1998). Here an amputee’s artificial limb goes beyond regaining human capability/function to exceeding it, with help from Gil Weinberg and team at the Georgia Institute of Technology.
It’s a common theme that runs through most of Weinberg’s work – the idea that robots can help us make music that wouldn’t be possible by humans alone. For example, software can crunch data much more quickly than we can, he says. It can also combine different musical styles in unexpected ways.
In sport, it has been asked whether certain modern artificial limbs give users an advantage, and should therefore not be permitted in competition (blogged about here). Music, however, has less clear lines about what the competition is, or whether indeed it is a competition. If we can produce ‘better’ music with such artificial limbs, what’s to stop so-called able-bodied musicians from obtaining them, beyond finance? Does knowing that music is machine-made affect enjoyment?
Jonathan Sterne argues that ‘we need to be more careful in our object construction’ (2006, p. 18), and not just assume that we know what cyberculture is. He suggests that the danger in doing so is that significant parts of cyberculture, such as sound/audio, are overlooked.
Sterne also highlights the role of periodization in deciding what is or is not included in our construction of (historical) concepts. In cyberculture studies the standards of periodization include (Sterne, 2006, p. 23):
by technology: computers – personal computers – Internet
by art: avant-garde art – cyberpunk – cyberculture
by economics: fordism – postfordism
Sterne calls for less acceptance of historical periods as ‘self-evident categories in our data, and more like problems to be considered and debated’ (p. 24). This prompted me to interrogate my own ideas of when cyberculture ‘started’. Can we include arcade culture, or is the separation between human and machine and the limited connectedness* within it, too far conceptually from the connected worlds of the Internet, and the socially/bodily integrated nature of digital technologies assumed/witnessed in more ‘traditional’ notions of cyberculture?
*for me, arcade ‘culture’ provided the first opportunity to play against unknown others – but the experience was tied to the location of the game played on, and examination of ‘top scores’ on return to the arcade/pub housing the game. I still don’t know who BK_STONE was/is.. but they were very good at Pac Man..
I previously referred to Bourdieu’s (1984) work, with reference to his suggestion that the human body has always been a site where social distinction is sought and that by extension, using technology to enhance human appearance and capacity is part of a continuing tradition. In the linked article (above) could be said to bring into play another of Bourdieu’s concepts, cultural capital.
The study in the article examined the impact of socialisation on adolescents’ news consumption:
Results indicate that parental modeling remains an important factor in socializing news consumption, even when modeling takes place via mobile devices. Additionally, we find consistent evidence for “matched modeling” between the devices parents use for news and those used by youth.
My interest was spurred by Bayne’s (2015) assertion that digital technologies cannot be separated from social practice. The use of hand-held devices and prevalence of screens in homes has seen media consumption become individualised, changing ‘social practice’:
Media use inside the home is increasingly individualized as parents and children adopt mobile devices and often use them behind closed doors, in a shift toward what Livingstone (2007) has called the privatized “bedroom culture.”
Connecting the results of the study to cyberculture, many of the fears expressed in cyberpunk relate to the increasing gaps between rich and poor or able and not able – for example, in Blade Runner anyone who can afford it and is declared fit for it has left for the Off-World colonies. The study shows how access to digital technologies alone is not an enabler or equalizer – how it is used, and how it is talked about has significant impact; technologies are not adopted universally in the same ways. Yes, digital technologies impact on social practice, but culture also impacts on how digital technologies are used.
This article caught my attention because of its focus on how the use of digital technologies changes social practices, in this case by normalising a culture of surveillance:
These technologies also normalise surveillance as a cultural and social practice, in the context of the parent–child relationship or children’s relationship with institutional and commercial actors. Children are monitored or encouraged to monitor their own activities (be it health, school performance and/or play practices).
Growing up in a culture where (self-)surveillance is normalised is likely to shape children’s future lives in ways that it is hard to predict.
The article’s focus on digital technologies’ influence on culture echoes Bayne’s (2015) challenge to the notion of technology being neutral and separate from social practices. Bayne, citing Hamilton & Frieson (2013), uses the perspective of science and technology studies to suggest that the narrative surrounding use of digital technologies, and particularly technology within education, is overly simplistic and reductionist, framing technology as either instrumentalist or existentialist:
Where essentialism attributes to technology a set of ‘inalienable qualities’ immanent to the technological artefact, instrumentalism constructs technology as a set of neutral entities by which pre-existing goals (for example, ‘better’ learning) can be achieved.
The datafication of childhood experience that the articles argues normalises a culture of surveillance isn’t just implicit in the SmartToys it talks about, however. Digital technologies have enabled schools to keep more in-depth records on student behaviours and performance, and this record keeping has a significant impact on both the lives of teachers and students. Supposedly, better record keeping, through tools such as ‘Target Tracker’ schools can improve student progress. However, they simultaneously work to normalise notions of progression and learning as definitively linear, and routine across students. Similarly, persistent behavioural records which are transferred between schools and across levels of schooling frequently establish behaviour as something in need of modification, without addressing underlying social circumstances. One example of the latter is ClassDojo. According to Williamson and Rutherford (2017):
ClassDojo reinforces the idea that it is the behavioural mindset of the individual that needs to be addressed. Many ClassDojo resources refer to ideas such as ‘character development’ and ‘growth mindset’ that emphasise developing individuals’ resilience in the face of difficulties, but this doesn’t address the social causes of many difficulties children encounter, such as stress about tests.
It is clear in these examples that technology is not neutral, but rather influences the culture and practice of both learning and teaching, and more generally, of childhood. Yet, we are also not merely at the whim of technology – for example, it is school (and at times government) policy which dictates whether applications such as Target Tracker and ClassDojo are used. The relatively widespread acceptance of their use (in my experience within multiple schools) suggests that (wrongly or rightly) societally we already accept surveillance. Of course, tracking micro-levels of learning (though not terribly effective in my mind) is slightly less creepy than a doll asking a child about where he/she lives and what mummy and daddy do.. I’m interested in others’ thoughts – is there actually a difference between surveillance by SmartToys and surveillance by schools, or do we -as a society- need to reconsider the latter as well?
Williamson, B. & Rutherford, A. (2017). ClassDojo poses data protection concerns for parents [blog post]. Retrieved from http://blogs.lse.ac.uk/parenting4digitalfuture/2017/01/04/classdojo-poses-data-protection-concerns-for-parents/