Lifestream, Tweets


This article caught my attention because of its focus on how the use of digital technologies changes social practices, in this case by normalising a culture of surveillance:

These technologies also normalise surveillance as a cultural and social practice, in the context of the parent–child relationship or children’s relationship with institutional and commercial actors. Children are monitored or encouraged to monitor their own activities (be it health, school performance and/or play practices).

Growing up in a culture where (self-)surveillance is normalised is likely to shape children’s future lives in ways that it is hard to predict.

The article’s focus on digital technologies’ influence on culture echoes Bayne’s (2015) challenge to the notion of technology being neutral and separate from social practices. Bayne, citing Hamilton & Frieson (2013), uses the perspective of science and technology studies to suggest that the narrative surrounding use of digital technologies, and particularly technology within education, is overly simplistic and reductionist, framing technology as either instrumentalist or existentialist:

Where essentialism attributes to technology a set of ‘inalienable qualities’ immanent to the technological artefact, instrumentalism constructs technology as a set of neutral entities by which pre-existing goals (for example, ‘better’ learning) can be achieved.

The datafication of childhood experience that the articles argues normalises a culture of surveillance isn’t just implicit in the SmartToys it talks about, however. Digital technologies have enabled schools to keep more in-depth records on student behaviours and performance, and this record keeping has a significant impact on both the lives of teachers and students. Supposedly, better record keeping, through tools such as ‘Target Tracker’  schools can improve student progress. However, they simultaneously work to normalise notions of progression and learning as definitively linear, and routine across students. Similarly, persistent behavioural records which are transferred between schools and across levels of schooling frequently establish behaviour as something in need of modification, without addressing underlying social circumstances. One example of the latter is ClassDojo. According to Williamson and Rutherford (2017):

ClassDojo reinforces the idea that it is the behavioural mindset of the individual that needs to be addressed. Many ClassDojo resources refer to ideas such as ‘character development’ and ‘growth mindset’ that emphasise developing individuals’ resilience in the face of difficulties, but this doesn’t address the social causes of many difficulties children encounter, such as stress about tests.

It is clear in these examples that technology is not neutral, but rather influences the culture and practice of both learning and teaching, and more generally, of childhood. Yet, we are also not merely at the whim of technology – for example, it is school (and at times government) policy which dictates whether applications such as Target Tracker and ClassDojo are used. The relatively widespread acceptance of their use (in my experience within multiple schools) suggests that (wrongly or rightly) societally we already accept surveillance. Of course, tracking micro-levels of learning (though not terribly effective in my mind) is slightly less creepy than a doll asking a child about where he/she lives and what mummy and daddy do.. I’m interested in others’ thoughts – is there actually a difference between surveillance by SmartToys and surveillance by schools, or do we -as a society- need to reconsider the latter as well?


Bayne, S. (2014). What’s the matter with ‘Technology Enhanced Learning’? Learning, Media and Technology, 40(1). DOI: 10.1080/17439884.2014.915851

Williamson, B. & Rutherford, A. (2017). ClassDojo poses data protection concerns for parents [blog post]. Retrieved from http://blogs.lse.ac.uk/parenting4digitalfuture/2017/01/04/classdojo-poses-data-protection-concerns-for-parents/


 

Week 2 Summary

Week two has primarily been focused on the ethical concerns of new technologies. A paper by Amy DeBaets (2011) led me to a greater understanding of how transhumanist perspectives sit across the political spectrum. It was interesting to learn that it is quite possible to be technologically progressive but politically (economically) conservative. Introspection of moral imperatives continued through analysis of Ghost in the Shell, review of discussion between Joi Ito, Scott Dadich and Barack Obama on the moral programming decisions of self-driving cars, and the cultural implications of ‘perfect’ female robots for human female body image. I explored the ethical discomfort further through examination of robot use in Japan, and my subsequent reading of Jenniffer Robbinson’s article on Human Rights vs Robot rights.

Lifestream feeds this week were primarily concentrated upon building community. I’ve been there for peers, offering to test IFTTT streams – it’s strange to see because generally I see myself as less technologically able. I do seem to be able to troubleshoot, mind..  a core educational area in the press this week.

What is clear this week is that technology is not separate from culture. The influence is two ways, and we do need to be proactive in the decisions we make about which technology to use in education. Always, we need to ask.. is there a purpose? What are the consequences? No technology for technology’s sake.

Ethics in the age of androids and cyborgs

This week I’ve thus far been fairly focused on the ethical implications of technological advancement. I posted previously about the 1995 anime, Ghost in the Shell, but I spoke in quite general terms about the themes. Today, let’s take a closer look at a couple of scenes:

    Motoko & Batou pass judgement on the Garbage Collector (Ghost in the Shell, 1995)

This clip [1m12, from 23 minutes into the film] shows the capture of the Garbage Collector, and the reactions of Motoko and her second in command, Batou. Motoko seems to almost spit out her questions, “Can you remember your mother’s name or what she looks like? Or how about where you were born? Don’t you have any happy childhood memories? Do you even know who you are?” Her reference to memory as indicative of identity parallels Blade Runner – but it is the viciousness of her questioning which intrigues me from an ethics perspective. The Garbage Collector was human, but has had his ‘ghost’ (consciousness or soul) hacked. Yet somehow it is his fault: “Ghost hacked humans are so pathetic,” says Batou. It seems like an attack on a victim (What were you thinking wearing that skirt? Drinking? Were you asking for it?).

Screenshot from the 1995 anime Ghost in the Shell

In this clip [2m30, from 47m25 into the film] the Puppetmaster, a non-human who has hacked a cyborg, asks for asylum in section 9, raising questions about what it is that differentiates man and machine:

Puppet Master: What you are now witnessing is an action of my own free will. As a sentient life form, I hereby demand political asylum.

Chief Aramaki: Is this a joke?

Nakamura: Ridiculous! It’s programmed for self-preservation!

Puppet Master: It can also be argued that DNA is nothing more than a program designed to preserve itself. Life has become more complex in the overwhelming sea of information. And life, when organized into species, relies upon genes to be its memory system. So man is an individual only because of his intangible memory. But memory cannot be defined, yet it defines mankind. The advent of computers and the subsequent accumulation of incalculable data has given rise to a new system of memory and thought, parallel to your own. Humanity has underestimated the consequences of computerization.

Nakamura: Nonsense! This babble is no proof at all that you’re a living, thinking life form!

Puppet Master: And can you offer me proof of your existence? How can you, when neither modern science nor philosophy can explain what life is?

Chief Aramaki: Who the hell is this?

Nakamura: Even if you do have a Ghost, we don’t offer freedom to criminals! It’s the wrong place and time to defect.

Puppet Master: Time has been on my side, but by acquiring a body, I am now subject to the possibility of dying. Fortunately, there is no death sentence in this country.

[quotes from wikiquote]

Of course, Ghost in the Shell is fictional, and the notion that we could achieve a state of technological advancement wherein it was possible to upload human consciousness to a machine, or for a machine to develop consciousness (singularity) remains questionable, even amid claims that the first head transplant could occur in the UK in 2017. Yet we are already at a stage where we need to think of the ethics involved in the decisions that machines make, as the previous posts/feed on self-driving cars has indicated. Beyond that, what of the rights of machines should they gain consciousness? On our screens we see their fates played out desperately (Westworld, for example), or alternatively, our own fate is portrayed as under threat (by Stephen Hawking, for example).

If humans are to be accorded different, more privileged rights than machines, they need to actually behave differently to them. Perhaps the increasingly human likeness of some machines is a call to bolster our own humanity, through empathy and the like, so as to truly differentiate ourselves from machines. Thoughts?