Week 3 summary

My summary this weeks leads directly on from my last post looking at the emotional aspect of humans and how that is intangibly different to computer coding. Many of my tweets and conversations with fellow students circled around the dystopian aspect of this and the worry about AI replacing us, both in care roles and the workplace (such as in call centres or social care settings). One author, a new father actually took solace in his wife’s exhaustion feeding at night as he couldn’t foresee any robot being able to emulate this human trait and all that it entailed.

However, I came across the quotation below from a paper from the Understanding Learners module, and even though it is from 1993 I felt heartened by the premise that the author was rejecting the relentless binary by urging us to take the best from both human and technological fields to create something complementary.

Machines tend to operate by quite different principles than the human brain, so the powers and weaknesses of machines are very different from those of people. As a result, the two together – the powers of the machine and the powers of the person-complement one other, leading to the possibility that the combination will be more fruitful and powerful than either alone. If we design things properly, that is. (Norman, 1993).

This makes me feel less uneasy than the dystopian futures of Hollywood, none more unsettling for me than AI, the trailer of which is in my stream and I too hope that our human memory remains different to that of machine. Our memory as Roszack sees it will keep us unique, not possible to replicate as we ourselves do not know the secret of how it works:

Roszak (1994)

My artefact hopefully sums the above up in a way that isn’t negative. Robots are part of our lives in many, sometimes hidden, ways but they hopefully will never be able to write/create our poetry, the ‘inexplicable‘ us. We need to ensure our time and effort on technological advancement is on worthy projects.

This then takes me to the more cyborg aspect of my stream for the week. Working on from Miller’s introduction Haraway’s manifesto was much heavier. However, after the hangout and some videos such as I tweeted did start to put it in context and I suspect that I will come back to her work again in the future. Medical applications continue to astound and I do see that with technological advances we will each become more and more ‘cyborg’. To answer Jeremy’s question on my week 2 summary: yes I think this will divide the rich and the poor in the same way as the digital divide (which still exists). As a society this will be a huge part of future healthcare with burdens as well as life extension. How far can it be pushed? Each week I end up with more questions.

I also realise that I haven’t spent much time relating all this to education – I could do with another week!

All in all I started out this block feeling very unequipped with regards background or experience in science fiction or cyberculture but I have been surprised at how enthusiastically I embraced it. I am going to miss my robot musings as we move on.


Norman, D. A. (1993). Things that make us smart : defending human attributes in the age of the machine . Reading, Mass., Addison-Wesley Pub. Co.
Chapter 5; The Human Mind (115 – 138)

Roszak, T., (1994) “Of Ideas and Data” from Roszak, T., The cult of information : a neo-Luddite treatise on high tech, artificial intelligence, and the true art of thinking pp.87-107, Berkeley, Calif. ; London: University of California Press

One Reply to “Week 3 summary”

  1. Great week 3 summary, and reflection on the end of block 1!

    Do try to stay close to the recommended 250 word though. It is tough when there is a lot to reflection upon, but also important to work within limits.

    That said, you’ve offered a really excellent critical summary here – the distinction between emotion and computer code is contextualised well, and the Norman quote (1993) provides a promising way of approaching this in non-dualist or -oppositional ways. I do think we need to move beyond utopian / dystopian binaries, particularly in the education technology field, otherwise we miss all sorts of nuance in the ways our relationships with technology unfold.

    Fantastic quote from Roszak too – I must look this up. Embodiment is certainly one way we can critiquing simplistic notions of A.I. as pure ‘information’.

    Ending this block with more questions is no bad thing! Perhaps with some time away from these ideas – as we discuss ‘community’ and ‘algorithmic’ cultures – you’ll find a way to connect with education.

Comments are closed.