Tag Archives: Human Post

Final blog summary

I believe my lifestream is a useful representation of much of my EDC experience as it logs a lot of my reading, my participation within the community and many of my thoughts about our studies. These have been expressed in post titles or in brief comments on items I’ve considered relevant to pull in (illuminating in itself) as well as more considered reflections. The provision of these summaries and comments has been a useful discipline, tracing my preoccupations and thought-trains and enabling meaningful review.

The blog has really worked for me as a central focusing ‘place to put everything’. Early on I resisted the impulse to organise with pages so the stream remained a better representation of what I understood it to be – a chronological series of thoughts, ideas and finds mashed up in a variety of modalities to chart my progress through EDC, more or less governed by myself. I decided to rely on tagging and categorising to locate posts or identify emerging themes or events, enjoying the economy of the WordPress tag-cloud which enables a one-click surfacing of themes or collections. This premeditated organisation is both illustrative of our human wish to create order and pin meaning and a sense-making imperative in a scrolling blog. My tag-cloud contains nothing surprising, but would be a rich source of information had I chosen other folksonomies, or schemes of emphasis, using it to light up posts I’d considered important or those with unanswered questions.

Central to the quasi-confessional nature of a blog and in common with all reflective diaries, I believe a tone has emerged, and the revisions I’ve made whilst composing longer posts attest to this performative aspect. The knowledge of its being public on the web has sometimes been inhibiting, but most often it’s a thought I have put aside or not had time to entertain.

I have enjoyed writing, particularly when I’ve been inspired by an idea or a reading, but I consider some of my posts to be too informal with ideas expressed in inappropriately flowery language. I believe I can write in an academic register, but the lifestream has somehow worked to release my inner sensationalist! It is interesting to consider that the blog form may have encoded within its literacy a human-designed essentialist ‘algorithm’ prompting me to write in a certain way. More likely it is my particular response to the affordance and has certainly been a natural and involuntary one. I’m not sure if that’s a good thing for academic study.

In a recent post I likened my lifestream to a river course. This natural-world analogy is distant from the algorithmic operations underpinning much of our real life-course which subtly dictate our choices and organise our journey. There are hints of algorithmic agency co-constituting my lifestream such as comments left by automated bots, auto-updating rss feeds, the sudden appearance of comments I’ve written elsewhere and the surprise I register when finding posts I’d forgotten I’d invoked by ifttt.

I believe the technologies used have co-created my lifestream, helping shape both its form and substance, a mix of the human and the human-designed non-human providing an experience I’m glad not to have missed.

How is my driving?

Public Domain image
http://maxpixel.freegreatpicture.com/Satisfaction-Customer-Review-Feedback-Opinion-1977986

Rating and quantifying the consumption of ‘experience’ is on the increase. Recently in the course of my daily life I have undertaken such diverse activities as contacting local government departments, calling in a plumber and some real life chocolate shopping 🙂

Soon after these encounters I have been invited to rate my experience via a phone call or by logging on to a website where I may select a number on a scale to record my level of satisfaction with the service I’ve received. In each case, the government official, the plumber and the shop assistant have all asked or alerted me to this with an unspoken understanding that they stand to gain or lose from the feedback they receive.

This is a demeaning experience both for me as consumer/customer and for them as service provider. The consumer is constructed as a potent arbiter able to award points with no other authority than the money in her pocket. The service provider is fashioned as a worker needing to amass tokens to attest to satisfactory service. Such a contrivance is part of the ontology of the computer harnessed by capitalism which dehumanises the individual and reduces social contact to a mechanistic exchange after the real one has taken place and thereby calling into question its authenticity. Similar construction of  the individual was predicted by Hand (2007) as one of his ‘narratives of threat’,

The idea of a digitally mediated participatory citizenship disguises the ‘push-button’ nature of digitally mediated political life (Street 1997). That is, the Web is simply another media of simple polling of preferences and opinion. The figure of the consumer-citizen takes centre stage where the processes of political management and engagement are inseparable from mass-mediated and customized forms of consumption. Information, instead of being an empowering force for cultural democratization, operates as a substitute for authentic knowledge, particularly where institutional and organisational uses of information centre upon the construction of preference databases. The individual freedoms associated with digital-empowerment are illusory – these are simply methods of decentralizing and delegating responsibility for citizenship to the individual. Citizens are thus now expected to behave like the dominant images of private consumers in economic theory – autonomous, individualised decision-makers removed from the communitarian fabric.
(Hand, 2008, p.39)

Push-button voting is redolent of Social Media likes and the use of rating and gamification in learning environments. Rehearsed in our consumer experiences, they become more readily acceptable in our educational exchanges. The teacher as facilitator is construed as a service provider in a relationship with the student that can only be verified or valorised by digital computation.

Hand, M., (2008) “Hardware to everyware: Narratives of promise and threat” from Hand, Martin, Making digital cultures : access, interactivity, and authenticity   pp.15-42, Aldershot: Ashgate

Algorithms made manifest

Image from https://hellohart.com/2015/05/25/the-mathematics-of-crochet/

by crocheting computer-generated instructions of the Lorenz manifold: all crochet stitches together define the surface of initial conditions that under influence of the vector field generated by the Lorenz equations end up at the origin; all other initial conditions go to the butterfly attractor that has chaotic dynamics. The overall shape of the surface is created by little local changes: adding or removing points at each step

Art or craft can make complex mathematics ‘visible’ for the layperson revealing its beauty and intricacies and opening up ways of understanding what composes our black boxed technologies.

The mathematics of crochet

Analysing Analytics

During snatched moments this week I have been thinking about algorithms and learning analytics, but in an uninformed and distracted way as it has been busy at work. Yet this time was spent in a world semi-constituted and organised by algorithms without my really taking note, as Nigel’s tweet about the way emails get placed into Clutter folders reminded me,

and as even my own lifestream should have underlined as it filled with tweets and posts left uncommented.

My default position on Learning Analytics I expressed early on, but I recognised the need to fight this instinct or at least to examine it more carefully. Siemens’ suggestion that

For some students, the sharing of personal data with an institution in exchange for better support and personalised learning will be seen as a fair value exchange.
(Siemens, 2013, p.1394)

had compounded my involuntary rejection of LA as it packed so many contentious statements in one short sentence.

I took issue with the bargaining trope of data exchange for assured personal gain. I questioned who decides what ‘better support’ is and whether such a promise would hold out after the relinquishing of data. I remained suspicious of the student and institution arriving at a fair outcome when the power balance of that relationship is characterised by inequality. I was wary of ‘personalised learning’ and wondered what it really means and whether it would divest the learner of any of their own thinking skills.

Although at the week’s end when I could read more I discovered Jisc’s counter to my worry,

Students maintain appropriate levels of autonomy in decision making relating to their learning, using learning analytics where appropriate to help inform their decisions.

I remained sceptical, however, because for some students reflection and meta-cognition are not easily achieved (nor always introduced and encouraged) and an effort to develop them may more simply be contracted out to graphs and graphics, leading to a misunderstanding of what counts in learning.

After reading Siemens (2013) my head was full of buzzwords such as actionable insights. I consoled myself by deciding actionable is not a word, but when I looked it up, I found its definition to be rooted in law and, seemingly, marketing, which was indeed insightful.

I had to keep reminding myself (and having to be reminded) that politics and power struggles happen with or without algorithms and not to fall into the trap of algorithms bad, no algorithms good. (What is the opposite of algorithm? Chaos? Proper choice? Manual?) I didn’t think their pervasive and deep penetration of our daily lives was a reason not to want to examine them and get a measure of their scope, dangers and failings, in accordance with Beer’s stated acknowledgement of

a sense that we need to understand what algorithms are and what they do in order to fully grasp their influence and consequences
(Beer, 2017, p.3)

Kitchin (2017) offers “six methodological approaches” (Abstract) to understanding them such as spending time with coders, conducting ethnographies, reverse engineering and witnessing others doing so.

Sociotechnical

I did of course, get ensnared in thinking that algorithms are dissociable from the sociotechnical world they co-constitute, especially frustrating as I see exactly how coded IF statements are firmly rooted in context: IF … THEN … ELSE …, where the elipses here stand in for prescriptive descriptions of the very detail of our lives and can comprise, too, more nested IF statements or containers into which variables are poured – by us, or by other algorithms, with such complexity, interrelation and recursiveness that these codes seem at once to be “neutral and trustworthy systems working beyond human capacity” (Beer, 2017, p.9-10) as well as organic-seeming and mutable, causing the need, from time to time, for the hand of the putative “viewer of everything from nowhere” (the fictitious person alluded to in Ben Williamson’s lecture) to make the fine adjustments named tweaks. The hand that tweaks is firmly located, but hidden, often in financial, commercial, government or educational institutions, involved in a secret and protected remit to organise and present the knowledge that ensures their continued power.

As Beer, quoting Foucault, makes the point,

… the delicate mechanisms of power cannot function unless knowledge, or rather knowledge apparatuses, are formed, organised and put into circulation.”
(Beer, 2017, p.10)

Manovich (1999, p.27) states that the point of the computer game is the gradual revealing of its hidden structure, the exact opposite of the algorithm which operates under cover by stealth to confound our mapping of it. Algorithms all too easily offer themselves as inscrutable and indecipherable, attributes which supply their perfect camouflage of objectivity and neutrality, as mechanisms for avoiding the bias and prejudice of messy human judgement. Commenting on the twofold “translation of a task or problem” into code, Kitchin states

The processes of translation are often portrayed as technical, benign and commonsensical
(Kitchin 2017, p.17).

Information gathering

It is recognised that Learning Analytics needs to gather information from multiple data points from distributed systems to better map and model the learner in recursive processes. Inherent in this gathering are decisions about what to collect, from where and how, with each of these decisions dependent on the platforms and software that capture the information and which have encoded in them their own particular affordances, constraints and bias. Once aggregated by another encoded fitment, decisions on how to interpret data have to be made as well as comparisons drawn against like typical and historical models in order to arrive at what might be predicted or trigger action. Siemens (2013) outlines problems of data interoperability himself,

distributed and fragmented data present a significant challenge for analytics researchers
(Seimens, 2013, p.1393)

This complex sociotechnical construction is not in any way an objective systematised analysis of authentic behaviour, but a range of encoded choices afforded by particular softwares and programming languages made by living and breathing individuals acting on a range of motivations to construct a more, but probably less, reliable image of the student. The construction of LA will favour some but perhaps inhibit, repel, harm or exclude others.

In addition, learning analytics posits the educational project as reducible to numbers, as a discernible learning process which may be audited and in which

‘dataveillance’ functions to decrease the influence of ‘human’ experience and judgement, with it no longer seeming to matter what a teacher may personally know about a student in the face of his or her ‘dashboard’ profile and aggregated tally of positive and negative ‘events’
(Selwyn, 2014 p.59)

Patterns

Learning Analytics attempts to seek out patterns which naturally begs the question, what about the data which falls away from the pattern cutter?

Another danger of pattern searching is voiced by boyd,

Big Data enables the practice of apophenia: seeing patterns where none actually exist
(boyd, 2012, p.668)

Patterns are concerned with data that recurs and they fail to take account of the myriad minute varied detail in which crucial contextual information may lie,

Data are not generic. There is value to analysing data abstractions, yet retaining context remains critical, particularly for certain lines of inquiry. Context is hard to interpret at scale and even harder to maintain when data are reduced to fit a model.
(boyd, 2012, p.671)

Siemens (2013) too, alludes to the difficulty in getting the measure of the individual,

recognizing unique traits, goals, and motivtions of individuals remains an important activity in learning analytics
(Siemens, 2013, p.1383)

So much for my own objectivity and neutrality, I seem to have fallen back into that pit whose muddy walls are white and mostly black. Struggling back out, I voiced my concerns in the tweetorial, but attempted to remain open minded,

If this state of affairs which is learning analytics today, is surfaced and properly taken into account, the endeavour shouldn’t be rejected out of hand, but investigated, honed and trialed to see if can usefully help understand the conditions for learning as well as support learners. It should be done in full partnership with students, enabling a more equal and transparent participatory experience as the University of Edinburgh’s LARC project demonstrates.

The significant barriers to LA, ethics and privacy, can be foregrounded and regarded as “enablers rather than barriers” (Gašević, Dawson and Jovanović, 2016) as the editors of the Journal of Learning Analytics encourage,

We would [also] like to posit that learning analytics can be only widely used once these critical factors are addressed, and thus, these are indeed enablers rather than barriers for adoption (p.2)

Jisc has drawn up a Code of Practice for learning analytics (2015) which does attempt to address issues of privacy, transparency and consent. For example,

Options for granting consent must be clear and meaningful, and any potential adverse consequences of opting out must be explained. Students should be able easily to amend their decisions subsequently.
(Jisc, 2015, p.2)

Pardo and Seimens (2014) identify a set of principles

to narrow the scope of the discussion and point to pragmatic approaches to help design and research learning experiences where important ethical and privacy issues are considered. (Abstract)

Yet even if the challenges of ethics and privacy are overcome, there remains the danger that learning analytics reveals only a very pixelated image of the student, one which might place her at a judged disadvantage, an indelible skewed blueprint existing in perpetuity and following her to future destinations. That this should be the case is not surprising if we consider that a sociomaterial account of learning analytics foregrounds its complex mix of the human, the technical and the material performing an analysis and an analysand by a partial apparatus of incomplete measurement. The encoded institution’s audit met with the absence of student context or nuance, means that LA will struggle to give anything other than general actionable insights.

http://fiona-boyce.deviantart.com/art/Pixelated-ID-192825081

References

Beer, D. (2017). The social power of algorithms. Information, Communication & Society, 20(1), pp.1-13.

boyd, d. and Crawford, K. (2012). Critical questions for Big Data. Information, Communication & Society, 15(5), pp.662-679.

Gašević, D., Dawson, S., Jovanović, J. (2016). Ethics and privacy as enablers of Learning Analytics. Journal of Learning Analytics, 3(1), pp.1-4.

Jisc, (2015). Code of practice for learning analytics. Available at: https://www.jisc.ac.uk/guides/code-of-practice-for-learning-analytics

Kitchin, R. (2017). Thinking critically about and researching algorithms. Information, Communication & Society, 20(1), pp.14-29

Manovich, L. (1999). Database as a symbolic form. Millennium Film Journal (Archive), 34, Screen Studies Collection, pp. 24-43

Pardo, A., Siemens, G. (2014). Ethical and privacy principles for learning analytics. British Journal of Educational Technology, 45(3), pp.438-450.

Selwyn, N. (2014). Distrusting Educational Technology. Routledge, New York.

Siemens, G. (2013). Learning Analytics: the emergence of a discipline. American Behavioral Scientist, 57(10), pp.1380-1400

Williamson, B. (2017). Computing brains: learning algorithms and neurocomputation in the smart city. Information, Communication & Society, 20(1), pp.81-99.

Week 8 Weekly thoughts

SQL Syntax from https://www.w3schools.com/

qryShow_Paid_Posts

SELECT Lifestream_CH_Posts.PostTitle + ‘, ‘ + Lifestream_CH_Posts.PostSubject + ‘, Week ‘ + Lifestream_CH_Posts.intWeek AS [Listing], Lifestream_CH_Posts.PostTitle AS [Post Title],
Lifestream_CHPosts.Postbody AS [Post]
COALESCE(CONVERT(nvarchar(12), Lifestream_CH_Posts.PostDate,113), N”) AS [Date of Post],
Lifestream_CH_Viewers.ViewerType AS [Viewer Type],
Lifestream_CH_Payments.PaymentRcvd AS [Payment Received],
Lifestream_CH_Validations.ViewerValue AS [Viewer Validation]
FROM Lifestream_CH_Viewers INNER JOIN
Lifestream_CH_Posts INNER JOIN
Lifestream_CH_Payments INNER JOIN
ON LIfestream_CH_Validations
ON Lifestream_CH_Posts.PostID = Lifestream_CH_Payments.PostID
ON Lifestream_CH_Viewers.ViewerID = Lifestream_CH_Payments.ViewerID
ON Lifestream_CH_Validations.ViewerCode = Lifestream_CH_Viewers.ViewerID
WHERE (Lifestream_CH_Viewers.ViewerType IN (‘A2‘, ‘A3‘, ‘B3‘, ‘B7‘, ‘D17‘, ‘L23‘, ‘L30‘, ‘M25‘, ‘S7′, ‘S14‘, ‘T9‘))
AND (Posts.PostTitle IN  (‘On and Off, ‘Privacy Paradox’, ‘Algo Chat, ‘Acceptance Creep’, ‘Not so favourite‘, ‘Start the week‘))
AND (Lifestream_CH_Payments.PaymentRcvd=True)
AND (Max(Lifestream_CH_Validations.ViewerValue) > 600)

On and Off

What a week, I couldn’t seem to fire a single algorithm. Early on I enthusiastically toggled Show me the best tweets first on different devices on my Twitter account but with no real discernible difference [1] [2] [3] [4].  I hardly ever use my sparse and locked down Facebook account, but I wandered around in the Settings basement and hauled some levers to ON. Still nothing personalised except a lonely effort by Alison Courses to get me to learn something. I could endorse it, inflicting it on my friends and spawning a million more of the same and similar for me.

It seemed that not only had I somehow gained the right to be forgotten, I had been. What was going on? Normally I only have to think the word hotel for my IP address to be swiped and the price hiked.

Clearly, algorithmically speaking, I should get out more. I started frantically browsing holiday cottages and choosing stuff in online swim shops to provoke a stream of targeted ads. Nothing. How long should it take? Where were the mono fin recommendations? These are algorithms, they shouldn’t show signs of pique. I considered asking a friend to experiment with his Fb timeline settings, but using another person’s data for my own gain seemed, well, dirty. I distracted myself by typing rude words into Google and was blanked, instantly. Naughty me. I did discover that many of us must be contemplating marrying our cousin (is it legal to …).

I headed to YouTube and logged in and out of my Google account like a mad thing, turning the pop up Allow Notifications to Block reflexively. I was impressed by the extent I could analyse my videos – I could get watch time reports, audience retention, playback locations, devices, comments (none) … the list went on. Nothing for Demographics, but the heading was there.

I had wanted to demonstrate how the

arrangements of comments, and thus the spatial qualities of the YouTube page … come together through multiple and contingent relations between the human users … as well as the non-human algorithms which operate beneath the surface
(Knox 2014, p.49)

I wanted to investigate how

the spaces utilised for educational activity cannot be entirely controlled by teachers, students, or the authors of the software”
(Knox, 2014, p.50)

but it seemed unlikely now.

So back in subterranean boiler rooms I struggled rusted faucets to OPEN and tapped the barometers to DELUGE. Sprinting back upstairs, I Googled myself to check I was still alive. Phew, a few of my selves had faint pulses. From a Kafkaesque corridor I dragged down my Google archive to the desktop but found only slim pickings. Seemingly I hadn’t been anywhere on the map for years. I travelled as far as Amazon where, at last, I was greeted with a jaunty Hello C, and I burst into tears of relief at their intimate knowledge of my hoover bag preferences and proffered book recommendations. They were accurate, useful and interesting except for the History book suggestions which must, I dimly remember, be a result of ordering revision guides for my children some hundred years ago.

I never thought I would be so glad to chum up with people who bought this and also bought that. I was back in the human race.

What had I been missing? What friendly, self-affirming world had I separated myself from by turning off tracking and not using Facebook? I’d denied myself even the decision to let Fb decide what I see. Am I doomed to be alone and un-liked with my own dull agency, forced to wander about to achieve serendipity myself instead of having it tastefully sprinkled on top of my carefully-aimed long tail niche cappuccino of recommendations?

“Recommendation algorithms map our preferences against others, suggesting new or forgotten bits of culture for us to encounter” (Gillespie, 2012).

Author of my own destiny? Perhaps not, thanks. I wouldn’t know which of the 52,000 Facebook categories were mine (Beyond Boring? Underactive Thyroid? Paranoid Meanie?). But then I wouldn’t know that anyway,

Categorization is a powerful semantic and political intervention
(Gillespie, 2012).

Best kept hidden.

Is it really consume like crazy, like and retweet in overdrive, complete complicated cameos, share lolcats and link this to that – or –  walk the wilderness? I suspect it’s a bit more nuanced.

I created and later updated a Storify to make sense of other people’s experiences. Perhaps I should keep my settings turned on and just frustrate the algos. I could have fun. I should have believed the boiler room posters (proclamations of “the legitimacy of these functioning mechanisms” (Gillespie, 2012), part of the “providers’ careful discursive efforts” (p.16) which assured me that my experience would be improved.

This articulation of the algorithm is just as crucial to its social life as its material design and its economic obligations
(Gillespie, 2012)

I should have heeded the signs in the (lack of) Control Room which shouted Cookies are Vital and threatened politely to forget which pages I like in Cyrillic. Manovich states that computer games are the “projection of the ontology of the computer onto culture itself” (Manovich, 1999, p.28); shouldn’t I just start to play?

But what was I doing with Storify? Temporarily fixing a contingent assemblage of student and teacher tweets sourced from filtered searches within the affordances of a particular technology? Was this,

the pedagogy of networked learning in which knowledge construction is suggested to be ‘located in the connections and interactions between learners, teachers and resources, and seen as emerging from critical dialogues and enquiries
(Knox, 2014, p.51, quoting Ryberg et al, 2012) ?

Was it like EDMOOC News in which

a set of dependencies and relations that entwine participants and algorithms in the production of educational space
(Knox, 2014, p.51) ?

Not really, but getting closer.

As someone who regularly gets lost rather than turn on their GPS, changing my preferences isn’t going be easy. Yet if I really want to map how “Complex algorithms and codes of the web shape and influence educational space” (Knox, 2014, p.52), untangle, as far as I can, the sociomaterial “procedures irreducible to human intention or agency” (p.53) and discern the power structures encoded in the code, I might have to take the plunge. Lucky I’ve got ten new costumes.

I should augment the number of actors in the “recursive loop between the calculations of the algorithm and the “calculations” of people” (Gillespie, 2012), lifesaving idealistic hopes and avoiding my cousins.

 

Recommended for me

 

Gillespie, T. (2012). The Relevance of Algorithms. in Media Technologies, ed. Tarleton Gillespie, Pablo Boczkowski, and Kirsten Foot. Cambridge, MA: MIT Press.

Knox, J. K. (2014). Active algorithms: sociomaterial spaces in the E-learning and Digital Cultures MOOC. Campus Virtuales, 3(1): 42-55.

Manovich, L. (1999). Database as a Symbolic Form. Millenium Film Journal, 34, pp.24-43.

Week 7 Monsters, mammoths, mammals and mammon

Monsters, mammoths, mammals and mammon

Image: Flickr https://www.flickr.com/photos/usfwsmtnprairie/5244106245 J. Michael Lockhart, USFWS

Henry James (1921) called Tolstoy’s War and Peace a ‘loose baggy monster’, just the epithet I need to describe this week. It’s been a bit all over the place and I haven’t stuck to the point. To begin with, I finished my micro-netnography which, as I hinted to Eli, had me take a detour both from the need to focus narrowly on one aspect of the mooc, and from the role of documenter by getting embroiled in philosophical distinctions. To turn this to advantage, I learnt a little phenomenology, found a friendly connection with the course leader (limes!) and thought about the necessity, the value and the restriction of setting specific tasks, the first two seeming to outweigh the last.

I created a precarious soundtrack for my philosophical road clip which got mangled in the upload to YouTube, so, now short of time, I bolted on a happy tune and left it at that. This was, of course, a mistake, quickly picked up by Daniel, but which made another important lesson for me. In the same way that words matter (noun and verb), my dissonant soundtrack performed a different meaning to the one I intended to express (at least some of the time). This cinematic literacy, evident in all my coursemates’ netnographies, is something I’m not well versed in, leading to an imperfect understanding of digital (and analogue) cultures.

All over the place, too, because I picked up an interesting podcast and paper which probably look forward to the next block rather than summarising this, added some random infrastructure thoughts and threw in the conservation ecology of the black-footed ferret.

James, H. (1921). Preface to The Tragic Muse. London: Macmillan & Co.

Netty Mologies

Lister (2009) states,

to understand contemporary net based media one must spend time online, not reading books (p.164)

This could equally apply to online communities and we can begin to understand different online groups and their cultures by participating in moocs, spending time on Facebook, Instagram or Twitter, belonging to special interest groups or playing online games (in other words, just being online).

Another way of beginning to understand online communities is to examine the lexicon and neologisms they develop. In his book Netymology, Tom Chatfield (2013) explores internet etymology which enables fascinating insight into how the digital both reflects and creates digital cultures.

The words we use say more about us than we usually realize. In a sense, they also use us – and never more so than when we’re speaking about what it feels like to be us

(Chatfield, 2013, p.22).

Here Chatfield is talking about  our memory and how past connotations of the word have been superseded by new, technological ones. He continues,

Just as steam-powered machinery left its metaphorical mark during the industrial revolution, the language we bring to bear on our own minds is increasingly shaped by computing: from talk of “processing” and “downloading” ideas to acts like “rebooting” our attitudes …

(p.23)

This seems to echo my thoughts on computing terminology and the performativity of language. Lister, talking about Wikipedia and Google, remarks how these names have become common parlance,

The enormous success of Wikipedia has prompted all kind of other “Wiki” based knowlege generating and sharing processes such that wiki has become a noun referring to a shared knowledge site as Google has become a verb meaning to find information

(p.xxx)

Chatfield informs us that the word “wiki” comes from the Hawaiian word for “fast” and the word “paideia” is Greek for “education”. The computing connotations of the word “wiki”, explains Chatfield, go back to 1995, when a programmer called Howard Cunningham used the word to describe his creation of a website that lots of people could edit quickly.

Etymologies give us an historical perspective as we note that even neologisms are built upon stems of words with long histories or are based on existing ideas. What we often describe as new technology is better understood when we know the past it drags behind.

To illustrate, I have taken just a few of the terms Chatfield presents and put them on a glog:

Etymology Glogblog

Chatfield, T. (2013). Netymology, Quercus, London.

Week 4 weekly thoughts

Image S B F Ryan, Flickr, https://www.flickr.com/photos/47572798@N00/8397808475

I liked this music on Soundcloud because, as a set of variations on a theme, it serves as a melodic link between Blocks 1 and 2. From early cybercultures and their playful interpretations of the net, EDC is turning to concentrate on network-enabled community cultures and their meaning for education.

Looking back, variations on a theme make me think of our burgeoning ability to create iterations of our human selves as cyborgs, each slightly different from the original, although whether an improvement, is subjective and up for debate.

Looking ahead, variations hail the start of my chosen mooc, A Philosophical Road Trip.  I chose this mooc for its experiential introduction to phenomenology and the mise en abîme effect of making an ethnographic study of students of phenomenology. It is a philosophy which urges an active observation of the world and of ourselves. It encourages us to explore and exploit the double take so that we waken from perceiving the world as expected and view it anew and differently: epoché.

Following this philosophy, I might uncover some of the tensions and obscured constructions behind what it is to become part of an online learning community. I may observe “tensions between the creative, open sources practices of web media and the economic and commercial forces with which they react” (Lister, 2009, p.205), tensions between a Socratic understanding of knowledge delivery and theories of connectivism and distributed expertise, (Stewart, 2013) and tensions between a free community sharing a common interest and a forced, ersatz participation.

 

Lister, M. … [et al.], (2009) “Chapter 3. Networks, users and economics” from Martin Lister … [et al.], New media: a critical introduction pp.163-236, London: Routledge

Stewart, B., (2013). Massiveness + Openness = New Literacies of Participation? MERLOT Journal of Online Learning and Technology, 9(2), pp.228–238.

This summary is related to several Lifestream posts this week. I saved a YouTube clip of Fred Turner on Pinterest and watched it whilst I was at the gym, hence the blurred photo post on Instagram. I was squeezing the most use I could from a period of time.

In the video clip Turner describes how members of the US counterculture of the 60s, ambivalent towards technology, made movement back to “the land” to rediscover themselves and form new communities with alternative values. Supporting this movement, a publication entitled The Whole Earth Catalog, dreamed up by Stewart Brand, came to “establish[ed] a relationship between information technology, economic activity, and alternative forms of community that would outlast the counterculture itself and become a key feature of the digital world” (2005, p.488). A full circle was turned: technology came to underpin and facilitate a community turning away from a politics that had spawned such affordances from the “large scale weapons technologies of the cold war” (p.488), yet ironically the community ultimately failed to “escape the pull of America’s technological and economic centers of gravity” (p.512).

Turner relates how technology was co-constitutive alongside political, economic and social elements, of a new sociability upholding alternative values in a digital age. By following his exposition, I was able to gain more perspective on how technology interrelates with particular people in particular places for particular purposes at particular times (students, online, learning, now) and how these interrelations become cultures in the sense described by Hand (2008, p.18),

“There are new forms of circulation emerging which override or replace older modern structures, where culture has in a sense replaced the social …”

(Article saved in Dropbox)

Half-listening to the radio in the night, I tuned into a BBC World Service broadcast which, coincidentally, described another digital culture full circle (saved on Pinterest). This one concerned the rise of fake news accounts on the internet and the creation of “click-worthy” stories, especially prevalent during the US Presidential campaign, which gained traction and prominence on Google and Facebook and made millions through advertising. One way Facebook is attempting to combat the proliferation of false news (according to the broadcast), is to develop algorithms to identify recently-created news sites and demote them in the rankings of social media feeds. The tech giants’ algorithms promoted these news accounts and now the companies must marshal new code to help quash them.

Hand, M. (2008). Hardware to everywhere: narratives of promise and threat, chapter 1 of Making digital cultures: access, interactivity and authenticity. Aldershot: Ashgate. pp 15-42.

Turner, F. (2005). The WELL and the Origins of Virtual Community. Technology and Culture, 46, pp 485-512. DOI: 0040-165X/05/4603-0001$8.00