Monthly Archives: March 2017

Tweet! Nudging, shoving and smacking

Predictive analytics – will it be used for nudging?

Bradbury et al declare the project of behavioural economics is

to model the essential irrationality of choosers, and in so doing to render the flaws in their choosing predictable … then be used to make claims as to how social and economic systems might be designed to counteract individuals’ tendencies to make ‘bad’ decisions and to stimulate ‘good’ decisions.
(Bradbury, McGimpsey and Santori, 2012, p.250)

The Educause article similarly relates the concept of the nudge as a

theory which centers on prompting individuals to modify their behavior in a predictable way (usually to make wiser decisions) without coercing them, forbidding actions, or changing consequences.

These descriptions point to how ‘irrational’ student behaviour may emerge from learning analytics data to be met with helpful and gentle attempts at ‘correction’ in the students’ best interests.

It sounds plausible and paternalistic, yet whilst making a point of neither forbidding nor coercing the individual, the ‘choice architect’ or ‘policy maker’ is concerned with constructing a situation in which the ‘correct’ course of action is not only implicit, but foundational and pervasive. It is a dynamic bias-in-action under the guise of neutrality and provision of choice. Disingenuous too, because it advertises human irrationality as undesirable whilst sloping the ground towards the one choice it deems appropriate.

Bradbury et al describe this ‘liberal paternalism’ as ‘the co-option of behavioural economics for the continuity of the neoliberal project’ (p.255), with economic reasons for adoption in education settings being cited by the Educause article,

The combination of automation and nudges is alluring to higher education institutions because it requires minimal human intervention. This means that there are greater possibilities for more interventions and nudges, which are likely to be much more cost- and time-effective.

Nudging and its more coercive or punitive variations, ‘shoving’ and ‘smacking’, carry the risk of inappropriate application through, for example, misinterpreting data or disregarding contextual detail excluded from it. Worse, the attempt to correct or eliminate irrationality is dangerous when the long-term effects of doing so are unknown, when what is considered ‘irrational’ is up for question and when it is subject to the substitution of only one option by a determinedly non-neutral party. An attempt to curb our freedom to choose what is regarded by one political project as ‘incorrect’ is an incursion of human rights and those rights, particularly as they belong to students already dominated by institutional or commercialised powers, should be protected. As the article concludes,

with new technologies, we need to know more about the intentions and remain vigilant so that the resulting practices don’t become abusive. The unintended consequences of automating, depersonalizing, and behavioral exploitation are real. We must think critically about what is most important: the means or the end.

Bradbury, A., McGimpsey, I., and Santori, D. (2012). Revising rationality: the use of ‘Nudge’ approaches in neoliberal education policy. Journal of Education Policy 28 (2), pp. 247-267.

Pinterest! Dispositif

Just Pinned to Education and Digital Cultures: Foucault’s dispositif http://ift.tt/2mQsRqI

What is the dispositif?

I pinned this because I found it serendipitously whilst looking for the word diapositif or negative, (old-fashioned film slides) for a short post I was writing on Jeremy’s Abstracting Learning Analytics blog. Foucault’s dispositif seems to sum up the complex interrelation of elements making up the mechanism or apparatus behind Learning Analytics:

From Wikipedia https://en.wikipedia.org/wiki/Dispositif

This requires some reading before I can properly decide if it’s relevant.

Bookmark! A thousand hours

http://www.telegraph.co.uk/culture/art/9570595/Edmund-de-Waal-on-his-new-exhibition-A-Thousand-Hours.html

Thinking about quantifying the learning student made me reflect on time on task and whether, if an accurate measurement can be taken, more time on task would correlate to greater success (with the usual caveat about defining success). I don’t think it is always a foregone conclusion although mastery of a subject or skill is often characterised by the amount of time spent engaged in it. Time, here, is the amassed amount of hours, days or years needed to become a pianist, a professor or a potter. Is it possible to make creativity correlations? Pinheiro and Cruz (2014) itemise a series of tests to measure creativity but suggest

that the phenomenon of creativity cannot be described by any of these tests alone, but only through a battery of joint measures

Mapping Creativity: Creativity Measurements Network Analysis

http://dx.doi.org/10.1080/10400419.2014.929404

 

from Diigo http://ift.tt/TXCD9V
via IFTTT

Instagram! Google ads and extreme content

From BBC News at http://www.bbc.co.uk/news/business-39325916

From the Guardian at https://www.theguardian.com/technology/2017/mar/21/youtube-advertisers-censorship?CMP=twt_gu

via Instagram http://ift.tt/2nsWPoa

These news items emphasise what happens when algorithms step outside our control or when even they aren’t sophisticated enough to do what we want at scale, requiring human input.

They’re also telling of the sort of responsibility we want to take for the tasks and code we create.

Evernote! Week 9 weekly thoughts

Audio Week 9 weekly thoughts
Open in Evernote

The school inspection has taken place. Some data amassing was required, but most of it was conducted by humans interacting with each other in the real world. How long this will remain the case is up for question if our study and discussions about learning analytics this week hail the beginning of an inevitable phenomenon. Inspections in the future might be done remotely with officials tapping in to the school’s metrics, viewing dashboards and delving into detailed individual student action plans, predictions and prescriptions carefully compiled by the code. Even the psychological temperature of the pupils will be available remotely in real time.

I have swithered all week between a reactionary distrust of learning analytics – a concept of learning by numbers and an ambition to instantiate a quantified student measured against coherent mapped knowledge domains – and an acknowledgment of the importance of research and a creeping suspicion that some of it might actually be useful, with a confession, too, that my happiness and motivation indicators do actually nudge up a little each time an automated comment on my lifestream applauds me for a great post. 

I have bundled up all my LA thoughts into one post (not such a heavy call on the algorithmic burden), although I sprinkled a few little comments on infographics, LA reports and modelling the student elsewhere as well as starting to contribute to Dan’s Milanote. I started my tweetorial tweets a bit early with a question which, for me, still hangs in the air.

I feel moocs on behaviourism and neuroscience coming on 🙂

Analysing Analytics

During snatched moments this week I have been thinking about algorithms and learning analytics, but in an uninformed and distracted way as it has been busy at work. Yet this time was spent in a world semi-constituted and organised by algorithms without my really taking note, as Nigel’s tweet about the way emails get placed into Clutter folders reminded me,

and as even my own lifestream should have underlined as it filled with tweets and posts left uncommented.

My default position on Learning Analytics I expressed early on, but I recognised the need to fight this instinct or at least to examine it more carefully. Siemens’ suggestion that

For some students, the sharing of personal data with an institution in exchange for better support and personalised learning will be seen as a fair value exchange.
(Siemens, 2013, p.1394)

had compounded my involuntary rejection of LA as it packed so many contentious statements in one short sentence.

I took issue with the bargaining trope of data exchange for assured personal gain. I questioned who decides what ‘better support’ is and whether such a promise would hold out after the relinquishing of data. I remained suspicious of the student and institution arriving at a fair outcome when the power balance of that relationship is characterised by inequality. I was wary of ‘personalised learning’ and wondered what it really means and whether it would divest the learner of any of their own thinking skills.

Although at the week’s end when I could read more I discovered Jisc’s counter to my worry,

Students maintain appropriate levels of autonomy in decision making relating to their learning, using learning analytics where appropriate to help inform their decisions.

I remained sceptical, however, because for some students reflection and meta-cognition are not easily achieved (nor always introduced and encouraged) and an effort to develop them may more simply be contracted out to graphs and graphics, leading to a misunderstanding of what counts in learning.

After reading Siemens (2013) my head was full of buzzwords such as actionable insights. I consoled myself by deciding actionable is not a word, but when I looked it up, I found its definition to be rooted in law and, seemingly, marketing, which was indeed insightful.

I had to keep reminding myself (and having to be reminded) that politics and power struggles happen with or without algorithms and not to fall into the trap of algorithms bad, no algorithms good. (What is the opposite of algorithm? Chaos? Proper choice? Manual?) I didn’t think their pervasive and deep penetration of our daily lives was a reason not to want to examine them and get a measure of their scope, dangers and failings, in accordance with Beer’s stated acknowledgement of

a sense that we need to understand what algorithms are and what they do in order to fully grasp their influence and consequences
(Beer, 2017, p.3)

Kitchin (2017) offers “six methodological approaches” (Abstract) to understanding them such as spending time with coders, conducting ethnographies, reverse engineering and witnessing others doing so.

Sociotechnical

I did of course, get ensnared in thinking that algorithms are dissociable from the sociotechnical world they co-constitute, especially frustrating as I see exactly how coded IF statements are firmly rooted in context: IF … THEN … ELSE …, where the elipses here stand in for prescriptive descriptions of the very detail of our lives and can comprise, too, more nested IF statements or containers into which variables are poured – by us, or by other algorithms, with such complexity, interrelation and recursiveness that these codes seem at once to be “neutral and trustworthy systems working beyond human capacity” (Beer, 2017, p.9-10) as well as organic-seeming and mutable, causing the need, from time to time, for the hand of the putative “viewer of everything from nowhere” (the fictitious person alluded to in Ben Williamson’s lecture) to make the fine adjustments named tweaks. The hand that tweaks is firmly located, but hidden, often in financial, commercial, government or educational institutions, involved in a secret and protected remit to organise and present the knowledge that ensures their continued power.

As Beer, quoting Foucault, makes the point,

… the delicate mechanisms of power cannot function unless knowledge, or rather knowledge apparatuses, are formed, organised and put into circulation.”
(Beer, 2017, p.10)

Manovich (1999, p.27) states that the point of the computer game is the gradual revealing of its hidden structure, the exact opposite of the algorithm which operates under cover by stealth to confound our mapping of it. Algorithms all too easily offer themselves as inscrutable and indecipherable, attributes which supply their perfect camouflage of objectivity and neutrality, as mechanisms for avoiding the bias and prejudice of messy human judgement. Commenting on the twofold “translation of a task or problem” into code, Kitchin states

The processes of translation are often portrayed as technical, benign and commonsensical
(Kitchin 2017, p.17).

Information gathering

It is recognised that Learning Analytics needs to gather information from multiple data points from distributed systems to better map and model the learner in recursive processes. Inherent in this gathering are decisions about what to collect, from where and how, with each of these decisions dependent on the platforms and software that capture the information and which have encoded in them their own particular affordances, constraints and bias. Once aggregated by another encoded fitment, decisions on how to interpret data have to be made as well as comparisons drawn against like typical and historical models in order to arrive at what might be predicted or trigger action. Siemens (2013) outlines problems of data interoperability himself,

distributed and fragmented data present a significant challenge for analytics researchers
(Seimens, 2013, p.1393)

This complex sociotechnical construction is not in any way an objective systematised analysis of authentic behaviour, but a range of encoded choices afforded by particular softwares and programming languages made by living and breathing individuals acting on a range of motivations to construct a more, but probably less, reliable image of the student. The construction of LA will favour some but perhaps inhibit, repel, harm or exclude others.

In addition, learning analytics posits the educational project as reducible to numbers, as a discernible learning process which may be audited and in which

‘dataveillance’ functions to decrease the influence of ‘human’ experience and judgement, with it no longer seeming to matter what a teacher may personally know about a student in the face of his or her ‘dashboard’ profile and aggregated tally of positive and negative ‘events’
(Selwyn, 2014 p.59)

Patterns

Learning Analytics attempts to seek out patterns which naturally begs the question, what about the data which falls away from the pattern cutter?

Another danger of pattern searching is voiced by boyd,

Big Data enables the practice of apophenia: seeing patterns where none actually exist
(boyd, 2012, p.668)

Patterns are concerned with data that recurs and they fail to take account of the myriad minute varied detail in which crucial contextual information may lie,

Data are not generic. There is value to analysing data abstractions, yet retaining context remains critical, particularly for certain lines of inquiry. Context is hard to interpret at scale and even harder to maintain when data are reduced to fit a model.
(boyd, 2012, p.671)

Siemens (2013) too, alludes to the difficulty in getting the measure of the individual,

recognizing unique traits, goals, and motivtions of individuals remains an important activity in learning analytics
(Siemens, 2013, p.1383)

So much for my own objectivity and neutrality, I seem to have fallen back into that pit whose muddy walls are white and mostly black. Struggling back out, I voiced my concerns in the tweetorial, but attempted to remain open minded,

If this state of affairs which is learning analytics today, is surfaced and properly taken into account, the endeavour shouldn’t be rejected out of hand, but investigated, honed and trialed to see if can usefully help understand the conditions for learning as well as support learners. It should be done in full partnership with students, enabling a more equal and transparent participatory experience as the University of Edinburgh’s LARC project demonstrates.

The significant barriers to LA, ethics and privacy, can be foregrounded and regarded as “enablers rather than barriers” (Gašević, Dawson and Jovanović, 2016) as the editors of the Journal of Learning Analytics encourage,

We would [also] like to posit that learning analytics can be only widely used once these critical factors are addressed, and thus, these are indeed enablers rather than barriers for adoption (p.2)

Jisc has drawn up a Code of Practice for learning analytics (2015) which does attempt to address issues of privacy, transparency and consent. For example,

Options for granting consent must be clear and meaningful, and any potential adverse consequences of opting out must be explained. Students should be able easily to amend their decisions subsequently.
(Jisc, 2015, p.2)

Pardo and Seimens (2014) identify a set of principles

to narrow the scope of the discussion and point to pragmatic approaches to help design and research learning experiences where important ethical and privacy issues are considered. (Abstract)

Yet even if the challenges of ethics and privacy are overcome, there remains the danger that learning analytics reveals only a very pixelated image of the student, one which might place her at a judged disadvantage, an indelible skewed blueprint existing in perpetuity and following her to future destinations. That this should be the case is not surprising if we consider that a sociomaterial account of learning analytics foregrounds its complex mix of the human, the technical and the material performing an analysis and an analysand by a partial apparatus of incomplete measurement. The encoded institution’s audit met with the absence of student context or nuance, means that LA will struggle to give anything other than general actionable insights.

http://fiona-boyce.deviantart.com/art/Pixelated-ID-192825081

References

Beer, D. (2017). The social power of algorithms. Information, Communication & Society, 20(1), pp.1-13.

boyd, d. and Crawford, K. (2012). Critical questions for Big Data. Information, Communication & Society, 15(5), pp.662-679.

Gašević, D., Dawson, S., Jovanović, J. (2016). Ethics and privacy as enablers of Learning Analytics. Journal of Learning Analytics, 3(1), pp.1-4.

Jisc, (2015). Code of practice for learning analytics. Available at: https://www.jisc.ac.uk/guides/code-of-practice-for-learning-analytics

Kitchin, R. (2017). Thinking critically about and researching algorithms. Information, Communication & Society, 20(1), pp.14-29

Manovich, L. (1999). Database as a symbolic form. Millennium Film Journal (Archive), 34, Screen Studies Collection, pp. 24-43

Pardo, A., Siemens, G. (2014). Ethical and privacy principles for learning analytics. British Journal of Educational Technology, 45(3), pp.438-450.

Selwyn, N. (2014). Distrusting Educational Technology. Routledge, New York.

Siemens, G. (2013). Learning Analytics: the emergence of a discipline. American Behavioral Scientist, 57(10), pp.1380-1400

Williamson, B. (2017). Computing brains: learning algorithms and neurocomputation in the smart city. Information, Communication & Society, 20(1), pp.81-99.