This last week we looked at the summary of our Learning Analytics exercise and tried to decipher the meaning of the data collected. While I had the most Tweets, there is no discernable information indicating the quality of those Tweets in relation to the questions asked by Jeremy, James, and others in the class. The data was more quantifiable in that it measured the number of Tweets, the words used the most, who made any comments at all, etc. There was little, if anything that I saw, reflecting the quality or relevance of any comment to the stated discussion topics.
A more subjective indication of the overall participation of class members would be the total number of Tweets by members. This would indicate a willingness to engage with each other on the Twitter platform, but again, would not necessarily represent the quality of the exchanges. An example would be the number of posted cheese jokes (some of which I found very funny). The data mined from the exercise does not reflect the number of cheese jokes or the reaction to them, unless you look at the number of times the words “cheese” or “cheesy” were mentioned. And even that number might be misleading if the words were included in a post addressing a different issue. What was of interest however, was the resulting discussion among classmates about the quality of the overall data collection in relation to quantity and how the two may be co-reflective. Some class members posted their own versions of the LA assessment and what the data meant to them.
The comments rendered by my classmates did however bring to my mind some interesting reading I had been doing on the ethics/morals of technology. I think this fits into what we as a class have been discussing. I have been looking over a couple of articles by Verbeek and Foucault that assert humans should not take an “outside position” when assessing technology but rather a “limit attitude” (Foucault 1997) whereby we do not focus on the ethics of having technology but rather on how the technology is designed and implemented. In other words, we as humans are involved in the design and implementation of the technologies that govern, or steer, our lives (Verbeek 2013). To clarify, an “outside” stance could be interpreted as oppositional to technology as opposed to the “limit attitude” where the individual stands on the border of the technology (but within its sphere of influence) application and assesses its value from that point of view. Braidotti (2013) quotes Verbeek by stating, ” . . . technologies contribute actively to how humans do ethics (Verbeek 2011).” This statement implies to me that technology, including data mining, or Learning Analytics, is meaningful only when humans use it as a means to revise their lives rather than use simple statistics that may not accurately portray real life circumstances. This ties in very well with the assertions of Foucault and Verbeek that we should be active participants in the gathering, analyzing, and application of data from the technologies we use.
This position may be wise in terms of our exercise of Learning Analytics. Rather than looking to see whether the collected data is valid or not, we need to understand its purpose and how that data is collected. We can also be in a better position to partake in the creation of the application and any revisions that may be necessary. Then we can make viable decisions on how the data is used in our lives, wherever that may be. The ultimate objective, or at least one of them, is the use of that data in the assessment of our exercise or assignment and how successful it was or not.
“In the context of technology this means that the frameworks from which one can criticize technology are technologically mediated themselves. We can never step out of these mediations. The most far we can get is: to the limits of the situation we are in. Standing at the borders, recognizing the technologically mediated character of our existence and our interpretations, we can investigate the nature and the quality of these mediations: where do they come from, what do they do, could they be different?” (Verbeek 2013).
Per Foucault and Verbeek, we need to look at the data from within, standing at the border of its application and become a part of how that data is collected and eventually used. This puts a more human element into the equation of the ethics of the assignment in terms of placing a value on it rather than just seeing it as a collection of sterile numbers and charts. Verbeek (2013) asserts, along with Foucault (1997) that technology is a part of our lives. I interpret this to mean that I should accept the fact of the presence of technologies, embrace them and work within the parameters of those technologies, using them to enhance my life and work, rather than taking an outside stance and continuing to assess if the technologies should be part of my life in the first place.
Finally, I will end this with a quote from Verbeek which, I believe, sums up what I am trying to express:
“While we cannot conceive of ourselves as autonomous beings anymore, because of the fundamentally mediated character of our lives, we can still develop a free relation to these mediations. Without being able to undo or ignore all of them, we can critically and creatively take up with them. Being a citizen in a technological society requires a form of ‘technological literacy’. Not in the sense that every citizen needs to understand all technical details of the devices around them, but in the sense that we develop a critical awareness of what technologies do in society” (Verbeek 2013).
Braidotti, Rosi (2013, p.41). The Posthuman. Cambridge, UK; Malden, MA. Polity Press.
Foucault, M. (1997a). “What is Enlightenment?”. In: M. Foucault, Ethics: subjectivity and truth, edited by Paul Rabinow. New York: The New Press
P.P. Verbeek (2011, p. 5). Moralizing Technology: Understanding and Designing the Morality of Things. Chicago, IL: University of Chicago Press.
P.P. Verbeek (2013). “Technology Design as Experimental Ethics”. In: S. van den Burg and Tsj. Swierstra, Ethics on the Laboratory Floor. Basingstoke: Palgrave Macmillan, pp. 83-100. ISBN 978113700292