Andreas Schleicher talks about the PISA test. This is a global measurement that ranks countries against one another and uses the data to help schools improve.
“Measuring how much time people spend in school or what degree they’ve got is not always a good way of seeing what they can actually do”
PISA tests whether students can extrapolate what they’ve learned and apply their knowledge in novel situations. Apparently we’re so, so in the rankings of the readiness of our young people for today’s economy.
Most relevant to this algorithmic cultures block “Data can be more powerful than administrative control or financial subsidy through which we usually run education”
A shocking indictment of how ineffective those side bar adverts can be that I’m having to force myself to look at them. In contrast the ads that appear in the timeline are virtually impossible to miss, which is probably why people find those more annoying.
Possible learning here for the placement of information in e-learning design.
I’ve decided to use amazon.co.uk to experiment with its algorithms. I’ve noticed in the past how quickly searched for items start to appear in other places, such as Google searches and Facebook, but this will be the first time I have attempted to document the speed and responsiveness to my browsing habits.
My methodology will be to search repeatedly for something unusual, specific and commonly available that I have never searched for previously. I will then watch for advertising appearing within search and other Internet pages, Facebook and any other applications that incorporate advertising.
I’m also interested to see whether adding an item to an Amazon ‘wish list’ increases the speed or likelihood of algorithm driven advertising appearing.
One potential issue is that my son buys from Amazon via my Prime account to take advantage of the one day delivery and this sometimes skews recommendations towards products he has been searching for.
Diary tracker:
Wednesday 08/03/2017
First search @ 22:30 for wax melt warmer from Amazon’s web interface in Chrome browser, which is signed into my Google + account.
Followed several recommended product links @ 22:39
Searched for scented wax heater @ 22:51
Searched for wax melter @ 22:53
Searched for electric wax melter in Amazon Android app on phone @ 23:16
Amazon web home page ‘Related to items you’ve viewed’ an ‘Inspired by your browsing choices’ now include lots of wax burners, scented wax blocks etc. @ 22:30, but this might have happened sooner
Continued searching on Amazon Android app which also now showing wax melts
Thursday 09/03/2017
20:11 first opportunity I’ve had to look at other applications. Facebook timeline now showing ‘xxxx likes Amazon’ and links to suggested products, including lots of wax burners and wax melts.
Now going to try putting a high ticket price item into my public wish list to see if this appears quicker than searched for items.
Added £2,000 flat screen TV to wish list, this instantly updates the ‘related to items you’ve searched section’ on the home page. Searching again for wax melters replaces the list of TVs in that section with wax melters, so it at appears that the algorithms behind these information panels does not privilege high ticket price items over low cost items.
New panel further down the home page is now showing ‘Popular affordable TVs’
It appears high ticket price items appear in Facebook no quicker than other items. Sponsored Advertising seems to be stuck on the Canon printer I searched for at lunchtime. That search was via Google so I’m now going to try searching for a TV through Google and then clicking through to Amazon to see if that makes any difference to the speed of ad updates.
The author of the above article, Vinod Khosla, sets out a case for algorithms ‘diagnosing diseases and teaching high school’ in the future, with less able teachers ‘providing the human touch as mentors and coaches’.
I think there is some mileage in this argument and some areas where it falls down. It’s certainly true that algorithms are already supporting the diagnosis of disease, such as those used in cervical cytology and skin lesions, and one can imagine how the accuracy of disease identification could improve as more and more varied presentations of diseased and normal tissue are added to the networked algorithms data sets. Human doctors must rely on a similar process of recalling previous presentations that have enough similarities to suggest a likely diagnosis, or consulting their more experienced colleagues and online resources, all of which is essentially algorithmic behaviour.
One could imagine how similar algorithms might help someone learn and recall facts, or learn a language, in essence anything that is binary, where there’s a right and a wrong answer. We’re already seeing such tools such as Memrise, a language learning app, being developed and made available on mobile devices.
Where I think the argument for algorithm driven teaching could fall down is, as Siemens (2013) states, because “The learning process is creative, requiring the generation of new ideas, approaches, and concepts.” Whereas “analytics, in contrast, is about identifying and revealing what already exists”. Analytics provides the data that algorithms use to solve problems, so one can see how an algorithm would struggle to understand creativity, new ideas and approaches, other than by classifying them as being like something identified correctly previously.
Earlier this week I was listening to a university professor explaining how self driving cars will become safer as they will pass on data about any collisions they do have to all the other self-driving cars. This is probably true, but an ability to identify similar sets of circumstances could never make them completely safe without removing human creativity and unpredictability from the equation by having no human drivers and no pedestrians in the vicinity of the vehicles. Similarly computerised teachers could learn and share successful teaching methods with other computers on the network, but I’m not sure how they would teach a student to make new and unexpected connections. However, technology continues to advance and this video would suggest that artificial intelligence could be given the capacity to reason, rather than complete a predetermined set of steps.
Another article I found online this week suggest a compromise might be the more immediate outcome. The article, titled “Could online tutors and artificial intelligence be the future of teaching?”suggests that a new software platform “will become one of the first examples of artificial intelligence (AI) software being used to monitor, and ideally improve, teaching.”.Tom Hooper, chief executive officer of the company who created the platform said: “We’re looking to optimise lessons based on the knowledge we gain. We’ve recorded every lesson that we’ve ever done. By using the data, we’ve been trying to introduce AI to augment the teaching”.
“Initially, the company’s 300 tutors will receive real-time, automated interventions from the teaching software when it detects that a lesson may be veering off-course.”It will be interesting to see how these tutors respond to the software effectively monitoring and feeding back on their performance in real time. I also wonder whether there will be cultural differences in how teachers in other countries will respond to the same type of input. The current tutors are based in India and Sri Lanka. The following table (Weil, M Rosen, L. 1999) either indicates that attitudes to technology in the region have progressed at pace in the past seventeen years, or the human aspect of this venture may be the more problematic.
References:
Khosla, V. (2012)Will We Need Teachers Or Algorithms? Posted on techcrunch.com Jan 15, 2012
Siemens, G. (2013)Learning Analytics: the emergence of a discipline. American Behavioral Scientist, 57(10): 1380-1400
Devlin, H. (2016) Could online tutors and artificial intelligence be the future of teaching? https://www.theguardian.com
Weil, M Rosen, L. (1999) The psychological impact of technology from a global perspective: A study of technological sophistication and technophobia in university students from twenty-three countries, Computers in Human BehaviorVolume 11, Issue 1, Spring 1995, Pages 95–133
Apparently it’s a setting we don’t have access to or needs a plugin. I’ve mentioned before that the idea of embedding content from lots of sources has been problematic for me. My recommendation for future iterations of this course would be to ensure that all the major embed format are fully supported.
“Hamilton and Friesen suggest that educational research is dominated by instrumentalist or essentialist perspectives, the former viewing technology as the transparent means to accomplishing educational aims, and the latter assuming innate and absolute properties (2013). These determinist perspectives maintain a separation between human beings and technology that posit either as the driving force that regulates and controls the other. Drawing from Dahlberg (2004), Kanuka suggests that educationalists tend to adopt one of three positions: ‘uses determinism’ involving the view that technology is a transparent tool for the realisation of educational aims (aligning with instrumentalism); ‘technological determinism’ concerning the effects of technology on individuals and society (aligning with essentialism); and ‘social determinism’ which perceives societal contexts to drive changes and uses of technology (2008)”
“I suggest that both behaviourism and connectivism have tended to adopt determinist views: either perceiving technology to influence preferred conduct and supress undesired behaviour (Kanuka 2008), or to be the invisible means to achieving educational aims (Hamilton and Friesen 2013), in this case the formation of connections with other participants in the form of a Personal Learning Network (Siemens 2010, Kop et al. 2011).”
Good to see some appropriate and useful literature referenced here Nigel. I think you’re right to highlight and reflect upon the (often difficult and nuanced) aspects of ethnography, including the dilemmas of ‘insidership’ , and we’ve only really scratched the surface with the block 2 task. The Research Methods course, later in the programme, will be a good opportunity to open up these issues again.
Week 7 ends up being rather busy, and I think many other have also focused on commenting on the micro-ethnographies – not a bad way to round up a block on community cultures!
It might be useful to continue some of your thinking about the observation of community as we enter block 3 and begin to uncover some of the algorithms at work on the web. While the researcher has ‘prejudices and biases’ (De Chesnay 2015) as you suggest, we might find that there are other ‘agencies’ having a say in what we are able to see.
from Comments for Nigel’s EDC blog http://ift.tt/2n9E4Cr
via IFTTT
Let the intellect alone, it has its usefulness in its proper sphere, but let it not interfere with the flowing of the life-stream. Daisetsu Teitaro Suzuki