In Granieri’s article, it mentioned that algorithmic culture can facilitate us with new information of similar interests and preferences. With the existing preference to provide more information in which it is believed that the reader will love to read, this is stated to be personalization. However, it also creates a possibility that this undermines other areas of interests due to the limited boundaries set by the existing search preferences.
When I think critically about algorithmic culture, I immediately have the “fake peppa pig” BBC trending video in mind, together with the real experience which my daughter has encountered. An aged 4 girl having very limited spelling / search capability, she can only type “peppa” or even “pig” and finally she managed to find out the fake series in YouTube. I was surprised to see that “Toys and Funny Kids”, appearing to be a child friendly channel, has accumulated over 5 billion views.
The third blog stated about some popular educational technology support with the rise of algorithm. Majority of the tools are new to me such as some adaptive learning systems and process intelligence tools. However, there is still room to validate if they are good to apply in digital education in real cases.
As mentioned in Knox’s article, it reminds us to be critical enough on algorithmic culture. Are the sources objective enough? Do we also look into the accuracy of the algorithmic result?
Reference:
Knox, J. 2015. Algorithmic Cultures. Excerpt from Critical Education and Digital Cultures. In Encyclopedia of Educational Philosophy and Theory. M. A. Peters (ed.). DOI 10.1007/978-981-287-532-7_124-1