I’m actually rather glad we have ‘come back’ to this topic at this stage in the course. It was in the very first week of our interaction that I relayed a story about my debate with colleagues over a wine bottle or two about the danger (in my opinion) of not fully evaluating the amount, kind and frequency of data collected through alogarithms about users of the web on a daily, hourly and, probably, minute by minute basis. My positions was that scheming and unscrupulous marketers and peddlers of various wares were potentially using this freely provided data to manipulate our behaviors and seduce us, unwittingly, into buying more stuff. And this doesn’t even being to touch on the use of alogarithms by governments and their affiliated agencies to potentially keep watch over us and society in general.
However, this aside, I have recently become quite intrigued with the process behind the now very commonplace GPS and its ability to indicate and present traffic flows for road users who make use of them. The gargantuan flow of data and the persistent analysis of that data using alogarithms must certainly be fantastic to visualise f that were possible. I wonder what manner of alogarithm it could be and juts how quickly does it sort its was through such massive amounts of ever changing bits and bytes.
Sounds like a question to be answered this week….