Cyberculture and privacy

Being actively engaged with a topic inevitably heightens one’s senses to any mention of it in everyday life.  One such example of this occurred this morning during my commute to work.  Half listening to the BBC’s Today Programme during the morning commute the subject matter turned to cyber security and I was immediately more attentive.

The item was referring to the use of social media tools by terrorists, both from a propaganda and organisational perspective, and the inability of security agencies to access encrypted messaging sent and received during the recent attack in London.

The debate centred on the relative benefits and pitfalls of social media providers creating a key or ‘backdoor’ to enable the security services to access encrypted messaging.  The opposing view presented was that the rest of the public would suffer a loss of privacy as a result and that such a backdoor would create a vulnerability that would be open to exploit.

This feels like another example of what Sian Bayne refers to as ‘complex entanglements’ (Bayne, S. 2014).  None of us want terrorists to be provided with unhindered means of organising attacks, and many might consider a loss of privacy a price worth paying.   But what if that loss of privacy allows state sponsored   meddling in our democratic processes.  What if our own security services were to misuse their powers and routinely access our day to day communications? (the  ‘snoopers charter’ debate).

This all felt very pertinent to the privacy aspect of this ‘Algorithmic Cultures’ block.  In one context collecting and presenting data in a particular way might appear entirely appropriate.  Perhaps what we need to consider is how else the data we collect might be used and by whom.

References

Bayne, S. (2014) ‘What’s the matter with ‘technology-enhanced learning’?’ Learning Media and Technology 40(1), pp.5-20

Leave a Reply