Not yet released, but here’s a book I hope will do good: https://t.co/BRmNQYazs5 @ahc thank you for writing it. Proverbs 1:7; 24:3-4 #mscedc

I caught sight of this on a Tweet, and I look forward to reading the book in due course:

Getting to the formal end of the ‘Community Cultures’ part of the module, I’m reminded by this book that the family – noting that it comes in all shapes and sizes – is a foundational communal element within our lives. Like all other communities, it’s shaped by digital cultures. And, thinking about education, for many learners family is a key constitutive element within their wider context for learning.

This book is written from a world-affirming but wisdom-inviting Christian perspective, which is close to my heart. It’s hard to assess from the contents page (see below) and other ‘look inside’ glimpses, but it looks promising. It looks like it neither demonises technology nor assumes a fatalistic view of it. I’m hoping for a thoughtful, brave, creative and integrated read, with practical, everyday payoffs and suggestive avenues for life. Andy Crouch, I’m happy to review it, if you want to send me a copy. Paper, please…

 

 

from http://twitter.com/Digeded
via IFTTT

Google plans to tell users when to wash their mouths out: moderating our toxic talk by giving us Perspective https://t.co/Ja0e9A4Ht2 #mscedc

The oracle at Delphi offered ambiguous data. But, carved in the stones of Delphi were instructions to ‘know thyself’ and ‘nothing in excess’. Such was ancient moderation, and much played over within the tales associated with those who sought the wisdom of the oracle.

Google, it seems, is offering a tech-based equivalent, according to this article from BBC News:

 

Algorithmically derived, Google wants to teach us to know ourselves, and to moderate our excess, in the face of the ambiguities of digital life. It’ called ‘Perspective’, which should attract a Donna Haraway related concern about the ‘god trick’ involved or invoked here. (As an aside, both Haraway and Perspective are a long way from how John Calvin opens his ‘Institutes of the Christian Religion’, where he notes that the sum of almost all knowledge is knowledge of God an knowledge of self.) As one of those behind Perspective explains, according to this BBC News article:

“Jared Cohen of Jigsaw explains three ways Perspective could be used: by websites to help moderate comments, by users wanting to choose the level of rudeness they see in the online conversations they take part in, and by people wanting to restrain their own behaviour.”

In anticipation of ‘Algorithmic Cultures’ to come, here’s another locus where ‘Community Cultures’ bleed and blend in. For learning and education, there are significant questions to be asked, not least as to whether this will induce deep learning, deep change, within users – or whether it’s an out-sourcing of emotional and social intelligence beyond the individual user, and another excuse for such ‘toxic talk’ online. For freedom of speech, it’s not mentioned here in this BBC News article, but how far until surveillance, or even permission, is similarly appraised? Concerns regarding ‘pre-crime’ under the anti-radicalisation and British values inherent in the UK Government’s ‘Prevent’ agenda also raise such questions.

 

It might be worth recalling the third aphorism inscribed at Delphi: ‘A pledge comes from madness’. Google beware. You’ve been warned!

 

from http://twitter.com/Digeded
via IFTTT

@philip_downey Not been to law school, but https://t.co/cRnN0NUi6U suggests law as more pragmatic, more performed – thus ‘teched’ #mscedc

Glad to have had this Twitter exchange, it’s clarified something for me. Here’s the exchange:

What it clarified for me was an extension I was making in this course from some previous reading of this book:

Legal geography is a disciplinary sub-field that has developed since my time within formal academic geography, but I found this book to be really stimulating, and it has informed a chapter I’ve written in this book, forthcoming from Eerdmans, a chapter looking at Acts 21, in the New Testament part of the Bible:

I really like inter-disciplinary work. For me, it’s the essence of geography, and of life. The clarifying moment in this Twitter exchange for me was that I’ve been using Delaney’s work as a lens for looking at digital cultures.

Once I realised it, it was obvious what I was doing. But, as is often the case across disciplines, it takes a while to realise it. Now that I know what I’m doing, hopefully I’ll do it better. Thank you, Philip!

 

 

from http://twitter.com/Digeded
via IFTTT

More wars, and rumours of wars. This time, the Russians are coming. https://t.co/5P9feTb2pI Cold war rebooted? #mscedc

This picks up on my earlier post, entitled ‘Wars, and rumours of wars’. Also, it connects with issues of periodisation, and the historiography of digital cultures. As mentioned in the Radio 4 ‘Thinking Aloud’ programme earlier this week, the Cold War and its aftermath was integral to the ways in which digital technologies – and their associated cultures – have developed.

The link is from the BBC News:

As another integrative connection across my Lifestream, this piece also points to issues of security being wider than just hackers. Also, the reported focusing on control of information links in with issues of fake-news, post-truth and democracy explored in other Lifestream postings. As a further, connection, the Lifestream posting on bots battling it out on Wikipedia sites takes a more sinister turn with the possibility of this kind of information warfare. I anticipate that an algorithmic dimension makes this much harder to track, to anticipate, and to defend against.

In a new move on from Cold War 1.0, this seems to be less a matter of clashing ideologies, and more an attack via the social, via digital cultures, on the operative outcomes of rival nation-states. The final comments in the piece about bloodless but paralysing interventions typify this. If the development of aviation in World War One anticipated mass civilian bombings in World War Two, then this kind of nation-state policy, enabled via digital connections, is both a fascinating and disturbing possibility.

Did the Cold War end, pause, or simply reshape? As mentioned in the above Tweet, perhaps ‘reboot’ is a suitable metaphor covering all three options. Cold War 2.0 – perhaps we hardly see it as yet.

 

from http://twitter.com/Digeded
via IFTTT

Constantly amazed by how digital techs shift legal issues. Here’s another instance: https://t.co/963RB06B3y If law, so too educn. #mscedc

A number of Lifestream posts this week have looked at how digital cultures inform, infuse and reshape legal deliberations. Here’s my latest observation of this restructuring at work, in a piece from BBC News:

It’s not that the law per se has changed: alleged murder remains alleged murder. But the ability to pursue and prosecute – and, indeed, to police – varies with technical changes and diffusion.

This predates the digital: for instance, Crippen was arrested for murder only with the aid of the telegram. What’s new with the digital is the range of issues raised for legal practice. Here, for instance, the ownership of data (interesting that it lies with Amazon, not the customer), and the right to privacy.

The First Amendment was not made with Echo smart speakers directly in mind, but it will need to work out what to do with them. And, just as technology restructures legal practices, so too it restructures educational practices. The raft of such instances in our press is unlikely to dry up in the near future.

from http://twitter.com/Digeded
via IFTTT

@dabjacksonyang I share your first suspicion. It’s all too eye-catching. But I wonder if both parts are true in your second sentence #mscedc

A Twitter exchange, regarding my earlier Tweet about Wikipedia as the turf for bot wars:

 

My reply picks up on thoughts expressed in any earlier Lifestream post, ‘Wars and Rumours of Wars…’ That title for that post picked up on the phrase’s end-times connotations in Mark 13:7.

 

from http://twitter.com/Digeded
via IFTTT

The ‘gig economy’ is not inherent to digital business: https://t.co/NWsAXhkXVH – or is it? https://t.co/BSbleDj1e3 Tangled web(s)… #mscedc

Two pieces in view and in contrast here, both reporting on senior executives from Amazon, Deliveroo and Uber appearing before a Select Committee of MPs within the Westminster Parliament.

The first, from the Guardian:

The second, from BBC News:

 

Reading these two media reports, this group of MPs, at least for a day, have had their chance to have their say about the ‘gig economy’. Employment rights, rates of pay, and fiscal revenue all came under the Committee’s scrutiny, as the legislature seeks to keep up with nimble and innovative business models.

These pieces read informatively off the back of the Radio 4 ‘Thinking Allowed’ programme earlier in the week, reported in an earlier Lifestream post. Especially with the suggestion made within that programme that the gig economy will not exist, in its present form, in five years’ time. What next, in this shifting assemblage of digital cultures? And, what holds true in these businesses finds many and varied and unpredictable parallel and differing manifestations within educational practices too.

These two reports also work well together, to show the different ‘spins’ which news can receive. Never mind fake news; news itself is complex enough.

 

from http://twitter.com/Digeded
via IFTTT

Wars, and rumours of wars: bots battling it out for ‘truth’ on Wikipedia. Unpredictable algorithms at work? https://t.co/XSorIvN44v #mscedc

This is a cleverly crafted piece from the Guardian:

Clever, in that it personifies (and illustrates) in a shockingly anthropomorphic fashion. The language of fights, battles, wars, and conflicts feeds an epic quality. A techno equivalent to the Greek pantheon, slogging it out within the sphere of us mere mortals. “Humans usually cool down after a few days, but the bots might continue for years.” The article even reports two bots arguing about God.

Clever, too, in that “some conflicts mirrored those found in society”, whereas “others were more intriguing.” A great mix of the familiar and the unfamiliar, another reworking of Bayne’s notion of the ‘uncanny’.

Clever, yet again, in that the piece shows how Wikipedia, often touted as communally constructed, a knowledge base of a ‘community culture’, is also interwoven with ‘algorithmic culture’. In the words of the underlying research article, “Wikipedia is an ecosystem of bots” and automated services: their research, here reported, focusing on editor bots, just one part of that ecosystem. Also, there are differences between territories, among different language editions of Wikipedia. This stuff has its own geography, with its own strata and underlying plate tectonics.

If geography, then also need for ecological nous, management, or at least respect. In the words of the Guardian report, “The findings show that even simple algorithms that are let loose on the internet can interact in unpredictable ways.” Having wiped out the Dodo, and introduced the rabbit to Australia, what on earth is going on in Wikipedia? We will see. If we can, indeed, see it. “We know very little about the life and evolution of our digital minions.”

The article ends with a great bridge between our ‘Community Cultures’ and ‘Algorithmic Cultures’ blocks:

“Often people are concerned about what AI systems will ultimately do to people,” he [Hod Lipson] said. “But what this and similar work suggests is that what AI systems might do to each other might be far more interesting. And this isn’t just limited to software systems – it’s true also for physically embodied AI. Imagine ACLU drones watching over police drones, and vice versa. It’s going to be interesting.”

 

from http://twitter.com/Digeded
via IFTTT

You can run, but you can’t hide https://t.co/LER6OA1OH5 Big(gish) data helps quash fake news. Or old-fashioned investigative journo? #mscedc

Meanwhile, in sports news:

This has several delicious turns, for the ‘fake news’ and news/entertainment angles underpinning some recent Lifestream posts.

First, this is no ordinary amateur athlete. It’s a “prominent food blogger”. Digital cultures create and recycle their own personalities. I don’t know how many non-prominent food bloggers I know, nor what bearing it has on the ‘story’ in question!

Second, I love the notion of a business analyst who “in his spare-time” runs (the verb is sweet irony)  a marathon-investigation website. Digital cultures have so many fascinating nooks and crannies. Now, I like a clean race as much as the next person, but I’m curious as to how one decides to set up such a website, and then how one gathers data for it.

Looking on the website about this particular instance, it’s an amazing presentation of data, and telling of a story, even the suggestion of a motive – to join a “subgroup” of ‘The Dashing Whippets’ called ‘The Performance Team’. Is this the lure of ‘community cultures’? Or am I being sucked into an elaborate piece of writing, located somewhere between news and entertainment? The more I look at it, it does grow in entertainment value – and I have to presume it’s true…

Third, the confession of guilt (or, at least, of shame) is then made via an Instagram account, which is now deleted. More irony, or just hard-to-verify information, in a piece seemingly rich with data, right down to the individual’s performance data, ‘shared’ on the Strava website? On that note, beware, your data stream and digital footprint will find you out – e.g. Strava‘s flyby screen (apparently).

Fourth, though, this is, in many ways, standard old-fashioned investigative journalism, albeit with a digitally-infused twist, with what the BBC call “digital detective work”. Bob Woodward, one of the journalists behind the Watergate investigation and credited with the Washington Post‘s byline mentioned on a previous Lifestream post , would at least  recognise the effort that’s gone in here.

 

from http://twitter.com/Digeded
via IFTTT

‘Democracy dies in darkness’ https://t.co/bH3iyBXXAA but can ‘the people’ decide? All in the moral maze https://t.co/4jAhnUTvYQ #mscedc

Today I listened to last night’s Radio 4 broadcast of The Moral Maze, examining the the morality of ‘fake news’. Perhaps I’d hoped for too much from one programme, especially after various previous posts on my Lifestream about the topic, but I found it a bamboozling discussion in terms of trying to come to a clear(er) mind about the issues. Off the back of the programme, and spurred by its passing reference to the Washington Post’s new-but-not-so-new slogan:

and as ongoing reflections, here are some provisional theses regarding fake news:

1.  Democracy, if it can, needs to see in the dark. If it can’t, it’s in trouble. We need a light that can shine in the darkness, which the darkness cannot put out.

2. We get the news we want. Or at least, we don’t get more than that. Sometimes, and in some places, we get a lot less than that.

3. Given the bewildering proliferation of sources for news, as well as the communicative complexity within ‘news’, we can’t verify everything. News was, is, and will be – to some degree – a matter of trusting both source and interpretation. News is socially constructed. We can all try harder in constructing it. Relating with one another is inherent to both the problems and the solutions cohering around terms like ‘part-truth’, ‘post-truth’ and ‘fake news’.

4. For mortal humans, truth-in-news is more than a relativistic mirage, and is less than an absolute certainty. We need wisdom, humility and tenacity to live with that. Digital cultures haven’t created this epistemic situation, but might well clarify its contours for us, even while creating some treacherous cliffs for us.

5. A spectrum between ‘entertainment’ and ‘news’ is not a neat one, but is probably analytically important. Its polarities might be easy-ish to identify, but don’t boil down to ‘news = true’, or ‘entertainment = false’. The real action is in the spectrum’s blurred middle, especially in light of inevitable and continual mediation via editorial selection and control. News media are social media; social media are news media. Both are interested in market share and, often, in profitability.

6. Digital cultures introduce new technologies, novel business models, experimental assemblages. They accentuate lots of uncertainty. But that was  all there before, too. Propaganda is a pre-digital term.

7. It’s probably helpful to distinguish deliberate lying, from mistakes made due to negligence or weakness which are then corrected. It might not be easy to discern the two, but motivation and consequence matter.

8. But don’t ever expect ‘correction’ to remove error; networked relations preclude such easily controlled binary options. No-one (person) is – fully – in control. But nor is ‘the system’, in some easily identifiable kind of way.

9. Don’t forget ‘the people’. And don’t think individuals can’t be lazy, mistaken, uncritically happy with what they’re told, or unbelieving of – and resistant to – the truth. It looks like it can happen.

10. Algorithms are an unknown. They can be a folk-devil. But what if they are genuinely dangerous for truth, and how can we tell? Therein lies one of my rolling questions, on the cusp of entering the ‘Algorithmic Cultures’ block of the course.

 

 

While The Moral Maze didn’t address digital platforms specifically, it was implicit in the discussion. Not least, in the choice of Tom Chatfield, author of ‘How to Thrive in the Digital Age’ as one of the witnesses interrogated by the panelists.

Issues of ‘fake news’ are ontologically driven, but are also a genre issue, in an era and space where genre-categories collapse and blur and reform in unexpected and unstable formations. Even YouTube videos are faked, it seems, and it can be news

 

from http://twitter.com/Digeded
via IFTTT