From learning analytics to writing analytics: tracking writing a novel (or a dissertation, perhaps?) https://t.co/hG61XskPAs #mscedc

Learning analytics can also be deployed in other areas of education, such as writing. In a week when Stuart Elden has blogged about his priority for writing academic books, not articles, here’s a piece from the Guardian employing some simple analytics to the book-writing enterprise:

More complex analytics are imaginable. How they might change the writing process is a complex of issues akin to the recent Tweetorial discussions.

 

from http://twitter.com/Digeded
via IFTTT

Arcadia, a post-genre app-novel: “three publishers, two designers, four sets of coders and a lot of anguish” https://t.co/w53kfYT4Cd #mscedc

A digital extension from the ‘choose-your-own-adventure’ books from last century, Ian Pears wants to by-pass “the limitations of the classic linear structure” while “making the technology the servant of the story rather than its master”.

Does he achieve it? I don’t know. Novel reading will have to wait until after this course. What interests me here is the explicit engagement with what we might call the poetics of analytics (my term, although other might have used it).

For instance, regarding his use of visualisation to track evolving plot, Pears comments: “On every occasion, the more satisfactory the appearance, the better the story read, and I still haven’t quite figured out how that works.” This emergent quality recurs in the recursive evolution of story and app, and in Pears’ claim to be loosening, even liberating, from “from those shackles known as genres”. It’s quite likely he over-states at this juncture, but here is one instance of wider poetic strands within digital storytelling.

One paragraph of this article struck me as particularly suggestive in this regard:

Ebooks are now quite venerable in computing terms, but it is striking how small an impact they have had on narrative structure; for the most part, they are still just ordinary books in a cheap format. An analogy is the early days of cinema, when film-makers did little more than plonk cameras in front of a stage and film a play. It took some time before they realised that by exploiting the new possibilities the technology offered – cutting, editing, closeups, lighting and so on – they could create a new art form that did not replace theatre, but did things theatre could not. Computing power properly understood and used can perhaps eventually do something of the same; not supplant orthodox books – which are perfectly good in most cases – but come into play when they are insufficient.”

At the same time, the hyperlink in the paragraph regarding ebooks and consistent physical-book sales suggests no easy replacement, but complex multiple forms – a healthy context for poetic innovations, with analytics and with more than simply analytics.

 

from http://twitter.com/Digeded
via IFTTT

What does your media feed you regarding algorithms? Ben Williamson casts some light: https://t.co/UhpAtlMXN4 Diverse everyday access #mscedc

I found this piece via the ‘recent posts’ on the ‘Code Acts in Education’ blog site hosting the Jeremy Knox piece on Abstracting Learning Analytics, set for this week’s readings.

I think it’s one of the most useful meta-interpretative pieces I’ve come across for understanding my own Lifestream. It doesn’t take long, on my Lifestream, to see I read the Guardian. What Williamson does here is helps position that perspective / bias / source, and helps me see my own eyeball a little more, so to speak. While not strictly or only a digital algorithm, it is my own entanglement, and worth seeing as well as it can be seen, for what it is.

Williamson uses Google research results to pitch the editorial  line taken by several UK newspapers in their reporting (or not) of algorithms. Thus:

  • The Guardian: ‘the algorithm as governor’.
  • The Telegraph: ‘the algorithm as a useful scientist’.
  • The Sun: “largely disinterested in algorithms in terms of newsworthiness”.
  • The Mirror: “treat[s] algorithms in terms of brainy expertise.”
  • The Daily Mail: ‘algorithms as problem-solvers’.

For my Lifestream, and from it, I’m unsurprised that the Guardian “appears to take the most critical editorial line”, and is the most explicit in emphasising “the connections between algorithms and politics”. This is my feed, most certainly.

But critical reflection about the spread of options is fascinating and vital, especially if, as Williamson projects, “new digital media literacy approaches to news consumption and information access are going to be crucial in coming years.” We talk a lot about entanglements on this course; Williamson points to the uneven and variegated nature of entanglements, even within old-school, fairly monolithic editorially-curated discourses. Within and beyond that:

“if we genuinely are concerned that algorithms are involved in political life by filtering and curating how we access information, then it’s perhaps concerning that these issues are much less well covered in papers from alternative political perspectives. Even what we know about filter bubbles and algorithmic curation is itself filtered and curated.”

Education is drawn into this morass, in that people learn their everyday knowledge from such divergent and relatively narrow perspectives. As Williamson concludes: “Developing forms of digital media literacy that attend to the role and power of algorithms in political and cultural life now appears to be real priority that will require dedicated attention in 2017.” Cue EDC…

 

from http://twitter.com/Digeded
via IFTTT

Lifestream summary, week nine

This week’s Lifestream feels like a multi-stepped journey, without a panoptic view of beginning and end. In part, this reflects the Tweetorial’s velocity and scale, in part it reflects the topics in view.

https://upload.wikimedia.org/wikipedia/commons/9/98/Algorithms-NetFlow1.png

To pick out some themes:

(a) Education and learning analytics have to both inform each other: there is a two-way relationship between them – neither is a ‘thing in itself’;

(b) Learning analytics do not escape the influence of context. Thus, learning analytics need to be understood relationally and in a context-critical way – this requires both localised particularity and also an eye towards globalisation;

(c) statistics don’t bleed, but learners still do – there are ethical entanglements with learning analytics which are inescapable;

(d) learning analytics are emblematic of wider algorithmic cultures, and need to be – and will be – understood within that wider cultural matrix;

(e) instant answers such as ‘learning to code’ are perhaps more instant than answers – and will generate new issues, new questions. At very least, coding needs to go hand in hand with social theorising, even if the money follows the former not the latter. Particularly if the money follows the former, not the latter.

(f) as the final post for the week suggests, I’m surprised to think that learning analytics might even humanise learning, in some situations – but then i think this unexpected ‘turn’ reflects the breadth of possible contexts and entanglements. A particular assemblage can surprise, and be unexpected.

(g) this block of the course is by far the least stable in my thinking and experience – befitting its emergent qualities, but also its ‘big’ scale.

https://upload.wikimedia.org/wikipedia/commons/9/98/Algorithms-NetFlow1.png

Follow the money: datafication and learning analytics with policy leads to reducing learners to, well, data. https://t.co/unpYaEqEbH #mscedc

This piece from the Guardian isn’t a simple diatribe against learning analytics. Indeed, it could be a plea for them. It’s an example of the ‘entanglements’ (Bayne’s term, if I remember right) in which Learning Analytics can be bound up. Here, the context clearly includes policies but, also (and this I couldn’t fit in the Tweet) personnel, management, and pressures from funding bodies – and other stakeholders such as parents looking at league tables for ‘success’.

I think in the initial Tweet I was too negative about learning analytics. Rereading the article, and seeking to inhabit the institutional mind-set it communicates, perhaps learning analytics are the very thing that might help highlight more complex issues than mere completion, shine light on shady practices, and even help humanise learners reduced to pass rates.

I surprise myself, but learning analytics might be anti-reductionist in this and similar circumstances. No guarantee of this, nor a monopoly on it. but this was worth the initial Tweet and the second thought.

But I still think one needs to follow the money. Learning analytics in isolation are not the cure.

 

from http://twitter.com/Digeded
via IFTTT

Google ad controversy: what the row is all about

This follows on from the last post, the two of them working together.

I’m reminded, reading this piece, of two other aspects of globalisation. First, the division of sub-prime loans among banks around the world, such that no one bank knew its risk exposure prior to the global financial crisis of nearly a decade ago. Second, the inabilities to trace infected meat through food production processes since it is all, literally, minced in together.

The Google ad controversy, among other things, is a product of globalised markets. This is best shown in the graphic within the article (below). Even Google’s algorithms struggle with this. But, tellingly, Google is the market, called upon to fix itself.

 

Google ad controversy: what the row is all about

Google ad controversy: what the row is all about

What is programmatic advertising, how does it work and why did big brands appear next to inappropriate material? We explain

Google and Facebook control almost 60% of the £11bn UK digital ad market.Google and Facebook control almost 60% of the £11bn UK digital ad market. Photograph: Josh Edelson/AFP/Getty Images

Friday 17 March 2017 15.24 GMT First published on Friday 17 March 2017 13.30 GMT

Why is advertising by big brands appearing alongside inappropriate content such as extremist videos?

As odd as it may sound, in the digital age many brands do not know exactly where their online advertising is running. The computerisation of digital advertising, where machines are largely responsible for choosing where ads run, has taken over much of the job of deciding where they should appear on the internet. This process is called programmatic advertising.

So what is programmatic advertising?

Think eBay, but quicker and more advanced. Until relatively recently, for an ad campaign to appear – on TV, radio or in print, for example – it would be booked through sales teams and ad agencies picking up a telephone and striking a deal for where it would go, when it would run and how much it would cost. The rise of digital media means this is being rapidly replaced by computerised, or programmatic, advertising systems, where the parties transact digitally in a similar way to buyers and sellers on auction site eBay.

How does it work?

Media owners, such as YouTube and many thousands of other publishers, make their advertising slots available within the programmatic system for advertisers to bid on. This process is handled through digital trading desks used by media agencies, which plan, book and execute campaigns on behalf of their clients. These connect with exchanges such as AdX, which is owned by Google, to then run ads around media such as videos on YouTube. Google also delivers ads to many other third-party sites.

What is going wrong?

The key here is a fundamental shift in how digital ad systems have transformed the targeting of ads to reach audiences with their messages, whether it be buying a car, a holiday deal or a charity appeal. Previously, advertisers would know the environment where they were running their ads online – for example, targeting readers of the Guardian website because they fit a particular demographic. With programmatic buying, there is a wealth of data on audiences but not the specific website or content that audience might be visiting. So a jihadi video might provide what looks like a valuable audience based on, for example, age data, but in reality is nothing of the sort.

.

How programmatic advertising works

 
Why is Google the villain here?

With great power comes great responsibility. Or, as Google’s critics say is proven by issues such as this, a total lack of the latter. Google – and Facebook, of which more later – have a near duopoly of control of the entire digital advertising market. The two Silicon Valley giants control almost 60% of the £11bn UK digital ad market, according to eMarketer. And almost 90p of every new £1 of digital ad spend is going to these two players. Programmatic advertising has gone from zero to accounting for almost 80% of the £3.3bn spent on the display advertising part of the market. As well as raking in the cash, Google is responsible for much of the infrastructure that delivers digital advertising. “Google provides most of the plumbing that enables programmatic,” said Scott Moorhead, founder of media consultancy Aperto One. “Google is not controlling the inventory coming in well enough in advance of making it available. And buyers of ads for clients, the media agencies, are not vetting it themselves.”

What was that about more on Facebook?
The programmatic advertising furore is the latest in a string of issues that have put the spotlight on the digital advertising market, which has hitherto been viewed as providing brands with the most accurate and measurable means of reaching consumers. Last year, a damning study found that potentially vast amounts of ads that brands were being charged for were viewed by “bots”, computer programs that mimic the behaviour of internet users. This was followed by Facebook admitting to a string of measurement errors, such as how many people are watching videos. Sir Martin Sorrell, chief of the world’s largest marketing group, WPP, said the issue was akin to Facebook “marking its own homework”. Keith Weed, marketing chief at Unilever, which owns brands including Dove and Lynx, said the lack of transparency around the efficacy of digital ads was akin to having “billboards underwater”. More recently, Facebook and Google have been taken to task for not cracking down on fake news, which came to prominence during the US election.

What are YouTube’s policies on advertising and controversial material?

Google knows that advertisers don’t like their brands appearing next to a whole host of controversial topics and tries to head off problems like this before they occur with its “advertiser-friendly content guidelines”.

“Content that may be acceptable for YouTube under YouTube policies may not be appropriate for Google advertising,” the site warns film-makers, before reeling off a long list of content which it would consider inappropriate, including (but not limited to): sexually suggestive content; violence; inappropriate language; promotion of drugs; and “controversial or sensitive subjects and events”, including “war, political conflicts, natural disasters and tragedies”.

How does YouTube enforce those policies?

The video platform says it uses “technology and policy enforcement processes” to determine whether a video is suitable for advertising. A substantial portion of the work is done automatically, by scanning the video title, metadata and imagery to try to get a sense of how appropriate the video is.

As well as the automatic tools, Google also relies on a crowdsourced approach, asking its users and advertisers to flag up content they consider inappropriate. That then undergoes manual review, which can result in the advertising being pulled from the video, or the video being removed. But controversial videos with narrow audiences – such as a piece of Britain First propaganda with fewer than 20,000 views – often will never reach users who consider the content controversial, limiting the usefulness of such an approach.

What Google doesn’t do is manually check every video for controversial content. To do so would be a mammoth task: 300 hours of video are uploaded to the site every minute, which would require more than 50,000 full-time staff doing nothing but watching videos for eight hours a day.

What happens to film-makers who break the rules?

The current system was introduced in September 2016 and rapidly attracted controversy from YouTubers, many of whom rely on the site as their sole source of income. The automated system YouTube applies errs on the side of caution, with film-makers often forced to appeal against false positives, and also served to cut funding from film-makers working on subjects such as LGBT history and even skincare for acne sufferers.

Hank Green, one of the site’s biggest stars, lost advertising on two videos at once: Zaatari: Thoughts from a Refugee Camp, and Vegetables that look like Penises.

Tags: mscedc
March 17, 2017 at 10:01PM
Open in Evernote

Lungworm and bike helmets: why does Google show certain ads?

This piece highlighted another of my experiences with algorithms: the haunting of my advert spaces by the hesitations and search results within my Amazon account. I’m pleased to know I’m not alone. And, reading to the bottom of the piece, I’m glad it’s not worse…

 

Lungworm and bike helmets: why does Google show certain ads?

Lungworm and bike helmets: why does Google show certain ads?

Anyone looking at the adverts companies think I may be interested in will conclude I lead a pretty dull life

Bike safety helmetsAfter looking at bike websites, I was shown offers for a helmet and a lock. Photograph: Alamy
@eminesaner

Friday 17 March 2017 18.35 GMT Last modified on Friday 17 March 2017 19.16 GMT

Show me a person’s targeted adverts, goes no proverb (yet), and I’ll show you what they put in their online shopping basket but decided against buying at the last minute. Most internet users will be very familiar with the feeling that your computer is spying on you, with adverts trying to get your attention and reminding you what you’re missing out on.

“The goal is to personalise those ads,” says Sean Donnelly, an analyst at Econsultancy, the digital research company. “The rationale is that the exposure to that ad will remind you that you looked at [something on that site], and at some point you will click on that ad and purchase it. The reason they do that is it works.”

Obviously companies don’t want those ads to appear on certain websites (extremist or porn sites, for instance) but it can be hard to control. “There are lots of different vendors out there, and if you think of the supply chain – sites that allow ads, the companies that sell space on those websites, the media companies that buy spaces – it gets quite fragmented and it can be difficult to see where your ad is going to appear.”

On a site such as YouTube, where millions of videos are uploaded, including extremist ones, brands can find themselves advertising on videos they would be horrified to be associated with – and unwittingly funding the uploader. The Times found that companies and organisations including this newspaper, Transport for London and Sainsbury’s had all had adverts on hate preacher Steven Anderson’s videos.

This afternoon, I spent 10 minutes looking at a couple of bike websites. A few minutes later, getting back to work and reading a news website, there was a big advert for a cycling chain, complete with a picture of the bike I’d just looked at, as well as offers on a helmet and lock. It’s the same on a US news site, though it doesn’t work everywhere: a TV site seems to think I should buy a car instead, while YouTube shows me an advert raising awareness about the danger of lungworm in dogs even though I’m pretty sure I’ve never stayed up late into the night surfing canine parasite pages. The other adverts, on Facebook and Instagram, are a trace of my browsing history over the last few weeks. Anyone looking at the adverts companies think I may be interested in will conclude I lead a pretty dull life.

Others should be more careful – in 2013, Conservative MP Gavin Barwell triumphantly criticised Labour on Twitter after clicking on a link in a press release, and seeing an ad for a dating site, reading “date Arab girls”. He’d learned nothing from John Prescott who, the year before, had criticised MP Grant Shapps for running ads for “Thai brides” on his website. Neither men realised the only thing they were exposing were their own browsing histories – or at least what Google had concluded would interest them.

Tags: mscedc
March 17, 2017 at 09:53PM
Open in Evernote

@james858499 Also need to ask how being educators reframes our perception of learning analytics. It’s a 2-way street, albeit tilted. #mscedc

Keeping this discursive channel as a two-way process seems vital for critical thinking. Without it, the risk is that we become only reactive. Given the inherent inter-disciplinary nature of much of this block’s issues, two-way thinking is going to generate better connections, hypotheses, and analysis.

 

from http://twitter.com/Digeded
via IFTTT