Knox (2015) writes:
“It is notable that algorithms, assumed to provide objectivity and exactitude, are frequently used in areas of high risk and security, and this is precisely where the most prominent example can be found in education: the use of the Turnitin plagiarism detection service at the point of assessment. ”
This is at odds with my own experience of using Turnitin. It is not a “plagiarism detection service”. It is at best able to suggest where plagiarism may have occurred, through its similarity indexing algorithm, but the ultimate call as to whether or not plagiarism has occurred is (still) made by humans. The similarity score of Turnitin is used as part of the evidence gathered in suspected academic misconduct cases. I have never heard of a student being penalised automatically. Perhaps it happens elsewhere.
Moreover, there should be push to flip the focus of Turnitin’s reporting to enable students to improve their scholarship.
ref: Knox, J. 2015. Algorithmic Cultures. Excerpt from Critical Education and Digital Cultures. In Encyclopedia of Educational Philosophy and Theory. M. A. Peters (ed.). DOI 10.1007/978-981-287-532-7_124-1
@james858499 a focus on detection needs to be switched around to focus on good scholarship if we are work in tandem with the tools provided
— C (@c4miller) March 13, 2017