Tuesday, December 4, 2007


I was at the Online Educa this past week and at my school's stand for 2 hours to talk with people about plagiarism detection. The only one to come was a guy from the Polish company StrikePlagiarism, who wanted to assure me that they had immediately sat down after my test and the programmers had set some parameters to be a bit stronger and now they could handle all of my plagiarisms.

I don't really believe that, but this is what all companies are saying. They are new, improved.

My point still is that these are not plagiarism detection systems, but similarity detection systems. The determination of whether or not a text is a plagiarism must be the sole responsibility of the teacher and/or the school administration (for determining the consequences). You cannot delegate responsibility for something this delicate to a machine.

He asked to be included in "next year's test". Ah, how I wish the funding fairy would stop by with a little pot of gold so I could actually do this - make 10 more plagiarisms and redo the tests. I don't need enough money to make a major grant, but more than I get in my yearly allotment. One must to expensive research, it seems, but that is another question.

1 comment:

  1. Yes--we have been testing TurnItIn and SafeAssign, and every time we find an obvious source the service fails to detect (like eCheat.com), or some other problem, we are told "We'll fix it right away!"

    How long have these products been on the market, taking money from schools and purporting to solve their plagiarism problems? Shouldn't the obvious flaws have been worked out of the systems before they ever went on the market? In what other industry would paying customers be expected to provide product testing and improvement advice?


Please note that I moderate comments. Any comments that I consider unscientific will not be published.