Time and again I get exasperated with journalists who are only out for a quick sensation instead of actually researching a question. Last Friday a journalist wrote me an email at 3 in the morning asking me to help him answer a question from a student. They are starting a new "service", answering real questions. The question was: How can I find out if I have plagiarized in my term paper? He wanted a very short answer, 4-5 lines, and it could be as "perky" (
flott) as possible.
I was traveling, giving a workshop on plagiarism detection for professors and leaving for a week of vacation in the morning. Since this is not a question that can be answered in 4-5 lines, I just didn't reply.
So what was the result? I was contacted today by a friend with a question about a software system that I had "recommended" in some article. Checking my alert service I found the article and was aghast -- the software in question was one that came in
last in my 2010 test of so-called plagiarism detection software. The entire article turns the results of my tests upside down.
I suppose it is because I am a computer scientist and I test software that journalists expect that I "recommend" the top-rated software, tell people what to buy and where the magic button is for quickly and painlessly finding plagiarism. In fact, there
is no recommended software, the category at the top is "
partially useful". It says at the top of the page:
Our recommendation: Only use these systems when
suspicions of plagiarism arise that cannot be found with 3-5 words in a
search machine. The focus should be on teaching students about
plagiarism and how to avoid it instead of investing time in using
software. Most of the work involved later is in preparing a plagiarism
case and dealing with the plagiator, and for this good documentation is
needed. Very few systems provide good documentation.
The journalist banged together three statements, the first and the last attributed to me and the second one not directly including my name, but people have read it as being from me. He
writes (numbering by me):
(1) Die Berliner Medieninformatikerin Debora Weber-Wulff hat vor Kurzem verschiedene Softwares getestet. Sieger war das kostenpflichtige Programm turnitin,
das mittlerweile an über 80 deutschen Universitäten und
Forschungseinrichtungen genutzt wird. Sie können es möglicherweise über
Ihre Hochschule kostenlos nutzen.
(2) Eine brauchbare kostenfreie
Alternative ist Viper.
(3) Kurze Texte können Sie außerdem gratis bei PlagAware kontrollieren lassen, das in Weber-Wulffs Analyse ebenfalls solide abgeschnitten hat.
ad (1): "vor Kurzem" means recently. Um, my last test was 2010, published in 2011. We are currently retesting the software, hoping to publish the results in September. These tests take a lot of time to conduct. "Sieger" means the winner. There were no winners. There were 5 systems that were given the grade of C-, as they did not even manage to reach 70 % of the points in the test. None of these can be considered winners, but they can be used as tools if one has a suspicion of plagiarism. They have far too many false negatives and false positives for general use, and all five are on about the same level. And the idea of letting students use them (for free even!) to run their papers through until they come out clean gives me shudders down the spine. The students will resort to swapping word orders and using unfitting synonyms until the numbers (which will differ for the same text from system to system and even in some cases from test to test) reach an acceptable level. There must be someone with experience in reading and interpreting the reports to go over this together with the students if it is to be a meaningful learning experience.
ad (2): Oh dear. I suppose the journalist was looking for a free system, as opposed to the expensive ones. I have no idea what terms he googled to find this system, but he didn't read this on my results page. Technically, he doesn't say that
I find the system useful ("brauchbar"), this is his own judgement. But sandwiched in between two references to my work, people misread it as me recommending it. Nothing could be further from the truth. The street address is the same for the company offering the plagiarism detection service and for a paper mill / essay writing service. Their respective telephone numbers differ only in the last digit. And actually reading the "Terms and conditions" should scare the daylights out of anyone: If you submit a paper to Viper, you give the company "All Answers Limited" the irrevocable right to keep a copy of the paper forever and to use it for marketing purposes, either for Viper or any "associated website". The email account I used for the test (we use a different email address for every test to see what kind of spam shows up later) now gets fun letters with subject lines such as "Enjoy the sun and get your dissertation done on time". Oh, and Viper has the distinction of being the worst
system for detecting plagiarism in the test, coming in last with just 24% of the
effectiveness points on all of our test cases in 2010.
ad (3): Another system is mentioned as being "solide", respectable, in my analysis. Again, "partially useful" is not even close to respectable.
I wonder why the journalist is in effect advertising the services of three companies, one of which is extremely questionable, instead of really trying to answer the question. At least the comments are good: Most of the comments note that one does not actually plagiarize by mistake. If you are taking careful notes and making the difference clear between what is your own words and ideas and what is from someone else, you are fine.
Now, how do I get journalists who just breeze through writing something like this without actually reading and understanding what hat been written on the topic to start doing real research themselves instead of just blowing off an email to me? I suppose I need to answer all journalists, as soon as possible, with short and snappy sayings. I do wish, however, that they would read at least the summaries before they ask.