Saturday, December 20, 2014

Christmas Links

I seem to be getting more and more links I can't adequately deal with, but which I don't want to withhold from readers. So here is some Christmas reading:
  • The "Neurosceptic" blog of Discover Magazine has a piece about The Strange Case of “Publication Integrity and Ethics” which details a number of integrity and ethics questions around the supposed new journal.
  • The Times Higher Education has a piece on post-publication peer-review that describes more of the chilling consequences that occur when lawyers meddle with scientific inquiry. Physics professor Philip Moriarty is quoted with: “If you are publicly funded and you put your research into the public domain but no one can criticise you for it without facing legal proceedings, that seems to me to be a very badly damaged system.” Exactly.
  • Retraction Watch obtained a $400.000 grant to set up a retractions database! This is great news, I hope that the database can be used to calculate a Retraction Index, that is, how many retractions per article published a journal has, and perhaps how long did it take for the retractions to take place after the initial information of the journal.
  • Bernd Kramer recently published a book in German about obtaining a doctorate in Germany without doing the work ("Der schnellste Weg zum Doktortitel. Warum selbst recherchieren, warum selbst schreiben, wenn's auch anders geht?"). The cover is a horrible stock photo, but the book makes quite interesting reading. Kramer gave an interview in Deutschlandradio in November 2014 about it.
  • Reports of fake peer reviews are increasing. Vox has an article about 110 papers retracted in the past two years on account of faking peer reviews. Retraction Watch reported on SAGE publishers retracting 60 papers from just one journal for this reason. The Minister of Education in Taiwan, Wei-ling Chiang, had been added to some of these papers as a co-author (he says without his knowledge). He stepped down because of the scandal in July 2014, according to IEEE Spectrum
  • Taipeh Times reported in August of 2013 that Andrew Yang, the former Taiwanese Minister of National Defense was forced to resign in a plagiarism scandal a few days after taking office. He had published a book in 2007 that friends had ghostwritten for him. They had, however, plagiarized large parts of the book.
  • The University of Nevada in Las Vegas fired an English professor for "serial plagiarism." The student newspaper, The Rebel Yell, also reports on the case.
  • End of November 2014 the Vice Chancellor of Delhi University in India was jailed and released on account of plagiarism.
  • There is a nasty case of plagiarism reported from early 2014 at the Chicago State University. The dissertation of the Senior Vice President and Provost of the university was being investigated, and the university confirmed to press that they were doing so. She sued the university for violating privacy laws, stating that she did not plagiarize [1]. There exist documentations of plagiarism in her dissertation in a blog ([2] - [3] - [4] - [5] - [6]). Despite the documentation, the University of Illinois, Chicago has ruled that her dissertation is not a plagiarism ([7]). The Chicago Tribune had three plagiarism experts (Tricia Bertram Gallant, Teddi Fishman, and Daniel Wueste look at the thesis ([8]). All three find the thesis problematic. The question is, are the students to be held to a different standard than the person who is enforcing that academic standard? A thorny question.


  1. Not sure an "retraction index" is such a good idea, as I am also unsure whether a journal should be applauded or criticized for many retractions: both in a way: criticized for having accepted the papers in the first place, and applauded for having retracted it afterwards.

    More meaningful in my eyes would be the responsiveness to allegations of scientific misconduct and the degree of transparency of the process followed. Together with the quality of the retraction notice itself, those dimensions really would show the quality of a publication.

  2. Indeed, a responsiveness index would be wonderful, but for that one would need data that is generally not available. What is the MTTR (Mean Time To Retraction) for retractable items? What is the ration of retraction requests to articles, how long after publication is the duplicate publication/plagiarism/data misuse detected, ... There are lots of interesting aspects. Perhaps it would be useful to set up a database where one could (anonymously?) register that one has informed a particular journal editor of a concern with a publication. The journal editors could also note what they are doing, so that the process could indeed be made more transparent. And then an author wanting to publish could look to see how many retractions a journal has had to issue in the past and for what reason.
    There are no quick solutions, though, I'm afraid.