Showing posts with label Russia. Show all posts
Showing posts with label Russia. Show all posts

Friday, June 7, 2019

WCRI 2019 - Day 3

https://wcri2019.org

Day 0 - Day 1a - Day 1b - Day 2 - Day 3


One's brain is already exploding, and there is one more day ahead. I decided to miss the first plenary about fostering research integrity in Malaysia, Korea and China.

Session: Publishing 1

Ana Jeroncic, University of Split School of Medicine, Split
"History of scientific publishing requirements: a systematic review and meta-analyses of studies analysing instructions to authors"

It is interesting to see all of the things that can be investigated. This one was looking at Instructions to Authors (ItAs) that describe manuscript submission procedures and journal policies. In particular, they conducted a systematic review of papers about ItAs. They found 153, the number increasing as digital publishing takes over. The topics slide was only up for a few seconds, but ItAs address issues beyond manuscript formatting such as publication ethics, clinical trial registration, authorship, conflicts of interest....
I asked about plagiarism of ItAs, that is, non-affiliated journals just copying ItAs from other journals, but they didn't look at that.

Michael Khor, Professor at Nanyang Technological University, Singapore, managed to fit something like 40 slides on "Global trends in research integrity and research ethics analysed through bibliometrics analysis of publications" into his allotted 10 minutes. It was quite entertaining, but one could barely take notes, as looking down momentarily meant that you missed a slide or two. It seems he looked at over 25 000 publications on research integrity and research ethics, using a graph representation tool to visualize relationships. He was showing topic maps, selecting by country to show how the topics are quite different from country to country and how the topics have changed over time. I would love to see this in print, as I need time to look over the graphs and take in what exactly has changed (and what disappears).

It was noted in the discussion that Scottish authors self-identfy as Scottish and not as UK :)

The talk I was waiting for was Harold "Skip" Garner, VCOM (Via College of Osteopathic Medicine), Blacksburg, speaking about "Identifying and quantifying the level of questionable abstract publications at scientific meetings." Skip is the driving force behind ETblast and Déjà vu, a technique that uncovered many duplicate publications and plagiarisms in biomedical publications. He currently runs HelioBLAST, a text similarity engine that finds text records in Medline/PubMed that are similar to the submitted query.You plugin up to 1000 words and look at what bubbles up.

He collected conference abstracts found on the open web and has set up an Ethics DB that lets one browse through or do some text mining on the data. There are a lot of false positives such as people submitting five versions of their manuscript and the conference having all of them available web-facing. But there were questionable things tht turned up such as the same abstract at different conferences with different author orders. Interestingly, he was able to find some instances of salami slicing using this method. He then compared the abstracts of 2018 to Medline. Here he turned up things such as previously published material being submitted to a conference 2 years later. He has classified these as "old findings." It seems that since there is such a time lag between abstract submission and acceptance or rejection, people submit their work to multiple conferences.

As a side-effect of his similarity investigations he can take the accepted papers for a conference and let the computer organize them into tracks of similar papers.


Catriona Fennell, Elsevier, Amsterdam

"Citation manipulation: endemic or exceptional?"

Estimated prevalence of citation manipulation by reviewers based on the citation patterns of 69,000 reviewers

She started off with a Dutch saying, "never let at good crisis got to waste". There was a scandal involving citation stacking in soil science that had affected Elsevier. They investigated the entire area of citation coercion through reviewers, citation pushing done by editors, and citation stacking done in journals.

She noted what a journal can do to fight this:
  • Make it clear that citation coercion is unacceptable
  • Educate editors
  • Remove reviewer privileges
  • Inform institutes and funding bodies
  • Create editorial systems to detect self-citations in reviews or revision letters?
  • Retract citations?
  • Black-list worst offenders?
  • Share information with other journals?
The last four are not really possible, in particular, citations cannot be retracted. There are COPE guidelines for reviewers, and Elsevier eithical guidelines. Also an article by Christopher Tancock about the practice: "The ugly side of peer rewiew".
Elsevier looked through 54 000 reviews stored in their systems and identified 49 persons to look more closely at.

In particular there was"Dr. X" with an h-index of 90 and 20 000 citations in Scopus. They contacted him/her, but they were entirely unrepentant, the institute was unresponsive, there was no funding body for the research, the person is active as an author even more so as reviewer. The person is now no longer a reviewer for Elsevier.
She also spoke about generic reviews that are so unspecific, they fit every paper. She called them "horoscope reviews". They saw some reviewers apparently copy & pasting these reviews into their responses.

The last speaker in the session (and rightly so the winner of one of the best speaker awards) was Alexander Panchin, Institute for Information Transmission Problems of the Russian Academy of Sciences, Moscow on "Concealed homeopathy: a natural test of peer-review quality".

A Russian pharmaceutical inventor (and holder of a patent on a homeopathic "remedy") has "discovered" that is cures pretty much all ailments. Alexander had pictures of it being sold in stores in Russia and heavily advertised.  It is made up of "diluted" antibodies, supposedly 1:10^16. There are variations that "combine" dilutions of 1:10^24 and 1:10^30. There is essentially nothing in the pills except sugar, which is why it is a tad off to take these pills to "cure" diabetes.

In the patent application it is called a homeopathic drug, but it is now called "ultra-low dosage" or "release-active" drugs.

Alexander tracked down many papers published by this gentleman, he was even an editor of a special edition published at SpringerLink that included 46 of his own papers! The papers do not disclose his conflict of interest, and often have very flawed study designs, showing peer-review not kicking in.

Alexander wrote to the journals and has managed to get three retractions and two promises to retract, but the authors of a review article that include many references to this stuff refuse to issue a correction until ALL of the flawed papers are retracted....

Even though the Minister of Science has named this manufacturer as the most damaging pseudoscience project, scientists and newspapers that have reported on this have been sued, so I am keeping the name off the blog.

After lunch we had the Plenary session on
Predatory publishing and other challenges of new models to share knowledge
I was really looking forward to this session and it didn't disappoint!

Deborah C. Poff, the new COPE chair and a philosopher from Ottawa titled her talk "Complexities and approaches to predatory publishing"

She spoke at lightning speed, getting faster as time began running out. It could have been at least a two-hour lecture, so jam-packed it was with really good stuff. I could barely keep up, so I hope I get the highlights right.

A definition for predatory publishing is problematic, as there is much overlap with legitimate but new or smallish publishers. She looked at necessary and sufficient conditions for a definition, but found that while deceit is necessary, sufficient conditions are vexing to try and capture.

PP cheat and deceive some authors charging publishing related fees without providing services; PP deceive academics into serving on editorial boards; PP appoint editorial board members without knowledge; no peer review; refuse to retract or withdraw problematic papers; etc.

The list goes on: Misleading reporting, language issues, lack of ethical oversight, lack of declarations of conflicts of interest, lack of corrections or retractions, lack of qualified EiC (if any), made-up rejection rates, false impact factors, false claims of being indexed in legitimate indexes, falsely claiming membership in publication ethics organization including forgery and falsifying logos of such organization. COPE apparently had to fight a forged COPE logo.

What should we call them, anyway? Arguments against the term "predatory": It is not descriptive or instructive, so some suggest using fake, rouge, questionable, parasitic, deceptive, etc.; predatory suggests victims, powerless people who are acted upon without their full knowledge, while a number of studies have shown that some scholars knowingly publish in such journals; Calling the issue "predatory" obviates or mitigates the personal responsibility for choosing where to publish.

The best argument for using the term: Since Jeffrey Beall coined the term, why not use it?
COPE is undecided on what name is best.

I particularly liked Deborah's stakeholder analysis of who or what is harmed by these publishers:
  • The innocent author who is duped into paying for services without receiving them. They may lose status when peers discover that they have published in such a journal, and it can even lead to investigations. Since many such publishers refuse to retract, the damage done may be long-term. 
  • Legitimate Open Access Journals are easily confused with predatory Open Access Journals
  • Legitimate journals which are not top ranked or may not follow best practice are also easily confused with them.
  • Research and funding sources: This depends on whether the research published is legitimate or not. If the research is shoddy and gets published by a PP journal, it may be cited and thus pollutes the scholarly record. If a scandal arises, the scandal may tarnish publicly funded research.
  • Universities and their role in knowledge creation.
  • Citizens who pay taxes.
She pointed out that predatory publishers make a great business ethics case. In closing, she sees only two things that can be done:
  1. Caveat Emptor (let the buyer beware) - use Think / Check / Submit: do you read the journal yourself? Do you cite research published there? Do your colleagues? Who is the editor-in-chief?
  2. Addressing and pursuing predatory publishers as businesses committing criminal acts. The USA Federal Trade Commission won a court case agains the owner of  OMICS and the company itself. The courts fined OMICS $50.1 million.


Bhushan Patwardhan, Professor of medicine, Vice chairman, University Grants Commission, New Delhi, spoke on "Research integrity and publication ethics: Indian scenario". Bhushan first spoke about the University Grants Commission and gave an overview of the India Higher Education sector.

There are more than 900 universities and more than 10.000 other institutions with 1.2 million teachers somehow coping with 36.6 million students. There are just shy of 150 000 publications produced in India per year, and unfortunately, many of these appear in problematic journals.

There is a paper about the situation in India, they selected 2000 Indian authors for papers in journals on Beall's list and sent them a survey. 480 responded, almost 60 % were unaware that they were publishing in a predatory journal:
G. S. Seethapahy,  J. U. Santhosh Kumar & A. S. Hareesha. (2016 December 10). India's scientific publication in predatory journals, need for regulating quality of Indian science and education. Curr Sci, 111(11), pp. 1759-64

Bhushan was shocked to find out just how many Indian publications were in predatory journals. India has just set up the Consortium for Academic and Research Ethics (CARE) in 2019. The goals of the CARE project are to
  • create and maintain a CARE list of reputable journals
  • promote research publications in reputable journals
  • develop an approach and a methodology for identification of quality journals
  • discourage publications in dubious journals
  • avoid long-term damage due to academic misconduct
  • promote academic and research integrity and publication ethics
He put up the URL of the site for CARE: http://ugccare.unipune.ac.in/index.html, but the site was down for "maintainence," as it had not even been up for a day before the site was cloned and published on a similar URL by unknown persons.

Then Matt Hodgkinson, Head of Research Integrity @ Hindawi Ltd., London, took the stage to give "A view of predatory publishing from an open access publisher". He first gave a bit of a historical overview and told us a bit about Hindawi. It was founded in Cairo in 1997, publishing the first subscription journals in 1999. In 2007 all journals were flipped to Open Access. In 2016 they created their Research Integrity team that handles all issues that arise at their journals. The headquarters of Hindawi moved to London in 2017.

He spoke of the impact that predatory journals have on legitimate, Open Access journals: they are tarred with the same brush. They also create false impressions for authors, who now expect undue speed in legitimate publishers, and out of impatience (Matt called it "gazumping") dual submissions to see which journal publishes first. They have had so many instances of this, Matt told me over coffee, that they check for text similarity online twice: once at submission, and once more just before publication. Many times they have caught double dippers this way.

He expanded the concept of predatory publishers to what he called the "Cargo cult" publishers (ones who publish unedited theses or the Wikipedia as "books"), paper mills, the selling of authorship and faked peer-review. He also noted that the subscription model is not immune to fakery - there are subscription journals that closely mirror the titles of legitimate publishers, something called hijacking.

He closed with some scandals (publications about elephant autism or space octopi) and then listed some of the newest ideas, the various pre-print server. The question arises, however, how sustainable such initiatives are.

Although I was planning on visiting another session, Jenny Byrne insisted that the session on checking data and images would be very interesting, and she was right. I had thought that Elisabeth Bik was the only person around perusing doctored images, but it turns out there are quite a number of initiatives.

First up was Jana Christopher from FEBS Press, Heidelberg, speaking about "Image Integrity in Scientific Publications."

She observed that the prevalence of image aberrations in publications is generally underestimated. Although there are ways to catch simple-minded manipulators, much like with plagiarism, people are getting more sophisticated in hiding their tracks. Her focus is on Western blots, micrographs, or photos, anything that can be overlayed in Photoshop. If they match identically, there's a problem. She showed in a quick demo how she loads suspected duplicates into different color channels and overlays them. The result is black for identical portions of the image.

She differentiated between manipulated images and wrong images being used to illustrate a finding. Why do people do this? Some apparently want a cleaner, more striking image. Others want to show a particular feature more clearly. Then there are those who wish to show a result that was not actually produced.

She showed some more examples of pictures that have crossed her desk, cut-outs clearly shown as transparent background, the clone tool being used to overwrite undesirable portions of an image, or images that are supposed to show different plants but because of the pattern of the soil are clearly the same plant.

Rennee Hoch, the Senior Manager and Team Manager of the Publication Ethics Team at PLOS One, San Francisco, sang the same song, second verse with her talk on the "Impact of data availability on resolution of post-publication image concern cases."

She noted that image concerns make up 39 % of the concerns raised in her department, but 75 % of the retractions. She took 100 cases of post-publication image cases from 2017-2019 and had a statistical look at them. The numbers flew by so fast, I was unable to keep up. 94 of the cases were with image duplication, the other 6 manipulation or fabrication. All fabrications have been retracted, for manipulations or duplications about half have an Expression of Concern or a Correction.

Their big issue is that when a concern is raised, they request the original data, and none is forthcoming. The excuses are similar: can't find the files, hard-disk crash, person left the lab. Concerns are coming in up to 5 years after publication, but some countries only have a three-year retention policy. So that is clearly not sufficient. At times they wonder if the data ever existed at all, although there is a lot of honest error or poor practice.

What can a journal do? They can require submission of the raw image data, and have the peer-review done with the raw image data, as well as publishing that as supplementary material. This permits better assessment and the journal can make sure that the images are archived properly.

In the discussion it turned out that many journals, upon requesting original data, get sent PowerPoint slides with screenshot images - completely useless for the task.

Daniel Acuna, a computer scientist from Syracuse University in New York State, USA, provides tools to Research Integrity Officers (RIOs) to help investigate cases. His talk on "Helping research misconduct investigations: methods for statistical certainty reporting of inappropriate figure reuse"was about a statistical tool that helps evaluate if the excuse of a scientist ("it just happened by chance") really makes sense.

Similar instruments might indeed generate similar artefacts, image processing software might generate similar noise, software reproducability might generate similar results, and there are some reuse of images that is legitimate, for example, generic brains used as underlays for captions.

They scraped about a million images they could find on PubMed Central, and had to scrape them from PDF which does not actually make things better. They calculated a similarity index, setting a high likelyhood threshold and then looking at the results. They managed a 63 % area under the ROC curve, which is not brilliant, but marginally better than flipping a coin (50 %). They need more images in order to refine their algorithm.

Thorsten Beck from the HEADT center (funded by Elsevier) at the Humboldt University, Berlin, spoke about the image integrity database that they are putting together. Bik, Fang & Casadevall have shown in their 2016 and 2018 papers that about 4 % of all published images have issues, a good 35 000 papers are in want of retracting for this reason.

They want to build a structured database with images from retracted images, recording as much information as they can about the authors of the publications, their institutions, the reason for the retraction, etc. However, retraction notices are famous for being vague, on account of authors suing journals. They want to keep track of who manipulated the image and who detected it, but seeing as how institutions are highly reluctant to disclose the results of an investigation, good luck in trying to obtain that data. [Although Nature has a  WorldView column this week by C. K. Gunsalus calling for institutions to be more transparent about their decisions]. And then there are copyright issues, so there are many challenges.

Jennifer Bryne, an oncologist from the Children's Hospital at Westmead, Australia, presented her work together with Cyril Labbé (University of Grenoble, France) on the Seek&Blast tool.

She first gave us a two minute introduction into genetics, noting that the nucleotide sequences for certain genes are such long strings of letters that no human being can easily remember them. She does, however, remember the name given to some cell line, TPD52L2, that she had worked with ages ago. There had been a dozen and a half papers about this many years ago, and suddenly it was popping up all over the place in papers by various Chinese authors for a wide variety of cancers, which is impossible. The cells come from only one organ.

[Matt Hodgkinson has sent in a correction: "Small correction - TPD52L2 is a gene Jennifer cloned in 96. The authors of suspect papers often reported studying it in cell lines known by https://t.co/1Lci9g8bfb to be really HeLa & they often got primer sequences for detecting & knocking down the genes wrong." I can't pretend to understand that, but I'm thankful for the correction!]

As she began reading the papers, she realised that they didn't make sense at all, something about the targeting sequence being off. In speaking with Cyril about this issue, he immediately saw that the nucleotide sequence is just one big word, so it is simple to parse them out of papers. He went and did so, and was even able to identify the context in which these nucleotide sequences were used, so that impossible uses of them could also be identified.

The system, as many software systems in this area are, has a large false positive and false negative rate. The positives must thus be manually examined before flagging a paper. They published a paper about it in Scientometrics, "Striking similarities between publications from China describing single gene knockdown experiments in human cancer cell lines," identifying the flagged papers. We had a look at the papers they identified with nucleotide sequence overlap and the ones I was reporting on with text overlap, and found that the same journal was publishing these papers. They are having very similar problems as I am in getting the offending papers retracted.

The service is available online at http://scigendetection.imag.fr/TPD52/ for looking to see if there are any publications with a particular sequence. They caution to manually verify a paper before taking any action such as commenting or contacting someone. This is not an automatic detector! Cyril will continue refining the algorithm used, he said after the presentation.

We were now down to the final session.

Maura Hiney and Daniel Barr reported on their results from the focus track on ensuring integrity in innovation and impact, and Klaas Sijtsma reported on the progress being made with the Registry for Research on the Responsible Conduct of Research. He now revealed what some of seemingly odd data was that was being collected at submission time: They wanted to see how many of the accepted papers had been pre-registered. It wasn't many. I think that pre-registration is fine for clinical trials, but there are many other methods of doing research that do not fit in the pre-registration mindset. In particular, when you observe something odd and end up chasing down a crooked alley and suddenly having a great big new field show up, you will hardly have pre-registered what you are writing up for other scientists.

David Moher reported on the Hong Kong Manifesto for the Assessing Researchers, discovering that the principles need a good bit of re-drafting.

The best speaker awards for young researchers were announced, and then Lex Bouter extended an invitation to attend the 7th World Conference for Research Integrity, to be held in 2021 at the University of Cape Town, South Africa.

That's it for blogging, I'm writing this on the plane with a screaming baby in my aisle, I will now put some music on the noise-cancelling headphones, in the hopes of drowning out the piercing screams. We've still got 4 hours to go....
 
Update 8 June 2019:  Vasya Vlassov had a friend film his talk about Dissernet.

Tuesday, June 4, 2019

WCRI 2019 - Day 1b

https://wcri2019.org

Day 0 - Day 1a - Day 1b - Day 2 - Day 3


Quite refreshed from a long night's sleep and reluctant to venture out into the rain, here's the rest of Day 1 of the WCRI conference 2019!

Session: Principles and Codes 2

Daniel Barr, RMIT University, Melbourne
"Research integrity around the Pacific rim: developing the APEC guiding principles for research integrity"

They looked at integrity systems across the Asia-Pacific Economic Cooperation (APEC) area and collated a close consensus of guidelines:
  • Research integrity systems appear diverse, multi-faceted and dynamic
  • Principlies-based policies appear common, but are not uniform
  • Limited coordination of institutions with some exceptions
  • Varied roles in leading or influencing research integrity systems
They did a survey with 499 responses, but 85 % of the respondents were from Mexico, so they had to split their analysis on Mexico and not-Mexico. They also conducted a workshop with various participants. In honor of the memory of Paul Taylor they have developed the Draft APEC Taylor Guiding Principles for Research Integrity that are a top priority for avoiding breaches.
Honesty, Responsibility, Rigour, Transparency, Respect, Fairness, & Diversity
The topic of Diversity was a principle that came out of the workshop.

Natalie Evans was supposed to speak about "Research integrity and research ethics experiences: a comparative study of Croatia, the Netherlands, and Spain" but there was some planning mix-up so Jillian Barr took over and spoke about research integrity in Australia, which was a shame for me, because that is what she talked about at the pre-conference workshop.

Dr. Sonja Ochsenfeld-Repp has been Deputy Head of Division Quality and Programme Management, German Research Foundation, since 2016. She spoke about the new Draft Code of Conduct "Guidelines for safeguarding good scientific practice". The old white paper was first published in 1998 [as a reaction to a very large academic misconduct scandal, this was not mentioned], a revision is underway since 2017. 

I asked about how many researchers in Germany actually know about and understand these guidelines, she assured me that everyone does. My own experience shows that this is not the case, there are quite a number of university heads who are unaware of the procedures and guidelines set out, even if it is published on their own web pages. 

I spoke with another researcher afterwards who conducted an actual study investigating how many people did, indeed, know about and understand the guidelines. The results appear to be sobering, I'll see if I can get a hold of concrete results.

Sergey Konovalov, Russian Science Foundation, Moscow
Research ethics in Russia: challenges for the Russian Science Foundation

RSF has existed for 5 years now, but has less than 10% of the federal budget allocated for science. They audit the accounting of the grants: Business class instead of coach, fancy furniture for the bosses' cabin. They don't touch the scientific part, only if the expenses are related to the research.

I asked about Dissernet and if that shows that they need to look beyond the economics to the science itself. He says that they have zero tolerance for plagiarism, but the researchers are themselves responsible for the scientific correctness of what they research. I'm afraid that he doesn't understand my question.

Update 2019-06-18: Sergey writes: "Regrettably, I did not manage to answer your question about Dissernet and if that shows that we need to look beyond the economics to the science itself. Frankly speaking, Dissernet has nothing to do with the Russian Science Foundation activities as they check the the thesises and dissertations and we deal with the project proposals, which is somewhat different. 

Maybe, you missed the point that we do check not only financial part of the projects but also the scientific part (not by RSF staff but it is done by our expert council members), which is equally important to us.
We do not have much of plagiarism concerns but we strictly check the scientific acknowledgements (funding source should be properly indicated in the RSF-funded publications) and duplication of proposal contens submitted to RSF and other funding agencies [...]; these two scientific issues are in our view one of the most common examples of unethical behaviour of researchers in Russia. At least, in our experience (again, our programs cover only 10% of researchers and 15% of research organisations in Russia)."
Session: Predators

I am quite interested in the entire predatory publisher phenomenon, so I decided to attend this session, although there were at least two others in parallel with interesting talks. One was on Dissernet, the Russian plagiarism documenting group (but I know about them already) and the other one was a symposium on "Misdirected allegations of breaches of research integrity" with Ivan Oransky from RetractionWatch.

First up was Rick Anderson from the University of Utah, Salt Lake City with "Predators at the gates: citations to predatory journals in mainstream scientific literature". He identified some predatory journals that had published nonsense in sting operations and then took some papers from each of these journals. He then looked at citations to these papers in the Web of Science, ScienceDirect and PLOS. Yes, there were citations to some of these. I was a bit concerned that he didn't look at the papers themselves to see if they made sense, as misguided individuals will publish good science in bad places. 

Next was Donna Romyn, Athabasca University, St Albert, Canada (a virtual university) on "Confronting predatory publishers and conference organizers: a firsthand account".

She decided to attend a supposed predatory conference on purpose and to chart her journey. She submitted an abstract "At risk of being lured by a predatory publisher? Not me!". The paper was accepted within 24 hours, so there must have been rigorous peer-review done.... There was a bit of back and forth about her giving a keynote, she ended up with the exact same abstract but using a different title, "Safeguarding evidence-informed nursing practice from predatory publishers." She attended the conference and found about 60 people in attendance, many unaware of the nature of the conference. During the discussion the site thinkchecksubmit.org came up, it has a good checklist on what to look at before submitting a paper.

Miriam Urlings from Maastricht University, Maastricht, looked at "Selective citation in biomedical sciences: an overview of six research fields". She did a citation network analysis for papers in six focused research fields with around 100 relevant potential citations in order to see if there is citation bias. There were, however, only 1-2 citations for many of the publications and then highly cited ones in each area, so the results were not conclusive.

Eric Fong, University of Alabama, Huntsville (with Allen W. Wilhite) spoke on "The monetary returns of adding false investigators to grant proposals". He developed an interesting economic model for looking to see if adding false investigators (FI) to grant proposals increases the monetary value of total grants over a 5-year period. Then emailed 113.000 potential respondents and had a 9.5 % response rate. Their conclusion: if you add FI to your grants, you apply for more grants, but that does not lead to larger funding per grant application. However, adding FI significantly increases cumulative total grant dollars over a five-year period. 

Vivienne C. Bachelet, Universidad de Santiago de Chile (USACH), Santiago, spoke about the interesting problem of academics putting institutional affiliations on their bylines without actually being employed at the institution. "Author misrepresentation of institutional affiliations: exploratory cross-sectional case study on secondary individual data".

They focused on researchers giving a Chilean university as an affiliation for the year 2016 and tried to verify if the person was actually affiliated with the university. For around 40 % of the authors, it was not possible to verify their connection to the university. This private investigation, done with no funding, was commenced after it became known that one university in Chile was paying prolific, Spanish-speaking researchers, to add an affiliation with their university, presumably to increase some metric the university is measured by.

After this I attended the COPE reception. There was a lot of very good discussion there, and some publishers I had mentioned in my talk were very interested to hear more about my cases.


A colleage (who wishes to remain unnamed) reported from a parallel session, here's her take on those presentations (edited to fit my format and fix typos):

Session: Prevention 1

Michael Kalichman, UC San Diego, San Diego
Research misconduct: disease or symptom?

He surveyed RIOs on their perceptions of cases, and got some data that research misconduct occurs in cases deficient in Good Research Practices (i.e. maybe what these courses really need to teach is record keeping).

He listed out 10 GPR or lackings and from ~30 RIOs (out of 60 emailed) what practices were in place in cases they had personally investigated. It’s to be expected, but very good talk.

Michael Reisig, Arizona State University, Phoenix
"The perceived prevalence, cause, and prevention of research misconduct: results from a survey of faculty at America’s top 100 universities."

He has a forensic background and corresponded with about 630 respondents about prevalence, causes, and prevention of research misconduct, and found ~50% people surveyed were very much in favor of formal sanctions to prevent future misconduct. 29 percent said that nothing works, and "30%” wanted an integrated approach. QRP pretty common. But the slides went too fast for numbers.

Sudarat Luepongpattana, National Science and Technology Development Agency, Thailand, Bangkok
"Research quality development of the Thailand National Science and Technology Development Agency"

Yet another survey, and found that researchers don’t really know that authorship entails.

Ignacio Baleztena, European Commission, Brussels
"National practices in research integrity in EU member states & associated countries"

I left during this. My understanding was that representatives from 14 countries were going to have meetings, and then more meetings, and then follow a flow chart of meeting and then produce a definition of research integrity. I was getting seriously jetlagged, but that’s the memory. I just don’t understand why we need YET ANOTHER document. Are there any rules of thumb for when these are useful? [This is an excellent observation. Everyone is producing their own documents (sometimes by gently plagiarizing other institutions documents) on academic integrity. But how do we breathe life into them, change the culture? --dww]

[She missed the last talk and went to another session. She caught the tail end of a survey on Finnish atttitudes toward QRP, who said that it was hard to find a definition of research integrity]

Session: Attitudes 3

De-Ming Chau, Universiti Putra Malaysia/Young Scientists Network-Academy of Sciences, Malaysia, Serdang
"Effectiveness of active learning-based responsible conduct of research workshop in improving knowledge, attitude and behaviour among Malaysian researcher"

He did a survey (small sample size) and found that researchers with more experience say they are more likely to “do nothing” if a colleague is engaging in research misconduct

He’s pretty impressive; an NAS grant got him started designing RCR in Malaysia, and the programs are being designed by early career researchers


Sunday, January 1, 2017

Things leftover in tabs from 2016

Happy New Year!

I seem to have collected quite a number of interesting stories that are hanging around in my browser tabs. Let me just document some of them here.
  • Serays Maouche reports in December 2016 in Mediapart in France about a plagiarism case that involves a person who is professor at the École Centrale Paris and a director at the Atomic Energy Commission. It involves plagiarism in a number of texts, among them a biography of Einstein. The institutions involved have nothing to say on the matters. Ms. Maouche closes with the question "Comment sanctionner des étudiants pour plagiat, si on accepte cette fraude académique pour des directeurs et des académiciens ?" (How can we sanction students for plagiarim when this academic misconduct is accepted by the administrations and academics?)
  • It was reported be the Guardian in November that the results of one portion of the ACT exam, one used by US-American universities to determine admission for foreign students, has been invalidated for Asia-Pacific students. No details were available. 
  • In Spain, el diario reported on November 21 and  November 23 about a plagiarism case involving the rector of a Spanish university. The Google translate version is not very clear, so I don't want to try and summarize it here, just give the links. 
  • In October the Chinese Global Times wrote about a report in the "Southern Weekly" about Chinese scientists and medical practioners paying journals to publish ghostwritten articles so that they can obtain promotions. Springer has since retracted 64 publications and BioMed Central 43 for faking peer reviews. 
  • Radio Free Asia reported on September 21, 2016 that students in Laos had to retake college entrance exams after more than 100 students obtained a perfect score on the social sciences part of the exam. Students are angry, as they will again have to incur traveling expenses in order to retake the exam.
  • Donald McCabe, a prolific researcher from Rutgers Business School who focused on determining how prevalent academic misconduct is amongst pupils and students worldwide and on the use of academic honor codes to prevent misconduct, passed away at age 72 on Sept. 17, 2016. I was lucky to get to meet Don in 2012 when he gave a talk at our university and we drove together down to Bielefeld for a conference. He will be sorely missed.
  • The Moscow Times reported on September 8, 2016 that Russian education officials  "have reportedly developed draft legislation that would make it possible to revoke a person's academic doctorate only after a copyright ruling by a court has come into effect. " Although copyright and plagiarism or other forms of academic misconduct have little to do with each other, this is apparently in response to the documentation work of Dissernet, who have documented plagiarism in hundreds of dissertations, among them many submitted by politicians to Russian universities. 
  • There was a flurry of publications about paper mills and the problem of contract cheating, that is, students paying someone else to do their work for them. In the UK the Quality Assurance Agency for Higher Education published a report on contract cheating in August. The chief operations officer at an essay mill then wrote a defense of his industry for the Times Higher Education which sparked quite a debate. Tricia Bertram Gallant, also writing in the THE, called on universities to fight contract cheating by openly discussing the topic with students. October 19 was declared the "International Day of Action Against Contract Cheating" and a number of institutions worldwide participated. 
  • The Age reported in October about an inside job at the University of Melbourne in Australia where grades on a manually graded exam was changed after grading with a red pen by someone who had access to the exam papers. The university was unable to determine who was responsible for the change.
  • Joanna Williams reported in June in the Times Higer Education about a survey on research misconduct in the UK.
  • In July 2016 the USA issued a patent (US9389852) to Indian researchers on a method for determining "plagiarism" in program code from Design Patterns. That Design Patterns were explicitly meant to be copied appears to have escaped the Patent Office. 
  • The blog iPensatori analyzed how Google Scholar gets filled up with junk.
  • The Office of Research Integrity has put up some infographics on their site about research integrity. They also have a guide on avoiding self-plagiarism.
  • And while I am on the subject, the 5th World Conference on Research Integrity will be held from May 28-31, 2017, in Amsterdam (I am on the program committee). The conference proceedings from the previous conference is available here. There will also be the 3rd International Conference Plagiarism In Europe and Beyond from May 24-25 in Brno, Czech Republic.  And no, there are no direct flights Brno-Amsterdam.
  • On March 18, 2016 the German DFG announced sanctions against an unnamed researcher who will be barred from applying for financing for three years.

Tuesday, March 1, 2016

Fake Academic Degrees in Russia

This is a guest post by one of the members of the Russian plagiarism documentation group Dissernet


Fake Academic Degrees in Russia 

By Andrei Rostovtsev  
dissernet@gmail.com

The practice of awarding fake academic degrees to politicians, businessmen, doctors in clinics, professors in universities, and teachers in schools, that is, to all those who wish to use their new academic titles to step onto a faster career route, is widely accepted in Russia. The academic titles are awarded throughout the country. This business is based on the production of falsified dissertations. In early 2013 a group of five scientists and journalists established a social network called “Dissernet”. The Dissernet is a volunteer-effort free association aimed at making fraud and trickery in the awarding of academic titles transparent and well-known to the public. By 2016, Dissernet activists have identified more than 5000 plagiarized and falsified dissertations. In falsified dissertations not only is the text copied, but also the numerical data in it are assigned to a different year or region (in economics, law, and sociology), or to a different disease and treatment (in medicine), see discussion below. Over 1000 cases of such dissertations are documented on the website of the Dissernet (www.dissernet.org). Statistical data collected by the Dissernet yield a number of conclusions discussed below.

First of all, there is an important difference between the ways scientific writings are plagiarized in Russia and in the Western counties. In the West, the plagiarism is often associated with an intentional incorporation of other people’s texts or ideas in one’s own scientific research. That is probably why the ‘western style’ can involve many intricate small-scale mosaic plagiarisms intentionally placed in the original text. Yet in Russia, most often Dissernet deals with authors who have never done research and might have never even seen their dissertation texts at all. Such ‘dissertations’ are usually nothing else but a mere compilation of other people’s texts glued one paragraph after another in a haphazard way, something Weber-Wulff calls “shake & paste” [1].

In extreme cases the new text is just an older dissertation with a title page changed to reflect the new candidate. Sometimes the new candidate changes the subject of his or her ‘research’ too—usually by contextually substituting some terms throughout the whole text. For example, one notorious ‘scholar’ transformed a dissertation about the confectionary industry into a dissertation about the beef-and-dairy industry by substituting ‘dark chocolate’ with ‘homegrown beef,’ ‘white chocolate’ with ‘imported beef,’ and ‘nut chocolate’ with ‘bone-in beef ’ (see http://www.dissernet.org/expertise/igoshin.htm and http://cook.livejournal.com/202638.html, in Russian). In the meantime, all the data, tables, pictures, and spelling remained unchanged. Sometimes such authors also ‘update’ the dating of the statistics they refer to, thus making their ‘research’ seem to have been done more recently.

Detection of thousands of fraudulent dissertations by the Dissernet is mainly the result of a unique technology used. In Russia, along with the dissertation a so-called avtoreferat must be made publically available before the Ph.D. defense. The avtoreferat consists of a shortened dissertation content (usually 20–30 pages) and the main research results. Importantly, the texts of the avtoreferats are indexed by public search engines (such as Google or Yandex). The dissertation itself is not usually indexed, however. But if the dissertation contains large fragments of plagiarized text, as described above, its avtoreferat would also have text coinciding with earlier works. The specific Dissernet software is able to pick up the avtoreferats one by one and takes advantage of the search engines indices to look for textual coincidences within the whole publicly available corpus of Russian digitized texts, including texts of other avtoreferats. This program runs 24 hours a day and 7 days a week. Thus a few hundred thousand dissertations have been automatically checked. Furthermore, Dissernet takes advantage of the common practice of a chain-like fraudulent dissertation production. As soon as a rampant plagiarism is detected in one dissertation, it is very likely to be detected as well in other dissertations defended by the same dissertation council or with the same supervisor. This happens because the producers of fake dissertations in Russia work in a conveyor-belt mode using very limited sets of scientific texts as sources. By focusing on practically totally plagiarized texts, the Dissernet deals only with a small tip of scientific fraud in Russia. But even so, in problematic fields such as economics and law, about 3 % of dissertations contain large-scale plagiarism. In pedagogy this fraction is a bit higher, but still below 6 %.

Such large-scale dissertation fraud in Russia is a result of corruption that has paralyzed the whole system of awarding academic degrees: from dissertation councils established by the leading universities, where the PhDs are awarded, through the Higher Attestation Commission—the agency which coordinates and validates the awarding of academic degrees—and finally, what is also very important, to editorial boards of scientific journals, where scientific papers of the prospective doctoral candidates have to be published prior the defense. It is obvious that if no real research is done, then no relevant scientific papers could be published by such research. Clear affiliations of Russian scientific journals with the fake dissertation industry run by universities (more exactly, certain dissertation councils) have also been traced by the Dissernet. Those three cornerstones (dissertation councils, the Higher Attestation Commission, and journal editorial boards) are the necessary working parts of the mechanism running the conveyor belts of the academic fraud in Russia. Very often the same persons serve in these three cornerstone bodies at the same time.
Figure 1. Statistics on false dissertations broken down by scientific fields. (Dissernet data)
Figure 1 shows statistics of fraudulent candidate (Ph.D.) degrees awarded in different scientific fields based on the present Dissernet data (n=5215). As one can see, the most problematic areas are economics, pedagogy, and law. These same areas are the most problematic ones in the everyday (non-academic) life of Russians as well. In my view, this correlation is not accidental. The academic community naturally erects a barrier in the way of fake sciences and mythifications, which could otherwise define a climate for the life of whole society. In the areas, where the academic community is strong enough to resist the fraudulent practice of awarding fake academic degrees, the entire non-academic society is not driven by the false ideas. In addition, according to SCOPUS, the proportion of fake dissertations in each scientific field is inversely proportional to Russia’s international input in these disciplines [2].


Figure 2. Geographic location of the major universities producing fake dissertations.
Relative contributions into the total productivity for Moscow
and St.-Petersburg are given in percentages.
Figure 2 presents the geographic locations of universities that award the fake degrees according the present Dissernet statistics. Obviously, Moscow and Saint Petersburg play the most important role as they are among the largest cities. Other cities and towns fall behind. The scale of falsifications in the Caucasus region is relatively large but on the whole, their share in national statistics isn’t that high. This means the phenomenon of scientific fraud in Russia is not a marginal one. It is not localized somewhere on the outskirts of the country. Today it plays a role of an institution that is well integrated into the contemporary Russian state. Why do we qualify this phenomenon as institutional rather than a subject to free market?

Several recent laws and decrees protect owners of falsified academic degrees. The most important one (see http://www.rg.ru/2013/10/01/stepen-site-dok.html and http://www.saveras.ru/archives/6450) makes it impossible to strip a person of an academic degree if its defense took place before 2011. The authorities are quite reluctant to revoke the fake academic degrees, even if the defense has happened after 2011. The reactions from those accused of plagiarism by Dissernet varies from ignoring it, through calling it nonsense and accusations that it is politically motivated, to accusing Dissernet members of unprofessionalism and arguing that only appropriate dissertation councils have the right to assess the quality of dissertations (E. Denisova-Schmidt, personal communication). This point of view is broadly supported by state-owned mass media. Still, as of today, the Dissernet has managed to convince dissertation councils to revoke about one hundred fake academic degrees.

Last but not least, Dissernet investigations are relevant not only for an assessment of the Russian fraudulent academic world. Most importantly, the Dissernet provides a unique view on the deterioration of some institution’s reputations in Russia. In order to illustrate this point, several reference groups may be considered: members of the Russian Academy of Science (RAS), directors of Moscow’s primary and secondary schools, chancellors of Russian universities, regional governors, and members of the State Duma. Members of each group are selected if they have been awarded an academic degree during the last 15 years. Dissernet did not detect any falsified dissertation by the RAS members. Of 141 dissertations defended by directors of Moscow’s primary and secondary schools, 23 satisfied the Dissernet criteria for largely plagiarized texts. This amounts to 16 %—a rate which is more than three times higher than the probability of finding large-scale plagiarism in a random pedagogical dissertation.

Figure 3. Breakdown of fake dissertations by occupation: a reputation crisis.
This implies a silent mechanism at work selecting and supporting those who are prone to falsifications. The next group is chancellors of Russian universities, which has shown an even higher fraction of 21 %. Of that, one third of such universities are in Moscow. The proportion of politicians representing regional governors and members of the State Duma is even higher, reaching 41 % for the latter. In short, Dissernet performs a sort of a litmus test, identifying those dissertations prone to fraud and trickery, depending on the circumstances, and demonstrates the reputation crisis in Russia. This is illustrated in Figure 3. Why are the authorities, which are charged with larger responsibilities, subject to this stronger negative selection? This question will have to be answered by sociologists rather than Dissernet.

Despite aggressive state politics directed at the Dissernet, this public initiative has gained a good reputation and respect in Russian society in general, as evidenced by several awards and the fact that the name itself has become a meme.

[1] Weber-Wulff, D. (2014). False Feathers: A Perspective on Academic Plagiarism. Heidelberg, Berlin: Springer. 
[2] Rostovtsev, A. (2015). Some Observations on the Subject of Dissertation Fraud in Russia. HERB: Higher Education in Russia and Beyond, 3(5), 17–18. Available at https://herb.hse.ru/data/2015/09/22/1075563638/HERB_05_view.pdf

Sunday, November 29, 2015

This and That

Sorry about the long silence. It's not just been my day job and my research. Someone who was unhappy with one of my blog posts had some lawyers get active. I have had to remove a post (can't say which one or it will cost me even more). I am quite disturbed that scientific discussions are more and more overshadowed by legal threats. Enough on that for now, a reader sent me a fine list of interesting links to international articles about plagiarism a while back, so here's a few!
  • The Korean Times reports on "Public officials accused of plagiarism on papers". The most disturbing part of the article is the first two sentences:
    "Plagiarism is everywhere in Korea where novelists, scholars and politicians habitually copy other people's work, making people insensible to this unethical practice. Public officials are no exception." Habitually. Like it's normal. 
  • There's a big row in Korea at the moment about a retraction of a paper about black holes, the Korean Times reports. It seems that a very young PhD published a paper in 2015 together with his advisor [1] that turns out to be textually and mathematically extremely close to a 2002 conference paper by the advisor alone [2]. There is a blog entry at ScholarlyOA about the case and one at RetractionWatch. A retraction notice was published this past week.
    As an amusing aside, the 2002 paper is followed in the conference proceedings by the following figure that is probably some sort of black hole insider's joke:
     
  • The Moscow Times reports that a Russian Official Has Doctorate Revoked After Plagiarism Charges. The Russian academic group Dissernet had documented plagiarism in the law thesis of a politician, who requested that his dissertation be revoked. He has announced that he wants to re-submit the thesis, with the "borrowing" fixed. I've seen announcements like this in a number of instances, and it puzzles me. Is it believable that people who stoop to plagiarism keep exact records of which bits they stole from what source? I think not. The published documentations are not machine-generated exact tracings of all of the plagiarisms, but only of some of what has been found to date. There can be (much) more.
  • On the topic of re-submitted theses, Neue Züricher Zeitung and Tagesanzeiger have both reported on a VroniPlag Wiki documentation of plagiarism in a Swiss habilitation. The university in question responded, when sent the documentation, that this was a documentation of the first version of the habilitation (which appeared in print) and that has been superseded by a second version. So they consider the case closed. The second version is not (yet) published, so there is no chance to see whether all of the documented text parallels are now properly quoted.

Wednesday, August 20, 2014

News links

I have some plagiarism news links floating around that need recording:
  • The Moscow Times have an interesting article about Dissernet, the Russian group of researchers documenting plagiarism in dissertations of politicians and academics in Russia.
  • According to Le Figaro and Liberation, the vice-president of the University of Grenoble in France, Dominique Rigaux,  has been accused of plagiarism and has left her post. The documentation of the plagiarism was done by Michelle Bergadaà, a French-speaking plagiarism researcher from the Swiss University of Geneva. 
  • VroniPlag Wiki has currently documented plagiarism in 23 doctoral dissertations in medicine from the University of Münster and 14 from the renowned Charité institution in Berlin. There are a number of theses accepted in forensic medicine that borrow heavily from earlier theses submitted to the same committee and under the direction of the same advisor:

    Both the University of Münster and the Charité have stated that they have begun investigations. But since there are still numerous cases (not only in medicine) from both institutions that are still open one or two years later, this may take some time to clear up. 

Tuesday, April 1, 2014

Russian Plagiarism

A reader sent in two interesting links on plagiarism in Russia:
  • The Washington Post (18 March 2014):
    Russia's plagiarism problem: Even Putin has done it!
    Russia has a really big plagiarism problem. So many businessmen, academics and high-ranking government officials — President Vladimir Putin included — have been found to have plagiarized their college and doctoral theses that Russia’s education minister just denounced the revelations, saying they were hurting Russia’s reputation.
    “People not versed in this topic will get the idea that all academics are cheats and liars,” Education and Science Minister Dmitry Livanov just told the Kommersant newspaper, according to a Russian news agency. ”It’s a severe reputational problem for Russian science.”

    [...] Olga Khvostunova of the Institute of Modern Russia, a nonpartisan think tank in Washington, wrote about the plagiarism scandal in Russia in this piece, detailing some of the history of the uncovering of the plagiarism scandals in Russia. She wrote about Putin's plagiarism: "The scandal over Putin's dissertation led nowhere. But because the head of state's deed had no repercussions whatsoever, a new trend emerged in the country: plagiarism in the writing and defense of dissertation works began on an unprecedented scale."
    I would say that the "reputational problem" is not in the reporting about the academic misconduct, but in the misconduct itself.

  • University World News (24 February 2014):
    Government to combat plagiarism and illegal degrees
    The biggest scandal over fake dissertations occurred in the summer of 2013 at Moscow State Pedagogical University, one of the country’s top institutions. Due to serious violations in the preparation of dissertations, several people lost their degrees, the local dissertation council was closed and the head of the university was fired.
    It seems that the Russian software company Anti-Plagiat,  looked at 14, 500 history theses and found plagiarism in every 10th one, according to the Moscow News.
  • Update: Just found this in Ria Novosti (from 10 February 2014):
    Russia’s Education and Science Minister has denounced a grassroots campaign to expose alleged academic plagiarism among high-ranking state officials.

    Whistleblowers are harming the public image of Russian academics, Dmitry Livanov said in an interview with Kommersant daily published Monday.
    Strange, I thought it was the plagiarists who were harming the public image of Russian academics.
Previous articles on Copy-Shake-Paste about Russia can be found here.

Friday, February 28, 2014

Short news

I again have a pile of important links that need documenting...
  • Nature reports on a system developed by French computer scientist Cyril Labbé that can be used to detect published papers that were generated by SciGen. IEEE and Springer had to admit that they had published not one, not two but at least 120 papers that were utter nonsense! And some of them appear to have co-authors who are not aware of their co-authorship. Labbé had previously demonstrated that one could set up a fake scientist with fake papers with an h-index of 94, essentially proving that the index is not reliable. I think Springer and IEEE have a lot more papers that need close examination and then withdrawal on account of plagiarism.
  • Flurfunk Dresden has a nice summary with links (in German) to the case of Nina Haferkamp. Stefan Weber had published documentation of plagiarism in her doctoral dissertation. The University of Duisburg-Essen has now, after long deliberation, decided that even though there is scientific misconduct in the thesis, since a "scientific kernel" is there, she gets to keep her doctorate. This raises some troubling questions. Weber has apparently been threatened with legal action, although documenting plagiarism in a thesis or paper is a time-honored method of scientific discourse, often referred to as a book or paper review. And if one can plagiarize away in the "unimportant" parts of a paper or dissertation, does that mean everyone can now plagiarize to their hearts content, as long as there is some little kernel of truth inside? The University of Duisburg-Essen does not tell us as readers how we can differentiate this kernel from the plagiarism-chaff that surrounds it.
  • VroniPlag Wiki has published case #62, #63, #64, and #65, from the University of Münster (again), University of Kiel (again), the National University of Ireland in Maynooth, and the Free University of Berlin (again). The map is getting quite thick with pins.
  • The Russian Education Minister is apparently unhappy with the work of Dissernet, a group of scientists in Russia who have investigated plagiarism in over 350 dissertations of, among others, politicians. Minister Dmitry Livanov is quoted as saying “People not versed in this topic will get the idea that all academics are cheats and liars. It’s a severe reputational problem for Russian science.” If the shoe fits....

Sunday, November 3, 2013

Dr. Z fights corruption and plagiarism in Russia

[I'm posting some oldish news that need documenting -- dww]
The German daily newspaper Die Welt published an article by Julia Smirnova on August 23, 2013 about "Dr. Z.", a scientist who is fighting corruption and plagiarism in Russia. A Swedish blogger reported on one of his revelations in February 2013, giving Russia today from February 22 as his source, but unfortunately no links to such an article can be found.

Andrej Zajakin, according to Smirnova, is a physicist who has lived in Spain for the past six years, doing research at the University of Santiago de Compostela. Using the pseudonym "Dr. Z.", he has been publishing extensive documentation of corruption in Russia. In February the Russian blogger Aleksej Navalnyj revealed that the politician Vladimir Pechtin had lied to voters by stating that he did not own foreign property when he actually owned a house in Florida. Pechtin was forced to step down. Navalnyj gave Zajakin as his source.

Zajakin is one of the persons behind Dissernet, a site that documents corruption and plagiarism in dissertations. The site is in Russian, but is said by Smirnova to have documented plagiarism in over 100 dissertation, including Russian politicians such as Pavel Astachov (children's rights commissioner), Olga Batalina (member of parliament), Vladimir Burmatov (member of parliament), Vladimir Gruzdev (governor of Tula Oblast), and Oleg Kowalyov (governor of Ryazan Oblast). The plagiarism documentations are linked from the page http://www.dissernet.org/expertise/ and use a similar type of documentation as is found at the German VroniPlag Wiki:
Plagiarism documentation at http://www.dissernet.org/expertise/kozlovaa2005.htm
The German online portal Spiegel online also published an interview (in German) with Andrej Rostovzev, one of the Dissernet scientists, in April 2013, in which he explains that this site is not organized as a wiki, but only permits vetted individuals to contribute to the effort. I don't speak Russian, but I would be happy to offer guest blogging privileges to anyone who would like to report on the progress being made by this group.

Wednesday, June 5, 2013

Plagiarism in Russia

"Nowhere else in the world is there so much forgery, plagiarism, and bribery in order to obtain a doctorate than in Russia. Politicians, oligarchs, and Mafia bosses - all have ambitions to have that prestigious 'Dr.' in front of their names."

Hans-Joachim Hoppe wrote a very long and detailed article in German about the plagiarism situation in Russia: Faked doctorates, misery at university, and an irritated education minister. 

Friday, September 21, 2012

Russia to check doctorates centrally?

One of my bots has turned up a press release from Russia stating that Prime Minister Dmitry Medvedev wants Russia to have software for checking for plagiarism in graduate theses and PhD dissertations, as well as setting up an Open Access repository of these theses. I don't believe that it is any easier to find plagiarism in Russian than in English or German or French, as plagiarism is more than just word-for-word copies. But it is a step in the right direction, and a step more than many countries, for example Germany, are willing to take. Teaching people about plagiarism and how to write scientifically would be more helpful, in my opinion.

Thursday, June 7, 2012

Plagiarism allegations against new Russian minister of culture

Radio Free Europe / Radio Liberty reports on the history dissertation of the new Russian minister of culture, Vladimir Medinsky. The article begins with some gems from the academic writing of Medinsky:

The Molotov-Ribbentrop Pact "deserves a monument."
The U.S.S.R. never occupied the Baltic states, it just "incorporated" them. 
An infamous picture of a Nazi-Soviet military parade in Poland in 1939 was "photoshopped."
Anti-Semitism in Tsarist Russia has been "greatly exaggerated." 
Sure. Why, I remember that ancient version of Photoshop, must have been version -31, like it was yesterday.

But a group of Russian historians, perhaps inspired by VroniPlag Wiki, have documented plagiarism in 16 places in Medinsky's dissertation.

Radio Free Europe / Radio Liberty quotes historian Lev Usyskin as passing judgement on the non-plagiarized portions of the dissertation:
"The bits that weren't plagiarized did not conform to the slightest academic rigor. This is actually a fraudulent scientific degree. The doctor himself knows this perfectly well -- this is a person who is not embarrassed to stand before the world as a fraudster," he adds. "His morals are clear."
It will be interesting to see how this develops.  Just a few weeks ago, the Romanian minister of education had to step down on charges of plagiarism, and at the end of March Hungary's president had his doctorate rescinded.