Showing posts with label predatory publishing. Show all posts
Showing posts with label predatory publishing. Show all posts

Thursday, January 9, 2020

Predatory Publishing 2020

It's 2020 and I'm still bogged down, not finished with my notes from half a year ago on the ENAI conference. What can I say? Life and all....

So let's start the new year with a discussion on predatory publishers. Deborah Poff gave a keynote speech at the ENAI conference 2019 on the topic, and as COPE chair she has now published a discussion paper on the topic. There are a number of irritating points, as Elisabeth Bik points out in a Twitter thread, but on the whole this is a good paper to get this very important discussion going in the new year.

How can we tell whether or not a journal is legitimate or not? Legitimate in the sense that rigorous peer-review is not just stated, but actually done? We are in a current world situation in which certain groups attack science because it is informing us of uncomfortable truths. Predatory publishers offer a welcome point of attack, as the weaknesses of the "science" they publish are immediately assumed for all science. The "self-regulation" of science has been shown in recent years to not actually do the work it is supposed to do, despite the efforts of so many to point out issues that need attention.

Researchers need guidance about publication venues. Beall's list was taken down for legal reasons, but there is a web site that publishes an archived copy of the list that was taken on 15 January 2017. That was soon after the 2017 list was published.

There is a checklist available at thinkchecksubmit.org that is useful, but not a list of problematic publications, probably for legal reasons.

We can't keep putting out heads in the sand about the problems of academic misconduct. If we only look away, we let people get away with bad science, and that then reflects on us all.

Monday, January 14, 2019

Flaky Academic Journals

Just a quick link to "Flaky Academic Journals", a blog that is collecting spam emails from, well, flaky academic journals.

Friday, July 20, 2018

Mock Science

Welcome to all new readers of my blog who followed a link from the Süddeutsche Zeitung! An investigative team from the Süddeutsche, WDR and NDR spent almost a year looking into m mock science: predatory publishers and mock conferences. They will have a TV documentary "Fake Science" on Monday, July 23, 2018 @ ARD. The NDR has published a short summary in English, the Süddeutsche has a summary available in German. An Indian journalist from the Indian Express who participated in the team has also published an article online.

I prefer to use the term "mock science" instead of "fake science", because this is different from so-called "fake news". Some of the science that is published by the predatory publishers or presented at the mock conferences is good science, but the authors were lured into thinking that they were writing for well-known journals or presenting at conferences at which they would be able to network with others in their own fields. These publishers and conference organizers are making a mockery out of what should be good science communication.

I also want to make it clear that Open Access is not the villian - there are some very good Open Access publishers out there. I have found too much bad science at so-called traditional publishers that take forever to retract (if it gets retracted at all). Many traditional publishers seem to be much more focussed on generating income than on communicating good science.  

Wednesday, January 4, 2017

The new predatory publisher list is out

Jeffrey Beall has published the 2017 version of his predatory publisher list. When he started the list in 2011, he had 18 publishers on the list. Now there are 1155! The number of standalone journals has gone from 126 to 1294 in the same time period. Since 2015 he has also been tracking misleading metrics and hijacked journals.

The list is getting to be so much more vital as the predatory publishing industry grows. Just the other day a colleague asked me my opinion of a publisher she had never heard of, but which had made her an offer to publish a paper. I showed her the site, and we found the publisher quickly on the list. The email to her was summarily filed in the trash. One less researcher to be fooled, but I fear that there are many more. Spread the word to your colleagues that you can check the list before you submit! It's not a guarantee, but it's a tool to help you assess the journal in question.

Update 20170117:  The lists disappeared on January 15, it is assumed that some publisher on the list is trying a strategic lawsuit against public participation (SLAPP) and has forced a take-down. There is a detailed discussion to be found at the blog Debunking Denialism.
The lists are still on the Internet archives for now. The links are giving at the above-mentioned blog. 

Saturday, July 25, 2015

End of Semester Roundup

Here are some links about plagiarism and academic misconduct from around the world that have been languishing in my inbox:
  • A professor for African-American History at the Arizona State University in the US has been demoted from full professor to associate professor, according to the Phoenix New Times, after a second plagiarism scandal about his writing erupted. A  blog contains many more details about the plagiarism allegations. Retractionwatch also has an article about this case. The professor is said to have used, among others, sources many students find useful: Wikipedia.
  • Writing in a blog at the Daily Pakistan Asif A. Malik points out a wonderful piece by physics professor Pervez Hoodbhoy from the Express Tribune from January 4, 2013. Hoodbhoy starts off with a thought experiment: "Imagine the following experiment aimed at improving Karachi’s police force: suppose that policemen are offered cash prizes for every criminal they kill in a police muqabala, given public recognition and told that promotions to higher posts hinge on their kill count." He spins the story out that, of course, the police would start shooting at anyone, just to increase their pay. That is, in essence, what is happening in academia, except that instead of a "kill count" there are the magic indicators "number of publications" and "citation index". Since 2002-2003 both the pay and promotion for professors in Pakistan depend on the number of papers published and the number of PhD and MSc students graduated. Surprise! These all increased!
    In 2012 Hoodbhoy wrote about the problem of telling the good from the bad, in the Express Tribune column in 2013 he noted that the apparent increase in quality proudly proclaimed by the Higher Education Council (HEC) "was only possible because many university teachers engaged in wholesale plagiarism, faked data and produced research that no one seems to have any use for. As academic ethics went into free fall, university administrators and the HEC turned a blind eye. The new policy — which required learning how to play the numbers game — had the effect of turning many professors into crooks and thieves."
  • The Times of India writes that a professor from the Postgraduate Institute of Medical Education & Research in Chandigarh has been found to have plagiarized from two US American sources. Rakesh Sehgal retracted a paper from the journal Tropical Parasitology, published by Wolters Kluwer Medknow. The retraction notice states only that the paper has been retracted, not why. In a previous article, the Times of India noted that according to Indian law, a jail sentence between 6 months and 3 years can be levied, and public servants can lose their jobs if found to have plagiarized. 
  • Richard de Boer published an atricle in the Dutch newspaper de Volkskrant on July 4, 2015 called "Welkom in de wereld van nepwetenschap" (welcome to the world of junk science), but the article is unfortunately behind a paywall and in Dutch. I was able to obtain it and use Google translate. The article deals with mock conferences like WASET and junk journals like those published with the OMICS group, which was removed from the PubMed list in 2013. There are a number of interviews with people attending the conferences, positive and negative, and also with a former colleague of the WASET organizer. The article discusses both Jeffery Beall's publisher black list and the Directory of Open Access Journals white list, noting problems with each. What a shame such an in-depth article is unavailable to a wider audience.
  • Hatoon Kadi writes in her blog at Arab News the Memoirs of a Saudi Ph.D. student: The menace of plagiarism. She mis-believes, as many do, that "[t]here are certain programs that take a few seconds in determining the originality of any research material." No, software can only detect potential text overlap, it cannot determine plagiarism or originality, because all systems suffer from false positives (quotation not seen) and false negatives (source not stored in the database). She discusses the question of hiring a ghostwriter with friends and found to her dismay that many had no qualms about using work from others. She calls for strict laws to punish plagiarists. I don't think punishment works -- we need to educate people as to why referencing and quotation is important.
Now, back to grading exams.

Saturday, April 4, 2015

Brazilian Government recommends mock conference

I have been made aware of the following article by Mauricio Tuffani in online version of the Brazilian daily newpaper Folha de S. Paulo: "Eventos científicos "caça-níqueis" preocupam cientistas brasileiros" (Scientific event cares about Brazilian scientists). The article is discussing (as far as I can puzzle out with Google Translate) the WASET multiconference to be held in Rio de Janiero in February 2016. Not one, not ten, but 116 simultaneous scientific meetings are planned to be held in a hotel there. Registration is already open, with rates of up to 450 € for speakers (250 € for listeners only), with a special deal of only 100 € more for an additional paper. 

The conference is organized by a publisher, WASET, that is on Jeffrey Beale's list of predatory publishers. A number of universities world-wide warn their academics from submitting to these conferences. Not the Brazilian government, though, according to Folha de S. Paulo: CAPES, the Higher Education Personnel Training Coordination body of the Brazilian Ministry of Education includes these conferences on their online platform Qualis. This is a list of periodicals and conferences that researchers are recommended for choosing to publish their research, as promotion and tenure depends, as it does so many places, on the number of published articles and conference presentations, not the quality. 

The conference advertises about how well-indexed their conferences are. For example, they say that they are indexed with the "International Science Index".  Since one of the largest citation databases in the world, the Web of Science, is known as the ISI index (Institute of Information Science), careless academics could easily jump to the conclusion that this conference is indexed at ISI.

Folha de S. Paulo was unable to get researchers to speak about this on the record, except for an ecologist from Sorocaba. His name is listed as being a member of the scientific committee of one of the 116 events, the "14th International Conference of Geophysics and Environmental Engineering". He was very surprised to hear that he was named here, he did not know the conference and stated that he will take steps to have his name removed from the conference web site.

Folha de S. Paulo asked WASET for comment, but there was no response. The journal notes that the company is listed as being in Riverside, California, USA, but the phone contact is in the United Arab Emirates and they say that the ISSN records for the publication list them as being from Turkey. I was not able to find an ISSN number given on the web pages of this multiconference, so I wasn't able to verify that it is indeed listed in Turkey and in the Qualis database.


Looking closer at the web site of WASET [I won't link here for obvious reasons] it is quite easy to see how this operation works. There are multiconferences being held ever week in a choice of international locations: Paris, Brussels, Istanbul, Auckland, Taipei, Bali, Dubai, Singapore, London. Conferences are planned up to and including 2027. Inspecting the link for Rio in February there are, indeed, conferences in 23 categories with varying numbers of individual conferences that all sound similar: International Conference on ..... (fill in the blank). All will take place at the same hotel, which only, according to their web page, has 35 meeting rooms.

The text on the conference pages is boilerplate, identical except for a few subject areas changed to fit the title of the conference. There is one month given as the time for the peer review by three reviewers. Some of the conference committees are identical for different conferences, sometimes they are different. Not all of the institutions the persons are affiliated with are decodable. The conference photos for the conferences are all the same. If you put this URL into Google's image search, you find it listed as a photo for conferences in Paris, Quebec, London, New York, and San Francisco. One attendee uses it in a university newspaper and identifies herself in the picture, noting that the conference was held in Osaka.

It is high time that universities and research institutions stop using quantitative measures for academic decisions. Predatory publishers and mock conference organizers have perverted the ideas of academic exchange and communication that existed previously and flooded the market with lookalikes. The German research council, DFG, took a step in the right direction in 2010 when they began to base funding decisions not on quantity, but on quality of the research. A researcher can only submit his or her best five publications in applying for grant money, and can only list two publications per year in grant reporting. They also refuse to accept any publication listed as "in press", as some researchers were being quite creative and referring to "in press publications" that hadn't yet been submitted.

Now how do we get the word out to the rest of the world and dry up the funding that is feeding this mock science machine?

Sunday, December 1, 2013

Musings on mock conferences and predatory journals

Jeffrey Beall published the "evaluation form" from a scientist who was lured to one of the many OMICS mock conferences. He describes pretty much all of the behavior that is found at such conferences: no involvement of the people on the committees, shortening the conference, massive no-shows, lots of pictures and awards and a fancy web site. It took a lot of effort on his part to get his name removed from their web site, the entire page has now been pulled. Perhaps scientists should quit attending large conferences at hotels, instead sticking to smaller, focused conferences held at universities?

OMICS also publish a wide range of "open access" journals that are on the predatory publishing list. I wonder how many of the "editors-in-chief" actually know that they are editors here?

One of the commenters noted that there is now a CWTS Journal indicator that calculates an impact factor that is normalized according to the field for journals in the SCOPUS database. I looked up a few journals, they seem to have only English-language journals listed. Even just looking at my field, I see so very many journals, how on earth are people able to read all of them? It might be good to check out the journals you are planning on submitting to before you dash off that manuscript.
source normalized impact per paper

Saturday, November 30, 2013

Peer Review, Impact Factors, and the Decline of Science

[I've had interesting tabs open for weeks waiting for time to report on them. Sorry for the delay. -dww]

The Economist reported on October 19, 2013 (pp. 21-24) that there is "Trouble at the lab". Indeed. And trouble has been brewing for quite some time without a single identifiable culprit or an easy way to solve the problem. This problem is concerned with predatory publishing, irreproducibility of scientific results, and the use of quantitative data as an attempt to judge quality.

University administrations, search and tenure committees, governments, funding associations, and other bodies need some way of judging people they don't know in order to decide whether to offer them jobs or promotions or funding. This has often boiled down to counting the number of publications, or the impact factors of the journals in which their articles are published. Coupled with the crisis in publishing, with the subscription price of subscription journals exploding, an unhealthy mix is brewing.

Predatory publishers promise quick publication in good-sounding "international" journals, using the Open Access "golden road" to extract fees from authors. They promise peer review, but if at all they only seem to look at the formatting. Established publishers trying to keep up their profits have incorporated more and more journals into their portfolios without keeping a watchful eye on quality control.

Enter John Bohannon. In October 2013 Bohannon published an article in Science, Who's Afraid of Peer Review? He details a sting operation that he conducted between January and August 2013, submitting 304 papers with extremely obvious deficiencies to journals that he chose both from Lund University's "Directory of Open Access Journals" as well as from Jeffrey Beall's list of predatory publishers.

Bohannon has put his data online, showing that 82% of the journals chosen from Beale's list accepted the fabricated paper, as well as 45% of the journals on the DOAJ list. Predictably, DOAJ is not amused and accusing Bohannon of, among other things, racism because he chose African-sounding names for the authors (1 - 2).

In August 2013, Nature journalist Richard van Noorden detailed a scheme by publishers called "citation stacking" in which a group of publishers collude to quote extensively from each other's journals in order to avoid being sanctioned for coercive citation. This activity was described in Science in 2012 by Allen W. Wilhite and Eric A. Fong as a process by which authors are instructed to quote from a publisher's own journals in order to increase the so-called impact factor. van Noorden's article focused on a group of Brazilian journals, so he, too, was accused of racism. This is unfortunate, as it detracts from a very serious problem.

We find ourselves today in a rapidly expanding world with scientific research being conducted in many different places and much money being invested in producing results. People need publications, and have little time for doing peer review, a job that is generally not paid for and performed as a service to the community. Universities in countries without a tradition of rigorous scientific practice have researchers who need publications, and there are people out to make money any way they can. Researchers competing for scarce jobs in countries that are trying to spend less on science and education than they have in the past are also sometimes tempted to follow the path of less resistance and publish with such journals. And some are not aware that they have just selected a publication that sounds like one that is well respected, as Beall has noted.

I don't have a solution to offer, other than boycotting the use of quantitative data about publications and getting people to be aware of the scams going on. We need to get serious about peer review, embracing such concepts as open access pre- and post-publication peer review in order to get more rigor into the publication process. I realize that people have been complaining about the decline of science since at least Charles Babbage (Reflections on the Decline of Science in England, And on Some of Its Causes, 1830). But we are in grave danger of letting bad science get the upper hand.

And what happens to those who try and point out some of the dicier parts of science? Nature just published another article by van Noorden, together with Ed Yong and Heidi Ledford, Research ethics:  3 ways to blow the whistle.

Update 2013-12-01: fixed a typo