Showing posts with label Canada. Show all posts
Showing posts with label Canada. Show all posts

Tuesday, June 4, 2019

WCRI 2019 - Day 1b

https://wcri2019.org

Day 0 - Day 1a - Day 1b - Day 2 - Day 3


Quite refreshed from a long night's sleep and reluctant to venture out into the rain, here's the rest of Day 1 of the WCRI conference 2019!

Session: Principles and Codes 2

Daniel Barr, RMIT University, Melbourne
"Research integrity around the Pacific rim: developing the APEC guiding principles for research integrity"

They looked at integrity systems across the Asia-Pacific Economic Cooperation (APEC) area and collated a close consensus of guidelines:
  • Research integrity systems appear diverse, multi-faceted and dynamic
  • Principlies-based policies appear common, but are not uniform
  • Limited coordination of institutions with some exceptions
  • Varied roles in leading or influencing research integrity systems
They did a survey with 499 responses, but 85 % of the respondents were from Mexico, so they had to split their analysis on Mexico and not-Mexico. They also conducted a workshop with various participants. In honor of the memory of Paul Taylor they have developed the Draft APEC Taylor Guiding Principles for Research Integrity that are a top priority for avoiding breaches.
Honesty, Responsibility, Rigour, Transparency, Respect, Fairness, & Diversity
The topic of Diversity was a principle that came out of the workshop.

Natalie Evans was supposed to speak about "Research integrity and research ethics experiences: a comparative study of Croatia, the Netherlands, and Spain" but there was some planning mix-up so Jillian Barr took over and spoke about research integrity in Australia, which was a shame for me, because that is what she talked about at the pre-conference workshop.

Dr. Sonja Ochsenfeld-Repp has been Deputy Head of Division Quality and Programme Management, German Research Foundation, since 2016. She spoke about the new Draft Code of Conduct "Guidelines for safeguarding good scientific practice". The old white paper was first published in 1998 [as a reaction to a very large academic misconduct scandal, this was not mentioned], a revision is underway since 2017. 

I asked about how many researchers in Germany actually know about and understand these guidelines, she assured me that everyone does. My own experience shows that this is not the case, there are quite a number of university heads who are unaware of the procedures and guidelines set out, even if it is published on their own web pages. 

I spoke with another researcher afterwards who conducted an actual study investigating how many people did, indeed, know about and understand the guidelines. The results appear to be sobering, I'll see if I can get a hold of concrete results.

Sergey Konovalov, Russian Science Foundation, Moscow
Research ethics in Russia: challenges for the Russian Science Foundation

RSF has existed for 5 years now, but has less than 10% of the federal budget allocated for science. They audit the accounting of the grants: Business class instead of coach, fancy furniture for the bosses' cabin. They don't touch the scientific part, only if the expenses are related to the research.

I asked about Dissernet and if that shows that they need to look beyond the economics to the science itself. He says that they have zero tolerance for plagiarism, but the researchers are themselves responsible for the scientific correctness of what they research. I'm afraid that he doesn't understand my question.

Update 2019-06-18: Sergey writes: "Regrettably, I did not manage to answer your question about Dissernet and if that shows that we need to look beyond the economics to the science itself. Frankly speaking, Dissernet has nothing to do with the Russian Science Foundation activities as they check the the thesises and dissertations and we deal with the project proposals, which is somewhat different. 

Maybe, you missed the point that we do check not only financial part of the projects but also the scientific part (not by RSF staff but it is done by our expert council members), which is equally important to us.
We do not have much of plagiarism concerns but we strictly check the scientific acknowledgements (funding source should be properly indicated in the RSF-funded publications) and duplication of proposal contens submitted to RSF and other funding agencies [...]; these two scientific issues are in our view one of the most common examples of unethical behaviour of researchers in Russia. At least, in our experience (again, our programs cover only 10% of researchers and 15% of research organisations in Russia)."
Session: Predators

I am quite interested in the entire predatory publisher phenomenon, so I decided to attend this session, although there were at least two others in parallel with interesting talks. One was on Dissernet, the Russian plagiarism documenting group (but I know about them already) and the other one was a symposium on "Misdirected allegations of breaches of research integrity" with Ivan Oransky from RetractionWatch.

First up was Rick Anderson from the University of Utah, Salt Lake City with "Predators at the gates: citations to predatory journals in mainstream scientific literature". He identified some predatory journals that had published nonsense in sting operations and then took some papers from each of these journals. He then looked at citations to these papers in the Web of Science, ScienceDirect and PLOS. Yes, there were citations to some of these. I was a bit concerned that he didn't look at the papers themselves to see if they made sense, as misguided individuals will publish good science in bad places. 

Next was Donna Romyn, Athabasca University, St Albert, Canada (a virtual university) on "Confronting predatory publishers and conference organizers: a firsthand account".

She decided to attend a supposed predatory conference on purpose and to chart her journey. She submitted an abstract "At risk of being lured by a predatory publisher? Not me!". The paper was accepted within 24 hours, so there must have been rigorous peer-review done.... There was a bit of back and forth about her giving a keynote, she ended up with the exact same abstract but using a different title, "Safeguarding evidence-informed nursing practice from predatory publishers." She attended the conference and found about 60 people in attendance, many unaware of the nature of the conference. During the discussion the site thinkchecksubmit.org came up, it has a good checklist on what to look at before submitting a paper.

Miriam Urlings from Maastricht University, Maastricht, looked at "Selective citation in biomedical sciences: an overview of six research fields". She did a citation network analysis for papers in six focused research fields with around 100 relevant potential citations in order to see if there is citation bias. There were, however, only 1-2 citations for many of the publications and then highly cited ones in each area, so the results were not conclusive.

Eric Fong, University of Alabama, Huntsville (with Allen W. Wilhite) spoke on "The monetary returns of adding false investigators to grant proposals". He developed an interesting economic model for looking to see if adding false investigators (FI) to grant proposals increases the monetary value of total grants over a 5-year period. Then emailed 113.000 potential respondents and had a 9.5 % response rate. Their conclusion: if you add FI to your grants, you apply for more grants, but that does not lead to larger funding per grant application. However, adding FI significantly increases cumulative total grant dollars over a five-year period. 

Vivienne C. Bachelet, Universidad de Santiago de Chile (USACH), Santiago, spoke about the interesting problem of academics putting institutional affiliations on their bylines without actually being employed at the institution. "Author misrepresentation of institutional affiliations: exploratory cross-sectional case study on secondary individual data".

They focused on researchers giving a Chilean university as an affiliation for the year 2016 and tried to verify if the person was actually affiliated with the university. For around 40 % of the authors, it was not possible to verify their connection to the university. This private investigation, done with no funding, was commenced after it became known that one university in Chile was paying prolific, Spanish-speaking researchers, to add an affiliation with their university, presumably to increase some metric the university is measured by.

After this I attended the COPE reception. There was a lot of very good discussion there, and some publishers I had mentioned in my talk were very interested to hear more about my cases.


A colleage (who wishes to remain unnamed) reported from a parallel session, here's her take on those presentations (edited to fit my format and fix typos):

Session: Prevention 1

Michael Kalichman, UC San Diego, San Diego
Research misconduct: disease or symptom?

He surveyed RIOs on their perceptions of cases, and got some data that research misconduct occurs in cases deficient in Good Research Practices (i.e. maybe what these courses really need to teach is record keeping).

He listed out 10 GPR or lackings and from ~30 RIOs (out of 60 emailed) what practices were in place in cases they had personally investigated. It’s to be expected, but very good talk.

Michael Reisig, Arizona State University, Phoenix
"The perceived prevalence, cause, and prevention of research misconduct: results from a survey of faculty at America’s top 100 universities."

He has a forensic background and corresponded with about 630 respondents about prevalence, causes, and prevention of research misconduct, and found ~50% people surveyed were very much in favor of formal sanctions to prevent future misconduct. 29 percent said that nothing works, and "30%” wanted an integrated approach. QRP pretty common. But the slides went too fast for numbers.

Sudarat Luepongpattana, National Science and Technology Development Agency, Thailand, Bangkok
"Research quality development of the Thailand National Science and Technology Development Agency"

Yet another survey, and found that researchers don’t really know that authorship entails.

Ignacio Baleztena, European Commission, Brussels
"National practices in research integrity in EU member states & associated countries"

I left during this. My understanding was that representatives from 14 countries were going to have meetings, and then more meetings, and then follow a flow chart of meeting and then produce a definition of research integrity. I was getting seriously jetlagged, but that’s the memory. I just don’t understand why we need YET ANOTHER document. Are there any rules of thumb for when these are useful? [This is an excellent observation. Everyone is producing their own documents (sometimes by gently plagiarizing other institutions documents) on academic integrity. But how do we breathe life into them, change the culture? --dww]

[She missed the last talk and went to another session. She caught the tail end of a survey on Finnish atttitudes toward QRP, who said that it was hard to find a definition of research integrity]

Session: Attitudes 3

De-Ming Chau, Universiti Putra Malaysia/Young Scientists Network-Academy of Sciences, Malaysia, Serdang
"Effectiveness of active learning-based responsible conduct of research workshop in improving knowledge, attitude and behaviour among Malaysian researcher"

He did a survey (small sample size) and found that researchers with more experience say they are more likely to “do nothing” if a colleague is engaging in research misconduct

He’s pretty impressive; an NAS grant got him started designing RCR in Malaysia, and the programs are being designed by early career researchers


Sunday, June 2, 2019

WCRI 2019 - Day 0

https://wcri2019.org

Day 0 - Day 1a - Day 1b - Day 2 - Day 3


I am currently attending the World Conference on Academic Integrity (WCRI 2019) in Hong Kong, sponsored by a travel grant from the German Academic Exchange Service (DAAD). It is quite an international conference with over 700 attendees from all over the world, although of course given the location there are many Asia-Pacific countries represented. I will be blogging about the talks and workshops that I attend, which are only a small fraction of the talks held, as there are seven tracks in parallel. As usual, there is one paper in each track that I really want to hear, so I will have to flip three coins to see which session I attend.
The conference is being held at the University of Hong Kong, a large university of high-rises, terraces, and steep stairways nestled in between the skyscrapers of Western Hong Kong.

I attended two pre-conference workshops before the opening ceremonies.

1. How to investigate breaches of research integrity and research misconduct

The workshop was designed by Daniel Barr (RMIT Melbourne, Australia), Ton Hol (University of Utrecht) and Paul Taylor(†, formerly of RMIT) and there were around 50 participants. Three talks were held and there was some discussion.
In his introduction, Dan Barr proposed this definition of research integrity:
Research integrity is the coherent and consistent adherence to a set of principles that underpin the trustworthiness of research.
He noted that people are not consistent – they might behave well one month, and use questionable practices the next month. From this I take it that the focus should be on the research and thus the publication or non-publication itself, not on the person.

Ton Hall, head of the School of Law at the University of Utrecht, then spoke on handling allegations of research misconduct in the Netherlands. The Diederik Stapel case appears to have been quite a force in getting Dutch universities to focus on both preventing research misconduct as well as formalizing the investigative process.

He first looked into the reasons why an institution needs to deal with accusations of research misconduct. Above all, the public's trust in science should not be affected by faulty research. There are other reasons of course, not only giving satisfaction to the accuser, but also to protect the reputation of the institution and of the accused researcher, and of course to improve the local research practices. He then explained the difference between an accusatorial or an inquisitorial approach. That means, an institution can either respond to an allegation, or it can start investigations on its own.

He noted that complaints from anonymous accusers can be investigated if there are compelling public insterests or the factual basis can be investigated without additional input from the complaintant (for example in documented plagiarism cases).

Jillian Barr, Director Ethics and Integrity National Health and Medical Reserach Council Australian Government, then gave the view on investigating breaches from the view of an Australian funding agency. In 2018, Australia published a code of Responsible Conduct in Research. There are many additional guides published, among them one on investigating potential breaches of the code.

One of the most important aspects in convening a panel for investigating potential breaches is deciding who should be on the panel, as there are potential consequences for those involved. Which members of the department or other departments should be incuded, should there be external members, should they have prior experience with dealing with such issues, how well do they need to know the code, do they have to understand the relevant discipline, are there conflicts of interest or gender / diversity issues to be addressed? And of course, who should chair such a panel, someone with legal experience? Many questions and no easy answers.

Karin Wallace, from the Secretariat for the Responsible Conduct of Research in Canada, was up next. The body she represents sees plagiarism as one of the largest problems, as well as misrepresentation of credentials. However, each case is unique, so it is not easy to set up guidelines for sanctioning.

Investigation reports do not have names on them, so that the focus for the investigation committee is on the facts of the case, not the institution or person involved. She suggests having a standing investigation committee that is familiar with research integrity, with subject matter expertise filled in on an ad hoc basis. She cautions that external members should be familiar with research integrity procedures and be in close proximity, in order to facilitate their participation.

Finally, Chris Graf, from COPE and the Director of Research Integrity and Publishing Ethics at Wiley, gave us the point of view of the publisher in dealing with breaches of integrity. Wiley, a large scientific publisher, has a number of people working full-time on this topic. 

He noted that research publishers create and maintain the formal "version of record". When an error is identified, corrections are restricted to an equally formal set of permanent options: Corrections, experessions of concern, retractions, withdrawals. The latter is what he calls the "nuclear option" to be used when a paper must be deleted for legal reasons or other serious circumstances, however, the formal bibliographic information is left at the DOI. 

He presented some interesting numbers of the various types of case they handled in 2018 and 2019 to date:
58 Misconduct or Questionable behavior
45 Authorship
43 Data issues
34 Redundant or Duplicate publications
33 Plagiarism
30 Copyright
30 Correction of the literature
26 Legal issues
22 Consent for Publication
20 Questionable or Unethical Research
19 Mistakes
13 Conflicts of Interest
6 Peer review
5 Whistleblowers
5 Funding
4 Contributorship
3 Social Media
2 Editorial Independence
2. Workhop on "The Embassy of Good Science"

Guy Wissershoven opened the session by briefly explaining the project and the web site, which is not yet online. It seemed that about half of the room (also around 50 persons in attendance) was somehow involved in the project, I am not sure but it seemed to be funded by the EU. 

I found a short description on the web:
‘The Embassy of Good Science’ is a European initiative to make the normative framework informing research ethics and research integrity (RE+RI) easily accessible, share RE+RI knowledge and experiences, and foster active participation of the scientific community.
The development of The Embassy platform is underpinned by a stakeholder consultation, which consisted of a series of focus groups in three EU countries with diverse levels of research and innovation (The Netherlands, Spain and Croatia, n=59) and an online community discussion with participants from across Europe and beyond (n=52). Participants included researchers, editors, RE or RI committee members, policy-makers, and funders.
It is an excellent idea to collect up the information about research ethics and research integrity, as well as the various guidelines that exist into some sort of easy-to-use repository. I was intrigued by the idea that they want to provide a forum for discussion about cases and issues relating to research activity, but there was just a rudimentary implementation of tools to facilitate the discussion and no clear concept of how they will attract and retain interested persons to the discussions.

The project will be continuing until 2021, so I do hope that they acquire some deeper understanding of how to cultivate an active community.


Opening Ceremony

The opening ceremony in the Grand Hall that looks like it easily seats 1000 people included the usual words of greetings. Then there were two talks given.

Guoqing Dai (Director-General of the Department of Supervision and Scientific Integrity, Ministry of Science and Technology (MOST) of the People's Republic of China, Beijing) spoke about his department, founded in 2018, and tasked with setting up concerted efforts for coping with new challenges for research integrity that have been identified in China. Multiple measures will be undertaken in China to promote research integrity:
  1. Constantly improve the institutional arrangements
  2. Trying to establish sound operational mechanisms
  3. Strictly investigating and punishing breach behaviors (497 researchers were punished last year)
  4. Strengthening the dissemination and education about research integrity (16 million students recieved instruction last year)
  5. Deepening the reform of science and technology evaluation.
Alan Finkel, Australia's Chief Scientist, from Canberra, used a funny metaphor to describe the current scientific publication process with four million articles published every year: A bridge. What used to be perhaps a simple structure to connect A with B (the scientist with the reader) is now a triple-decker, multi-lane freeway. It's not about to collapse, but showing many signs of stress. The publication bridge must be kept open, but there are currently too many trucks with wrong cargo or contraband (there are over 20.000 retractions in the Retraction Watch database), or with useless cargo that no one wants. Some drivers speed madly in order to make as many trips as possible. There are smugglers (predatory publishers) and researchers jumping off the bridge into the wild waters of open science. 

He identified the biggest problem as the incentives. Systems adapt to follow the incentives, so the incentive system in academia must move away from counting publications and look to quality.  As an engineer he proposed a quality assurance initiative that includes mandatory research integrity training for all researchers (not just new ones) and some version of the Rule of Five (maximum of x publications in the last y years for values of x and y close to 5). There should be a "Publication Process Quality Assurance" seal that is positively awarded to journals, and grant-giving institutions should insist on publications only in these journals.


After this well-recieved talk, a Chinese "Lion's Dance" was presented, and then we were treated to some appetizers and a drink. It was lovely to meet old acquaintences and meet new people active in the field.


Updated 2019-06-03 to include Karin Wallace's institution

Sunday, July 16, 2017

Keeping tabs on cheating

I tend to keep tabs open in my browser for weeks with interesting articles I want to explore in more depth. Then Firefox decides to update and crashes so miserably, that the tabs are gone. So I'll try to at least post them here. No promises that I can do this with any kind of regularity, like Retraction Watch does with its Weekend Reads.
  • The Japan Times has an interesting article debunking an excuse typically used by students from the Far East: "Confcius made me do it." It seems that the difference between allusion and "literary theft" was well know many centuries ago.

    "If East Asian students and researchers plagiarize, it’s not because of some archaic cultural programming; it’s because modern institutional cultures tacitly condone plagiarism, or lack clear policies for explaining and combating it."
  • In the New Scientist there was an interview with Shi-min Fang that published in 2012, who was awarded the Maddox prize for his work on exposing scientific misconduct in China.  It seems that there is a lot of controversy around his work.
  • At the University College Cork in Ireland there was a spat about wide-spread contract cheating, as the Irish Times reports. Ireland is currently considering legislation to make advertising for or providing contract cheating services illegal.
  • Down under, the weekly student newspaper of the University of Sydney, Australia,  Honi Soit reports that the university had considered using some anti-cheating software that was created by former University of Melbourne students, but have decided not to after a trial. The idea was to analyse typing patterns and use multiple login questions in order to make it harder for students to submit essays purchased from contract cheating sites. Some of the issues included the necessity to be connected to the Internet to write an essay, forcing students to write with this system and not the editor of their choice, and a massive invasion of privacy that includes tracking the locations of the users and comparing it with the location of their mobile phones. The software was felt to be impractical and invasive.
  • Back in June the Daily Times reported that the doctorate of the prorector of the Comsats Institute of Information Technology has been revoked by Preston University.
  • The former head of the Toronto school board lost his teaching certificate for plagiarism. According to The Globe and Mail, he has appealed the ruling and is willing to testify under oath about who helped him produce the plagiarisms.

Monday, December 1, 2014

Diverse links

Here are some links that need documenting:
There will be more, I'm afraid, to come.