Day 0 - Day 1a - Day 1b - Day 2 - Day 3
Quite refreshed from a long night's sleep and reluctant to venture out into the rain, here's the rest of Day 1 of the WCRI conference 2019!
Session: Principles and Codes 2
Daniel Barr, RMIT University, Melbourne
"Research integrity around the Pacific rim: developing the APEC guiding principles for research integrity"
They looked at integrity systems across the Asia-Pacific Economic Cooperation (APEC) area and collated a close consensus of guidelines:
- Research integrity systems appear diverse, multi-faceted and dynamic
- Principlies-based policies appear common, but are not uniform
- Limited coordination of institutions with some exceptions
- Varied roles in leading or influencing research integrity systems
Honesty, Responsibility, Rigour, Transparency, Respect, Fairness, & DiversityThe topic of Diversity was a principle that came out of the workshop.
Natalie Evans was supposed to speak about "Research integrity
and research ethics experiences: a comparative study of Croatia,
the Netherlands, and Spain" but there was some planning mix-up so Jillian Barr
took over and spoke about research integrity in Australia, which was a shame for me, because that is what she talked about at the pre-conference workshop.
RSF has existed for
5 years now, but has less than 10% of the federal budget allocated
for science.
They audit the
accounting of the grants: Business class instead of coach, fancy
furniture for the bosses' cabin. They don't touch the scientific
part, only if the expenses are related to the research.
I asked about Dissernet and if that shows that they need to look beyond the economics to the science itself. He says that they have zero tolerance for plagiarism, but the researchers are themselves responsible for the scientific correctness of what they research. I'm afraid that he doesn't understand my question.
Update 2019-06-18: Sergey writes: "Regrettably, I did not manage to answer your question about Dissernet and if that shows that we need to look beyond the economics to the science itself. Frankly speaking, Dissernet has nothing to do with the Russian Science Foundation activities as they check the the thesises and dissertations and we deal with the project proposals, which is somewhat different.
Maybe, you missed the point that we do check not only financial part of the projects but also the scientific part (not by RSF staff but it is done by our expert council members), which is equally important to us.
Dr. Sonja Ochsenfeld-Repp has been Deputy Head of Division Quality and
Programme Management, German Research Foundation, since 2016. She spoke about the new Draft Code of
Conduct "Guidelines for safeguarding good scientific practice". The old white paper was first published in 1998 [as a reaction to a very large academic misconduct scandal, this was not mentioned], a revision is underway since 2017.
I asked about how many researchers in Germany actually know about and understand these guidelines, she assured me that everyone does. My own experience shows that this is not the case, there are quite a number of university heads who are unaware of the procedures and guidelines set out, even if it is published on their own web pages.
I spoke with another researcher afterwards who conducted an actual study investigating how many people did, indeed, know about and understand the guidelines. The results appear to be sobering, I'll see if I can get a hold of concrete results.
Sergey Konovalov,
Russian Science Foundation, Moscow
Research ethics in
Russia: challenges for the Russian Science Foundation
I asked about Dissernet and if that shows that they need to look beyond the economics to the science itself. He says that they have zero tolerance for plagiarism, but the researchers are themselves responsible for the scientific correctness of what they research. I'm afraid that he doesn't understand my question.
Update 2019-06-18: Sergey writes: "Regrettably, I did not manage to answer your question about Dissernet and if that shows that we need to look beyond the economics to the science itself. Frankly speaking, Dissernet has nothing to do with the Russian Science Foundation activities as they check the the thesises and dissertations and we deal with the project proposals, which is somewhat different.
Maybe, you missed the point that we do check not only financial part of the projects but also the scientific part (not by RSF staff but it is done by our expert council members), which is equally important to us.
We do not have much of plagiarism concerns but we strictly check the scientific acknowledgements (funding source should be properly indicated in the RSF-funded publications) and duplication of proposal contens submitted to RSF and other funding agencies [...]; these two scientific issues are in our view one of the most common examples of unethical behaviour of researchers in Russia. At least, in our experience (again, our programs cover only 10% of researchers and 15% of research organisations in Russia)."
Session: Predators
I am quite interested in the entire predatory publisher phenomenon, so I decided to attend this session, although there were at least two others in parallel with interesting talks. One was on Dissernet, the Russian plagiarism documenting group (but I know about them already) and the other one was a symposium on "Misdirected allegations of breaches of research integrity" with Ivan Oransky from RetractionWatch.
A colleage (who wishes to remain unnamed) reported from a parallel session, here's her take on those presentations (edited to fit my format and fix typos):
Session: Prevention 1
Michael Kalichman, UC San Diego, San Diego
Research misconduct: disease or symptom?
He surveyed RIOs on their perceptions of cases, and got some data that research misconduct occurs in cases deficient in Good Research Practices (i.e. maybe what these courses really need to teach is record keeping).
He listed out 10 GPR or lackings and from ~30 RIOs (out of 60 emailed) what practices were in place in cases they had personally investigated. It’s to be expected, but very good talk.
Michael Reisig, Arizona State University, Phoenix
"The perceived prevalence, cause, and prevention of research misconduct: results from a survey of faculty at America’s top 100 universities."
He has a forensic background and corresponded with about 630 respondents about prevalence, causes, and prevention of research misconduct, and found ~50% people surveyed were very much in favor of formal sanctions to prevent future misconduct. 29 percent said that nothing works, and "30%” wanted an integrated approach. QRP pretty common. But the slides went too fast for numbers.
Sudarat Luepongpattana, National Science and Technology Development Agency, Thailand, Bangkok
"Research quality development of the Thailand National Science and Technology Development Agency"
Yet another survey, and found that researchers don’t really know that authorship entails.
Ignacio Baleztena, European Commission, Brussels
"National practices in research integrity in EU member states & associated countries"
I left during this. My understanding was that representatives from 14 countries were going to have meetings, and then more meetings, and then follow a flow chart of meeting and then produce a definition of research integrity. I was getting seriously jetlagged, but that’s the memory. I just don’t understand why we need YET ANOTHER document. Are there any rules of thumb for when these are useful? [This is an excellent observation. Everyone is producing their own documents (sometimes by gently plagiarizing other institutions documents) on academic integrity. But how do we breathe life into them, change the culture? --dww]
[She missed the last talk and went to another session. She caught the tail end of a survey on Finnish atttitudes toward QRP, who said that it was hard to find a definition of research integrity]
Session: Attitudes 3
De-Ming Chau, Universiti Putra Malaysia/Young Scientists Network-Academy of Sciences, Malaysia, Serdang
"Effectiveness of active learning-based responsible conduct of research workshop in improving knowledge, attitude and behaviour among Malaysian researcher"
He did a survey (small sample size) and found that researchers with more experience say they are more likely to “do nothing” if a colleague is engaging in research misconduct
He’s pretty impressive; an NAS grant got him started designing RCR in Malaysia, and the programs are being designed by early career researchers
Session: Predators
I am quite interested in the entire predatory publisher phenomenon, so I decided to attend this session, although there were at least two others in parallel with interesting talks. One was on Dissernet, the Russian plagiarism documenting group (but I know about them already) and the other one was a symposium on "Misdirected allegations of breaches of research integrity" with Ivan Oransky from RetractionWatch.
First up was Rick Anderson
from the University of Utah, Salt Lake City with "Predators at the
gates: citations to predatory journals in mainstream
scientific
literature". He identified some predatory journals that had published nonsense in sting operations and then took some papers from each of these journals. He then looked at citations to these papers in the Web of Science, ScienceDirect and PLOS. Yes, there were citations to some of these. I was a bit concerned that he didn't look at the papers themselves to see if they made sense, as misguided individuals will publish good science in bad places.
Next was Donna Romyn,
Athabasca University, St Albert, Canada (a virtual university) on "Confronting
predatory publishers and conference organizers: a firsthand account".
She decided to attend a supposed predatory conference on purpose and to chart her journey. She submitted an abstract "At risk of being lured by a predatory publisher? Not me!". The paper was accepted within 24 hours, so there must have been rigorous peer-review done.... There was a bit of back and forth about her giving a keynote, she ended up with the exact same abstract but using a different title, "Safeguarding evidence-informed nursing practice from predatory publishers." She attended the conference and found about 60 people in attendance, many unaware of the nature of the conference. During the discussion the site thinkchecksubmit.org came up, it has a good checklist on what to look at before submitting a paper.
She decided to attend a supposed predatory conference on purpose and to chart her journey. She submitted an abstract "At risk of being lured by a predatory publisher? Not me!". The paper was accepted within 24 hours, so there must have been rigorous peer-review done.... There was a bit of back and forth about her giving a keynote, she ended up with the exact same abstract but using a different title, "Safeguarding evidence-informed nursing practice from predatory publishers." She attended the conference and found about 60 people in attendance, many unaware of the nature of the conference. During the discussion the site thinkchecksubmit.org came up, it has a good checklist on what to look at before submitting a paper.
Miriam Urlings from Maastricht University, Maastricht, looked at "Selective citation
in biomedical sciences: an overview of six research fields". She did a citation network analysis for papers in six focused research fields with around 100 relevant potential citations in order to see if there is citation bias. There were, however, only 1-2 citations for many of the publications and then highly cited ones in each area, so the results were not conclusive.
Eric Fong,
University of Alabama, Huntsville (with Allen W. Wilhite) spoke on "The monetary returns
of adding false investigators to grant proposals". He developed an interesting economic model for looking to see if adding false investigators (FI) to grant proposals increases the monetary value of total grants over a 5-year period. Then emailed 113.000 potential respondents and had a 9.5 % response rate. Their conclusion: if you add FI to your grants, you
apply for more grants, but that does not lead to larger funding per grant
application. However, adding FI significantly increases cumulative total
grant dollars over a five-year period.
Vivienne C. Bachelet, Universidad de Santiago de Chile (USACH), Santiago, spoke about the interesting problem of academics putting institutional affiliations on their bylines without actually being employed at the institution. "Author misrepresentation of institutional affiliations: exploratory
cross-sectional case study on secondary individual data".
They focused on researchers giving a Chilean university as an affiliation for the year 2016 and tried to verify if the person was actually affiliated with the university. For around 40 % of the authors, it was not possible to verify their connection to the university. This private investigation, done with no funding, was commenced after it became known that one university in Chile was paying prolific, Spanish-speaking researchers, to add an affiliation with their university, presumably to increase some metric the university is measured by.
After this I attended the COPE reception. There was a lot of very good discussion there, and some publishers I had mentioned in my talk were very interested to hear more about my cases.
They focused on researchers giving a Chilean university as an affiliation for the year 2016 and tried to verify if the person was actually affiliated with the university. For around 40 % of the authors, it was not possible to verify their connection to the university. This private investigation, done with no funding, was commenced after it became known that one university in Chile was paying prolific, Spanish-speaking researchers, to add an affiliation with their university, presumably to increase some metric the university is measured by.
After this I attended the COPE reception. There was a lot of very good discussion there, and some publishers I had mentioned in my talk were very interested to hear more about my cases.
A colleage (who wishes to remain unnamed) reported from a parallel session, here's her take on those presentations (edited to fit my format and fix typos):
Session: Prevention 1
Michael Kalichman, UC San Diego, San Diego
Research misconduct: disease or symptom?
He surveyed RIOs on their perceptions of cases, and got some data that research misconduct occurs in cases deficient in Good Research Practices (i.e. maybe what these courses really need to teach is record keeping).
He listed out 10 GPR or lackings and from ~30 RIOs (out of 60 emailed) what practices were in place in cases they had personally investigated. It’s to be expected, but very good talk.
Michael Reisig, Arizona State University, Phoenix
"The perceived prevalence, cause, and prevention of research misconduct: results from a survey of faculty at America’s top 100 universities."
He has a forensic background and corresponded with about 630 respondents about prevalence, causes, and prevention of research misconduct, and found ~50% people surveyed were very much in favor of formal sanctions to prevent future misconduct. 29 percent said that nothing works, and "30%” wanted an integrated approach. QRP pretty common. But the slides went too fast for numbers.
Sudarat Luepongpattana, National Science and Technology Development Agency, Thailand, Bangkok
"Research quality development of the Thailand National Science and Technology Development Agency"
Yet another survey, and found that researchers don’t really know that authorship entails.
Ignacio Baleztena, European Commission, Brussels
"National practices in research integrity in EU member states & associated countries"
I left during this. My understanding was that representatives from 14 countries were going to have meetings, and then more meetings, and then follow a flow chart of meeting and then produce a definition of research integrity. I was getting seriously jetlagged, but that’s the memory. I just don’t understand why we need YET ANOTHER document. Are there any rules of thumb for when these are useful? [This is an excellent observation. Everyone is producing their own documents (sometimes by gently plagiarizing other institutions documents) on academic integrity. But how do we breathe life into them, change the culture? --dww]
[She missed the last talk and went to another session. She caught the tail end of a survey on Finnish atttitudes toward QRP, who said that it was hard to find a definition of research integrity]
Session: Attitudes 3
De-Ming Chau, Universiti Putra Malaysia/Young Scientists Network-Academy of Sciences, Malaysia, Serdang
"Effectiveness of active learning-based responsible conduct of research workshop in improving knowledge, attitude and behaviour among Malaysian researcher"
He did a survey (small sample size) and found that researchers with more experience say they are more likely to “do nothing” if a colleague is engaging in research misconduct
He’s pretty impressive; an NAS grant got him started designing RCR in Malaysia, and the programs are being designed by early career researchers
No comments:
Post a Comment
Please note that I moderate comments. Any comments that I consider unscientific will not be published.