Saturday, September 2, 2017

News from the world of academic integrity

The school year is getting underway in the Northern Hemisphere, so there have been a number of links twittered or linked to in the past week that need linking:
  • Radio Free Europe published an article by Alan Crosby: Montenegro Education Council Members Resign, Learn Lesson About Plagiarism. It seems their curriculum was plagiarized from the Croation one without reference. 
  • In Australia the Daily Telegraph authors Chris Harris & Bruce McDougall report on a primary school teacher caught offering contract cheating: Cheating students on marketplace website offering to buy or sell work. It may seem that the practice of contract cheating is rather widespread in Australia, but I believe that is because there are researchers actively looking there. I did a quick search on eBay and found lots of offers and people looking for help in Berlin...
  • Linked from another article I was reading was an article from The Sport Digest from 2016 about South Korea: Moon Dae-sung Suspended as IOC Member Over Plagiarism. It seems that Moon, who had won an Olympic gold medal in Athens (2004) in taekwondo, had been awarded a doctorate in 2007 with a thesis on taekwondo. In 2014 Kookmin University rescinded his doctorate. Moon defended himself in the appeal, stating that he had the permission (!) of the author of the text he used to use it. 
  • The Swedish newspaper Sydsvenskan reported in June 2017 that there has been a record number of university students sanctioned for academic misconduct in 2016. There were a total of 33 universities questioned by the news agency TT, they reported 733 sanctions (an average of 2.5 per 1000 full-time equivalent students). In 2013 there were only 533 sanctions meted out. The increase in serious cases registered and sanctioned may be due to teachers doing more checking or getting better at discovering students who cheat on exams or plagiarize.

Friday, July 28, 2017

Fraudulent PhDs in Romania

The Times Higher Education web site has an article published July 27, 2017 about Emilia Sercan's work documenting plagiarism in Romanian doctorates. I had the privilege of meeting her at the Brno conference on plagiarism. She used to be able to obtain the doctorates via the national library, now there are restrictions on access. She has recently published a book (apparently in Romanian) on the Romanian "doctorate factories."

Sunday, July 23, 2017

Clippings from Sweden

A good friend in Sweden was kind enough to send me some clippings from Swedish newspapers about the academic integrity issues that he has been collecting over the past year or so. Here goes my summaries of the articles:

  • Uppsala Nya Tidning, 12 May 2016. Two professors of economics from the University of Uppsala and an assistant professor and a postdoc from the University of Stockholm, including one member of the Nobel Prize committee, have been accused of research misconduct. The accuser (named in the article) discovered his own work plagiarized in an article the accused published in 2011 that also somehow was used to prove an opposite result. The journal found the criticism correct, but decided not to take any action. The accused said that the plagiarized text was only in a preliminary version and was removed when they were made aware of it.
  • Uppsala Nya Tidning, 18 May 2016. A doctorate in the immunology of crayfish that had been awarded by the University of Uppsala has been rescinded after it was discovered that there was extensive manipulation of figures. This is the first time that this university has rescinded a doctorate. No sanctions were imposed on the advisor, as there are no rules for doing so. The thesis was a collection of five published papers, four of which had to be retracted. All attempts to contact the former doctoral student were fruitless. He was last seen working on a postdoc at a university in Texas, but was apparently fired there after just a few months for academic misconduct. 
  • Dagens Nyheter, 13 August 2016.  Three professors wrote an opinion piece on academic ethics demanding a reform of the current process of dealing with misconduct. Their four major points include a better definition of academic misconduct that differentiates between misconduct and badly-done research, making clear that the institutions understand that they have a responsibility to deal with academic misconduct, that there needs to be a national instance with sufficient resources to conduct investigations as necessary, and better protection the privacy of the whistleblowers and the accused. [Currently, the names of all concerned are open knowledge, according to the Swedish Freedom of Information laws.]
  • Uppsala Nya Tidning, 9 December 2016. A record number of cases of academic misconduct (17) have been reported to the Swedish Central Ethical Review Board (CEPN) in 2016. There were only 10 in 2016, and only 1-3 cases a year since the board was set up in 2010. The head of the board is not sure what has caused the surge, but states that it could be a result of the press coverage of the Macchiarini* affair that is encouraging other whistleblowers to come forward. The head of the Vetenskapsradet (the Science Council of Sweden) believes that universities are now referring more cases to the review board, as the Macchiarini affair showed the problems that arise when an institution makes the wrong decision. 
  • Dagens Nyheter, 23 February 2017. Commentary by the newspaper's medicine reporter Amina Manzoor about the suggestions proposed by Margareta Falhgren, the person appointed by the government to propose changes to how academic misconduct is to be handled in the aftermath of the Maccharini scandal. Her suggestions include a national body for investigating cases of suspected misconduct; forcing universities to register all cases with the body and permitting individuals to lodge complaints; the body taking a decision on whether misconduct happened or not, but leaving the sanctioning to the university; setting up a legal definition for misconduct to include FFP (falsification, fabrication, and plagiarism); other forms of cheating are to be dealt with by the universities themselves. It is expected that the changes will take effect by 2019. 
  • Uppsala Nya Tidning, 7 May 2017 (other reports on 31 March 2017, 29 April 2017 and 12 May 2017) A long article describes a paper on research conducted at a research station on the island of Gotland about the effects of microplastic particles on fish larvae that was retracted from Science. There is also commentary thanking the whistleblowers in this case, some of whom are from the University of Uppsala.  They had attempted to obtain the raw data on the study, but the laptop with the supposedly only copy of the data was registered as stolen 10 days after the first request for the data was sent. There were various other excuses for why the data was not available. The whistleblowers had been at the research station at the time the experiments were said to have been conducted, but they did not see anything of the magnitude of the study taking place. The University of Uppsala had at first found no misconduct, but the CEPN found multiple issues, including missing ethical permission for animal experiments. The university now has to decide if and how they will sanction the researchers. 
  • Dagens Nyheter, 7 June 2017. The number of students who are sanctioned for cheating is skyrocketing. 2016 there were 733 sanctions recorded, or 2.5 per 1000 full-time equivalent students. 2015 there were only 630 sanctions, 2013 only 533. The jurist in charge at the Universitetskanslersämbetet stated that the increase is due to the universities being more conscious of the problems and getting better and uncovering cheating.
I find it very encouraging that public discussions about academic integrity are taking place in Sweden. Other countries would be well-advised to follow suit.

* The surgeon Paolo Macchiarini implanted artificial trachea in three patients in Sweden, two of the patients died and one was badly hurt and recently also died. Karolinska Institutet, the institution at which the research was done, eventually fired a number of people in 2016 after a TV documentary forced them to take action. 

Sunday, July 16, 2017

Keeping tabs on cheating

I tend to keep tabs open in my browser for weeks with interesting articles I want to explore in more depth. Then Firefox decides to update and crashes so miserably, that the tabs are gone. So I'll try to at least post them here. No promises that I can do this with any kind of regularity, like Retraction Watch does with its Weekend Reads.
  • The Japan Times has an interesting article debunking an excuse typically used by students from the Far East: "Confcius made me do it." It seems that the difference between allusion and "literary theft" was well know many centuries ago.

    "If East Asian students and researchers plagiarize, it’s not because of some archaic cultural programming; it’s because modern institutional cultures tacitly condone plagiarism, or lack clear policies for explaining and combating it."
  • In the New Scientist there was an interview with Shi-min Fang that published in 2012, who was awarded the Maddox prize for his work on exposing scientific misconduct in China.  It seems that there is a lot of controversy around his work.
  • At the University College Cork in Ireland there was a spat about wide-spread contract cheating, as the Irish Times reports. Ireland is currently considering legislation to make advertising for or providing contract cheating services illegal.
  • Down under, the weekly student newspaper of the University of Sydney, Australia,  Honi Soit reports that the university had considered using some anti-cheating software that was created by former University of Melbourne students, but have decided not to after a trial. The idea was to analyse typing patterns and use multiple login questions in order to make it harder for students to submit essays purchased from contract cheating sites. Some of the issues included the necessity to be connected to the Internet to write an essay, forcing students to write with this system and not the editor of their choice, and a massive invasion of privacy that includes tracking the locations of the users and comparing it with the location of their mobile phones. The software was felt to be impractical and invasive.
  • Back in June the Daily Times reported that the doctorate of the prorector of the Comsats Institute of Information Technology has been revoked by Preston University.
  • The former head of the Toronto school board lost his teaching certificate for plagiarism. According to The Globe and Mail, he has appealed the ruling and is willing to testify under oath about who helped him produce the plagiarisms.

Sunday, July 9, 2017

German plagiarism cases in the news

There were four articles in German news this past week or so about a very diverse collection of plagiarism cases. Here are the links and short summaries in English:
  1. The taz published an article by Markus Roth about a biography that Stefan Aust, a well-known German writer, published in 2016, „Hitlers erster Feind. Der Kampf des Konrad Heiden“ (Hitler's first enemy - Konrad Heiden's struggle). Heiden, a writer in exile in France, had published a biography of Hitler in the mid-30s. It seems, however, that Aust liberally used text from Heiden himself, just changing the present tense to the simple past tense or adding an explanation of names that would be clear to someone reading in the 30s but not to present day readers. Some examples are given in the taz article.  Aust himself had apparently recently complained that people were looting Heiden's words, but stated that he was setting a monument to Heiden's works. Wer erzählt hier eigentlich?“ (who is speaking here) is apparently a question difficult to answer, unless one has read much of Heiden's work, as Roth has done (he is also working on a biography of Heiden).
  2. Stern reports on a Facebook posting by German folk music star Stefanie Hertel against hate on the historic occasion of Germany passing legislation permitting homosexual couples to marry. Her fans praised her words, but it turned out they weren't acutally hers, but from a TV game show moderator, Michael Thürnau. Ich fand seine Worte so toll, dass ich ihm einfach nur recht geben konnte, she defended herself according to Stern, "I found his words so awesome, that I just had to say that he's right." 
  3. The DFG, the German funding organization for research, announced that they were reprimanding a scientist "in writing". A life scientist (no name or research institut mentioned) was found to have had extensive word-for-word copies from other publications without reference in a grant application. The DFG investigated, and the scientist conceded that s/he had copied more for the "state-of-the-art" section.
    Since I don't know what a "reprimand in writing" means, I have written to the DFG to ask for clarification. 
  4. In other DFG news, a Leibnitz prize (2.5 million €) was awarded to a researcher after all. Just prior to the award ceremony in March 2017, plagiarism allegations arose. The DFG postponed the award in order to investigate. They are satisfied that there was no plagiarism, and thus have now given out the award. The allegations were not made public.
Update: Marco Finetti, the spokesman for the DFG, clarified for me: A reprimand "in writing" is indeed just a letter written to the scientist. But since it has been decided on by the Hauptausschuss, the main body of the DFG, all the scientists in that board and the representatives of the state governments (who finance the universities in Germany) heard the details of the case and decided on this as the weakest sanction. "It is a big blow to the reputation of a scientist", Finetti claimed.

Friday, June 2, 2017

WCRI 2017, Day 4

Day 3

I really wanted to hear John Ioannidis (Stanford University, Stanford, U.S.A.) speak in the morning about "Re-analysis and replication practices in reproducible research," but I was so tired that I didn't make it until later. I did have time, though, to speak with Skip Garner. I learned that eTBLAST, the text comparison and search tool that populated the Déjà vu database from MEDLINE, was turned off when he left his previous school. But there is a follow-on project, HelioBLAST. More on this later.

Ana Marušíc led the session on Retractions that included a very curious case. There were three talks about this case, it was a shame that they could not have given one talk three times longer (and without two different topics in between).

Alison Avenell (University of Aberdeen, UK) gave the first two talks. She spoke first about "Novel statistical investigation methods examining data integrity for 33 randomized controlled trials in 18 journals from one research group." While preparing a Cochrane study she and her colleages noted a rather odd set of studies by the same Japanese authors that managed to recruit and interview 500 women with Alzheimer's and 280 males and 374 females with stroke in just a few months, interviewing the participants every four weeks over a five-month period. And the studies all had the same results, although the patients were supposedly different.

Doing some statistics on the values reported showed it highly unlikely that the data was not fabricated. They wrote to the authors and quickly received a reply that this was an error, they would correct it. Instead of a retraction, however, there was only a correction published.

By now Alison's group was looking at the 33 other RCT studies from the group that they could find. They were published in 16 journals (the highest one was the JAMA) over 15 years with a total of 26 co-authors at 12 institutions. The group tried to see what the impact of these papers was, that is, in how many reviews this incorrect data was currently used. They found 12 of the studies in secondary publications, one guideline, 8 trials that used these results as the basis for their research rationale involving over 5000 people! That means that even with conservative costs of $500 / person / study, $2.5 million were spent, thinking that they were expanding on solid research. And since they were only looking at English-language publications, the impact was probably even wider.

In all, it took three years to get the JAMA paper retracted. Someone from the audience noted that it is difficult to get journals to retract papers anyway, mostly for legal reasons. Andrew Grey (University of Auckland, New Zealand) reported on the problems they had getting any of the papers retracted and their own paper about the case published (Neurology. 2016 Dec 6;87(23):2391-2402. Epub 2016 Nov 9). He used a timeline that got more and more complicated as time passed as they kept writing back to unresponsive journals. He identified some interesting issues:
  • How should journals deal with meta-reviews that are based on retracted work?
  • Should journals be more forthcoming in the face of unresolved concerns? If it takes 3 years to retract an article, there will be many people who read the paper and perhaps acted on it.
  • Should published correspondence about retracted papers also be retracted?
  • They also emailed medical societies and institutions at which the authors worked - should they have done this?
One of the other talks was by Marion Schmidt (DZHW, German Centre for Higher Education Research and Science Studies, Berlin, Germany) about an analysis she did on the annotation of retractions in PubMed and the Web of Science. She first determined that the word "retraction" is defined differently by various organizations. She noted that many retractions-based studies are often based on selecting papers marked "Retracted Publication" as the type of the article in PubMed. She conducted a title-based search in PubMed and on the Web of Science using "Withdrawal" in the title and the type of article marked as retracted, then validated manually. Surprise! There are retractions in PubMed that are not listed on the WoS, and vice versa. And not all withdrawals are marked at all. Sometimes a withdrawal in one database is marked as retracted in the other one. She concluded that the formats used by publishers do not translate loss-free into different databases and wonders how citing authors can even be aware of a retraction if even PubMed and WoS do not agree. Even if there were a database of retractions (the audience noted: Retraction Watch!), people would have to check all their references against it.

The other talk in the session was by Noemi Aubert Bonn (Hasselt University, Diepenbeek, Belgium). For some reason, it was in the retractions session, although it was not about retractions but is research about research integrity: How is it performed, how is it published, what are the consequenses?

In a plenary session about the Harmonization of RI Initiatives, Maura Hiney from the Health Research Board Ireland (HRB) and the lead author on the ALLEA European Code of Conduct for Research Integrity (2017) charted the development that has been made at WCRI: At the first conference people were discussing whether or not there really was a research integrity problem. Later conferences grappled with defining it, finding methods to investigate it, defining who is responsible for it, and now that there are so many different definitions and methods and policies, how can they be harmonized? Simon Godecharle had presented various maps at the 2013 WCRI showing the wide variations that exist in Europe alone, starting with language. At least by 2017 there are now less countries that have no policy at all.

Daniel Barr from Deakin University, Australia, spoke on the "Positive Impacts of Small Research Integrity Networks in Asia and Asia-Pacific," recurring on the Singapore Statement and noting that RIOs, research integrity officers, are quickly becoming the norm at universities.

Alison Lerner from the National Science Foundation, U.S.A. spoke about the NSF's role in "Promoting Research Integrity in the United States." She spoke of their processes of auditing and investigating cases of fraud, and noted that they have had some extensive plagiarism cases, some of which also involved fraud. Both PubPeer and Retraction Watch
were given a shout-out as non-governmental bodies that work on monitoring integrity.

I then did some more session hopping, as the interesting talks were in different rooms.

Skip Garner talked about finding potential grant double-dippers, it is a similar process to finding duplicate abstracts in MEDLINE or duplicate abstracts for papers given at different conferences (or the same conference in different years).   He spoke a bit about Déjà vu and how eventually many of the duplicates he uncovered were retracted. But the rate of retractions is lower than the rate of new questionable manuscripts in the scientific corpus, which is worrying. Even two years after a retraction, 20 % of them are not tagged as such, and thus people continue to use them.

For fun (yes, computer people have perhaps different ideas of "fun" than other folks) he downloaded the abstracts from scientific meetings that had more than 5000 abstracts each and permitted a longitudinal investigation because the meeting recurrs yearly or every other year. These were compared to each of the other abstracts at the meeting itself, with all abtracts in the previous meetings, and with his collection of Medline abstracts.

He encountered multiple submissions, replicate abstracts with different presenting authors, replicate abstracts from previous years, and plagiarized abstracts. He assured the audience that he did not run this meeting :)

His double-dipping work has been published (Double dipping, same work, twice the money) and reported on (Funding agencies urged to check for duplicate grants) in Nature in 2013. [Drat, I should have downloaded the first one when I was in Amsterdam. The VU has full access to Nature, my school doesn't. Of course, I could buy it for $18....].

During Q&A he was asked if he had reported the cases he found. Indeed he did, and the journals didn't like it. Seems the US government also subpoenaed his database...

Miguel Roig (St. John’s University, NY, U.S.A.) spoke about Editorial expressions of concern (EoC). He and some of the Retraction Watch crew pulled EoCs out of PubMed and examined them. They looked at the wording of the EoC and the eventual fate of the paper. Only 7 % resulted in a correction, 32 % resulted in a retraction, in 4 % of the cases the matter was resolved. For the rest (almost 58 %!!!)  there was no follow-up information to be found, even if the EoC was published four years previously. He referred to a very recent publication (Feb. 2017) on the same topic, Melissa Vaught, Diana C. Jourdan & Hilda Bastian, "Concern noted: a descriptive study of editorial expressions of concern in PubMed and PubMed Central" in  Research Integrity and Peer Review 2017 2:10. He closed encouraging journals to be more specific about the reason for the concern and to use EoCs more often.

Mario Malicki (University of Split School of Medicine, Split, Croatia) spoke about his "hobby project" (i.e. no funding) looking at third party inquiries of possible duplicate
publications. He discovered that the National Library of Medicine will assign a tag of "duplicate publication" in the [pt] field if it finds a pair during manual indexing. But there is no action taken, and since the mark is hard to find, people don't see them. He downloaded 555 potential duplicate publications, and checked to see if they had been retracted. He contacted 250 editors about the duplicates, although 16 editor emails could not be located at all. Not all editors bothers to answer his inquiry, although a few of these were eventually retracted. The correspondence with the editors was evaluated, as there were specific questions asked, such as: are you aware of the duplicate publication field tagging in Medline? Only 1 was aware of this, 15 said no, 165 did not bother to answer the additional question!

Mario catalogued the answers and the reasons given for not taking action, and as far as he obtained information, the excuses of the authors and above all of the publishers for their errors. It seems common that an article is published twice in different volumes (104 times), or doubly published in a sister journal (64 times) or even published twice in the same volume (21 times). Over the span of 4 years, 9 % the articles identified have been retracted. He did not determine the publishers or the precedence of the publications.

J.M. Wicherts (Tilburg University, Tilburg, The Netherlands) has a theory, namely that transparency and integrity of peer review are somehow linked. In order to show this, he set up QOAM: Quality of Open Access Market. Here the readers rate a journal on various paramters on a scale of 1 to 5. Since it was not made clear which is best (1 is the top grade in Germany), this has a cross-cultural issue. To date about 5000 ratings have gone in, there is one particularly active person. He saw this positively, I would check to make sure it is not someone hired by certain journals. As a quick test, I chose my favorite rabid anti-vaxxer paper published in a journal that was on the now-defunct B-list. Sure enough, it was in there, with three reviews and a grade of 4.6. I don't really believe that this is a good idea.

At the closing session Nick Stenek presented the Amsterdam Agenda for assessing the effectiveness of what are seen as the most important ways to promote integrity in research that had been worked on over the past days.

It was quite an experience, these two conferences in Brno and in Amsterdam. They were different in focus, but both offered much for me to learn. And it was fantastic to meet all these people I have corresponded with by email in person! The next WCRI will be in 2019 in Hong Kong, jointly organized by people in Hong Kong and Melbourne.

I have one other link that I picked up from a tweet I want to preserve here: The Authorship and Publication subway map from QUT Library and Office of Research Ethics & Integrity.

Over and out!

Thursday, June 1, 2017

WCRI 2017, Day 3

Day 2                                                                                                             Day 4
Day 3 of the World Conference on Research Integrity began with a plenary session on the role institutions play in research integrity. Bertil Anderson, the president of the Nanyang Technological University Singapore, spoke on  "Research and Research integrity - a key priority for a young and fast rising university." He reported in a refreshingly open manner about quite a number of cases of academic integrity his university was concerned with. In addition to much plagiarism, authorship disputes, self-plagiarism, there was outright fraud. He asked how much a university can do to investigate a case when something has happened that also hits the media? He presented four cases:
  1. NTU retracts NIE academic papers after malpractice investigations (The Straits Times), 2016
    A professor, hired in 2006, was contractually obligated to publish 10 papers in 3 years. An external whistleblower alerted the university to data fabrication that included an invented person and an invented company, eventually the police and the Ministry of Education were involved. The case led to 21 retractions. Such a clause is no longer in contracts.
  2. 3 Singapore-based scientists linked to research fraud (The Straits Times), 2016
    NTU professor fired for data falsification (New Press), 2016
    This case involved Western blot imagery manipulation and three institutions in Singapore, the USA and New Zealand. Two PhDs were revoked (and this has to be done by the president of the country, not the university president), nine papers were retracted and the professor dismissed for willful negligence. The national research organization is also seeking repayment. As a problematic side effect, students of this professor are now left without a supervisor, and are often not accepted on joint programmes elsewhere because of the tarnished reputation of laboratory. They are innocent persons who suffer.
  3. Adventures in Copyright Violation: The Curious Case of Utopian Constructions (Blog, Lincoln Cushing), 2012
    This was a case of images being mis-used. The owner of the copyright on the pictures stumbled over his pictures and wrote to the full professor at the Arts School who was using them. All he wanted was his name on the images and a link to his site. The professor refused, a lengthy investigation including external people ensued (that also uncovered additional problems) and ended with the professor being dismissed. 
  4. Fake Peer Reviews, the Latest Form of Scientific Fraud, Fool Journals (Chronicle of Higher Education, paywalled), 2012
    A scientist managed to hack into the Elsevier system for referee reports. He added sentences along the lines of "paper X needs to be referenced" in order to increase his own citation index. It turned out that there were 122 instances of such hackings. The scientist resigned, but NTU referred the case to the Singapore Police Force under the "Misuse of Computers" legislation, and the scientist has apparently left the county.
Anderson concluded by discussing the challenges in developing a culture of research integrity in a rapidly developing university in a competitive environment. There are aggrevating factors such as hierarchy, hetrogeneous faculty, tolerance of misconduct, and also the fact that investigations of research integrity cases need competencies outside the traditional university framework, such als lawyers. He emphasized that no university or institution can be immune to research fraud, and thus they need to have clear procedures defined.

The second speaker in the plenary session was Mai Har Sham, the associate vice president of research at The University of Hong Kong. She noted that just recently there was a meeting of the Asia-Pacific Research Integrity Network with 110 participants from 20 countries. She specified three areas in which the institution is called on to take action:
  1. Determination and commitment - provide policies, resources and infrastructure support
  2. RCR education, skills training, setting up a data management system andsupporting platforms
  3. Taking the initiative for quality assurance and risk managment
The third speaker was Jay Walsh, the vice president for research at Northwestern University, USA. He noted wryly that if you have never been quoted out of context, you haven't been quoted enough. How true. He then embarked on a short description of how we learn: We gather evidence and we form stories.We then develop a hypothesis about how the words in the stories work. We take data, distill it into information, coalesce that to knowledge, from which we develop wisdom.

But things can go wrong that warp the stories, the data, the information, the knowledge and/or the wisdom. Beyond inadequate methods, poor data, and poor practices, he referred to a paper that recently identified 235 forms of bias.

He feels that the path forward involves the training of researchers in the responsible conduct of research (RCR). Since it is hard to change the curriculum, the funders should make RCR training a requirement. Speaking of funders, he desires a single system for handling FFP  (falsification, fabrication & plagiarism) cases, as each funder has a different process. There need to be robust RCR courses, and professors could be given credit for teaching such courses. It is also vital to have a system that allows students and post-docs to come forward with problems without retribution, although this is so easy to say and so hard to do. The root causes of research integrity issues are wrong incentives. These need to be solved, otherwise we are just treating the symptoms.

I then chaired a session on Authorship. This seemed to me to be such a trivial topic, as the focus of publication is on communication from a group of authors with a collective of readers, so I find a ranking to be unnecessary. But there are many and various forms of author orderings and inclusions and perceptions of what they all mean. And where there are differences of opinion, there are fights, somtimes quite intense and protracted. It was interesting to see people investigating this from all sides, I'll just list them here, as I was busy dealing with the time and the questions during the session:
  • Authors without borders: Investigating international authorship norms
    among scientists & engineers
    Dana Plemmons (University of California, Riverside, U.S.A.)
  • Experiences of the handling of authorship issues among recent doctors
    in medicine in Sweden
    Gert Helgesson (Karolinska Institutet, Stockholm, Sweden)
  • A philosophical framework for a morally legitimate definition of
    scientific authorship
    Mohammed Hosseini (Utrecht University, Utrecht, The Netherlands)
  • The perceptions of researchers working in multidisciplinary teams on
    authorship and publication ethics
    Zubin Master (Albany Medical College, Albany, NY, U.S.A.
  • An investigation of researchers’ understanding and experience of
    scientific authorship in South Africa
    Lyn Horn (University of Cape Town, Cape Town, Republic of South Africa)
The final plenary session was about interventions that work. The session chair, Lex Bouter, remarked that the only thing that people have widely learned to use is text-matching software, all other investigations of interventions have either shown no effect, or an effect in the wrong direction.

The first speaker was Klaas Sijtsma, who was vice-dean at the Tilburg University School of Social and Behavioral Sciences under the deanship of Diederik Stapel when Stapel confessed to the rector that he had, indeed, committed extensive academic misconduct. Sijtsma was first named interim dean, then continued on as dean and will be stepping down in the coming school year. In his talk "Never Waste a Good Crisis: Towards Responsible Data Management," he spoke about this scandal that also touched the University of Amsterdam and the University of Groningen and involved dozens of articles and book chapters, and affected several PhD theses that were granted on the basis of analysis of fraudulent data.

They were lucky to have had a confession, so that Stapel's contract could be terminated, although there were still many committees needed to investigate all of the publications. There were a few criteria that were identified that permitted a culture to thrive in which the frauds were possible: Staple was unusual in that he preferred to work alone, he would not allow his PhD students to collect their own data, and he presented unlikely results to journals.

The University Tilburg is taking the following steps to foster a climate of integrity:
  • Each PhD student must have at least 2 supervisors.
  • Master theses and PhD theses are scanned for plagiarism (although they did this already).
  • An official formula is read aloud publicly when a doctorate is awarded. It refers to the young doctor's obligation to academia and society to act with integrity.
  • The university has a Code of Conduct.
  • Every staff member must sign an integrity code.
  • There is a now independent Integrity Officer and Research Committee.
Additionally, the School of Social and Behavioral Sciences took two actions:
    •    They intensified classes on research ethics and research integrity.
    •    The dean instituted a "Science Committee" in the Spring of 2012.

This, it seems, was one of the best ideas they had. This committee is tasked with auditing a small sample (about 20 out of 500) of the articles published by members of the school each year. Their task is to assess the quality of the data storage and to look closely at how well the research methods are described. The committee thus learns where there are problems in preserving data, and advises the school's management team and the researchers about data storage, completeness of data sets, honoring subjects' privacy, access to data, and making the data available to others. They are not out to "witch-hunt" for fraudsters, but just to eyeball the data. That, however, keeps the various research groups on their toes and thinking about these aspects of their data before they publish. In turn, this creates a better research atmosphere.

Sijtsma has often been asked, why he didn't design a universal data storage system and data management policy first? Well, it seems he understands that computer systems are often too complex, take a lot of time, are very expensive, and tend to encounter unpleasant technical surprises. It would have taken too long, grass would have grown over the scandal, and the sence or urgency would have disappeared. So he installed the committee first. They set up rules and regulations, announced annual random audits. Now the groups were motivated to come up with a data policy that suited their needs best.

The worked quite well! Some groups are better than others, people tend to only arrange their storage only when they are audited. When they leave the school, they lose commitment. No consistent data storage system, but was a deliberate choice, so much more to do.

Do these interventions work?  He reports that they do. They won't prevent new affairs, but they do encourage RCR and reduce QRP (Questionable Research Practices). He also noted in the Q&A that the university decided not to rescind the doctorates of the people using Stapel's fraudulent data, as they did not know that the data was false.

For more information on the Stapel scandal, see the report: Flawed science: The fraudulent research practices of social psychologist Diederik Stapel.

Patricia Valdez, the extramural Research Integrity Officer(RIO) of the National Institute of Health spoke on the NIH Perspective on Research Integrity.

Her focus was on the reproducibility crisis, as the NIH invests $30 billion of taxpayer money annually. They don't want to waste money trying to reproduce something that is erroneous. They are focusing on evaluating the rigor of the methodology and the transparency of the research in the hope that this will have an effect on reproducability.

She referred to a 2017 book by Richard Harris, Rigor Mortis: How sloppy science creates worthless cures, crushes hope and wastes billions (Basic Books). The take-home message from the book is: Teach students methods the first year, not facts!

Ian Freckelton closed the session speaking on Research Misconduct and the Law: Intervening to Name, Shame and Deter. He is a lawyer (Queen's Counsel at the Victorian Bar in Australia) and a professor for Law and Psychiatry at the University of Melbourne.
He published a book in 2016 called Scholarly Misconduct and the Law (Oxford University Press).

After reading to us the most important bit from Stapel's book about the fraud (so we don't have to read it), he raced us through criminal law, which is invoked to shame and deter, as it has been applied to research misconduct. Then he spoke of a number of cases (I've put in links to press articles or the Wikipedia for more detail):
He also spoke about another book by Tom Nichols, The Death of Expertise (2017), about when experts lie, and noted that there are many other cases of fraud that have not reached the courts. He closed with the observation that the law is a very slow, blunt instrument and that criminal prosecution is not the answer, but notes that research fraud is not victimless. A court decision would, however, vindicate whistleblowers and hopefully present a high deterrence factor.

We then were ferried by boat through the canals of Amsterdam and the Amstel River to our dinner. I'll try to get a short description of day 4 out by tomorrow!