Event Reports

The 47th Annual Meeting of the Molecular Biology Society of Japan Research Ethics Luncheon Seminar "Recent Trends in Scientific Papers: Are Fake Papers Increasing? How to Ensure Scientific Data Quality."

mbsj2024bannar
 The 47th Annual Meeting of the Molecular Biology Society of Japan was held for three days from November 27th to 29th, 2024 at the Fukuoka International Congress Center and Marine Messe Fukuoka in Fukuoka City. As part of the annual meeting, the Research Ethics Committee organized a luncheon seminar to promote greater awareness in research ethics .
This report summarizes the research ethics luncheon seminar held on November 28, titled "Recent Trends in Scientific Papers: Are Fake Papers Increasing? How to Ensure Scientific Data Quality" (Chair: Dr. Naoko Ohtani).

In this seminar, the key points of ensuring scientific data quality were discussed, mainly focusing on data quality control and data reproducibility. In the panel discussion, topics such as the mindset for ensuring the quality of research data and the unique difficulties of managing research data in molecular biology were discussed.

Transparent Publication and Open Science: From Bench to Journal Publication
Dr. Bernd Pulverer (Head, EMBO Press / Chief Editor, EMBO Reports)
Dr. Bernd Pulverer
Dr. Bernd Pulverer
Dr. Pulverer is a research ethics expert who serves as Head of EMBO Press (hereinafter referred to as "EMBO" in this report) and, as one of the co-founders of DORA (San Francisco Declaration on Research Assessment), has been working on issues related to research ethics for many years. EMBO systematically controls data quality and has a system in place to identify problematic papers, which is now considered the standard for scientific publications. In this seminar, he gave a presentation on data quality assurance, supported by specific examples.

Dr. Pulverer introduced various examples of image editing and said that the processes can be divided into three categories: "operational error, sloppiness, ignorance and beautification of the image", "selective reporting" and "fabrication and falsification". EMBO classifies these into categories I to III and responds accordingly.
He mentioned that it would be effective to use AI-based image duplication detection and to incorporate a source data section into accepted manuscripts, an area that EMBO was one of the pioneers in.

Next, he mentioned the absence of protocols and original data, statistical and selective reporting, and issues with research integrity as reasons for low reproducibility in basic biology papers and stated that data and methods/protocols in particular are crucial for improving reproducibility. To address these concerns, the use of Open Science Peer Review was introduced into the peer review process. This is a system in which AI uses an open-access library to perform quality checks on published papers and original data, and then a data referee reviews the data. He said that the results can then be used to ask authors to revise figures and tables or withdraw their papers as necessary, which would be effective in encouraging authors to make self-corrections.

Finally, in order to ensure a more resilient scientific dissemination, he outlined measures that labs, research institutions, funding agencies, and journals can promote from their respective positions, and stated that ideally these measures would be implemented through collaboration among the various organizations.

(Below are some of the measures mentioned)
Labs: Open, collaborative environment, independent verification, digital data and protocols/reagents archiving
Institutions: Oversight (pre-submission), open, collaborative environment, reform research assessment, RIO offices
Funding Agencies: Reward scientific approach (hypothesis+technique/negative+confirmatory) data, mandate source data, protocols, reagent archiving/reporting, professionalize
Journals: Systematic integrity screens, data forensics, support open science

The Importance of Data Verification: Encouraging Careful Research
Dr. Eiji Hara (Research Institute for Microbial Diseases, The University of Osaka)
 Dr. Eiji Hara
Dr. Eiji Hara
Dr. Hara is a researcher in the field of cellular aging and is also working on issues related to quality control of research data. In this lecture, he used his own experiences to explain how to eliminate bias and obtain reliable research results.

In papers that Dr. Hara co-authored, he frequently encountered concerns regarding the reproducibility of experiments using model mice conducted by other authors. Taking responsibility as a co-author, Dr. Hara conducted an investigation, found points of doubt, and put in a great deal of effort, including exchanging opinions with the providers of the mice, the authors of the papers, and other researchers conducting similar research, as well as conducting numerous verification experiments himself. As a result, he was able to determine that the cause was with the model mice.

Based on this experience, Dr. Hara emphasized the importance of thoroughly checking the content of the experiments of others when co-authoring a paper, and of having the courage to ask questions if there are any doubts about the content. In addition, since experiments using model mice can take as long as two to three years, he pointed out that there is a risk that research time will be wasted if inappropriate mice or methods are used. He explained that in order to prevent such losses, it is important not to blindly accept what is written in published papers, but to first thoroughly verify them on your own before using them, especially when they involve model mice.

Panel Discussion (Efforts to Ensure Data Quality)
Dr. Eiji Hara and Dr. Bernd Pulverer, Dr. Naoko Ohtani (Chairperson), Dr. Itoshi Nikaido, Dr. Masayuki Miura, and Dr. Akihiko Yoshimura
In response to questions from Dr. Ohtani, the Chairperson, regarding what they pay attention to and keep in mind when ensuring data quality, the following opinions were expressed:
[Dr. Yoshimura]  
  • It's important to look at things objectively, rather than trying to make the conclusion match what you want to see. In order to increase the reliability of the data, it is important to include "positive controls" and "negative controls" and to objectively evaluate the data with a scientific attitude, rather than relying solely on your own experience and intuition as a researcher.
[Dr. Miura]  
  • One issue related to image data is that imaging devices sometimes automatically enhance or beautify images through built-in processing. In addition, initial data collected by students contains a lot of noise, so they need careful, one-on-one instruction on how to collect initial data.
[Dr. Nikaido]  
  • While genomic science is often viewed as impartial because the data is open, in reality, evaluation can still be subjective. Even with the same dataset, researchers may selectively choose and report data that supports their hypotheses. This has a significant impact because the results can be used by others to further their research. The value of benchmarking research has increased in recent years, and the results of studies that compare different methods on the same samples are now being published in top journals. This is a good sign (for proper evaluation).
  • Results vary depending on which data, which program, which library, and which OS are run. To ensure reproducibility, it is essential to prepare the "environment" for experiments and analysis, such as by standardizing the OS and software environment.
[Dr. Hara]  
  • In my lab, I encourage other researchers to reproduce important data whenever possible before submitting a paper. Especially in animal studies, where individual variation is high, we blind the administration of treatments and conduct the analysis separately to avoid bias, only comparing the data at the final stage.
[Dr. Pulverer]  
  • Experimental data in life sciences is inherently noisy, so blinding reagents and experimental subjects is an important and effective measure.

During the discussion between the audience and the panelists, several additional thought-provoking insights were shared.
[Dr. Nikaido]  
  • Researchers should take an active interest in experimental tools and methods. Some people think that they should simply buy a kit, follow the instructions, and adopt it if they get good results. However, by learning the principles behind it, they can gain confidence in their experiments, reduce negative results and failures, and positive feedback loop in which they find experimenting enjoyable. If we can create this kind of positive atmosphere throughout the lab, research can be carried out as a team without the Principal Investigator (PI) having to shoulder everything.
[Dr. Hara]  
  • When reviewers make comments (during the peer review process) such as 'We can accept the paper if this data is included' or 'It would be sufficient if this result were obtained,' researchers may feel pressured to forcefully generate the requested data. Such comments should be avoided. It is also important for those around the researchers to be considerate and ensure they are not pressured into producing results through unreasonable efforts.


Many of the participants were faculty members, PIs, and other researchers involved in quality control of research data, and it was a lively seminar in which participants exchanged views on future challenges.
Panel Discussion
From left: Dr. Bernd Pulverer, Dr. Eiji Hara, Dr. Itoshi Nikaido, Dr. Masayuki Miura,
Dr. Akihiko Yoshimura, Dr. Naoko Ohtani (Chairperson)