Second Year Show
Recasting the Actor-Patient
Scandals Raise Issue of Image Manipulation
Several recent high-profile reports of scientific fraud have catapulted the issue of manipulated scientific images to headline news.
Last month, Science retracted two papers from a Korean lab for falsified photos claiming to show 11 distinct human embryonic stem cell lines, among other evidence of fraud. Weeks later, The New England Journal of Medicine announced an investigation of two reports from a researcher in Norway because of fabricated data that included duplicated photomicrographs in one of them purporting to show different stages of precancerous changes in the mouth.
The incidents have provoked discussions about acceptable standards for altering image data and debate about the responsibilities of researchers, journals, and scientific institutions for making, sharing, and enforcing these rules.
“The big issue arises from the fact that new technologies to handle images and present data have arisen more quickly than the scientific community has gotten together to set the standards,” said Emilie Marcus, editor of Cell. “It’s become apparent that the scientific community in some form needs to define standards as to what is and isn’t acceptable.”
For example, she said, “Is it legitimate to move lanes from two gels and blend them into one without a clear boundary? No. But there’s a debate over whether you can take out a piece of dirt on a gel with the [Photoshop] cloning tool or fill in a tear in a gel to make it look prettier.”
as Image Cop
His efforts to spread the screening practice to other journals have seemed futile until recently. “Nothing like an international scandal to generate some interest,” Rossner recently wrote in an e-mail to some members of his scientific editorial board.
“Journal editors have a responsibility to protect the published record in any way they can,” Rossner said. “This is one way they can.” At JCB, the acceptance of 1 percent of manuscripts has been revoked due to detected manipulation that affected interpretation of the data. About one quarter of the accepted manuscripts have at least one figure that needs to be remade because of tinkering that merely violates the journal’s standards for image presentation, such as exaggerating the contrast to remove unimportant data bands from a gel.
“I’m not convinced this is the best route,” Marcus said. “It seems an odd place in the research process to put a primary quality control for what is a major issue. If a student, postdoc, or PI is getting to the point of submitting papers with figures that are unethically manipulated, then there’s a bigger problem.”
At HMS, rules about what is acceptable or not in manipulating images would fall under the bailiwick of the Faculty Policies on Integrity in Science (www.hms.harvard.edu/integrity) alongside guidelines for authorship, conflict of interest, and letters of reference.
“We don’t have a specific policy on image alteration,” said Margaret Dale, dean for Faculty and Research Integrity. “Many of our policies arose because of an emerging issue.” It is too early to tell whether faculty leaders will develop a separate policy, she said.
To evaluate charges of image improprieties, HMS applies the more general standards of research misconduct, defined by federal regulations as fabrication, falsification, or plagiarism in proposing, performing, or reviewing research or in reporting research results. As with the journals, many gaffes tend to be mistakes or misunderstandings, she said. More serious cases are referred to the federal Office of Research Integrity if federal funding is involved.
“The rules are incredibly clear,” Robinson said. “It’s not necessary to have a specific rule that says do not cheat. An essential part of the ethics of the scientific method is the clear and transparent presentation of what actually happened in any experiment, in part so that the validity of the results and the methods can be judged [and reproduced] by others. It may be legitimate to change an image, but only as long as you indicate to the journal editor or in the manuscript that the image was changed.”
A narrow focus on images belongs in a discussion of the bigger issues of scientific integrity emerging as biology itself has become more complex and multidisciplinary, agreed Adrian Ivinson, director of the Harvard Center for Neurodegeneration and Repair and a former editor at the Nature journal group.
“This is not to say that journals do not have a role to play, but whatever they do, they will only be scraping the surface of authenticity,” Ivinson said. “Images aside, people now tend to collect large amounts of data, perform sophisticated analysis, and present the analyses rather than the raw data. At what point do you hold people’s feet to the fire and make them present all of the data, not just the postanalysis data and interpretation? The biology community is only beginning to take on that idea.”
Structural biologists, whose images do not even pretend to be real data, may be setting the example by publicly posting their X-ray crystallography data for other scientists to reanalyze, said Piotr Sliz, head of the HMS structural biology computing initiative. “[That is] even more important than the structure being correct,” he said. “The structure can be complex. Depending upon what you are looking for—a drug binding site or water conductivity—a scientist is naturally eager to spend more time refining and interpreting the part of the structure that will answer a particular scientific question. Whoever else comes after can validate the structure and look with more detail or more patience and look at other portions of the structure.”