Fri. Apr 12th, 2024

A once-ignored community of science sleuths now has the research community on its heels

By 37ci3 Feb15,2024

A community of researchers searching for errors in scientific research has sent shockwaves through some of the world’s most prestigious research institutions and the scientific community at large.

High-profile cases of alleged image manipulation in articles authored by the former president of Stanford University and leaders of the Dana-Farber Cancer Institute have made national media headlines, and some top science leaders think this may be just the beginning.

“At the rate things are going, we expect one of them to come up every few weeks,” he said. in the field.

Sleuths argue that their work is necessary to correct the scientific record and prevent generations of researchers from chasing dead ends because of flawed documentation. And some scientists say it’s time to reform how universities and academic publishers deal with flawed research.

“I understand why explorers who find these things get so upset,” says Michael Eisen, a biologist, former editor of eLife, and a leading voice for reform in scientific publishing. “Everyone—the author, the journal, the institution, everyone—is incentivized to minimize their importance.”

For nearly a decade, scientists have identified widespread problems with scientific imaging in published articles, airing concerns online but receiving little attention.

That began to change last summer after then-Stanford president Marc Tessier-Lavigne, a neuroscientist, stepped down. Investigation of alleged image manipulations in studies assisted by the author and a report criticizing his laboratory culture. Tessier-Lavigne himself was not found to have engaged in wrongdoing, but members of his lab allegedly manipulated the images. a report from a scientific panel hired to investigate the allegations he said.

In January, a blogger exposed the questionable work of top leaders of the Dana-Farber Cancer Institutesubsequently asked the journals to retract six articles and amend dozens more.

One resignation letter, Tessier-Lavigne noted that the panel did not find that she knew of the illegal conduct and that she never provided documents that she did not believe were proper. Dana-Farber, the research integrity officer, said she had taken vigorous steps to correct the scientific record and that discrepancies in the images were not evidence of an author’s intent to deceive.

“We’re certainly living through a moment that really created a turning point when the Marc Tessier-Lavigne thing happened and has continued ever since, being Dana-Farber’s last,” he said.

Now, the long-standing problem is in the national spotlight, and new artificial intelligence tools are only making it easier to detect problems ranging from decades-old mistakes and sloppy science to unethically enhanced images in photo editing software.

This enhanced scrutiny is reshaping how some publishers operate. And it’s prompting universities, journals and researchers to reckon with new technology, the potential backlog of undetected errors and ways to be more transparent when problems are discovered.

It comes at a busy time in academic halls. Bill Ackman, venture capitalist, in a post last month in X discussed the weaponization of artificial intelligence to detect plagiarism by leaders at top universities with ideological differences, raised questions about political motivations in plagiarism investigations. More broadly, public trust in scientists and science has steadily declined in recent years. According to the Pew Research Center.

Eisen said he doesn’t think explorers’ concerns about scientific images have turned into “McCarthy” territory.

“I think they’re focused on a very specific type of problem in the literature, and they’re right — it’s bad,” Eisen said.

Scientific publishing is at the heart of what scientists understand about their discipline, and it is the primary way researchers with new findings describe their work to colleagues. Before publication, scientific journals review submissions and send them to outside researchers in the field to check them and detect errors or fallacies, which is called peer review. Journal editors will review studies for plagiarism and copyediting prior to publication.

This system is not perfect and still relies on the researchers’ good faith efforts to avoid manipulating their findings.

Over the past 15 years, scientists have become increasingly concerned about problems with some researchers digitally altering images in their papers to skew or accentuate results. Detecting irregularities in images—usually experiments involving mice, gels, or stains—has become a greater priority for scientific journal work.

Jana Christopher, an expert on scientific images who works for the Federation of European Biochemical Societies and its journals, said the field of image integrity verification has grown rapidly since she started working in the field about 15 years ago.

At the time, “nobody was doing it, and people were denying it was research fraud,” Christopher said. “The general perception was that it was very rare and every once in a while you would find someone falsifying their results.”

Today, scientific journals have entire teams dealing with images and trying to ensure their accuracy. More documents are being withdrawn than ever – a record number of more than 10,000 were withdrawn last year. According to nature analysis.

A loose group of scientific explorers added outside pressures. Sleuths often detect and report errors or potential manipulations in the PubPeer online forum. Some sleuths receive little or no payment or public recognition for their work.

“To some degree, there’s a vigilance around it,” Eisen said.

Moment Analysis of comments on over 24,000 articles posted on PubPeer It found that more than 62% of the comments on PubPeer were related to image manipulation.

For years, sleuths have relied on sharp eyes, sharp pattern recognition, and an understanding of photo manipulation tools. In the past few years, rapidly developing artificial intelligence tools that can scan papers for irregularities have bolstered their work.

Scientific journals now use similar technology to prevent errors from reaching publication. Science in January announced that Using an artificial intelligence tool called Proofig to scan documents being edited and peer-reviewed for publication.

Science editor-in-chief Thorp said the family of six journals “quietly” added the tool to their workflow about six months before the January announcement. Previously, the journal used eye checks to avoid these types of problems.

Thorp said Proofig identified several papers late in the editorial process that were not published because of problematic images that were difficult to explain and other cases where authors had “logical explanations” for issues they corrected before publication.

“The serious errors that cause us to not publish a paper are less than 1%,” Thorp said.

Chris Graf, director of research integrity at Springer Nature, said in a statement that the company has developed and tested “image integrity software with built-in artificial intelligence” to check for duplicate images. Graf’s research integrity unit currently uses Proofig to help evaluate articles if concerns arise after publication.

Processes vary across journals, Graf said, but some Springer Nature publications manually check images for manipulations with Adobe Photoshop tools and look for inconsistencies in raw data for experiments that visualize cellular components or common scientific practices.

“While AI-based tools are useful in speeding up and expanding research, we still consider the human element important in all of our research,” Graf said, adding that image recognition software is not perfect and requires human expertise. to protect against false positives and negatives.

No tool can catch every bug or scammer.

“There are many people involved in this process. We’re never going to catch everything,” Thorp said. “As journals, institutions, and authors, we need to manage better when this happens.”

Many scientists were frustrated after their concerns were ignored or research proceeded slowly and without a public decision.

Sholto David, who publicly aired his concerns about the Dana-Farber study in a blog post, said he had largely “given up” writing letters to journal editors about the errors he found because their responses were insufficient.

Elisabeth Bick, a microbiologist and longtime image sleuth, said she often notes image problems and “nothing happens.”

Leaving public comments questioning researchers on PubPeer can start a public conversation about questionable research, but authors and research institutions often don’t respond directly to online criticism.

Although journals may issue corrections or retractions, it is usually up to the research institute or university to investigate cases. When cases involve biomedical research supported by federal funding, the federal Office of Research Integrity can investigate.

Thorp said institutions need to move more quickly to take responsibility when wrongdoings are discovered, and to be open and honest about what happened to gain public trust.

“Universities are very slow to react and very slow to work through their processes, and the longer this goes on, the more damage it will do,” Thorp said. “We don’t know what would have happened if Stanford had said, ‘These documents are wrong,’ instead of starting this investigation.” We will take them back. This is our responsibility. But for now, we take the blame and own up to it.”

Some scientists worry that image concerns only scratch the surface of science’s integrity problems—it’s easier to spot problems in images than data errors in spreadsheets.

While managing bad papers and seeking accountability are important, some scientists think these measures will treat symptoms of a larger problem: a culture that rewards the careers of those who publish the most interesting results, not those who persist over time.

“Scientific culture itself does not say that we care about being right; That means we’re thinking about buying bounce papers,” Eisen said.

Source link

By 37ci3

Related Post

Leave a Reply

Your email address will not be published. Required fields are marked *