• Fri
  • Jul 11, 2014
  • Updated: 10:19pm
NewsWorld
THE RELIABILITY OF RESEARCH

US-based Science Exchange lets researchers recheck study results

Journals are full of false findings and erroneous conclusions, prompting one company to launch a service that puts the stamp of legitimacy on a lab's results

PUBLISHED : Sunday, 19 August, 2012, 12:00am
UPDATED : Sunday, 19 August, 2012, 10:52am

So many scientific studies are making incorrect claims that a new service has sprung up to fact-check reported findings by repeating the experiments.

Science Exchange, a year-old company based in Palo Alto in the state of California, announced last week its "Reproducibility Initiative", aimed at improving the trustworthiness of published papers. Scientists who want to validate their findings will be able to apply to the initiative, which will choose a laboratory to redo the study and determine whether the results match.

The project sprang from the growing realisation that the scientific literature - from social psychology to basic cancer biology - is riddled with false findings and erroneous conclusions, raising questions about whether such studies can be trusted.

Not only are erroneous studies a waste of money, often taxpayers' money, but they can also cause companies to misspend time and resources as they try to invent drugs based on false discoveries.

"'Published' and 'true' are not synonyms," said Brian Nosek, a psychology professor at the University of Virginia in Charlottesville and a member of the initiative's advisory board. Last year, Bayer Healthcare, which researches and manufactures pharmaceuticals, said its scientists could not reproduce 75 per cent of published findings in cardiovascular disease, cancer and women's health.

In March, Dr Lee Ellis of the M.D. Anderson Cancer Centre, based at the University of Texas, and Dr Glenn Begley, the former head of global cancer research at biopharmaceutical company Amgen, reported that when the company's scientists tried to replicate 53 prominent studies in basic cancer biology, hoping to build on them for drug discovery, they were able to confirm the results of only six experiments.

The new initiative, said Begley, senior vice-president of privately held biotechnology company TetraLogic, "recognises that the problem of non-reproducibility exists and is taking the right steps to address it".

The initiative's 10-member board of prominent scientists will match investigators with a lab qualified to test their results, said Dr Elizabeth Iorns, Science Exchange's co-founder and chief executive. The original lab would pay the second for its work.

How much depends on the experiment's complexity and the cost of study materials, but should not exceed 20 per cent of the original research study's costs. Iorns hopes government and private funding agencies will eventually fund replication to improve the integrity of scientific literature.

The two labs would jointly write a paper, to be published in the journal PLoS One, describing the outcome. Science Exchange will issue a certificate if the original result is confirmed.

Founded in 2011, Science Exchange serves as a clearinghouse that connects researchers who want to outsource parts of experiments, from DNA sequencing (US$2.50 per sample) to bioinformatics (US$50 per hour). It is funded largely by venture capitalists and angel investors.

It may not be obvious why scientists would subject their work to a test that might overturn its results, and pay for the privilege, but Iorns is optimistic. "It would show you are a high-quality lab generating reproducible data," he said. "Funders will look at that and be more likely to support you in the future."

If results are reproduced, "it will increase the value of any technology the researcher might try to licence", she said. It would also provide assurance to, say, a pharmaceutical company the result is sound and might lead to a new drug.

Experts not affiliated with Science Exchange noted that if science were working as it should, the initiative would not be necessary.

"Science is supposed to be self-correcting," Begley said. "What has surprised me is how long it takes science to self-correct." There are too many incentives to publish flashy, but not necessarily correct, results, he said.

Virginia's Nosek experienced the temptation first hand. He and his colleagues ran a study in which 1,979 volunteers looked at words printed in different shades of grey and chose which hue on a colour chart - from nearly black to almost white - matched that of the printed words.

Self-described political moderates perceived the greys more accurately than liberals or conservatives, who literally saw the world in black and white, Nosek said.

Nosek and his colleagues redid the study, with 1,300 people. The ideology/shades-of-grey effect vanished. They decided not to publish.

Typically, scientists must show that results have only a 5 per cent chance of having occurred randomly. By that measure, one in 20 studies will make a claim about reality that actually occurred by chance alone, said John Ioannidis a professor of the School of Medicine at Stanford University, who has long criticised the profusion of false results.

With some 1.5 million scientific studies published each year, by chance alone some 75,000 are probably wrong.

In addition, Ioannidis said: "People start playing with how they handle missing data, outliers, and other statistics", which can make a result look real when it's not. "People are willing to cut corners" to get published in a top journal, he said.

There are numerous ways to do that. Researchers can stop collecting data as soon as they obtain the desired result rather than gather more as originally planned. Conversely, they can continue to gather data until they get the desired result.

How common might such sleights of hand be? In a 2005 paper in PLoS Medicine, Ioannidis used statistical and other methods to show "most published research results are wrong".

It remains the most-viewed paper in the journal's eight-year history. "Until recently, people thought you could trust what's published," Ioannidis said. "But for whatever reason, we now see that we can't."

Share

Related topics

For unlimited access to:

SCMP.com SCMP Tablet Edition SCMP Mobile Edition 10-year news archive
 
 

 

 
 
 
 
 

Login

SCMP.com Account

or