Advertisement
Advertisement
Without fundamental changes to the academic reward system, there could be long-term consequences for Chinese science. Photo: Xinhua

China must restructure its academic incentives to curb research fraud

Scott Edmunds and Rob Davidson say recent exposés are leading to mass retraction of papers

The idealised view of science as the curiosity-driven pursuit of knowledge to understand and improve the world around us has been tarnished. Recent news tells of systematic fraud and mass retraction of research papers from the Chinese academic system, and allegations of attempts to game the peer-review system on an industrial scale.

With much of our research and development funded by the government, we all hope our tax dollars are spent as wisely as possible, so funders around the world have developed methods of assessing the quality of the work that researchers do. One of the most widely used metrics is an academic journal's "impact factor", which ranks its reputation based on the citations its articles get.

While many countries have tried to broaden their assessment system to assess a researcher's impact in a more balanced way, in China, the only method of judging researchers is by the number of publications they have in ranked journals. Unfortunately, that means money is changing hands in huge amounts, with hundreds of thousands of yuan spent to get a single publication in the top journals.

This focus on one metric above all others has led to large-scale gaming of the system and a black market of plagiarism, not to mention invented research and fake journals. On the heels of previous exposés of an "academic bazaar" system where authorship can be bought, uncovered in December a wider and more systematic network of Chinese "paper mills" that produce ghostwritten papers and grant applications to order. The article linked the problem to a hacking of the peer-review system, which is supposed to protect the quality and integrity of the research.

The major first fallout took place last week, with the publisher BioMed Central retracting 43 papers for peer-review fraud, the biggest mass retraction carried out for this reason to date. The number of papers retracted for this reason increased by nearly 50 per cent.

Many other major publishers have been implicated, with the publisher of the world's largest journal, Public Library of Science (PLOS), also issuing a statement that it is investigating linked submissions.

BioMed Central should be applauded for taking the time and effort to fix the scientific record so quickly. Besides the need for better policing by publishers, funders and research institutions, there need to be fundamental changes to how we carry out research.

Without fundamental changes to its academic incentive system, there could be long-term consequences for Chinese science, with the danger that this loss of trust will lead to fewer opportunities to collaborate with institutions abroad, and potentially build such scepticism that people will stop using research from China.

While we are rightly proud of Hong Kong's highly regarded and ranked universities system, we are not immune to the same pressures. While funders in Europe have moved away from using citation-based metrics such as the impact factor in their research assessments, the University Grants Committee state in its research assessment exercise guidelines that it may informally use it. Some universities also pay bonuses related to the impact factor of the journals their researchers publish in, so it could lead to the same temptations and skewed incentive systems that led to corrupt practices on the mainland.

A look at the list of retracted papers indicates, fortunately, that no Hong Kong-based researchers were implicated. But with our local institutions increasing their ties across the border with new research institutes and hospitals, how much longer our universities can remain unblemished will be a challenge.

If the impact factor system is so problematic, what are the alternatives? There are factors that should obviously be taken into account, such as the quality of teaching, and the number of students who go on to do bigger and better things. Impact can be about changing policy, producing open software or data that other research can build upon, or stimulating public interest and engagement through coverage in the media.

These measures can also be subject to gaming, but having a broader range of "alternative metrics" should make the system harder to manipulate. China is overtaking the US to become the biggest producer of published research, but there needs to be more focus on quality than quantity.

More and more funders worldwide are encouraging and enforcing data management and access, and we at Open Data Hong Kong are cataloguing the policies and experiences of the city's research institutions. Currently, we rank 54th in the latest Global Open Data Index.

One of the main benefits of open data is transparency, and the fact that the government is already promoting release of public sector data through the newly launched Data.Gov.HK portal is encouraging, but it is clear our research data needs to be treated the same way.

This article appeared in the South China Morning Post print edition as: China must restructure its academic incentives to curb research fraud
Post