Disease research lab mishaps rattle scientists and regulators
After a series of high-profile mistakes, scientists and the agencies they work for debate safety
For the danger they posed, the lapses were appalling. They put lives at risk, that much is clear. But they were shocking, too, due to where they happened. The US government's high-security disease-control laboratories, which house samples of the most harmful germs in the world, cannot afford to screw up.
First came news of a single incident. Staff working on deadly bioterrorism agents at the Centres for Disease Control and Prevention (CDC) in Atlanta followed the wrong procedure to "inactivate" batches of Bacillus anthracis, the bacterium that causes anthrax.
Though potentially still lethal, the bugs were sent to another CDC lab where staff were not equipped to handle live spores. A report into the lapse revealed a worrying pattern of staff failures, and found that dozens were potentially exposed. The CDC doled out antibiotics and anthrax vaccine. Affected rooms were sterilised. They were lucky: no one got the disease. But that is hardly the point.
It was not an isolated event. As CDC investigators finalised their report into June's anthrax scare they unearthed a more alarming incident that had gone unreported. In March, lab staff sent samples of a fairly harmless strain of bird flu to scientists at the US Department of Agriculture. To the agricultural team's alarm, every chicken they infected with the virus died.
It was only after 21 birds had succumbed that they discovered why: the CDC samples had been contaminated with a strain of highly lethal H5N1 bird flu. Natural outbreaks of the virus have killed hundreds of people in Asia.
The director of the CDC, Tom Frieden, took swift action. He closed the CDC's anthrax and influenza labs and imposed a ban on the shipment of biological material in or out of the CDC's highest-security labs while safety procedures are revamped.
At a press conference, Frieden said the behaviour of some staff had been "totally unacceptable ... Frankly, I'm angry about it."
Frieden was called before a US House oversight committee to explain himself. The chairman, Tim Murphy, did not hold back. The incident was "troubling" and "completely unacceptable", the CDC's reputation "tarnished".
The incidents cast a long shadow over the organisation charged with protecting the US public. But a third incident points to a broader failure in US biosecurity. In early July, six vials of smallpox were discovered in a storage room at an unguarded US Food and Drug Administration lab in the state of Maryland that once belonged to the US National Institutes of Health (NIH).
After the eradication of smallpox, a horrific disease that kills 30 per cent of people it infects, official stocks are kept only at the CDC in Atlanta and at the Russian Vector lab in Novosibirsk.
Earlier this month, US federal investigators reported more dangerous material from the same room. In all, they found 12 boxes containing 327 vials labelled with various unpleasant pathogens, from influenza and dengue fever to rickettsia and Q fever.
It was the NIH's turn to apologise. Its director, Francis Collins, said that "overlooking such a sample collection for years is clearly unacceptable". To the outside world, the most trusted keepers of lethal germs had shown themselves to be dangerously incompetent.
The failings will have direct consequences at the CDC and NIH, but the fallout from the lapses will be felt far beyond the US. There are major lessons to be learned about human error that even the most vigilant high-security labs in Europe, Asia and elsewhere must heed.
For some scientists, the incidents call for more drastic action. Some want the number of laboratories holding lethal bugs to be slashed. Others want the highest-risk experiments curtailed, arguing that the fresh understanding they bring is not worth the real danger of an accidental outbreak.
At the US hearing on the CDC anthrax scare, Richard Ebright, a biosafety expert at Rutgers University, called for a dramatic reduction in the number of labs permitted to work on the bugs, from 1,500 or so in the US to nearer 50, in order to minimise the risk of a serious accident. He urged the government to set up an independent federal agency to regulate the work, one with real powers to shut down labs that operate dangerously. The US government is unlikely to embrace Ebright's plan.
On July 13, the NIH sacked half of its biosafety panel by email. The move ousted 11 of the government's original advisers, who in the past had raised concerns about experiments to create dangerous new pathogens.
Critics are now waiting to see who will replace the fired advisers. One said the replacements would be "yes men".
Ron Fouchier, a virologist at Erasmus medical centre in Rotterdam, said slashing the number of labs working with dangerous pathogens would be a huge mistake. "The reason so many labs are doing pathogen research is because there is so much to be investigated, in the interest of public and animal health," he said.
The breaches in the US have fuelled concerns about some of the more extreme studies that scientists do. When the CDC declared its anthrax incident, Marc Lipsitch, professor of epidemiology at the Harvard school of public health, said we should be glad it was only anthrax. He fears that scientists pose far greater risks to the public by intentionally creating dangerous pathogens.
In 2011, Fouchier announced that he had mutated bird flu to make it spread easily in animals through coughs and sneezes. Advocates for these experiments, known as gain-of-function studies, say they give scientists crucial insights into the kinds of viruses to fear in nature.
To Lipsitch and others, the irony is all too clear. In trying to prevent the next pandemic, they say Fouchier and his ilk make a disastrous outbreak more likely.