An antibiotic used to treat gonorrhea has been found no longer effective.
DRUGS first limited to treat a resistant strain of gonorrhea in Hawaii eight years ago have now been recommended for patients nationwide, an alarming development that furthers the argument for curbing nonhuman antibiotic applications.
Data show rapid spread of the "superbug" in 26 cities with dramatic increases in cases from 1 percent of all gonorrhea infections to more than 13 percent in less than five years. In Hawaii, resistant gonorrhea comprised 1.4 percent of cases in 1997 compared with more than 20 percent in 2006.
What's causing concern is that the antibiotic that was the first line of defense against the fast-evolving microbe no longer restrains the disease. The substitute antibiotic now recommended by the Centers for Disease Control and Prevention leaves just the one class of drugs for treatment, with no new antibiotics for gonorrhea being developed.
Antibiotics are one of the most profound achievements in medicine, but decades of widespread and indiscriminate use -- in cosmetics and soaps and food animal production -- has rendered many ineffective.
The practice continues. Last month, the Food and Drug Administration, despite counsel from its own experts and health groups, put on the approval track for treating cattle an antibiotic that is the fourth generation version of the one the CDC is now urging for gonorrhea.
The company that makes the antibiotic contends similar drugs have been used in animals in Europe without harm, but recent data indicate bacteria resistance has grown not only in food animals but in humans as well. Government and private health organizations agree that careful, limited use of antibiotics is crucial to public health as microbes become more and more resistant to them. The spread of gonorrhea is just one indication of the danger.
Source: http://starbulletin.com/2007/04/16/editorial/editorial02.html
Wednesday, April 18, 2007
Germs and the City
Two centuries of success against infectious disease have left us complacent—and vulnerable.
There have been at work among us three great social agencies: the London City Mission; the novels of Mr. Dickens; the cholera.” Historian Gertrude Himmelfarb quotes this reductionist observation at the end of her chapter on Charles Dickens in The Moral Imagination; her debt is to an English nonconformist minister, addressing his flock in 1853. It comes as no surprise to find the author of Hard Times and Oliver Twist discussed alongside Edmund Burke and John Stuart Mill in a book on moral history. Nor is it puzzling to see Dickens honored in his own day alongside the City Mission, a movement founded to engage churches in aiding the poor. But what’s V. cholerae doing up there on the dais beside the Inimitable Boz? It’s being commended for the tens of millions of lives it’s going to save. The nastiness of this vile little bacterium has just transformed ancient sanitary rituals and taboos into a new science of epidemiology. And that science is about to launch a massive—and ultimately successful—public effort to rid the city of infectious disease.
The year 1853, when a Victorian doctor worked out that cholera spread through London’s water supply, was the turning point. Ordinary people would spend the next century crowding into the cities, bearing many children, and thus incubating and spreading infectious disease. Public authorities would do all they could to wipe it out. For the rest of the nineteenth century, they lost more ground than they gained, and microbes thrived as never before. Then the germ killers caught up—and pulled ahead. When Jonas Salk announced his polio vaccine to the press in April 1955, the war seemed all but over. “The time has come to close the book on infectious disease,” declared William Stewart, the U.S. surgeon general, a few years later. “We have basically wiped out infection in the United States.”
By then, however, infectious diseases had completed their social mission. Public authorities had taken over the germ-killing side of medicine completely. The focus shifted from germs to money—from social disease to social economics. As germs grew less dangerous, people gradually lost interest in them, and ended up fearing germ-killing medicines more than the germs themselves.
Government policies expressed that fear, putting the development, composition, performance, manufacture, price, and marketing of antibiotics and vaccines under closer scrutiny and control than any public utility’s operations and services. The manufacturers of these drugs, which took up the germ-killing mission where the sewer commission left off, must today operate like big defense contractors, mirror images of the insurers, regulatory agencies, and tort-litigation machines that they answer to. Most drug companies aren’t developing any vaccines or antibiotics any more. The industry’s critics discern no good reason for this at all: as they tell it, the big drug companies just can’t be bothered.
These problems capture our attention only now and again; they hardly figure in the much louder debate about how much we spend on doctors and drugs, and who should pay the bills. “Public health” (in the literal sense) now seems to be one thing, and—occasional lurid headlines notwithstanding—not a particularly important one, while “health care” is quite another.
We will bitterly regret this shift, and probably sooner rather than later. As another Victorian might have predicted—he published a book on the subject in 1859—germs have evolved to exploit our new weakness. Public authorities are ponderous and slow; the new germs are nimble and fast. Drug regulators are paralyzed by the knowledge that error is politically lethal; the new germs make genetic error—constant mutation—the key to their survival. The new germs don’t have to be smarter than our scientists, just faster than our lawyers. The demise of cholera, one could say, has been one of the great antisocial developments of modern times.
By withdrawing from the battlefield just long enough to let us drift into this state of indifference, the germs have set the stage for their own spectacular revival. Germs are never in fact defeated completely. If they retire for a while, it’s only to search, in their ingeniously stupid and methodically random way, for a bold new strategy. They’ve also contrived, of late, to get human sociopaths to add thought and order to the search. The germs will return. We won’t be ready.
Microbes discovered the joys of socialism long before Marx did, and in matters of health, they made communists of us all. Since the dawn of civilization, infectious disease has been the great equalizer, with the city serving as septic womb, colony, and mortuary. Epidemic—“upon the people”—is the democracy of rich and poor incinerated indiscriminately by the same fever, or dying indistinguishably in puddles of their own excrement.
The Mao of microbes was smallpox, which killed 300 million people in the twentieth century alone. Sometimes called the first urban virus, it probably jumped from animals to humans in Egypt, Mesopotamia, or the Indus River Valley, at about the same time that the rise of agriculture began drawing people together in towns and cities. Smallpox has also been called nature’s cruelest antidote to human vanity. Princes broke out in the same pustules as paupers, reeked as foully of rotting flesh, and oozed the same black blood from all their orifices. Alongside millions of nameless dead lie kings of France and Spain, queens of England and Sweden, one Austrian and two Japanese emperors, and a czar of Russia.
While the germs reigned, there wasn’t much rest-of-medicine to speak of: infections eclipsed every other cause of illness but malnutrition. And when monarchs were dying, too, language and politics honestly tracked medical reality. The “social” in “social disease” reflected an epidemiological fact. It also pointed to a practical, collective solution. Disease arose and spread when people converged to create societies. It was caused by invisible agents that individuals could not control on their own. It could be eradicated only by social means—public sanitation, slum clearance, education, and, above all, a robust, germ-hating culture. It took a city to erase a cholera.
This was the overarching insight that crystallized in the public consciousness in the first half of the nineteenth century. “In the Victorian version of the Puritan ethic,” Himmelfarb writes, “cleanliness was, if not next to godliness, at least next to industriousness and temperance.” For Dickens, as Himmelfarb and others have observed, the filth in the Thames symbolized the city’s insidious taint, its ubiquitous, effluvial corruption. What social historians often fail to note, however, is that by the time Dickens was placing the Thames at the center of London’s many ills, a new science had emerged to move the river far beyond metaphor.
Epidemiology—the rigorous science of public health—was born with physician William Farr’s appointment as controller of London’s General Register Office in 1838. Directed to do something about the cholera epidemic, Farr began systematically recording who was dying and where. The most important things he discovered were negative. Wealth didn’t protect you from cholera. Neither did occupation, or residing close to the sea. What mattered was how high above the Thames you lived. Farr concluded that the river’s horrendous stench caused the disease. Another English doctor, John Snow, made the right connection in 1853: London’s sewers emptied into the Thames, so the farther down-sewer you lived, the more likely you were to drink foul water. A year later, Snow saved countless lives by persuading parish authorities to remove the handle from the Broad Street pump in Soho.
The rest is history. By pinning down the waterborne pathway of contagion, Farr and Snow had transformed a devastating public disease into a routine exercise in civil engineering. In 1858, Parliament passed legislation, proposed by then-chancellor of the exchequer Benjamin Disraeli, to finance new drains. Charles Dickens published his last novel—Our Mutual Friend, in which the main character is the pestilential Thames—in 1864. London suffered its last cholera epidemic in 1866.
This wasn’t the end of great plagues in the city, or even the beginning of the end, but it was the end of the beginning. In 1872, Disraeli rallied his Tory Party around what his Liberal opponents derided as a “policy of sewage”—reforms involving housing, sanitation, factory conditions, food, and the water supply—and while he served as prime minister, these policies became law. For the next 50 years or so, in the United States as in Britain, public health depended on city bureaucrats above all. They wasted little time with sick patients, other than sometimes ordering them to lock their doors and die alone. They focused instead on eradicating germs before they reached the patient, and that meant attending to the water, sewage, trash, and rats.
In a recent British Medical Journal survey, public sanitation was voted the most important medical advance since the journal was established in 1840. If we don’t think of public sanitation as “medical” any more, it’s only because the municipal bureaucrats who followed Farr cleaned things up so well.
As they ran out their welcome in public spaces, microbes went private. They still had to move from person to person, but there could be no more carefree joyrides on rats or surfing through the water supply. People themselves, however, are almost as infectious as their sewers. Clean water alone could not eliminate coughs, dirty hands, and filthy food.
The systematic pursuit of germs into the flesh of patients didn’t really begin until the very late nineteenth century. Jenner’s smallpox vaccine was already a century old, but it owed its existence to the lucky fact that the human pox had a weak cousin that infected cows. (We give our kids the “cow treatment”—vacca is Latin for cow—every time we do as Jenner did, and challenge their immune systems with a corpse, cousin, or fragment of a horrible microbe.) The systematic production of other vaccines had to await the arrival of Louis Pasteur and Robert Koch, who developed procedures for isolating microbes and then crippling or killing them.
Vaccines, health authorities quickly recognized, are quintessentially public drugs. They expel germs not from the public water but from the not-quite-private lungs, fluids, and intestines of the public itself. When enough people get vaccinations, “herd immunity” protects the rest.
Five human vaccines arrived in the late nineteenth century, and many others would follow in the twentieth. They weren’t developed quickly or easily, but they did keep coming. In due course, and for the first time in human history, serious people began to believe that infectious disease might come to an end. Scientists could painstakingly isolate germs that attacked humans. Drug companies then would find ways to cultivate and cripple the germs, and mass-produce vaccines to immunize the public. Disease would fall, one germ at a time, and when they were all gone, good health would be pretty much shared by all.
New laws, vigorously enforced, drafted the healthy public into the war on germs. England mandated universal smallpox vaccination in 1853. Facing a smallpox outbreak, Cambridge, Massachusetts, decreed in February 1902 that all the town’s inhabitants “be vaccinated or revaccinated,” set up free vaccination centers, and empowered a physician to enforce the measure. A certain Henning Jacobson refused to comply, insisting on a constitutional right “to care for his own body and health in such way as to him seems best.” The U.S. Supreme Court disagreed, easily upholding the “power of a local community to protect itself against an epidemic threatening the safety of all.” Later enactments would require the vaccination of children before they could attend public schools. Adults who traveled abroad had to be vaccinated if they planned to come home. Albert Sabin’s polio vaccine took things even further—the vaccine itself was infectious. A child swallowed a live but weakened virus soaked into a sugar cube, and then went home and vaccinated his siblings and parents, too.
The germ killers didn’t really get into the business of private medicine—curing already-sick patients—until the development of sulfa drugs in the 1930s, followed by antibiotics after World War II. Even then, much of the cure still lay in preventing the infection of others. The paramount objective with tuberculosis, for instance, was to wipe out the tubercle bacillus so thoroughly that nobody would need streptomycin any more, because nobody would come into contact with any other infected person or animal.
Year by year, one segment or another of the public sector contrived to take a little more control, directly or indirectly, over the development, distribution, and price of vaccines. Soon after the development of the polio vaccine in the 1950s, Washington launched a program to promote and subsidize the vaccination of children nationwide. At about the same time, the Soviet Union proposed a global campaign to eradicate smallpox. The World Health Organization officially launched the campaign in 1966, and it ended in triumph 20 years later.
A complete socialization of the war against germs seemed sensible. The germs, after all, lived in the public water, floated through the public air, and passed from hand to hand in the public square. Individuals might buy their own vaccines, antibiotics, bottled water, and face masks. But collective means could make all of that forever unnecessary. And they did. Big government attacked the infectious microbes with genocidal determination, expelling them, one by one, from human society. Defiled by monstrous human fratricide, the first five decades of the twentieth century were also the triumphant decades of public health.
To stay prepared, however, human culture apparently requires regular booster shots of smallpox, cholera, plague, or some other serious disease that indiscriminately sickens and kills. Without periodic decimation, ordinary people apparently forget what germs can do, the authorities grow complacent, scientists turn their attention elsewhere, and private capital stops investing in the weapons of self-defense.
As Sherwin Nuland observes in How We Die, AIDS struck just when “the final conquest of infectious disease seemed at last within sight.” In 1981, a weekly Centers for Disease Control report noted a sudden increase in a specific strain of pneumonia in California and New York. Before long, we had it on Oprah Winfrey’s authority that the germs were back and were after us all. “One in five heterosexuals could be dead of AIDS in the next three years,” she declared in February 1987.
Whatever they were thinking about HIV, the heterosexuals had, by that point, plenty of other venereal diseases to worry about. A tragically large number of young women had contracted chlamydial infections serious enough to leave them infertile. Herpes, gonorrhea, syphilis, and some types of sexually transmitted hepatitis were also on the rise. The sexual revolution seems in retrospect to have been led by people who took William Stewart at his word when he consigned infectious disease to the dustbin of history. But rampant promiscuity packs people together tighter than slums, and germs rush in where angels fear to tread. It has taken a great deal of readily avoidable suffering and death to establish that people do need sexual taboos—taboos at the very least robust enough to thwart microbes, if not with less sex, then with more latex.
As social agents go, however, HIV and chlamydia accomplished far less than cholera. It was the demise of a germ-hating culture that had helped clear the way for new epidemics of venereal disease, and the resurrection of that culture still has a long way to go. Many people in positions of authority and influence continue to affirm the tattoo artist’s expressive freedom, the bag lady’s right to sleep next to the sewer, the mainliner’s right to share needles in an abandoned row house, and the affluent parent’s right to interpose a “philosophical objection” between his child and the vaccinations demanded by public schools. They propound grand new principles of freedom, privacy, and personal autonomy to protect septic suicide, even septic homicide. Social doctors in Dickens’s day didn’t have to invade anyone’s privacy to track smallpox—it announced itself on its victims’ faces. Tracking HIV, by contrast, requires a blood test, and privacy police dominated the first 20 years of the fight over testing.
A legal system that affirms the individual’s right to do almost everything at the germ-catching end now struggles to decide when, if ever, we can force the Typhoid Marys of our day to stop pitching what they catch. The law that once ordered a healthy Henning Jacobson to roll up his sleeve can no longer compel a virulent celebrity to zip his fly. Infectious lifestyle, once a crime, is now a constitutional right.
Many people just don’t care much, and it’s easy to see why. Habits and lifestyles that the Victorians learned to shun look a lot less vile when they lose not only their repulsive cankers, pustules, sputum, fevers, diarrhea, dementia, and emaciation, but also their power to impose these horrors on the neighbors. Just three months after Oprah warned heterosexuals about AIDS, President Reagan thought it necessary to remind us that we were battling a disease, not our fellow citizens. Everyone knew why. The gay community had good reason to fear that many Americans might be thinking: HIV isn’t my problem, it’s theirs. The new choleras are indeed much less social than the old. Why shouldn’t they forever remain so?
Over the morning coffee and toast, consumed in our tidy little kitchens, we read that drug-resistant tuberculosis is a cause for growing concern—but mainly in prisons. So too are new drug-resistant staph infections—in tattoo parlors and the foulest of locker rooms. And it’s in private drug dens, bedrooms, and bathhouses, of course, that infectious germs have made their biggest comeback, contriving to get themselves spread by not-quite-private needles and genitalia. True, the germs incubated in abandoned houses, cardboard boxes, and other hovels have drifted into run-down urban hospitals, whose emergency rooms often provide primary care to the patients most likely to harbor the worst germs. But they haven’t moved much farther than that. Sharing the city, it seems, no longer means sharing smallpox and the plague.
The epidemiological facts—beyond serious dispute—support the complacent majority. Germs used to ravage young bodies with inexperienced immune systems; now they mainly take the old. Though death certificates still quite often record an infection as the final cause of death, germs now are mostly epitaph killers, moving in on immune systems terminally crippled by old age, heart disease, cancer, stroke, and Alzheimer’s.
In the pantheon of disease and death, lifestyle and genes have completely eclipsed germs. The great agent of social change today isn’t cholera; it’s cholesterol. It propagates via drive-through windows, not sewers. Crowds don’t flee the city when it strikes; they pay extra for a supersize serving. In the heyday of public health, public money went to clean up public filth. Today, we’re sick because we spend our private money buying bad health by the pack, the bottle, and the Happy Meal. Small wonder, then, that the germ-fighting social norms once ranked next to godliness still seem to many as antiquated as the whalebone hoops that defended Victorian virtue.
If we took the new microbes seriously, we could certainly beat them. The science for tracking, immunizing against, and annihilating germs grows more vigorous, innovative, and youthful with each passing year. But insurers and regulators now control how we use that science, and as the germ-phobic culture has decayed, they’ve grown increasingly slow and rigid. Many competent people at the top echelons of government worry deeply about this problem. Yet as they scramble to address it, they must hack their way through laws and bureaucracies that have accumulated and thickened since the 1960s. Technical know-how isn’t enough; collective will is also necessary. Yet—paradoxical as it sounds—collective will is what was lost as government took over the show.
Good sewers, public sanitation, and fresh water are undoubtedly public ends, best advanced by public means. Yet though germ killing begins in public, it must, as the Victorians grasped, end in private, and this is where the government’s attempt to take charge of everything has had terrible consequences. The Victorians had nothing but culture to wield against germs on private premises, so they taught that clean was virtuous, and dirty sinful, and they taught it very persistently to everyone. But the big, efficient, technocratic government agencies of our day don’t do virtue and sin—they requisition, stockpile, subsidize, proscribe, and mandate. And they teach—implicitly but persistently—that germs are government’s responsibility, not yours. Socialized germ killing makes it a lot easier for people to lose touch with the personal side of germicide.
When the government then tries to clean up human bodies with the same heavy hand that it uses to clean up the sewers, it can end up fighting both amnesiacs and those who remember too well. The forgetful push back because they are sick and tired of being hectored by the universal nanny about washing hands, vaccinating kids, and countless other time-wasting nuisances. The unforgetful seem to believe that the little routines and habits of daily life are too important to entrust to a nanny perched on the banks of the Potomac.
Consider, for example, the most important new vaccine recently licensed, to protect against the human papillomavirus (HPV). Developed by scientists at the National Cancer Institute, it’s designated a “childhood” vaccine, which will give it some (by no means complete) shelter from the tort lawyers. Merck licensed the vaccine, steered it through the FDA, and will be responsible if anything goes wrong. The firm is charging $300 or more for the three-shot dose—ten to 100 times the inflation-adjusted cost of most vaccines in the 1950s. It may well be worth it. This new kids’ vaccine protects against a sexually transmitted virus that causes many cases of cervical cancer. To be effective, however, vaccination must occur before exposure to the virus, and each new sexual partner exposes a girl to a 15 percent chance of infection. The Centers for Disease Control therefore plans to see to it that girls are vaccinated before. That, the federal authorities have concluded, would be before they are 12.
Quite a few parents have concluded that the federal authorities can go to hell. The amnesiacs are beyond help; they’re probably skipping other vaccines, too. As for the mnemonists—maybe some just remember that sex spreads a lot of other germs, as well, and figure that they have a scheme to protect their little girls from all of them. Others probably aren’t making any conscious calculation about germs; they’re just holding fast to a faith and culture that still seeks to protect little girls from sex itself. People who believe government can achieve anything will say that it should just have handled this one more delicately. Perhaps—but the fact is, the government’s germ killers have ended up at loggerheads with the people they most need as their closest allies: parents who teach the taboos and rules that provide a crucial line of defense against the most persistent and clever killers of children on the face of the planet.
Even when it doesn’t reach the point of turning parents against vaccines, the government takeover has left many people with a triple sense of entitlement—to germ-free life, risk-free drugs, and wallet-free insurance. This in turn has created an almost profit-free economic environment for germ-hunting pharmaceutical companies, which still do much of the basic science and take charge of the essential, delicate, and difficult business of mass drug production.
The new drug law that President Kennedy signed on October 10, 1962, codified a profound change in attitude. With infectious diseases all but finished, the drugs of the future would target human chemistry. A horrified world had just discovered that one such drug, which effectively relieved morning sickness and helped people fall asleep, also halted the growth of a baby’s limbs in the womb. Before, when microbes were the enemy, drugs got the benefit of the doubt. After the 1962 Thalidomide amendments, the unknown cure was officially more dangerous than the known disease. Very strong evidence would be necessary to establish otherwise.
The 1962 drug-law amendments gave decisive weight to human tests and clinical outcomes: a drug would not be deemed “effective” without clinical trials in which human patients started out sick and finished up healthy. Progressing in little steps from that seemingly sensible starting point, the FDA has since reached the point of worrying more about drugs evolving into sugar pills than about currently innocuous germs evolving into plagues.
The FDA has long required that clinical tests demonstrate that a new antibiotic is as good as or better than one already on the shelf, and it wants these trials to be extremely thorough and convincing. It worries that with too few patients tested, statistical anomalies might allow an inferior antibiotic to win its license, and the new one might then become the benchmark for a third even worse one, and so on until the industry slouches its way down to licensed sugar pills. The agency calls this scenario “biocreep.” It’s just the sort of logical but overly theoretical concern that sneaks into government offices where the paramount objective is to avoid mistakes. But nowadays, “no mistakes” means no new drugs licensed until the germs start killing lots of people again. Most serious infectious diseases are rare, and it’s unethical to test a new drug on seriously ill patients when there’s an old, licensed alternative in easy reach. This makes it all but impossible to assemble enough sick-but-not-too-sick patients for statistically meaningful clinical tests.
For example, the FDA recently had to decide whether to approve the use of Cubicin against a vicious germ that infects heart valves. Dubbed the “Darth Vader” bacterium, it is stealthy, difficult to kill, and almost always fatal if untreated. The FDA’s staff wasn’t convinced that enough patients had been tested to establish that the drug was better than the already existing alternatives—and argued that if it was too weak, the germ might evolve into something beyond Darth, resistant to everything. On the other hand, doctors were already using Cubicin off-label—prescribing it in ways never officially vetted or approved by the drug company or the regulator—but in low doses, thus possibly creating an even greater beyond-Darth risk. In the end, the chief of the FDA’s anti-infectives division overruled a staff recommendation and granted the license. But every such decision is a fight, and all the legal, political, and institutional cards are now stacked against quick, bold action.
A letter coauthored by a doctor at Harvard’s medical school and published in Clinical Infectious Diseases in 2002 bluntly linked the FDA’s approach to “the end of antibiotics.” “For nearly two decades,” it began, “antibacterial research has been the ‘Cinderella’ area in the pharmaceutical industry.” Increasingly stringent demands for proof efficacy, framed in ways that seem “innocent and technical,” the authors argued, have thrown the industry “into a panic.” They have “wreaked irreparable damage to our ability to provide a reliable pipeline of new antibiotics for treatment of serious infections.” And they probably helped propel Lilly and Bristol-Myers Squibb out of antibacterial research and development.
Most fundamentally, the FDA has no mandate—none at all— to prepare us for war against the next cholera. Its mandate is to make sure that we don’t lose the battle against the next Thalidomide. “Safe” and “effective,” the two key standards set out in the 1962 drug law, have intelligible meaning only with a germ to fight and infected patients to fight it in. The agency, in other words, actually needs thriving germs to supply enough really sick patients to provide FDA-caliber evidence to validate the drug that will wipe out the germs. Epidemics that lurk in the future, waiting for germs still under construction, can’t be officially considered at all. But those germs—the ones that don’t yet exist, the ones still evolving, by chance or with human help in a terrorist’s lab—should worry us most.
Today’s public health guardians have made things even harder for vaccines than for antibiotics. Gene-splicing and other bioengineering tools make it far easier today than ever before to build safe corpses, cousins, or fragments of horrible microbes. But many vaccines that could quite easily be developed haven’t been, and probably won’t be, because no drug company will take them to market.
Political support for the FDA depends on the public perception that it’s solving problems. When widely administered, as it must be, a vaccine wipes out the disease that it targets. The disease can thus be eclipsed in the public’s mind by the vaccine’s side effects, however rare or even imaginary. The more effective the vaccine at the outset, the more likely it will be condemned as unsafe at the end. Moreover, the FDA has an acknowledged policy of being extra cautious in licensing any product that healthy people will use, especially children. Nothing could better suit the germs. Healthy children with undeveloped immune systems are their favorite targets.
Judges and juries have even more trouble balancing the interests of the individual who claims that a vaccine has injured him against those of the rest of the disease-free community. New legal standards formulated in the 1950s and ’60s made it much easier to sue vaccine manufacturers, and relaxed rules of evidence soon made junk-science allegations much more common than legitimate ones. When liability claims spiraled to the point where they threatened to cut off the supply of some vaccines entirely, Congress set up an alternative compensation system for children (though not for adults) actually injured by their immunizations, and imposed a broad-based vaccine tax to fund it.
But it was too little, too late. Enveloped in bureaucracy, the germ-killing segment of the drug industry has lost much of its flexibility, resilience, and reserve capacity—and has become painfully slow in developing what new science makes possible. Short-term economics and federal law have converged to create a systematic bias in favor of the germicidal drug invented and licensed decades ago. The principal fiscal concern is who should pay how much for the new, patented drug, or whether the old, cheaper generic might not do as well. The principal regulatory concern centers on the risks of the new drug, not the perils of the new germ. Insurers are cost averse: wherever they can, they favor cheap drugs with expired patents. The FDA is risk averse: when in doubt, it sticks with the old and says no to the new.
Research labs do continue to come up with new vaccines, but that decade-long process is now routinely followed by a second decade (at least) before a commercial product makes it to market. No one doubts that the extra time helps ensure that the vaccine eventually injected into millions of arms is safer and more effective than it might otherwise be. But the whole effort is long and costly, and the likely profits waiting at the end, many manufacturers have concluded, are very modest by comparison. In 1957, five important vaccines were being supplied by 26 companies. By 2004, just four companies were supplying 12. Mergers accounted for some of the attrition, but most of it resulted from companies getting out of the business. More than half of the vaccines routinely given to young children in 2005 came from just one manufacturer; four others had only two suppliers.
Having surrendered on all other social aspects of infectious disease, the health authorities now focus principally on socializing costs. Capping profits is the politically inevitable corollary. The federal government now buys over half of all vaccines used in the United States, and by taking on that role, it has effectively taken control of prices.
After the anthrax mail attacks in late 2001, federal authorities made it clear that if push came to shove, they would rescind patents, too. Just two years earlier, at the government’s specific request, Bayer had asked the FDA to approve and label Cipro for use against inhalational anthrax. The agency granted the license—the first ever for a drug for use in responding to a deliberate biological attack—in a matter of months, basing its approval not on human trials but on the antibiotic’s effectiveness in inhibiting anthrax in rhesus monkeys. Then came the 2001 mail terror. Demand for Cipro soared—and prices collapsed. Yes, collapsed.
It was politically unthinkable for Bayer to raise the retail price in pharmacies, and federal authorities immediately demanded huge discounts on the pills that they wanted to stockpile. The Canadian government initiated its price negotiations by announcing that it would ignore Bayer’s Canadian patent and order a million tablets of a generic version of the drug from another company.
A couple of years later, Congress passed the 2004 BioShield law. It is intended to create a federal stockpile of bioterror vaccines, and to that end, it empowers the Pentagon to bypass certain aspects of the 1962 drug laws. Those provisions have already been invoked once, to cut off litigation against military use of the anthrax vaccine. The federal government also offered almost $1 billion of BioShield cash for the development of a new anthrax vaccine and the provision of 75 million doses. But established drug companies just weren’t interested. The contract went to VaxGen, a tiny startup that had never brought a licensed drug to market and that proposed to supply a bioengineered vaccine that the army had already developed. VaxGen failed to deliver, and last December, the government canceled the contract.
A decades-old alternative with various problems continues to be provided by BioPort, whose declared mission is to develop products for use against infectious diseases “with significant unmet or underserved medical needs” and, most notably, “potential weapons of bioterrorism.” That would mean anthrax, botulism, Ebola, and smallpox, among other killers. BioPort employs about 450 people.
So what would a watchmaker have done—not a blind one, but one with keen eyes and an excellent loupe—if called upon to design a microbe that could thrive among people fortified by fistfuls of vaccines and backed up by dozens of potent antibiotics? Nature got there, without the loupe.
Humans had spent a painstaking century developing vaccines. So nature designed an “immunodeficiency” virus—an all-purpose anti-vaccine, so tiny, quiet, slow, methodical, and gentle that it spread unnoticed for decades, and so innocuous that it never quite gets around to killing you at all. It leaves that to the old guard—the bacteria, protozoa, and viruses that invade when your immune system shuts down, and feast on your brain, lungs, blood, liver, heart, bone marrow, guts, skin, and the surface of your eyes. In its final stages, AIDS is truly horrible.
When the blind watchmaker has been pulling such stunts for 4 billion years, it’s reckless to suppose that HIV was its last or worst. Quite the contrary: our casual willingness to tolerate a septic underclass, so long as it remains insular and out of sight, is certain to hasten the rise of much more and much worse. People as negligent with pills as they are with germs have already helped spawn drug-resistant forms of tuberculosis, by taking enough medicine to kill weaker strains, while leaving hardy mutants alive to take over the business. HIV patients who don’t strictly follow the complex, unpleasant drug regimen used to suppress the virus become human petri dishes, in which microbes multiply and evolve to resist the stew of antibiotics prescribed as a last resort.
The new infectious diseases are already very good at this sort of adaptation—they have evolved to be as nimble as we are now institutionally stolid, as flexible as we are rigid. The influenza virus evolves exceptionally fast, using pigs as its genetic mixing bowl. HIV mutates constantly—one drug-resistant strain of the virus now apparently depends for its survival on a chemical constituent of the drug widely prescribed to stop its advance. Stubbornly persistent pelvic infections are on the rise, along with drug-resistant gonorrhea and even syphilis. A certain staph bacterium responsible for the most common infection acquired in hospitals has developed ways to pass the gene that produces a lethal toxin from one strain to the next and also to certain viruses that can spread it further still.
It’s because the threat is so grave that one must avoid the temptation to propose a simpleminded checklist of reforms to shoehorn somewhere into the middle of the next 1,000-page revision of the federal drug laws and FDA regulations. Germs are terrorists: they let the dead past bury its dead, they are always changing, and the ones you know aren’t the ones that will kill you. If we somehow revive a tough, germ-fearing culture, the risk-averse drug regulators, penny-pinching insurers, overreaching judges, clueless juries, and preening, industry-bashing congressional committees will fall into line. If we don’t, no tinkering will make much difference.
What we need is quite simple. We need many people to be much more frightened than they currently are. And we need a robust, flexible, innovative portfolio of drug companies to sink a lot of new capital into highly speculative ventures, almost all of which will lose money, with just one or two ending up embraced by regulators, eagerly paid for by insurers, vindicated every time by judges and juries, lauded by nonconformist preachers, and so spectacularly profitable for investors that they crowd in to fund more.
If we can’t drum up concern by other means, some dreadful germ will materialize and do the job for us. Nobody knows which one; that’s why we so desperately need the right popular culture and vigorous private enterprise. If the germs in the tattoo parlors today were both virulent and untreatable with current medicine, you wouldn’t be reading this, at least not in the heart of any big city. You’d be heading for the country.
That’s what the rich did when epidemics struck in Dickens’s day. They knew what they were fleeing—the urban pathologies described in Our Mutual Friend in 1864 were as familiar to Londoners as the Thames. And familiar not just to the boatmen who made a living fishing human corpses out of the river but also to the middle class, decimated by a violent cholera outbreak in Soho at the end of August 1854; to the entrepreneurs who made fortunes collecting and sorting mountains of trash; to members of Parliament, who, in June 1858, had to evacuate the House of Commons to escape the pestilential stench of the river; and to Queen Victoria, who lost her husband to another waterborne disease, typhoid fever, in 1861. Small wonder that cholera was a great agency for social change. In the time of cholera, the bacterium itself loved everyone.
Anthrax prefers goats; it finds its way into human bodies only very occasionally, through open wounds. The spores can be inhaled, too, but ordinarily they clump together and don’t spread well through the air. They become mass killers only when people painstakingly coat them with other materials and take special efforts to disperse them. The spores that struck 11 Americans (and killed five) in Washington, D.C., and New York in late 2001 weren’t dispersed through the Potomac or Hudson Rivers; they arrived by U.S. mail. A few pounds, suitably prepared and dispersed in the New York subway, could kill 100,000 people. If cholera is a social disease, weaponized anthrax defines the antisocial bottom of contagion—it’s a microbe that infects humans only with the help of sociopaths.
But we live in an age of sociopaths, and there remains much that we don’t know about germs. Viruses and prions may play a far larger role in genetic malfunctions than we yet fully understand. HIV and influenza demonstrate the boundless viral capacity to mutate and evolve. And while anthrax could never make it on its own in New York, murderous people are scheming to give it help. One way or another, germs will contrive to horrify us again, in some very nasty way. A society’s only real defense is to stay horrified, well ahead of the curve.
Peter W. Huber Peter Huber is a Manhattan Institute senior fellow. His books include Hard Green: Saving the Environment from the Environmentalists and Galileo’s Revenge: Junk Science in the Courtroom.
There have been at work among us three great social agencies: the London City Mission; the novels of Mr. Dickens; the cholera.” Historian Gertrude Himmelfarb quotes this reductionist observation at the end of her chapter on Charles Dickens in The Moral Imagination; her debt is to an English nonconformist minister, addressing his flock in 1853. It comes as no surprise to find the author of Hard Times and Oliver Twist discussed alongside Edmund Burke and John Stuart Mill in a book on moral history. Nor is it puzzling to see Dickens honored in his own day alongside the City Mission, a movement founded to engage churches in aiding the poor. But what’s V. cholerae doing up there on the dais beside the Inimitable Boz? It’s being commended for the tens of millions of lives it’s going to save. The nastiness of this vile little bacterium has just transformed ancient sanitary rituals and taboos into a new science of epidemiology. And that science is about to launch a massive—and ultimately successful—public effort to rid the city of infectious disease.
The year 1853, when a Victorian doctor worked out that cholera spread through London’s water supply, was the turning point. Ordinary people would spend the next century crowding into the cities, bearing many children, and thus incubating and spreading infectious disease. Public authorities would do all they could to wipe it out. For the rest of the nineteenth century, they lost more ground than they gained, and microbes thrived as never before. Then the germ killers caught up—and pulled ahead. When Jonas Salk announced his polio vaccine to the press in April 1955, the war seemed all but over. “The time has come to close the book on infectious disease,” declared William Stewart, the U.S. surgeon general, a few years later. “We have basically wiped out infection in the United States.”
By then, however, infectious diseases had completed their social mission. Public authorities had taken over the germ-killing side of medicine completely. The focus shifted from germs to money—from social disease to social economics. As germs grew less dangerous, people gradually lost interest in them, and ended up fearing germ-killing medicines more than the germs themselves.
Government policies expressed that fear, putting the development, composition, performance, manufacture, price, and marketing of antibiotics and vaccines under closer scrutiny and control than any public utility’s operations and services. The manufacturers of these drugs, which took up the germ-killing mission where the sewer commission left off, must today operate like big defense contractors, mirror images of the insurers, regulatory agencies, and tort-litigation machines that they answer to. Most drug companies aren’t developing any vaccines or antibiotics any more. The industry’s critics discern no good reason for this at all: as they tell it, the big drug companies just can’t be bothered.
These problems capture our attention only now and again; they hardly figure in the much louder debate about how much we spend on doctors and drugs, and who should pay the bills. “Public health” (in the literal sense) now seems to be one thing, and—occasional lurid headlines notwithstanding—not a particularly important one, while “health care” is quite another.
We will bitterly regret this shift, and probably sooner rather than later. As another Victorian might have predicted—he published a book on the subject in 1859—germs have evolved to exploit our new weakness. Public authorities are ponderous and slow; the new germs are nimble and fast. Drug regulators are paralyzed by the knowledge that error is politically lethal; the new germs make genetic error—constant mutation—the key to their survival. The new germs don’t have to be smarter than our scientists, just faster than our lawyers. The demise of cholera, one could say, has been one of the great antisocial developments of modern times.
By withdrawing from the battlefield just long enough to let us drift into this state of indifference, the germs have set the stage for their own spectacular revival. Germs are never in fact defeated completely. If they retire for a while, it’s only to search, in their ingeniously stupid and methodically random way, for a bold new strategy. They’ve also contrived, of late, to get human sociopaths to add thought and order to the search. The germs will return. We won’t be ready.
Microbes discovered the joys of socialism long before Marx did, and in matters of health, they made communists of us all. Since the dawn of civilization, infectious disease has been the great equalizer, with the city serving as septic womb, colony, and mortuary. Epidemic—“upon the people”—is the democracy of rich and poor incinerated indiscriminately by the same fever, or dying indistinguishably in puddles of their own excrement.
The Mao of microbes was smallpox, which killed 300 million people in the twentieth century alone. Sometimes called the first urban virus, it probably jumped from animals to humans in Egypt, Mesopotamia, or the Indus River Valley, at about the same time that the rise of agriculture began drawing people together in towns and cities. Smallpox has also been called nature’s cruelest antidote to human vanity. Princes broke out in the same pustules as paupers, reeked as foully of rotting flesh, and oozed the same black blood from all their orifices. Alongside millions of nameless dead lie kings of France and Spain, queens of England and Sweden, one Austrian and two Japanese emperors, and a czar of Russia.
While the germs reigned, there wasn’t much rest-of-medicine to speak of: infections eclipsed every other cause of illness but malnutrition. And when monarchs were dying, too, language and politics honestly tracked medical reality. The “social” in “social disease” reflected an epidemiological fact. It also pointed to a practical, collective solution. Disease arose and spread when people converged to create societies. It was caused by invisible agents that individuals could not control on their own. It could be eradicated only by social means—public sanitation, slum clearance, education, and, above all, a robust, germ-hating culture. It took a city to erase a cholera.
This was the overarching insight that crystallized in the public consciousness in the first half of the nineteenth century. “In the Victorian version of the Puritan ethic,” Himmelfarb writes, “cleanliness was, if not next to godliness, at least next to industriousness and temperance.” For Dickens, as Himmelfarb and others have observed, the filth in the Thames symbolized the city’s insidious taint, its ubiquitous, effluvial corruption. What social historians often fail to note, however, is that by the time Dickens was placing the Thames at the center of London’s many ills, a new science had emerged to move the river far beyond metaphor.
Epidemiology—the rigorous science of public health—was born with physician William Farr’s appointment as controller of London’s General Register Office in 1838. Directed to do something about the cholera epidemic, Farr began systematically recording who was dying and where. The most important things he discovered were negative. Wealth didn’t protect you from cholera. Neither did occupation, or residing close to the sea. What mattered was how high above the Thames you lived. Farr concluded that the river’s horrendous stench caused the disease. Another English doctor, John Snow, made the right connection in 1853: London’s sewers emptied into the Thames, so the farther down-sewer you lived, the more likely you were to drink foul water. A year later, Snow saved countless lives by persuading parish authorities to remove the handle from the Broad Street pump in Soho.
The rest is history. By pinning down the waterborne pathway of contagion, Farr and Snow had transformed a devastating public disease into a routine exercise in civil engineering. In 1858, Parliament passed legislation, proposed by then-chancellor of the exchequer Benjamin Disraeli, to finance new drains. Charles Dickens published his last novel—Our Mutual Friend, in which the main character is the pestilential Thames—in 1864. London suffered its last cholera epidemic in 1866.
This wasn’t the end of great plagues in the city, or even the beginning of the end, but it was the end of the beginning. In 1872, Disraeli rallied his Tory Party around what his Liberal opponents derided as a “policy of sewage”—reforms involving housing, sanitation, factory conditions, food, and the water supply—and while he served as prime minister, these policies became law. For the next 50 years or so, in the United States as in Britain, public health depended on city bureaucrats above all. They wasted little time with sick patients, other than sometimes ordering them to lock their doors and die alone. They focused instead on eradicating germs before they reached the patient, and that meant attending to the water, sewage, trash, and rats.
In a recent British Medical Journal survey, public sanitation was voted the most important medical advance since the journal was established in 1840. If we don’t think of public sanitation as “medical” any more, it’s only because the municipal bureaucrats who followed Farr cleaned things up so well.
As they ran out their welcome in public spaces, microbes went private. They still had to move from person to person, but there could be no more carefree joyrides on rats or surfing through the water supply. People themselves, however, are almost as infectious as their sewers. Clean water alone could not eliminate coughs, dirty hands, and filthy food.
The systematic pursuit of germs into the flesh of patients didn’t really begin until the very late nineteenth century. Jenner’s smallpox vaccine was already a century old, but it owed its existence to the lucky fact that the human pox had a weak cousin that infected cows. (We give our kids the “cow treatment”—vacca is Latin for cow—every time we do as Jenner did, and challenge their immune systems with a corpse, cousin, or fragment of a horrible microbe.) The systematic production of other vaccines had to await the arrival of Louis Pasteur and Robert Koch, who developed procedures for isolating microbes and then crippling or killing them.
Vaccines, health authorities quickly recognized, are quintessentially public drugs. They expel germs not from the public water but from the not-quite-private lungs, fluids, and intestines of the public itself. When enough people get vaccinations, “herd immunity” protects the rest.
Five human vaccines arrived in the late nineteenth century, and many others would follow in the twentieth. They weren’t developed quickly or easily, but they did keep coming. In due course, and for the first time in human history, serious people began to believe that infectious disease might come to an end. Scientists could painstakingly isolate germs that attacked humans. Drug companies then would find ways to cultivate and cripple the germs, and mass-produce vaccines to immunize the public. Disease would fall, one germ at a time, and when they were all gone, good health would be pretty much shared by all.
New laws, vigorously enforced, drafted the healthy public into the war on germs. England mandated universal smallpox vaccination in 1853. Facing a smallpox outbreak, Cambridge, Massachusetts, decreed in February 1902 that all the town’s inhabitants “be vaccinated or revaccinated,” set up free vaccination centers, and empowered a physician to enforce the measure. A certain Henning Jacobson refused to comply, insisting on a constitutional right “to care for his own body and health in such way as to him seems best.” The U.S. Supreme Court disagreed, easily upholding the “power of a local community to protect itself against an epidemic threatening the safety of all.” Later enactments would require the vaccination of children before they could attend public schools. Adults who traveled abroad had to be vaccinated if they planned to come home. Albert Sabin’s polio vaccine took things even further—the vaccine itself was infectious. A child swallowed a live but weakened virus soaked into a sugar cube, and then went home and vaccinated his siblings and parents, too.
The germ killers didn’t really get into the business of private medicine—curing already-sick patients—until the development of sulfa drugs in the 1930s, followed by antibiotics after World War II. Even then, much of the cure still lay in preventing the infection of others. The paramount objective with tuberculosis, for instance, was to wipe out the tubercle bacillus so thoroughly that nobody would need streptomycin any more, because nobody would come into contact with any other infected person or animal.
Year by year, one segment or another of the public sector contrived to take a little more control, directly or indirectly, over the development, distribution, and price of vaccines. Soon after the development of the polio vaccine in the 1950s, Washington launched a program to promote and subsidize the vaccination of children nationwide. At about the same time, the Soviet Union proposed a global campaign to eradicate smallpox. The World Health Organization officially launched the campaign in 1966, and it ended in triumph 20 years later.
A complete socialization of the war against germs seemed sensible. The germs, after all, lived in the public water, floated through the public air, and passed from hand to hand in the public square. Individuals might buy their own vaccines, antibiotics, bottled water, and face masks. But collective means could make all of that forever unnecessary. And they did. Big government attacked the infectious microbes with genocidal determination, expelling them, one by one, from human society. Defiled by monstrous human fratricide, the first five decades of the twentieth century were also the triumphant decades of public health.
To stay prepared, however, human culture apparently requires regular booster shots of smallpox, cholera, plague, or some other serious disease that indiscriminately sickens and kills. Without periodic decimation, ordinary people apparently forget what germs can do, the authorities grow complacent, scientists turn their attention elsewhere, and private capital stops investing in the weapons of self-defense.
As Sherwin Nuland observes in How We Die, AIDS struck just when “the final conquest of infectious disease seemed at last within sight.” In 1981, a weekly Centers for Disease Control report noted a sudden increase in a specific strain of pneumonia in California and New York. Before long, we had it on Oprah Winfrey’s authority that the germs were back and were after us all. “One in five heterosexuals could be dead of AIDS in the next three years,” she declared in February 1987.
Whatever they were thinking about HIV, the heterosexuals had, by that point, plenty of other venereal diseases to worry about. A tragically large number of young women had contracted chlamydial infections serious enough to leave them infertile. Herpes, gonorrhea, syphilis, and some types of sexually transmitted hepatitis were also on the rise. The sexual revolution seems in retrospect to have been led by people who took William Stewart at his word when he consigned infectious disease to the dustbin of history. But rampant promiscuity packs people together tighter than slums, and germs rush in where angels fear to tread. It has taken a great deal of readily avoidable suffering and death to establish that people do need sexual taboos—taboos at the very least robust enough to thwart microbes, if not with less sex, then with more latex.
As social agents go, however, HIV and chlamydia accomplished far less than cholera. It was the demise of a germ-hating culture that had helped clear the way for new epidemics of venereal disease, and the resurrection of that culture still has a long way to go. Many people in positions of authority and influence continue to affirm the tattoo artist’s expressive freedom, the bag lady’s right to sleep next to the sewer, the mainliner’s right to share needles in an abandoned row house, and the affluent parent’s right to interpose a “philosophical objection” between his child and the vaccinations demanded by public schools. They propound grand new principles of freedom, privacy, and personal autonomy to protect septic suicide, even septic homicide. Social doctors in Dickens’s day didn’t have to invade anyone’s privacy to track smallpox—it announced itself on its victims’ faces. Tracking HIV, by contrast, requires a blood test, and privacy police dominated the first 20 years of the fight over testing.
A legal system that affirms the individual’s right to do almost everything at the germ-catching end now struggles to decide when, if ever, we can force the Typhoid Marys of our day to stop pitching what they catch. The law that once ordered a healthy Henning Jacobson to roll up his sleeve can no longer compel a virulent celebrity to zip his fly. Infectious lifestyle, once a crime, is now a constitutional right.
Many people just don’t care much, and it’s easy to see why. Habits and lifestyles that the Victorians learned to shun look a lot less vile when they lose not only their repulsive cankers, pustules, sputum, fevers, diarrhea, dementia, and emaciation, but also their power to impose these horrors on the neighbors. Just three months after Oprah warned heterosexuals about AIDS, President Reagan thought it necessary to remind us that we were battling a disease, not our fellow citizens. Everyone knew why. The gay community had good reason to fear that many Americans might be thinking: HIV isn’t my problem, it’s theirs. The new choleras are indeed much less social than the old. Why shouldn’t they forever remain so?
Over the morning coffee and toast, consumed in our tidy little kitchens, we read that drug-resistant tuberculosis is a cause for growing concern—but mainly in prisons. So too are new drug-resistant staph infections—in tattoo parlors and the foulest of locker rooms. And it’s in private drug dens, bedrooms, and bathhouses, of course, that infectious germs have made their biggest comeback, contriving to get themselves spread by not-quite-private needles and genitalia. True, the germs incubated in abandoned houses, cardboard boxes, and other hovels have drifted into run-down urban hospitals, whose emergency rooms often provide primary care to the patients most likely to harbor the worst germs. But they haven’t moved much farther than that. Sharing the city, it seems, no longer means sharing smallpox and the plague.
The epidemiological facts—beyond serious dispute—support the complacent majority. Germs used to ravage young bodies with inexperienced immune systems; now they mainly take the old. Though death certificates still quite often record an infection as the final cause of death, germs now are mostly epitaph killers, moving in on immune systems terminally crippled by old age, heart disease, cancer, stroke, and Alzheimer’s.
In the pantheon of disease and death, lifestyle and genes have completely eclipsed germs. The great agent of social change today isn’t cholera; it’s cholesterol. It propagates via drive-through windows, not sewers. Crowds don’t flee the city when it strikes; they pay extra for a supersize serving. In the heyday of public health, public money went to clean up public filth. Today, we’re sick because we spend our private money buying bad health by the pack, the bottle, and the Happy Meal. Small wonder, then, that the germ-fighting social norms once ranked next to godliness still seem to many as antiquated as the whalebone hoops that defended Victorian virtue.
If we took the new microbes seriously, we could certainly beat them. The science for tracking, immunizing against, and annihilating germs grows more vigorous, innovative, and youthful with each passing year. But insurers and regulators now control how we use that science, and as the germ-phobic culture has decayed, they’ve grown increasingly slow and rigid. Many competent people at the top echelons of government worry deeply about this problem. Yet as they scramble to address it, they must hack their way through laws and bureaucracies that have accumulated and thickened since the 1960s. Technical know-how isn’t enough; collective will is also necessary. Yet—paradoxical as it sounds—collective will is what was lost as government took over the show.
Good sewers, public sanitation, and fresh water are undoubtedly public ends, best advanced by public means. Yet though germ killing begins in public, it must, as the Victorians grasped, end in private, and this is where the government’s attempt to take charge of everything has had terrible consequences. The Victorians had nothing but culture to wield against germs on private premises, so they taught that clean was virtuous, and dirty sinful, and they taught it very persistently to everyone. But the big, efficient, technocratic government agencies of our day don’t do virtue and sin—they requisition, stockpile, subsidize, proscribe, and mandate. And they teach—implicitly but persistently—that germs are government’s responsibility, not yours. Socialized germ killing makes it a lot easier for people to lose touch with the personal side of germicide.
When the government then tries to clean up human bodies with the same heavy hand that it uses to clean up the sewers, it can end up fighting both amnesiacs and those who remember too well. The forgetful push back because they are sick and tired of being hectored by the universal nanny about washing hands, vaccinating kids, and countless other time-wasting nuisances. The unforgetful seem to believe that the little routines and habits of daily life are too important to entrust to a nanny perched on the banks of the Potomac.
Consider, for example, the most important new vaccine recently licensed, to protect against the human papillomavirus (HPV). Developed by scientists at the National Cancer Institute, it’s designated a “childhood” vaccine, which will give it some (by no means complete) shelter from the tort lawyers. Merck licensed the vaccine, steered it through the FDA, and will be responsible if anything goes wrong. The firm is charging $300 or more for the three-shot dose—ten to 100 times the inflation-adjusted cost of most vaccines in the 1950s. It may well be worth it. This new kids’ vaccine protects against a sexually transmitted virus that causes many cases of cervical cancer. To be effective, however, vaccination must occur before exposure to the virus, and each new sexual partner exposes a girl to a 15 percent chance of infection. The Centers for Disease Control therefore plans to see to it that girls are vaccinated before. That, the federal authorities have concluded, would be before they are 12.
Quite a few parents have concluded that the federal authorities can go to hell. The amnesiacs are beyond help; they’re probably skipping other vaccines, too. As for the mnemonists—maybe some just remember that sex spreads a lot of other germs, as well, and figure that they have a scheme to protect their little girls from all of them. Others probably aren’t making any conscious calculation about germs; they’re just holding fast to a faith and culture that still seeks to protect little girls from sex itself. People who believe government can achieve anything will say that it should just have handled this one more delicately. Perhaps—but the fact is, the government’s germ killers have ended up at loggerheads with the people they most need as their closest allies: parents who teach the taboos and rules that provide a crucial line of defense against the most persistent and clever killers of children on the face of the planet.
Even when it doesn’t reach the point of turning parents against vaccines, the government takeover has left many people with a triple sense of entitlement—to germ-free life, risk-free drugs, and wallet-free insurance. This in turn has created an almost profit-free economic environment for germ-hunting pharmaceutical companies, which still do much of the basic science and take charge of the essential, delicate, and difficult business of mass drug production.
The new drug law that President Kennedy signed on October 10, 1962, codified a profound change in attitude. With infectious diseases all but finished, the drugs of the future would target human chemistry. A horrified world had just discovered that one such drug, which effectively relieved morning sickness and helped people fall asleep, also halted the growth of a baby’s limbs in the womb. Before, when microbes were the enemy, drugs got the benefit of the doubt. After the 1962 Thalidomide amendments, the unknown cure was officially more dangerous than the known disease. Very strong evidence would be necessary to establish otherwise.
The 1962 drug-law amendments gave decisive weight to human tests and clinical outcomes: a drug would not be deemed “effective” without clinical trials in which human patients started out sick and finished up healthy. Progressing in little steps from that seemingly sensible starting point, the FDA has since reached the point of worrying more about drugs evolving into sugar pills than about currently innocuous germs evolving into plagues.
The FDA has long required that clinical tests demonstrate that a new antibiotic is as good as or better than one already on the shelf, and it wants these trials to be extremely thorough and convincing. It worries that with too few patients tested, statistical anomalies might allow an inferior antibiotic to win its license, and the new one might then become the benchmark for a third even worse one, and so on until the industry slouches its way down to licensed sugar pills. The agency calls this scenario “biocreep.” It’s just the sort of logical but overly theoretical concern that sneaks into government offices where the paramount objective is to avoid mistakes. But nowadays, “no mistakes” means no new drugs licensed until the germs start killing lots of people again. Most serious infectious diseases are rare, and it’s unethical to test a new drug on seriously ill patients when there’s an old, licensed alternative in easy reach. This makes it all but impossible to assemble enough sick-but-not-too-sick patients for statistically meaningful clinical tests.
For example, the FDA recently had to decide whether to approve the use of Cubicin against a vicious germ that infects heart valves. Dubbed the “Darth Vader” bacterium, it is stealthy, difficult to kill, and almost always fatal if untreated. The FDA’s staff wasn’t convinced that enough patients had been tested to establish that the drug was better than the already existing alternatives—and argued that if it was too weak, the germ might evolve into something beyond Darth, resistant to everything. On the other hand, doctors were already using Cubicin off-label—prescribing it in ways never officially vetted or approved by the drug company or the regulator—but in low doses, thus possibly creating an even greater beyond-Darth risk. In the end, the chief of the FDA’s anti-infectives division overruled a staff recommendation and granted the license. But every such decision is a fight, and all the legal, political, and institutional cards are now stacked against quick, bold action.
A letter coauthored by a doctor at Harvard’s medical school and published in Clinical Infectious Diseases in 2002 bluntly linked the FDA’s approach to “the end of antibiotics.” “For nearly two decades,” it began, “antibacterial research has been the ‘Cinderella’ area in the pharmaceutical industry.” Increasingly stringent demands for proof efficacy, framed in ways that seem “innocent and technical,” the authors argued, have thrown the industry “into a panic.” They have “wreaked irreparable damage to our ability to provide a reliable pipeline of new antibiotics for treatment of serious infections.” And they probably helped propel Lilly and Bristol-Myers Squibb out of antibacterial research and development.
Most fundamentally, the FDA has no mandate—none at all— to prepare us for war against the next cholera. Its mandate is to make sure that we don’t lose the battle against the next Thalidomide. “Safe” and “effective,” the two key standards set out in the 1962 drug law, have intelligible meaning only with a germ to fight and infected patients to fight it in. The agency, in other words, actually needs thriving germs to supply enough really sick patients to provide FDA-caliber evidence to validate the drug that will wipe out the germs. Epidemics that lurk in the future, waiting for germs still under construction, can’t be officially considered at all. But those germs—the ones that don’t yet exist, the ones still evolving, by chance or with human help in a terrorist’s lab—should worry us most.
Today’s public health guardians have made things even harder for vaccines than for antibiotics. Gene-splicing and other bioengineering tools make it far easier today than ever before to build safe corpses, cousins, or fragments of horrible microbes. But many vaccines that could quite easily be developed haven’t been, and probably won’t be, because no drug company will take them to market.
Political support for the FDA depends on the public perception that it’s solving problems. When widely administered, as it must be, a vaccine wipes out the disease that it targets. The disease can thus be eclipsed in the public’s mind by the vaccine’s side effects, however rare or even imaginary. The more effective the vaccine at the outset, the more likely it will be condemned as unsafe at the end. Moreover, the FDA has an acknowledged policy of being extra cautious in licensing any product that healthy people will use, especially children. Nothing could better suit the germs. Healthy children with undeveloped immune systems are their favorite targets.
Judges and juries have even more trouble balancing the interests of the individual who claims that a vaccine has injured him against those of the rest of the disease-free community. New legal standards formulated in the 1950s and ’60s made it much easier to sue vaccine manufacturers, and relaxed rules of evidence soon made junk-science allegations much more common than legitimate ones. When liability claims spiraled to the point where they threatened to cut off the supply of some vaccines entirely, Congress set up an alternative compensation system for children (though not for adults) actually injured by their immunizations, and imposed a broad-based vaccine tax to fund it.
But it was too little, too late. Enveloped in bureaucracy, the germ-killing segment of the drug industry has lost much of its flexibility, resilience, and reserve capacity—and has become painfully slow in developing what new science makes possible. Short-term economics and federal law have converged to create a systematic bias in favor of the germicidal drug invented and licensed decades ago. The principal fiscal concern is who should pay how much for the new, patented drug, or whether the old, cheaper generic might not do as well. The principal regulatory concern centers on the risks of the new drug, not the perils of the new germ. Insurers are cost averse: wherever they can, they favor cheap drugs with expired patents. The FDA is risk averse: when in doubt, it sticks with the old and says no to the new.
Research labs do continue to come up with new vaccines, but that decade-long process is now routinely followed by a second decade (at least) before a commercial product makes it to market. No one doubts that the extra time helps ensure that the vaccine eventually injected into millions of arms is safer and more effective than it might otherwise be. But the whole effort is long and costly, and the likely profits waiting at the end, many manufacturers have concluded, are very modest by comparison. In 1957, five important vaccines were being supplied by 26 companies. By 2004, just four companies were supplying 12. Mergers accounted for some of the attrition, but most of it resulted from companies getting out of the business. More than half of the vaccines routinely given to young children in 2005 came from just one manufacturer; four others had only two suppliers.
Having surrendered on all other social aspects of infectious disease, the health authorities now focus principally on socializing costs. Capping profits is the politically inevitable corollary. The federal government now buys over half of all vaccines used in the United States, and by taking on that role, it has effectively taken control of prices.
After the anthrax mail attacks in late 2001, federal authorities made it clear that if push came to shove, they would rescind patents, too. Just two years earlier, at the government’s specific request, Bayer had asked the FDA to approve and label Cipro for use against inhalational anthrax. The agency granted the license—the first ever for a drug for use in responding to a deliberate biological attack—in a matter of months, basing its approval not on human trials but on the antibiotic’s effectiveness in inhibiting anthrax in rhesus monkeys. Then came the 2001 mail terror. Demand for Cipro soared—and prices collapsed. Yes, collapsed.
It was politically unthinkable for Bayer to raise the retail price in pharmacies, and federal authorities immediately demanded huge discounts on the pills that they wanted to stockpile. The Canadian government initiated its price negotiations by announcing that it would ignore Bayer’s Canadian patent and order a million tablets of a generic version of the drug from another company.
A couple of years later, Congress passed the 2004 BioShield law. It is intended to create a federal stockpile of bioterror vaccines, and to that end, it empowers the Pentagon to bypass certain aspects of the 1962 drug laws. Those provisions have already been invoked once, to cut off litigation against military use of the anthrax vaccine. The federal government also offered almost $1 billion of BioShield cash for the development of a new anthrax vaccine and the provision of 75 million doses. But established drug companies just weren’t interested. The contract went to VaxGen, a tiny startup that had never brought a licensed drug to market and that proposed to supply a bioengineered vaccine that the army had already developed. VaxGen failed to deliver, and last December, the government canceled the contract.
A decades-old alternative with various problems continues to be provided by BioPort, whose declared mission is to develop products for use against infectious diseases “with significant unmet or underserved medical needs” and, most notably, “potential weapons of bioterrorism.” That would mean anthrax, botulism, Ebola, and smallpox, among other killers. BioPort employs about 450 people.
So what would a watchmaker have done—not a blind one, but one with keen eyes and an excellent loupe—if called upon to design a microbe that could thrive among people fortified by fistfuls of vaccines and backed up by dozens of potent antibiotics? Nature got there, without the loupe.
Humans had spent a painstaking century developing vaccines. So nature designed an “immunodeficiency” virus—an all-purpose anti-vaccine, so tiny, quiet, slow, methodical, and gentle that it spread unnoticed for decades, and so innocuous that it never quite gets around to killing you at all. It leaves that to the old guard—the bacteria, protozoa, and viruses that invade when your immune system shuts down, and feast on your brain, lungs, blood, liver, heart, bone marrow, guts, skin, and the surface of your eyes. In its final stages, AIDS is truly horrible.
When the blind watchmaker has been pulling such stunts for 4 billion years, it’s reckless to suppose that HIV was its last or worst. Quite the contrary: our casual willingness to tolerate a septic underclass, so long as it remains insular and out of sight, is certain to hasten the rise of much more and much worse. People as negligent with pills as they are with germs have already helped spawn drug-resistant forms of tuberculosis, by taking enough medicine to kill weaker strains, while leaving hardy mutants alive to take over the business. HIV patients who don’t strictly follow the complex, unpleasant drug regimen used to suppress the virus become human petri dishes, in which microbes multiply and evolve to resist the stew of antibiotics prescribed as a last resort.
The new infectious diseases are already very good at this sort of adaptation—they have evolved to be as nimble as we are now institutionally stolid, as flexible as we are rigid. The influenza virus evolves exceptionally fast, using pigs as its genetic mixing bowl. HIV mutates constantly—one drug-resistant strain of the virus now apparently depends for its survival on a chemical constituent of the drug widely prescribed to stop its advance. Stubbornly persistent pelvic infections are on the rise, along with drug-resistant gonorrhea and even syphilis. A certain staph bacterium responsible for the most common infection acquired in hospitals has developed ways to pass the gene that produces a lethal toxin from one strain to the next and also to certain viruses that can spread it further still.
It’s because the threat is so grave that one must avoid the temptation to propose a simpleminded checklist of reforms to shoehorn somewhere into the middle of the next 1,000-page revision of the federal drug laws and FDA regulations. Germs are terrorists: they let the dead past bury its dead, they are always changing, and the ones you know aren’t the ones that will kill you. If we somehow revive a tough, germ-fearing culture, the risk-averse drug regulators, penny-pinching insurers, overreaching judges, clueless juries, and preening, industry-bashing congressional committees will fall into line. If we don’t, no tinkering will make much difference.
What we need is quite simple. We need many people to be much more frightened than they currently are. And we need a robust, flexible, innovative portfolio of drug companies to sink a lot of new capital into highly speculative ventures, almost all of which will lose money, with just one or two ending up embraced by regulators, eagerly paid for by insurers, vindicated every time by judges and juries, lauded by nonconformist preachers, and so spectacularly profitable for investors that they crowd in to fund more.
If we can’t drum up concern by other means, some dreadful germ will materialize and do the job for us. Nobody knows which one; that’s why we so desperately need the right popular culture and vigorous private enterprise. If the germs in the tattoo parlors today were both virulent and untreatable with current medicine, you wouldn’t be reading this, at least not in the heart of any big city. You’d be heading for the country.
That’s what the rich did when epidemics struck in Dickens’s day. They knew what they were fleeing—the urban pathologies described in Our Mutual Friend in 1864 were as familiar to Londoners as the Thames. And familiar not just to the boatmen who made a living fishing human corpses out of the river but also to the middle class, decimated by a violent cholera outbreak in Soho at the end of August 1854; to the entrepreneurs who made fortunes collecting and sorting mountains of trash; to members of Parliament, who, in June 1858, had to evacuate the House of Commons to escape the pestilential stench of the river; and to Queen Victoria, who lost her husband to another waterborne disease, typhoid fever, in 1861. Small wonder that cholera was a great agency for social change. In the time of cholera, the bacterium itself loved everyone.
Anthrax prefers goats; it finds its way into human bodies only very occasionally, through open wounds. The spores can be inhaled, too, but ordinarily they clump together and don’t spread well through the air. They become mass killers only when people painstakingly coat them with other materials and take special efforts to disperse them. The spores that struck 11 Americans (and killed five) in Washington, D.C., and New York in late 2001 weren’t dispersed through the Potomac or Hudson Rivers; they arrived by U.S. mail. A few pounds, suitably prepared and dispersed in the New York subway, could kill 100,000 people. If cholera is a social disease, weaponized anthrax defines the antisocial bottom of contagion—it’s a microbe that infects humans only with the help of sociopaths.
But we live in an age of sociopaths, and there remains much that we don’t know about germs. Viruses and prions may play a far larger role in genetic malfunctions than we yet fully understand. HIV and influenza demonstrate the boundless viral capacity to mutate and evolve. And while anthrax could never make it on its own in New York, murderous people are scheming to give it help. One way or another, germs will contrive to horrify us again, in some very nasty way. A society’s only real defense is to stay horrified, well ahead of the curve.
Peter W. Huber Peter Huber is a Manhattan Institute senior fellow. His books include Hard Green: Saving the Environment from the Environmentalists and Galileo’s Revenge: Junk Science in the Courtroom.
Overuse Of Antibiotics-Creating Monsters Out Of Common Microbes
Health officials are alarmed at the evolution of causal organisms of hitherto common diseases into ‘superbugs’. This, they warn is the result of overzealous use of antibiotics.
One such example is the recent announcement by the U.S Centers For Disease Control (CDC) that gonorrhea has now become resistant to the antibiotic fluoroquinolones. The only line of treatment left is cephalosporins; a fact, which is creating worry lines on many a health expert.
Says Dr. Kevin Fenton, Director of CDC’s National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention:“There is an urgent need for new, effective medicines to treat gonorrhea. We are running out of options to treat this serious disease. Increased vigilance in monitoring for resistance to all available drugs is essential.”
So, where lies the problem? Antibiotics are one of the most profound achievements in medicine, but decades of widespread and indiscriminate use in cosmetics and soaps and food animal production, has rendered many ineffective.
The practice goes on. One such example is that last month, the Food and Drug Administration, despite warnings from its own experts and health groups, gave the green light to treating cattle with an antibiotic; the fourth generation version of the one the CDC now urges for gonorrhea. Though the company that makes this antibiotic argued that similar drugs have been used in animals in Europe without harm, recent data indicates that bacteria resistance has grown not only in livestock but in humans as well.
Both the Government and private health organizations seem to agree that careful, limited use of antibiotics is crucial to public health, as microbes become more and more resistant to them. Still, the spread of gonorrhea is one indication of a thought yet to be put, into action.
Source-Medindia
ANN/V
One such example is the recent announcement by the U.S Centers For Disease Control (CDC) that gonorrhea has now become resistant to the antibiotic fluoroquinolones. The only line of treatment left is cephalosporins; a fact, which is creating worry lines on many a health expert.
Says Dr. Kevin Fenton, Director of CDC’s National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention:“There is an urgent need for new, effective medicines to treat gonorrhea. We are running out of options to treat this serious disease. Increased vigilance in monitoring for resistance to all available drugs is essential.”
So, where lies the problem? Antibiotics are one of the most profound achievements in medicine, but decades of widespread and indiscriminate use in cosmetics and soaps and food animal production, has rendered many ineffective.
The practice goes on. One such example is that last month, the Food and Drug Administration, despite warnings from its own experts and health groups, gave the green light to treating cattle with an antibiotic; the fourth generation version of the one the CDC now urges for gonorrhea. Though the company that makes this antibiotic argued that similar drugs have been used in animals in Europe without harm, recent data indicates that bacteria resistance has grown not only in livestock but in humans as well.
Both the Government and private health organizations seem to agree that careful, limited use of antibiotics is crucial to public health, as microbes become more and more resistant to them. Still, the spread of gonorrhea is one indication of a thought yet to be put, into action.
Source-Medindia
ANN/V
Subscribe to:
Posts (Atom)