A Plague on Both Your Houses

The plague –nothing conjures up death quite like that word -after all, the bubonic plague wiped out half of Europe in the 14th century. But there have been others of its ilk –and all probably caused by the Yersinia pestis bacterium. Although the yet-unnamed infectious agent was identified in the 1890ies by the bacteriologist Alexandre Yersin -working at the time in the Pasteur Institute on plague samples from an outbreak in Hong Kong- the name was initially misattributed… Never work for somebody really famous when you discover something important. Personally, I preferred its previous name of Pasteurella pestis because that’s the name I was first taught and I liked the alliteration. But never mind.

The plague has three different presentations, depending upon the organs infected: bubonic plague, from infection of the lymphatic system and localized as buboes (swellings of infected lymph nodes which may become necrotic and turn black in their attempt to defend the body); pneumonic plague –infection of the lungs, presumably from aerosolized droplets from coughing or the like; and the rarest and likely most fatal of the three, septicaemic plague, which is an infection of the blood stream. All are carried by fleas, which are carried by rats, which then carry them to us.

Although we tend to associate the word ‘plague’ with the infamous ‘Black Death’ of European fame -not least because of the shock value of its name, I suspect- there have been several plagues throughout history. The first was originally thought to have been as early as 430 BCE in Athens, but a study published in the journal Cell in 2015 suggests that it began long before that –about 5,353 years before, actually. But perhaps a more assimilable article that outlines the background is found in a BBC news report, also in 2015: http://www.bbc.com/news/health-34603116

‘Samples taken from the teeth of seven bodies contained traces of the bacterial infection in the Bronze Age. They also showed it had, at the time, been unable to cause the bubonic form of plague or spread through fleas – abilities it evolved later.’ You have to love this kind of information, eh?

‘In its early days, it could cause only septicaemic or pneumonic plague – which is nearly always deadly and would have been passed on by coughing. By analysing the bacterium’s genetic code through history, the researchers estimate it took until 1000 BC for plague to evolve into its more familiar form. One mutation – acquiring the ymt gene – allowed the bacterium to survive inside the hostile environment of a flea’s gut. […]Developing a separate gene, called pla, allowed the infection to penetrate different tissues and cause bubonic plague.’

But all things change, don’t they? Things fall apart; the centre cannot hold, in the unforgettable words of Yeats. And yet why would a pathogen evolve to destroy the very hosts on which it depends? Why burn the hotel…?

I suppose an easy explanation might be that of a game in which each side –host/pathogen- continually attempts to outsmart the other. More virulence in the invader leads to more defensive mechanisms in the invaded –things as overt as quarantine or antibiotics, to the more subtle, but hopefully preventative development of immune resources by vaccination or over the longer term, adaptation of endogenous immune defenses: survival of the fittest.

But for me, the intriguingly unanswered question still remains: why kill your host? Why not coexist as, say, a parasite –or even a commensal- in the gut, or create a chronic condition that might weaken the owners, but not eliminate them? Of course, some pathogens are just evolutionary dead-ends – fireworks that illuminate the sky briefly and then disappear as suddenly as they appeared, or maybe finally settle into a desk-job and plod along just under the radar. But I suppose even germs want some time on the pedestal, though. Nothing ventured, nothing gained… Ecological opportunities beg for exploitation –leave a window unlocked, and something will find it.

Of course there are other ways of making a living: attack and retreat to fight again… While not strictly analogous, I am reminded of the Champawat tiger of Nepal (and later in the Kumaon district of India) in the late 19th century. She used to attack suddenly and then disappear before anybody could do anything about her. True, she was finally shot, but not before she’d managed to kill almost 450 people in different locations and instilled fear of her return for years. Fear is like that –especially fear of what Donald Rumsfeld (a once upon a time U.S. secretary of Defence, remember?) oxymoronically called the ‘known unknowns’.

The plague has managed a similar trick over the centuries, flaring up in one region, only to hide, then reappear in a totally different region later –often much later. ‘The most recent plague epidemics have been reported in India during the first half of the 20th century, and in Vietnam during wartime in the 1960s and 1970s. Plague is now commonly found in sub-Saharan Africa and Madagascar, areas which now account for over 95% of reported cases (Stenseth, 2008)’ [https://www.cdc.gov/plague/history/index.html]

But, even those of us living in North America are not entirely safe -remember that Hong Kong plague that Yersin was studying in the 1890ies? A ship from there arrived in San Francisco in the summer of 1899 with plague found among some stowaways, two of whom escaped and drowned in the Bay. An epidemic of plague hit San Francisco nine months later. Whether it was from them or from rats that swam ashore, is not known, but the disease has been with us ever since.

http://www.livescience.com/51792-plague-united-states.html  ‘Plague cases occur sporadically in the United States — between 1970 and 2012, an average of seven plague cases occurred yearly […] But plague cases don’t show up everywhere. Rather, most occur in rural areas in western states […] the CDC says. One reason why cases of plague are restricted to the West is that the rodent populations there carry the disease […] “Prairie dogs are one of the major rodent species that serves as a reservoir for plague, and they tend to be west of the 100th meridian” in the United States. For this reason, this line of longitude is sometimes referred to as the “plague line”.’

What, will the line stretch out to th’ crack of doom? asks Macbeth. I suspect that he would have found it fascinating that any of us would think we might be immune from history. And yet, despite all its bad press and the terrifying epithet of ‘Black Death’, plague cases in North America are rare. They can occur when people visit rural areas, says, Dr. Adalja, an infectious disease specialist at the University of Pittsburgh’s Center for Health Security, although ‘people are more likely to be infected with tick-borne illnesses such as Lyme disease, than plague.’

Uhmm, I’d be careful with squirrels in California, though…

 

 

 

 

 

 

 

 

Advertisements

Kegel Exercises in Pregnancy

Okay, okay, I was wrong! It happens. Sometimes the brain gets in the way of scientific studies –prejudges them. Alters them in little ways so they do not conflict with its own opinions. Or, worse still, is influenced by a confirmation bias that precludes even the perusal of any information that makes it uncomfortable. The brain can be its own editor, redacting reams of otherwise useful knowledge, recusing itself inappropriately. None of us readily admit guilt in this respect, of course. In a sense, we are blind to it… or want to be.

I’m a gynaecologist as well as an obstetrician, so I have long been aware of the value of strengthening the pelvic floor muscles to prevent urinary incontinence amongst other things. There are a set of muscles –the levator ani muscles- that act as a kind of pelvic platform and help support the various organs that transit through the area, notably the bladder, uterus, and rectum. Exercising them was proposed by a Dr. Kegel in 1952, albeit to strengthen their ability to narrow the vagina and hence the ease of orgasm. I think a more frequently admitted use, is to reduce urinary incontinence, however. Indeed, to discover  the correct muscle for training, the woman need only attempt to stop her urinary stream and she has identified the correct one.

Prominent among the levator ani muscles is the pubococcygeus muscle. (The name merely describes where the muscle starts –the pubic bone, and where it ends- the coccyx, or tail-bone. On its journey, it wraps around, first the urethra –the tube that empties the bladder-  and then the vagina, and finally the rectum, like a series of hammocks). The fact that strengthening it can constrict the vaginal diameter when contracted, has always been a kind of two-edged sword for those of us who deliver babies. On the one hand, there is some fairly longstanding and convincing evidence that it can indeed help to prevent the involuntary loss of urine (urinary incontinence). But remember that it not only helps support the bladder and its opening, it is also a hammock that supports and constricts the vaginal canal. Well, that’s what the baby has to squeeze through… So, does the one benefit become a detriment to the other? Are you robbing Petra to pay Paula?

I have to admit that I was one of the exercise skeptics; it made sense to me that the stronger the muscles that surround the vagina -the greater their bulk- the narrower and more difficult the passageway for the baby to pass through at delivery. At the very least, I reasoned, it would take a greater effort on the part of the mother to force her baby through. And all this at a time when she is already exhausted from her labour. Maybe it would make more sense to work on strengthening those muscles in the weeks and months after delivery. Everything in the area was stretched or torn from the effort of actually pushing the baby’s head out, so perhaps the benefits would accrue if those muscles were strengthen then –a sort of postpartum rehabilitation.

In other words, would strong pelvic floor muscles increase complications in either labour or birth? Would there be a higher incidence of Caesarian Sections, for example? Or the need for episiotomy (cutting the skin at the opening of the vagina) to allow more room for the baby’s head to descend? Would there be a greater need for so-called operative delivery (forceps or vacuum extraction)?

Well, here’s where the information from large studies are more helpful than personal experience. Each of us carries a bias –acknowledged, or buried deep within our own reminiscences of similar situations. If I, for example, believe that the Kegel exercises are a hindrance to normal delivery, I am more likely to remember any episodes in my career where that might indeed have played a role –unaware, or maybe conveniently forgetting  (or not even asking about) times when it didn’t. Confirmation bias again. Limited, or selective, observations are not necessarily a valid reflection of the collective reality. They amount to opinions, not proof, and carry only as much weight as the prestige of the propounder allows. In my case, it was never very much…

The benefit of Kegel exercises in pregnancy remained somewhat controversial in the obstetrical community –at least amongst us iconoclasts- until some Norwegian researchers, notably Kari Bo at the Norwegian School of Sport Sciences, decided to investigate it in a large group of women (18,865 primiparous women) who practiced Kegel exercises at various frequencies per week during pregnancy. The group then looked at the outcome and complications of their labours and deliveries. http://www.ncbi.nlm.gov/pubmed/19461423 There was no difference in outcomes between those who did Kegels religiously in pregnancy, and those who did not. Presumably, the pelvic floor muscles –as strong (and bulky?) as they had become- were able to relax enough to allow normal passage of the baby.

I learned a lot from that paper –and a lot about the way my beliefs interpret my experience. A lot, too, about the way many of us travel through our lives, influenced as we are by only limited familiarity or exposure to events, and drawing perhaps unwarranted –or at least unproven- conclusions from them. And although it is inductive reasoning with all of its inherent uncertainty, deriving conclusions that are reliable and from sufficient observations can be a problem. Generalizing, in other words: probabilistic forecasting from limited available data. An example sometimes given is: all the swans I’ve ever seen have been white, so therefore it would seem reasonable to conclude that all swans must be white… until, that is I see a black swan. Obviously, any one person’s experience must be limited, so any conclusions derived from them, must also be limited.

All generalizations are false, including this one, as Mark Twain famously observed. I’m not sure I’d go that far, though. I think George Bernard Shaw was closer to what I have learned about depending on one’s own experience to the exclusion of competing views: Beware of false knowledge; it is more dangerous than ignorance.

The Peanut Trap

You know, there are times when the cart should precede the horse and not follow blindly behind it along the same old paths. We are too often seduced by the roads that others have made simply because we know where they go and what we might reasonably expect to encounter along the way. The problem, of course, is that another reasonable expectation of taking the same route is ending up in the same place.

Sometimes the detours that others have taken, whether through ignorance or design, have ended up in far more interesting places. More significant destinations. We might do better if it was them we followed, not the horse.

Childhood allergies have always puzzled me: are they all genetically determined, or is there something else going on? Because their prevalence seems to be increasing –especially peanut allergy. As an editorial in The New England Journal of Medicine (Feb.26/15) suggests: ( http://www.nejm.org/doi/full/10.1056/NEJMe1500186 ) ‘In the United States alone, the prevalence has more than quadrupled in the past 13 years, growing from 0.4% in 1997 to 1.4% in 20081 to more than 2% in 2010. Peanut allergy has become the leading cause of anaphylaxis and death related to food allergy in the United States.’ But why?

As the editorial goes on to say: ‘In 2000, largely in response to outcomes reported in infant feeding trials conducted in Europe and the United States, the American Academy of Pediatrics (AAP) recommended that parents refrain from feeding peanuts to infants at risk for the development of atopic disease until the children reached 3 years of age.’ And yet despite that, the number of cases of peanut allergy still continued to rise…

So, ‘In 2008, after reviewing the published literature, the AAP retracted its recommendation, stating that there was insufficient evidence to call for early food avoidance.’

Then, a fortuitous observation: ‘Du Toit et al.6 noted that the prevalence of peanut allergy among Jewish children in London who were not given peanut-based products in the first year of life was 10 times as high as that among Jewish children in Israel who had consumed peanut-based products before their first birthday. In addition, subsequent studies that evaluated the early introduction of other allergenic foods, including egg7 and cow’s milk,8 showed that earlier introduction of egg and milk into an infant’s diet was associated with a decrease in the development of allergy.’ A different path.

The subsequent study, known as LEAP (Learning Early About Peanut allergy), was a truly courageous and, dare I say, a lodestar one: http://www.nejm.org/doi/full/10.1056/NEJMoa1414850

The investigators hypothesized that early introduction of peanut-based products (before 11 months of age) would lead to the prevention of peanut allergy in high-risk infants. More than 500 infants at high risk for peanut allergy were randomly assigned to receive peanut products (consumption group) or to avoid them (avoidance group). Approximately 10% of children, in whom a wheal measuring more than 4 mm developed after they received a peanut-specific skin-prick test, were excluded from the study because of concerns that they would have severe reactions. At 5 years of age, the children were given a peanut challenge to determine the prevalence of peanut allergy. The results are striking — overall, the prevalence of peanut allergy in the peanut-avoidance group was 17.2% as compared with 3.2% in the consumption group.’ As a result, the study found that ‘the early introduction of peanut dramatically decreases the risk of development of peanut allergy (approximately 70 to 80%).’ The immune system –and what it considers ‘foreign’ and hence dangerous- develops early in life. Early exposure to something may therefore render it more acceptable –not an allergen…

But, as in all scientific inquiries, one has to be careful not to generalize the results too liberally. Perhaps, despite all their precautions to be representative and mindful of conditions that may differ among populations sampled for the study, it may not obtain universally. For example, the editorialist points out: ‘Should we recommend introducing peanuts to all infants before they reach 11 months of age? Unfortunately, the answer is not that simple, and many questions remain unanswered: Do infants need to ingest 2 g of peanut protein (approximately eight peanuts) three times a week on a regular basis for 5 years, or will it suffice to consume lesser amounts on a more intermittent basis for a shorter period of time? If regular peanut consumption is discontinued for a prolonged period, will tolerance persist? Can the findings of the LEAP study be applied to other foods, such as milk, eggs, and tree nuts?

All good questions –and ones, it should be noted, that troubled the authors sufficiently that they have designed a followup study to assess: ‘The question of whether the participants who consumed peanut would continue to remain protected against the development of peanut allergy even after prolonged cessation of peanut consumption requires further study and is under investigation in the LEAP-On study (Persistence of Oral Tolerance to Peanut; ClinicalTrials.gov number, NCT01366846).

I am impressed with the study because of the determination of the investigators to journey down a road not well travelled -in this case the sure and certain one that Society had come to believe: that you should avoid all possible allergens from birth or risk irreparable damage to your child. Once something has been generally accepted, to transgress, to wander from the road, is anathema. Folkways have a habit of becoming incontrovertable Laws. Unexaminable tenets. Incontestable. Indisputable.

We humans are like that: we follow the horse because it’s easy. And if the route it takes is falling into disrepair, we merely shrug because we know we are on the road we are supposed to be on. The one everybody knows is safe… But it is a trap -almost like that apocryphal conditioning experiment with pigeons: when one happened to turn around a few times before the food arrived, it continued to do so each time before it got fed… Why take any chances?

In the case of allergens, it seems less silly, though. Less arbitrary. If one child develops a severe allergy to, say, peanuts, most people would feel it would be reasonable to avoid exposing the next child to peanuts. It may not work, but why take any chances? To do any different would be madness. Everybody would agree with that…

I am reminded of something Mark Twain once said: ‘Whenever you find yourself on the side of the majority, it is time to pause and reflect.’ There is, no doubt, a degree of wisdom in the crowd, but we must beware conflating the wisdom of our crowd, with its blindness.