Human Embryology -Ontologically Awkward Thoughts

Ontogeny recapitulates phylogeny: foetal development goes through the same stages as evolution would have -Haeckel’s theorem. It’s a phrase -one of the few- I remember from my embryology classes in medical school. It’s a bit simplistic, of course, and long since discredited but nonetheless illustrative of some evolutionary phylogenetic similarities. It demonstrates the potential for variation in the developmental effects of various controller genes. It also demonstrates just how far embryology has come since the days when lecturers were forced to bore their classes with soporifically-presented unpronounceable and mysterious Latin names.

In those earlier than early days, I had no inkling that I would end up as an obstetrician and be involved, however vicariously, in the process. Unfortunately, I used to fall asleep in the embryology classes -it’s where I learned to nap with my head carefully balanced upright on my shoulders. I didn’t care what I was missing; in those years, I suspect it was not very much. Nowadays, though, everybody seems to be curious, and I often get asked about baby development. Happily, I don’t have to resort to showing the parents my old, retired embryology text book with its overly complicated diagrams that puzzled me even then. Now I have a colourful chart on the wall of my examination room that illustrates -simply, and in an appropriate, real-life size- the developing foetus at various gestational weeks.

Those diagrams don’t even attempt to show any of the amazing features in evolutionary development, though, so I am constantly on the lookout for aspects of it that were not appreciated in my student days. Of course, in those heady times I well might have missed some stuff…

For example, the ear-bones of mammals (humans are mammals, remember) have an interesting history. There are three tiny little bones in the inner ear -ossicles- that serve to modulate any sound energy arriving there and to dampen the vibrations from the ear drum. Reptiles and various other creatures only have one, the stapes (‘stirrup’ in Latin). Where did our two extra bones come from? Well, those same reptiles (also including birds, dinosaurs, etc.) have a few extra bones in the joints of their jaws which we mammals have co-opted for our inner ears: the incus (‘anvil’ in Latin) and malleus (hammer). Nature doesn’t waste things -the innovation is called exaptation: the process of using structures that have already been introduced by evolution and re-purposing them. And yes, I guess that means we have to chew differently than dinosaurs, but hey…

Other things persist in us with diminished or unknown function. I’m thinking here of such structures as the appendix. Most of us only know about it if we have to have it removed. Mine had to be taken when it ruptured in first year medicine while I was in the anatomy lab -my only regret was that we hadn’t toured that far in our cadaver yet so it was still terra incognita for us. But we all figured it was once a digestive organ (I mean look where it is, eh?). There is a suggestion (C. Choi in Live Science, 2009) that even now, the appendix might contain important bacteria that can reconstitute gut flora after such things as antibiotics have changed the normal distribution and population of the micro-organisms. We know that normal bowel bacteria have many functions, including a role in the immune system, so it would be nice to think the appendix might help refurbish our defenses. Re-arm us. Unfortunately, that may mean I’m all on my own now… Naked, sort of.

And, of course, there is the caecum (Latin for ‘blind pouch’) to which our appendix is attached. In some herbivores it’s much larger and it is a hotbed of bacteria that can hydrolyse cellulose. Without them, grass is about as nutritious as cardboard and with our little dead end package, I’m assuming it’s why we no longer enjoy grazing very much.

I’m intrigued by the coccyx as well. It is the lower end of the spinal column and is what forms the tail in other animals. Believe it or not, the human foetus has a little tail between about 31-35 days into development (I had to cull this one from Wikipedia, I’m afraid, because my old embryology text was fashionably non-committal on the subject. Uncomfortable, even.). I also learned from the same source that because it is where some muscles still attach, the coccyx hasn’t yet disappeared entirely. Maybe it’ll be gone in a few millennia -if we’re still extant as a species, that is… and don’t discover that we need a tail again.

Finally, I have to confess I have a personal interest in wisdom teeth. Since I don’t have any, and never did, I have always felt cheated. Diminished, as it were, by the thought that I was given neither the wisdom to which I, as a human, was entitled, nor the teeth as a little bonus. A dentist once tried to reassure me that not everybody has them and, in an attempt to impress me that she had recently graduated with honors from dental school, that it is probably related to the functioning of a gene called PAX9. This only depressed me further, however, and immediately suggested to me that either I didn’t get the gene, or that mine was malfunctioning. She tried a different tack (attempting, no doubt, to distract me from what she was doing in my mouth). Third molars (she had to explain that she was using dental-speak for ‘wisdom teeth’ when I started to salivate excessively) were probably how our ancestors dealt with plants they were eating. They were grinders. But I was lucky, she continued, probing deeper into my gum, because we changed our diets and evolved smaller jaws. So nowadays there’s no room for the extra teeth. They are vestigial and have to be pulled out, in other words. My persisting skepticism must have been obvious, however, and I could see her mask change shape. Given the drill in her hand and the impatience in her eyes, I pretended to be convinced and dropped the subject. I still worry, though.

Embryology, like its eponymous subjects, has evolved over the years. It has become more interesting and certainly accumulated more frills. DNA and ever-advancing technology have dragged it into social media and otherwise polite conversation -our ancestry titillates us like never before. I mean I could probably even stay awake in the classes nowadays. But all the same, the skill I developed for napping unawares shouldn’t be dismissed as vestigial; it is still useful. I look upon it as an exaptation of my neck muscles and inner ear, an evolutionary strategy for surviving some of the appurtenances that continue to bedevil us in the modern age: there are still concerts I would rather not attend. And as for meetings…

 

 

 

Are Human Rights Contingent?

Here’s a question: are human rights contingent or emergent? In other words, do they depend on the circumstances and the cultural milieu? Do they depend on the context in which the events in question are imbedded, and maybe even the time-frame –the Epoch? Or, do they exist a priori and exist no matter what the circumstances and so transcend any particular event? An existential consideration, to be sure, but an important one.

If I read history correctly, human rights likely began contingently. Indeed the very word ‘right’ suggests their origin as something granted to someone under specific circumstances. They were guaranteed to someone by whoever possessed sufficient power to do so. And as time passed, these rights grew –evolved- and the context enlarged. So, when the power became invested in the populace at large, the rights took on a life of their own and seemed to be  inalienable and self-evident -no longer granted, but obviously existing: they were emergent.  They accreted new considerations and enlarged, and are evolving still.

Rights continue to be a moving target. As societies mature, their foundational principles come to be read in a new light. Emergent or contingent..? Alas, as we add new limbs to the ever-evolving animal, we tend to be revisionist and judgmental about its more embryonic forms, even though we’re seeing them through modern eyes, and modern prejudices…

And yet some things seem almost suited to temporal bigotry. I’m thinking here of the current controversy over Trinity Western University –a private, Christian institution- and its decision to open a law school: http://www.huffingtonpost.ca/2014/04/11/trinity-western-university-law-school-bc_n_5134482.html  This came on the heels of a 2009 decision by the Supreme Court of Canada overturning a challenge by the British Columbia College of Teachers who claimed that the university shouldn’t be able to grant teaching degrees because it opposed gay unions.  And, as the article outlines: at the time, students were required to sign an agreement not to engage in activities that were “biblically condemned,” including “homosexual behaviour.”  The agreement is now known as the ‘Community Covenant’.

The controversy over the proposed law school is not so much the result of any official opposition -the Law Society of British Columbia, the Federation of Law Societies of Canada, and B.C.’s Advanced Education Ministry have all given at least preliminary approval- it is rather that the gay and human rights community questions the suitability of the university to teach supposedly neutral and unbiased Law, when it seems to have a fundamental difference in values from much of  the society outside their walls. The Huffington Post article reports that the president of the university, Bob Kuhn, is up front about it: Kuhn said prospective students aren’t asked about their sexual orientation during the application process, and he said all students are welcome at the school — as long as they agree to abide by the community covenant. If a gay or lesbian or bi student wished to come to Trinity Western University and wished to comply with the community covenant as it’s written, then there’s no problem,” he said.

True, students that disagree with the tenets of the university need not apply –there is no compulsion at stake -merely a principle. But isn’t that what should matter the most? Surely those who accept principles that discriminate against minority positions must not expect neutrality at the end. If nothing else, the example it sets is not a balanced one.

My worry is not that an excellent academic institution like Trinity Western would not competently and rigorously adhere to all legal requirements, but more that the underlying spirit of its presentation would be framed in a context that does not reflect the diversity of the multicultural society it must serve;  it is a madrassa that pretends inclusivity.

I’m sure that a graduate of the school would attempt neutrality in the real world. After all, she would be free to refuse a case if it conflicted with her values and refer it to a colleague who perhaps would view the issue in a different light. The law can be a demanding master.

And yet I can’t help but wonder if those less than democratic values that necessitated a ‘community covenant’ are the thin edge of a wedge that threatens to skew the defence of what we have come to regard as basic societal rights and nudge them into an unspoken yet contingent position again. Rights are fragile enough as is. Although guaranteed in law or constitution, they still require more than fear of punishment to ensure they are respected. We all have to feel they are important. And not just when there is no cost to us or society at large. Sometimes there are sacrifices required; sometimes we have to swallow our distaste and step back a little to think about the ramifications of not applying the right equally and under all circumstances.

But rights are about justice, not punishment; laws are about consensus not fear. And neither rights nor law should live in fear of circumstance. Even Shakespeare recognized this so many years ago and in such a different time:

We must not make a scarecrow of the law,
Setting it up to fear the birds of prey,
And let it keep one shape, till custom make it
Their perch and not their terror.

 

 

A Medical Chinese Curse?

Change. We are condemned to live in interesting times, as the Chinese Curse purportedly observed -although there seems to be no evidence that there ever was such a curse, nor does anyone appear to have any idea what it means… But I have always assumed that it had to do with change, and our sometime antipathy to it. Of course things have always been in flux but its only over the last hundred years or so that it has seemed exponential. We’ve had to accustom ourselves to a continuing and accelerating change and have come to expect that next year -if not tomorrow- may be significantly different from today. But although gradual change is readily assimilable, when the difference is abrupt or requires a significant adjustment we often rebel. Habits die hard. After all, an assumption of predictability and stability is what allows civilizations to function, groups to cohere.

It is under just such conditions that a fundamental dichotomy arises, however: knowledge is the enemy of stasis and progress requires modification, however incremental –a sea change into something rich and strange, as Shakespeare wrote. Change often comes upon us like contagion on the wind: pollution with an unknown virus. An unexperienced plague for which we have no defence. No immunity. And yet like the Siren’s song that lured ancient mariners onto rocks, it is seductive.

It’s hard to know what to make of Change. Not all of it is good; some of it is mere revision. As the poet Robert Frost once observed: ‘Most of the change we think we see in life is due to truths being in and out of favor’. Some of it, however, inevitably represents real advances, or even revolution -think of the concept of the paradigm shift popularized by Thomas Kuhn in his nineteen-sixties book The Structure of Scientific Revolutions. Something is conceived that is so new, so different from what we had come to believe, that it turns our belief system inside out.  The change, to be adopted, has to offer distinct advantages over the old, of course; it has to be worth making the effort. Reinvesting. But it is not without a struggle from those with vested interest or careers dependent on the old knowledge…

There are many such changes occurring in Medicine. Some are lauded and universally appreciated: a new treatment for cancer, say, or a fresh insight into the cause of a disease. Others, seemingly trivial, go unremarked -or at least unflagged until pointed out. And yet they may represent paradigm shifts in their own right. For example, new ways of looking at the problem of hospital infections and the spread of resistant bacteria. Hand washing and alcohol-containing lotion dispensers situated outside each patient room along the corridor are being universally adopted and are an important component in the containment attempt.

But there is another approach that, now that I think about it, should have been equally obvious. I first read about it being mandated in some UK hospitals and filed it away somewhere as being a good idea. Now it is being considered here in North America, and none too soon:

Hang Up Your Lab Coat (What Not to Wear — for Patient Care) 

It’s so obvious when you stop and think about it, isn’t it? What is worn from room to room, brushing against patients, rubbing on bedclothes, and stained by anything and everything that it touches? The white lab coat, of course! We see them so often in our hospitals, we’ve come to expect them. And we all know who wears them: lab techs shuffling along the halls, doctors hurrying from room to room, senior nurses… its a virtually ubiquitous sight in a hospital. An expectation. And yet, no matter how often and diligently the doctor -or whoever- washes his hands between patients, no matter how devoted to cleanliness, no matter how motivated, if he drags his lab coat -his uniform– from room to room, he’s like a germ duster. A fomite. A Johnny apple-seed for our times. And the admonishment of Bare-below-the-elbows, as the link suggests, makes sense too: it assumes the lab coat has been hung up outside the door. But it’s also one of those things that is clear in retrospect, but almost invisible unless pointed out.

Consider it pointed out.

 

 

 

 

 

A Picture is worth…

Communication -explanation- in Medicine is so important that one might even consider it paramount. Except in circumstances where the patient receives a treatment of which she is unaware because of an accident or the severity of the illness, her understanding of the reasons for the therapy and the side effects it may engender often determines whether or not the prescribed regime will be followed – and sometimes whether or not it will even have the desired effect. A cure is not necessarily the same as the elimination of all symptoms: think of eliminating a headache only to be left with a fuzziness in the head, a feeling of fatigue, a ringing in the ears… In order to adjust her expectations appropriately and help her to understand what the concept of ‘cure’ might entail -in order to understand what it is not, in other words- there must be an explanation she can understand -especially if there is a process or mechanism she can visualize. An anatomical correlate with which she can identify. An algorithm, even. In many instances, this necessitates the use of diagrams.

The Diagram has always fascinated me; its etymology (loosely rendered as ‘explanation by lines’) less so: there is a magic in its enlightening power that transcends mere lines, outstrips even the most eloquent vocabulary. It is the Word Incarnate, as it were. And yet there are problems: we each see the world through different eyes. The past influences the present; so does fear… And although diagrams are often drawn to allay anxiety, sometimes they merely distract. Whatever I’ve drawn is open to interpretation and confusion if it is not both clear and commensurate with her own notion of what her internal organs look like. And most people have no idea… I sometimes show my post-op patients  pictures taken during their operation -the ovarian cyst that I have removed, for example, or the spot of endometriosis I have coagulated. Unfortunately I suspect that for most, it is a ‘Where’s Waldo’ puzzle -everything is strange but similar and mixed together randomly; for some, an ovary is merely a white meatball in a bowl of extra large fettuccine noodles. Or sausages, maybe. We see what we are used to.

The  idea that my diagrams may not be adding to someone’s understanding of their condition is uncomfortable for me, though. Anathema, maybe. I mean, my uterus looks really similar to the pictures I’ve studied, and I can draw a fairly recognizable Fallopian tube. But if the patient has never seen an ovary, or maybe even thought about a fibroid, the drawing may be totally devoid of meaning for her. She might nod politely as I doodle on about how the end of the Fallopian tube grasps the ovary like little fingers picking up a ball, and hear me explain that it’s why they call them fimbria: Latin for ‘fingers’… but if she wasn’t linguistically inclined -or was simply nervous- the words might still be meaningless. Unhelpful. Perhaps all she was really seeing were two wiggly lines becoming ragged and imaginatively bankrupt at what might be a ball, or a poorly drawn circle. And since she would know that an ovary should contain eggs she might be wondering why I didn’t draw them as well. Two people crossing a bridge and each seeing a different bridge. Two worlds, two paradigms… one diagram.

I was reminded of this the other day when a patient that I had not seen for some time, immediately smiled when she saw some of the photocopied diagrams I keep on my desk for immediate explanatory reference. The one most visible is of a uterus with its two Fallopian tubes arching conveniently far from its side like arms from shoulders. Of all my reference diagrams, I have always been attracted to its simplicity. That it is unmistakably a uterus with Fallopian tubes I thought was obvious; it is so utterly characteristic and self-explanatory there is nothing else it could be mistaken for.

“I see you still have the drawing of the cow,” she said matter-of-factly, and smiled again.

“Pardon me?”

“The cow,” she said, rolling her eyes as if I’d have to be blind not to see it. She sat back in her chair for a moment, to give me time to follow her words, but when I didn’t say anything, she leaned forward again to study it more closely. “Well, I suppose it could be a goat, or something, but…”

She tried, unsuccessfully, to disguise a sigh but when she saw me staring at her, she shrugged and pointed to the uterus. “The head,” she said slowly and carefully, so I could follow her finger on the diagrammatic womb. “And here are the horns.” She enunciated clearly, like a teacher explaining a difficult concept. “See? The tubes are like long, skinny horns sticking out of the top of the head…” She smiled, obviously proud of her explanation.

Two different bridges, I suppose. But the converse can also obtain: the bridge sometimes determines who crosses it. Another patient I hadn’t seen for a while -not the same one, I don’t think- brought in a beautiful black-and-white photo she’d taken of a mountain range. She seemed quite excited about it and immediately plopped it on my desk as she sat down.

I looked at it and smiled. “Beautiful picture,” I said, somewhat taken aback by the irrelevance of its arrival on my desk. I even tried my best to look grateful, in case it was a present. “Did you take it?”

She smiled, but I couldn’t help but notice that behind the smile was a hint of disappointment.

“Can you see them?” she asked, hope creeping into her voice. As if maybe I could redeem myself.

“Uhmm, well I really like the shadows of the trees… Sort of like an Ansel Adams photograph,” I added lamely. “Incredible detail.” I was staring intensely at the picture, not sure what more I could add.

“But what do you see?”

I felt like a child on a school tour of an art gallery. Evidently there was more to the picture than the trees and the mountains. I’d been really proud of my mention of the shadows, but I’d obviously missed the mark.

“Breasts!” she said finally, exasperation evident in the tone of her voice. She pointed to a couple of peaks in adjacent mountains, and waited for any sign of recognition in my face. I suspect what she wanted was a show of admiration for the perspicacity required for her to spot the resemblance.

She was, I knew, a visual artist and becoming quite well known. I tried to pretend I saw the breasts, but the trees kept bringing me back to earth. And for some reason, all I could think of was my cow diagram.  In a feeble attempt at humour I told her I saw a cow -a feeble attempt to diffuse the situation, really.

“A cow?” Her eyes widened in admiration. “Really..? Where?”

I sort of randomly moved my finger around the photo, pausing on a tree after skimming over a rather angular mountain.

She sat back, clearly impressed. “You know it’s really amazing how we all see different things, eh?”