An Obstetrical Edition

Miscarriages –early pregnancy losses- have long been the subjects of research. They are unfortunately all too common, and until very recently, we were only aware of those that occurred after a noticeable menstrual delay –the tip of the iceberg, in other words.

Some progress has been made in understanding why they occur, of course –random genetic mistakes either from development, or from abnormalities in the sperm or egg DNA that happened to be involved, for example. But this type of knowledge is often after the fact -insufficient to predict or prevent the problem, although with in vitro fertilization (IVF) there are often techniques available to detect genetic flaws and guide the choice of fertilized egg to be implanted. This does little to address the issue in the much larger population attempting pregnancy in the more traditional, unaided fashion, however.

I was therefore intrigued by an article in the BBC news: http://www.bbc.com/news/health-35301238 that outlined a proposal to genetically modify some human embryos (not for implantation, be aware) to ‘…understand the genes needed for a human embryo to develop successfully into a healthy baby.’

I realize that, at first glance at any rate, this proposal seems to cross a boundary that has been hitherto sacrosanct: experimenting with human embryos. It seems to trespass on at least two traditional shibboleths. The first one –the more problematic and dogmatically based one- is that from the moment of conception, the embryo –or morula, once the fertilized egg has divided into 16 cells- is a person, or at least entitled to all the respect and privileges of a human being. This is more of a belief, a religious or moral tenet, than a demonstrable attribute of the embryo at this stage, though, and a more neutral consideration of its personhood would have to rely on either arguments from potential or its ability to survive outside of the uterus, should that be required.

The other, and maybe less religiously coloured objection, is the issue of unintended (or even intended) consequences: that to interfere with human DNA is to interfere with humanity itself and perhaps even the reason we are as we find ourselves –evolutionary adaptations that are the solutions to myriad problems of which we may be only dimly aware, if at all; and that we don’t really understand what we’re doing –or how to do it safely –i.e. without inadvertently affecting other things, even if we did. Like any ecosystem, everything is interdependent in one way or another: solve one problem and perhaps create another that you might not have even suspected was being modulated by the initial problem.

This, of course, is the thrust of the UK proposal. One can reasonably study animal models –mice, for example- only if they have comparable genes for early embryologic development. And as Dr Niakan, from the Francis Crick Institute, said: “Many of the genes which become active in the week after fertilisation are unique to humans, so they cannot be studied in animal experiments.” Initially, the study could have more benefits in IVF work – ‘Of 100 fertilised eggs, fewer than 50 reach the blastocyst stage, 25 implant into the womb and only 13 develop beyond three months…’ “We believe that this research could really lead to improvements in infertility treatment and ultimately provide us with a deeper understanding of the earliest stages of human life.”

Convinced? It’s a difficult one, isn’t it? Clearly, we need to understand how things work (as the study proposes) long before we attempt to modify them in any way. And if gene editing on a human embryo can be done, it is inevitable that it will be done by someone, somewhere, but perhaps with less stringent rules and guidelines to constrain it. So, should we just bite our collective tongues, and bow to progress? And is there really a choice?

I’m not sure where I stand on the issue of genome editing; I don’t think there is a one-size -fits-all solution, but I do think there is un bel compromis. The issue must be kept open for discussion, made public, in other words, so that at the very least it is not perceived as being done in secrecy and without identifiable or appropriate input. The pros and cons must be aired and in terms that all can understand. And the opinions of all of the various interest groups -both religious and secular- should be publicly and repetitively solicited. The left hand must know what the right hand is doing.

No, there is unlikely to be consensus; people will divide along predictable lines as I have suggested, but at least there will be a chance for an airing of the arguments, and an assessment of their merits or deficiencies that is available to all who care –a public catharsis. A mitigation…

But in the end, I think we must always be mindful of the dangers that Shakespeare intimated in his Much Ado About Nothing: ‘O, what men dare do! What men may do! What men daily do, not knowing what they do!’

 

 

 

 

 

 

 

 

Food for Thought

There’s something encouraging about the fact that we are not simply our genes. We’ve moved on -evolved, I guess. They are still the recipes, the instructions, but as every chef knows, you don’t always have to include all of the ingredients to get a good result. Genes are perhaps more akin to a first draft for a project. Suggestions. Options. They are, in effect, travel guides -road maps- that tell you what you could do and how you might go about doing it, but although the tickets are bought, you don’t have to get on the bus. Who we are –what we are- is not as pre-ordained as we previously thought. Just because there is a light switch on the wall, doesn’t mean it has to be turned on unless it’s needed. There is a mechanism, as Wikipedia puts it, for ‘cellular and physiological phenotypic trait variations that are caused by external or environmental factors that switch genes on and off and affect how cells read genes instead of being caused by changes in the DNA sequence.It is called epigenetics.

Genetic evolution usually takes a long time –often a very long time- and circumstances can arise that were not originally anticipated. But there are several mechanisms to silence or inhibit those genes from carrying out their initial instructions and these allow extra opportunities for an organism to survive and adapt to circumstances perhaps not present during its initial evolution. Unfortunately, it can be a two-way street…

Food, food, glorious food -well that’s how I remember the words anyway (apologies to Flanders and Swann). It’s something that is often as pleasant in retrospect as it’s anticipation is in prospect. Something that transcends the here and now. Like culture, it involves feelings and judgments. It is a part of the fabric of our realities, part of the habits that are difficult to change without conscious effort and strong motivation. We wear our preferences as uniforms -identities. Food is not simply what we consume -it reflects a train of thought. There are allegiances, unspoken loyalties that pass from generation to generation. And it is often how others see us -evaluate us. To change or vary, risks awkward questions at the very least. So it’s fascinating to reflect on the importance of food in defining not only who and what we are, but also on it’s influence on what we might become. And what our children might become as a result…

It is not a trifling matter. Food has always had a central role in culture and what a mother eats in her pregnancy has long inspired myths about the child she will deliver. Famines have been instructive: in more recent times, the Dutch Famine of 1944 during World War ll led to intrauterine growth restriction and subsequent chronic diseases such as coronary heart disease later in the offspring’s life.

A similar twist on the importance of prenatal nutrition was highlighted in an article in BBC News: http://www.bbc.co.uk/news/magazine-34222452  ‘.A team from Britain’s Medical Research Council, which has been collecting data on births, marriages and deaths in Keneba since the 1940s, discovered some years ago that in this part of The Gambia when you are conceived makes a huge difference to your chances of dying prematurely.’

This seemingly bizarre finding is corroborated in animal experiments in which, ‘it is possible to make the genes in an embryo more active, or turn them off entirely, simply by varying their mother’s diet.’ And indeed, as the author explains, ‘the studies done in The Gambia certainly provide compelling evidence that these so-called “epigenetic changes” may also happen in humans in response to a change in diet. That if, during very early development, a mother eats a diet rich in leafy green vegetables, then this will change forever just how active some of her child’s genes are.’

There are other epigenetic ramifications that are also important: this ‘happens through a process called methylation and researchers in The Gambia have recently shown that babies conceived in the wet season have very different levels of activity of a particular gene that’s important for regulating the immune system. As Matt Silver, part of the MRC team, says: “Variation in methylation state in this gene could affect your ability to fight viral infections and it may also affect your chances of survival from cancers such as leukaemia and lung cancer.”

Prenatal influences are far greater than we had ever suspected; we were naïve indeed to feel that the importance of diet was primarily to provide nutrients for the developing fetus -ingredients for the recipe. We were too narrow in our conceptions. Too dull, maybe. There is so much about the world –about ourselves- that we are only beginning to understand. We truly live in exciting times… and yet it has always been exciting times for those interested enough to open their eyes hasn’t it? We’ve always lived at the edge of some river or other. It merely takes someone curious enough to travel down it. As Shakespeare said: We know what we are, but know not what we may be.

 

Non-Invasive Prenatal Testing

Chromosomally derived anomalies have been with us for millenia –maintaining structural and functional integrity is obviously difficult when you think about it. We humans have 46 chromosomes that must continually divide and reproduce unerring copies of themselves as they issue unique and contextual instructions for cell development or maintenance.

The functional components of chromosomes are called genes and we have around 20,000-30,000 of them, each one built from smaller base pairs like words in a sentence. And depending on the chromosome, each strand of DNA has well over a hundred million base pairs to supervise. It doesn’t take too much imagination to realize that rearranging words in such a sentence, or letters in each word, can alter its meaning. Jumble its information – or even destroy its function…

Throughout recorded history, there has been a recognition that some individuals lacked the same intellectual or emotional attributes as the rest of the community, and yet these people still had a role in the society. They were tolerated and often cherished members of the group and contributed to the weft and warp of the social fabric. Every town had its Village Idiot, to use the ancient (and non-revisionist) term; every village had its special people…

It would seem we live in different times, however, and social values have shifted; there is an expectation of normalcy, if not perfection, in our offspring. The current thrust is early –prenatal– diagnosis of suspected anomalies so that the expectant parents can choose whether the issue lies within their comfort and capability zones.

Prenatal testing has undergone many sea-changes over the years as technology and attitudes have goaded each other. Early tests sought to detect only the most frequent genetic anomaly: Down syndrome –or trisomy 21. As time and ability progressed, more genetic abnormalities have received similar surveillance.

But accuracy of prediction has come under scrutiny of late. It is no longer acceptable merely to arouse suspicion of an abnormality. False positives (thinking the anomaly is present when it is not) and false negatives (not detecting the anomaly) each have their own consequences. Risk of error, in other words, needed to be minimized if decisions were to be reliably dependent on the results.

In Canada, there are currently three (and now four –the subject of this essay) options for prenatal screening of genetic abnormalities –still largely for trisomy 21 because it is by far the largest component of the pool of abnormalities:

  • First trimester screening –done between 11 and 14 weeks gestational age with a detection rate of 87-90% and a false +ve rate of 5%
  • Integrated prenatal screening– consisting of two parts: the first one the same as with first trimester screening and the second between 15 and 20 weeks gestational age. This has a detection rate of 87%-95% and a false +ve rate of 2 to 5%
  • Quad screening– done between 15-20 weeks but with a detection rate of only 81% and a false +ve rate of between 5 and 7%

These results are pretty good and statistically acceptable –unless, that is, a mother has to make an irrevocable decision based on them. There was a need for even more accuracy –less risk- and so technology again rose to the challenge: the Non-Invasive Prenatal Test (NIPT). This is a blood test taken from the mother that measures her baby’s DNA that is floating free in the part of her blood called plasma. It is being continually released into the maternal circulation (with a half-life of around 16 minutes), so it’s an up to date survey of the foetus. There is maternal DNA there though, and the fetal fraction of it is usually about 10% so, to be sure the result is representative, the fetal fraction measured has to be at least 4%… Confused? Well, just remember that it is most reliably measured after 10 weeks gestation and with no upper limit of gestational age; that it has a detection rate of over 98% and a false +ve rate of less than 0.3% (I’ve taken these figures from the June 2014 edition of JOGC).

There are some caveats, of course –there always are- and seemingly a variety of iterations of what can be measured. But by and large it seems close to ideal: high accuracy with minimal if any risk to mother or baby. It is still recommended that a result indicating a chromosomal anomaly be confirmed with an amniocentesis (taking a sample of fluid from around the baby in the uterus) for confirmation, however.

So why don’t we fully embrace NIPT and relegate the other tests to history –tests that were helpful in their time, but indirectly naïve on sober reflection? Well, apart from the current high cost which might preclude its equal availability to all strata of society, there are other ethical considerations. And although these same considerations obtain with any prenatal genetic test, with NIPT these are largely attributable to its accuracy; one could foresee a time when the recommendation for a corroborative amniocentesis to obviate any risk of false positivity might be rescinded, further decreasing the time available for thoughtful and reflective parental decision-making.

Autonomy is the right of an individual to make informed choices for herself. But the key word here is ‘informed’. This implies that the information that informs her is both relevant and appropriate information. And yet, by necessity, it is provided and constructed by others; it is drawn from social and political contexts that she may not share and the options it provides may reflect this. Relational autonomy is an ethical theory that considers the ramifications of those choices that are made available to her. More traditional views have tended to treat the person making decisions as an isolated unit; but in fact, she is embedded in her own -perhaps differing- culture that influences both the context and the situation in which she has to make her decision.

We do not all react the same to identical information, nor is the ability to make an informed choice simply a function of the amount of information available. Women and doctors have different data priorities. Even different message priorities. We all need time to sift through the context; we need time for processing our feelings. Our needs. Our connection to the simmering culture in which we swim.

And then there is the issue of what we want NIPT to detect. Access to fetal DNA offers boundless opportunities in the future for singling out other aspects of the chromosomes we wish to interrogate –whether with serious concerns: hereditary conditions like cystic fibrosis, for example, or with more broad-based anxieties such as concern about random mutations

Other, more frivolous concerns such as sex selection or, in the forseeable future, even searches for –and hence management of- certain genetic traits, present a growing tension between individual autonomy and societal values. For that matter, even detection of the trisomies has engendered much controversy, let alone the prospect of finding and perhaps eliminating other abnormalities not shared by the majority. What is the expectation –perceived or otherwise- after an ‘abnormal’ test? And what is abnormal? What should we accept?

I suppose, ultimately, it is for each of us to decide. Of course Shakespeare offered his opinion long ago: Love looks not with the eyes, but with the mind. But are we still that wise? Or have time and circumstance changed that as well?