Like madness, is the glory of this life

My grandmother was old when she died -very old, in fact: she died on the morning after her 100th birthday party. Her congratulatory letter from the Queen -or at least someone official claiming to speak for her highness- came the day before. I’m not so sure it was congratulations, really -more a recognition that a member of the United Kingdom, albethey an émigré, had still remained loyal to her majesty and her dominions for a century.

My grandmother seemed to enjoy the party we held for her -she was all smiles and although she also seemed a bit confused by it all, she was delighted by the letter. It spoke to her of another life, I think -one that whispered the secrets of a little girl growing up in an English seaside town with a shingled beach and an amusement pier that offered tempting glimpses of a world across the sea -a world she couldn’t know would become her own for most of her life.

We all have lives like that -the present we currently occupy pales in depth, in colour, and even in meaning to the worlds we have tasted in our incomparably longer past. It only seems appropriate that when our brains tire of sorting through the tyrannies of the moment, we default to the myriad memories of what we lived. The past can be a comfortable place to rest -familiar, at the very least.

I loved visiting my aging granny -even in the hospital where she spent her final days she was always full of stories, full of wisdom, and full of wonder. And although often confused about current events, or what she’d had for breakfast that morning, her eyes would light up when I asked her to tell me about, say, her train journey across the country when she and grampa first arrived in the boat from England.

She would chuckle when she told me of the pioneer stoves they used to cook their food enroute, and how each time the locomotive stopped to fill the water in its tank, everybody would make a mad dash from the railway coaches to find wood and occasional supplies from the little stations along the way. Her eyes would twinkle as she relived the flavours of whatever food they’d had, and she would laugh at the difficulty of cooking on the ever-moving stoves. She had no trouble remembering how everybody helped each other -she even remembered some of their names after more than eighty years.

So whenever she seemed confused at my visits or flustered by my questions about her health, I would smile and settle in a chair beside her and ask her what she remembered about ‘the old days’ as she decided to call them. After all, I think she lived there most of the time -it seemed a place where she was happy. At any rate, it seemed to calm her, and allow her to speak to me as if she were still in the summer garden she’d loved to show me on my visits years ago to the house she and her husband had built near Vancouver. There seemed to be no disorder in the garden, no anxious  search for a constantly fading identity, nothing forgotten there -just flowers all around us, and birds singing in the bower of trees she’d planted so long ago.

She loved to speak from there, and even then -especially then- I was happy to sit there with her in her past. I lived happily in the two worlds, and she enjoyed meeting me there; like lovers we would float from dream to dream, escaping from the bewildering clatter of a crowded hospital ward. Who would not prefer her floral ‘then’ to her sterile ‘here-and-now’?

The staff told me of the problems with her confusion, and how she would sometimes wander off looking, as she told one of them, ‘for the garden’. And all the while around us, there were often moans and shouts, and irritable reactions to attempts to tame the ward. Sanity lay somewhere in the past -their patients’ past- but the department seemed hastily conceived as a holding area until beds became available in community nursing homes. Hospital was perhaps the wrong place for most of the elders -they were not sick except, perhaps, for home… or for something that reminded them of home, at any rate.

I have to say that I was pleasantly surprised to come across an essay on retrieving the autobiographical memory of demented seniors in Aeon: https://aeon.co/ideas/the-self-in-dementia-is-not-lost-and-can-be-reached-with-care

It was written by Muireann Irish, an associate professor of psychology at the University of Sydney. ‘Our autobiographical memory… seems crucial to weaving a life story that bridges past and present, and permits us to extrapolate how the future might unfold, all within a meaningful and coherent narrative. So what happens when the tapestry of memory begins to fray, and we lose access to defining memories from the past?’

There are many types of neurodegenerative loss -Alzheimer’s among them, of course- and it is progressive. ‘Gradually, as the disease spreads, more distant memories are affected, leading to patchy recall of self-defining events, such as one’s wedding day or the birth of one’s children.’ And without our memories, who are we…? ‘There remains a recalcitrant perception that in parallel with the progressive pathological onslaught in the brain is the inevitable demise of personhood, akin to a ‘living death’.’

But, viewing dementia like that is not only depressing, but incomplete, according to the author. ‘While the illness is devastating, not all memories are obliterated by Alzheimer’s, and much of the person’s general knowledge and recollection of the distant past is retained. There remains a vast repository of life experiences, personal history, stories and fables that endures, even late into the illness. At moderate to severe stages of dementia, activities such as art, dance and music therapy provide important nonverbal means of communicating and fostering social interaction even when, on the surface, many core capabilities might seem to be lost… As the disease progresses and their self-concept becomes more rooted in their past, people with dementia can feel increasingly divorced from their current surroundings, which no longer make sense or feel familiar. This is the catalyst for behaviours that are commonly couched as ‘challenging’, such as agitation, wandering, attempts to leave a care facility to ‘go home’.’

Irish suggests that instead of confronting the dementia with an enforced ‘now’, ‘a positive approach could be to create a ‘memory box’ in anticipation of the days to come. This could form a repository of photographs, keepsakes, newspaper clippings, objects with personal meaning, even fabrics and smells, that resonate with the person and provide an external memory store. Conversations regarding music and songs from the person’s formative years, and the memories that these tunes evoke, could inspire personalised playlists that foster social interaction and the springboard for reminiscence. For care staff, a memory store of this nature would be as important as taking a detailed medical history.’

As for my grandmother, I was happy to sit with her in her garden while she happily regaled me with stories of her past. And I’d like to think that after she received that letter from her queen, she retreated to the garden to read it again and again as her life washed over her like a cooling summer breeze, and the flowers whispered sweet nothings in her ear.

Age Prejudice

I suppose I should have seen it coming. I suppose I should have laid down firmer tracks, taken a more trodden path. I suppose I shouldn’t have been so influenced by Robert Frost’s poem, the Road Not Taken. But I was… ‘Two roads diverged in a wood, and I— I took the one less traveled by, And that has made all the difference.’ Well, at least I think I did -I didn’t mean to, or anything; it’s just something that, as I look back through the years, turned out that way.

All our paths are unique, to be sure, and it is likely the conceit of many of us to think that we, alone, have chosen singularities for ourselves –things, if not incomparable, then at least extraordinary: signatures for which only we could have been responsible.

So it is with some dismay that I came across a BBC News article describing a paper by William von Hippel, a professor of Psychology at the University of Queensland, in Australia in which he suggests that ‘although many people remain unprejudiced throughout their lives, older adults have a tendency to be more prejudiced than their younger counterparts.http://www.bbc.com/news/magazine-33523313

Older folks? Now wait a minute… As someone who did not suspect his way of life had ‘fall’n into the sere, the yellow leaf’, I feel almost blind-sided. Of course I didn’t see Harper Lee’s Go Set A Watchman coming either. I have been a life-long admirer of Atticus Finch –someone whose name I even bestowed upon a cat I once had- and Lee’s previous book To Kill A Mockingbird was always a beacon in troubled waters for me. Something that let me believe that we were more than our fists. That we could, if we chose, rise above the curses and anger that so frequently filled our streets. But even her older Atticus slipped beneath those waves it seems…

So is this how the path to maturity -the grinding road to wisdom- ends? An abnegation of all the tolerance that we have learned to espouse? An abrogation of all those principles we fought so hard to enshrine in law? A foretaste of the dark night of the soul that lurks, still hidden, behind the shadowed corners of our ever-increasing ages?

In fairness, if you read the original article in Psychological Science https://www2.psy.uq.edu.au/~uqwvonhi/S.vH.R.PS.09.pdf it attempts to frame the problem as a sort of neurologic deficit, suggesting –that age differences in implicit racial prejudice may be due to age-related deficits in inhibitory ability- perhaps absolving us elders of ultimate responsibility. As in dementia, it’s not our fault. Certainly not our wish… Some things, like grey hair or wrinkles, are merely part of the geography of age the trail runs through. But is increasing prejudice a town along the way, or a detour we didn’t have to take? A ghost town, sporting a few boarded up stores and nobody we recognize on the streets? A place we visited years ago, then left because we didn’t feel at home despite the friends we made?

The BBC report neatly summarizes von Hippel’s neurological hypothesis: The frontal lobes are the last part of the brain to develop as we progress through childhood and adolescence, and the first part of the brain to atrophy as we age. Atrophy of the frontal lobes does not diminish intelligence, but it degrades brain areas responsible for inhibiting irrelevant or inappropriate thoughts. Research suggests that this is why older adults have greater difficulty finding the word they’re looking for – and why there is a greater likelihood of them voicing ideas they would have previously suppressed.

Whoaa! I’m not sure I like that… I’ve always had irrelevant and probably inappropriate thoughts. I’ve always been a loose cannon. And now, I suppose, I should be worried that there is a threshold I might some day cross in which my age would shift me from being considered merely eccentric to a diagnosis of mild dementia –frontal lobe atrophy, no less- by default. Or maybe I should have worried about those mysterious frontal lobes all along –maybe I got a damaged pair from the outset.

Hmm, and I always figured it was the result of getting lost on that less-travelled road. It was, after all, ‘grassy and wanted wear’…

A test for Alzheimer’s Disease…

Now here’s a scientific and epidemiologic conundrum: Suppose you develop a test that will give you advance warning of a fatal disease you can neither treat nor prevent. But that foreknowledge might allow an understanding of the really early aspects of the disease -while it was still asymptomatic- that could eventually lead to a treatment. Especially if the disease, as most are, was potentially more treatable in its early stages. What should you do with the test? You need a lot of people to take the test so you can more appropriately generalize the information obtained and yet you can do nothing for them.  And what are the subjects in the test to do with the information? Suppose it is falsely positive and, despite what the results suggest, despite the worry and possible suicides contemplated, they will not actually get the disease. No test is perfect.

In other words, should you screen a particular population with the test, when the value is not so much for the individual tested as for the knowledge that might eventually be useful to someone else? How ethical is it? How cruel is it..?

People have thought about this, fortunately, and some guidelines were offered in 1968 by the World Health Organization for screening criteria. Among them are the suggestions that, not only should the condition be an important problem, but there should also be a treatment for the condition and an ability to diagnose it accurately. They also suggested the condition should have a latent stage when treatment would be expected to be more efficacious.

The problem I have set forth, of course, is exemplified by the recently announced test for Alzheimer’s Disease. (I have included two articles, the Huffington Post summary being the more easily assimilable of the two.)

http://www.medscape.com/viewarticle/821982?src=iphone&ref=email

http://www.huffingtonpost.co.uk/2014/03/10/dementia-early-detection-blood-test_n_4933188.html

It is obvious that Alzheimer’s disease and dementia are both important health concerns in a time when populations are aging in many countries. It would be helpful to know what facilities might be needed so the appropriate infrastructure could be planned for that particular demographic. But equally, it would be useful to know more about who in that population are particularly at risk so they could be studied. A recent report from the Alzheimer’s Association, for example, suggested that women over 60 are two times as likely to develop Alzheimer’s disease over the rest of their lives as breast cancer:

http://www.alz.org/news_and_events_women_in_their_60s.asp

Perhaps of paramount importance is studying the disease at an early stage to search for the cause. To devise a cure. And yet I can’t help thinking about the helpless laboratory animals in our research facilities, poked, prodded and experimentally assigned… But not for their own good. What constitutes a laboratory animal..?

Under what conditions, then, would it be permissible to undertake such a study? Informed consent is mandatory, of course, but what exactly would the participants be consenting to? To knowing about an inexorable decline in cognitive functioning that would rob them of that which they hold the dearest: themselves? We are our pasts -they are what knit the fabric of our identities into a pattern we and others can recognize from one day to the next. The present is a transient gift that constantly slips behind us so we have to pull it along like a shadow as we walk through time. We collect each present and store it on an accessible shelf like books we’ve read. Without them, we become functionally illiterate. Lost. Wandering endlessly through unmarked time as in a dense mist with no signposts we can see, let alone understand.

That this vision may encompass the tundra that is Alzheimer’s is obviously more pessimistic than may obtain: no doubt it is a condition that varies on a spectrum. But the prospects are not appealing, nor the amplitude of changes likely predictable -and I, personally, would not want to know about it until it has captured me and shrouded my awareness of what I had lost.

I suspect this is the reason for the cautionary statements of the investigators and the thrust of the caveats of the WHO parameters. I’m not sure what to do with the test they describe. It is obviously an important step on the road to understanding dementia and yet… I am reminded of that famous “To be, or not to be” speech by Hamlet in which he talks about death, but describes it in terms the more pessimistic among us might suggest could equally apply to Alzheimer’s disease:

The undiscovered country from whose bourn
No traveler returns, puzzles the will
And makes us rather bear those ills we have
Than fly to others that we know not of
I’m sorry, but I don’t think most of us are ready for the test just yet… Or is it only me?