Age Prejudice

I suppose I should have seen it coming. I suppose I should have laid down firmer tracks, taken a more trodden path. I suppose I shouldn’t have been so influenced by Robert Frost’s poem, the Road Not Taken. But I was… ‘Two roads diverged in a wood, and I— I took the one less traveled by, And that has made all the difference.’ Well, at least I think I did -I didn’t mean to, or anything; it’s just something that, as I look back through the years, turned out that way.

All our paths are unique, to be sure, and it is likely the conceit of many of us to think that we, alone, have chosen singularities for ourselves –things, if not incomparable, then at least extraordinary: signatures for which only we could have been responsible.

So it is with some dismay that I came across a BBC News article describing a paper by William von Hippel, a professor of Psychology at the University of Queensland, in Australia in which he suggests that ‘although many people remain unprejudiced throughout their lives, older adults have a tendency to be more prejudiced than their younger counterparts.

Older folks? Now wait a minute… As someone who did not suspect his way of life had ‘fall’n into the sere, the yellow leaf’, I feel almost blind-sided. Of course I didn’t see Harper Lee’s Go Set A Watchman coming either. I have been a life-long admirer of Atticus Finch –someone whose name I even bestowed upon a cat I once had- and Lee’s previous book To Kill A Mockingbird was always a beacon in troubled waters for me. Something that let me believe that we were more than our fists. That we could, if we chose, rise above the curses and anger that so frequently filled our streets. But even her older Atticus slipped beneath those waves it seems…

So is this how the path to maturity -the grinding road to wisdom- ends? An abnegation of all the tolerance that we have learned to espouse? An abrogation of all those principles we fought so hard to enshrine in law? A foretaste of the dark night of the soul that lurks, still hidden, behind the shadowed corners of our ever-increasing ages?

In fairness, if you read the original article in Psychological Science it attempts to frame the problem as a sort of neurologic deficit, suggesting –that age differences in implicit racial prejudice may be due to age-related deficits in inhibitory ability- perhaps absolving us elders of ultimate responsibility. As in dementia, it’s not our fault. Certainly not our wish… Some things, like grey hair or wrinkles, are merely part of the geography of age the trail runs through. But is increasing prejudice a town along the way, or a detour we didn’t have to take? A ghost town, sporting a few boarded up stores and nobody we recognize on the streets? A place we visited years ago, then left because we didn’t feel at home despite the friends we made?

The BBC report neatly summarizes von Hippel’s neurological hypothesis: The frontal lobes are the last part of the brain to develop as we progress through childhood and adolescence, and the first part of the brain to atrophy as we age. Atrophy of the frontal lobes does not diminish intelligence, but it degrades brain areas responsible for inhibiting irrelevant or inappropriate thoughts. Research suggests that this is why older adults have greater difficulty finding the word they’re looking for – and why there is a greater likelihood of them voicing ideas they would have previously suppressed.

Whoaa! I’m not sure I like that… I’ve always had irrelevant and probably inappropriate thoughts. I’ve always been a loose cannon. And now, I suppose, I should be worried that there is a threshold I might some day cross in which my age would shift me from being considered merely eccentric to a diagnosis of mild dementia –frontal lobe atrophy, no less- by default. Or maybe I should have worried about those mysterious frontal lobes all along –maybe I got a damaged pair from the outset.

Hmm, and I always figured it was the result of getting lost on that less-travelled road. It was, after all, ‘grassy and wanted wear’…

A test for Alzheimer’s Disease…

Now here’s a scientific and epidemiologic conundrum: Suppose you develop a test that will give you advance warning of a fatal disease you can neither treat nor prevent. But that foreknowledge might allow an understanding of the really early aspects of the disease -while it was still asymptomatic- that could eventually lead to a treatment. Especially if the disease, as most are, was potentially more treatable in its early stages. What should you do with the test? You need a lot of people to take the test so you can more appropriately generalize the information obtained and yet you can do nothing for them.  And what are the subjects in the test to do with the information? Suppose it is falsely positive and, despite what the results suggest, despite the worry and possible suicides contemplated, they will not actually get the disease. No test is perfect.

In other words, should you screen a particular population with the test, when the value is not so much for the individual tested as for the knowledge that might eventually be useful to someone else? How ethical is it? How cruel is it..?

People have thought about this, fortunately, and some guidelines were offered in 1968 by the World Health Organization for screening criteria. Among them are the suggestions that, not only should the condition be an important problem, but there should also be a treatment for the condition and an ability to diagnose it accurately. They also suggested the condition should have a latent stage when treatment would be expected to be more efficacious.

The problem I have set forth, of course, is exemplified by the recently announced test for Alzheimer’s Disease. (I have included two articles, the Huffington Post summary being the more easily assimilable of the two.)

It is obvious that Alzheimer’s disease and dementia are both important health concerns in a time when populations are aging in many countries. It would be helpful to know what facilities might be needed so the appropriate infrastructure could be planned for that particular demographic. But equally, it would be useful to know more about who in that population are particularly at risk so they could be studied. A recent report from the Alzheimer’s Association, for example, suggested that women over 60 are two times as likely to develop Alzheimer’s disease over the rest of their lives as breast cancer:

Perhaps of paramount importance is studying the disease at an early stage to search for the cause. To devise a cure. And yet I can’t help thinking about the helpless laboratory animals in our research facilities, poked, prodded and experimentally assigned… But not for their own good. What constitutes a laboratory animal..?

Under what conditions, then, would it be permissible to undertake such a study? Informed consent is mandatory, of course, but what exactly would the participants be consenting to? To knowing about an inexorable decline in cognitive functioning that would rob them of that which they hold the dearest: themselves? We are our pasts -they are what knit the fabric of our identities into a pattern we and others can recognize from one day to the next. The present is a transient gift that constantly slips behind us so we have to pull it along like a shadow as we walk through time. We collect each present and store it on an accessible shelf like books we’ve read. Without them, we become functionally illiterate. Lost. Wandering endlessly through unmarked time as in a dense mist with no signposts we can see, let alone understand.

That this vision may encompass the tundra that is Alzheimer’s is obviously more pessimistic than may obtain: no doubt it is a condition that varies on a spectrum. But the prospects are not appealing, nor the amplitude of changes likely predictable -and I, personally, would not want to know about it until it has captured me and shrouded my awareness of what I had lost.

I suspect this is the reason for the cautionary statements of the investigators and the thrust of the caveats of the WHO parameters. I’m not sure what to do with the test they describe. It is obviously an important step on the road to understanding dementia and yet… I am reminded of that famous “To be, or not to be” speech by Hamlet in which he talks about death, but describes it in terms the more pessimistic among us might suggest could equally apply to Alzheimer’s disease:

The undiscovered country from whose bourn
No traveler returns, puzzles the will
And makes us rather bear those ills we have
Than fly to others that we know not of
I’m sorry, but I don’t think most of us are ready for the test just yet… Or is it only me?