Imagination bodies forth the forms of things unknown

The poet’s eye, in fine frenzy rolling,
Doth glance from heaven to earth, from earth to heaven;
And as imagination bodies forth
The forms of things unknown, the poet’s pen
Turns them to shapes and gives to airy nothing
A local habitation and a name.
Such tricks hath strong imagination.                                                                                                                   
Theseus, in Shakespeare’s A Midsummer Night’s Dream

Shakespeare had a keen appreciation of the value of imagination, as that quote from A Midsummer Night’s Dream suggests. But what is imagination? Is it a luxury -a chance evolutionary exaptation of some otherwise less essential neural circuit- or a purpose-made system to analyse novel features in the environment? A mechanism for evaluating counterfactuals -the what-ifs?

A quirkier question, perhaps, would be to ask if it might predate language itself -be the framework, the scaffolding upon which words and thoughts are draped. Or is that merely another chicken versus egg conundrum drummed up by an overactive imagination?

I suppose what I’m really asking is why it exists at all. Does poetry or its ilk serve an evolutionary purpose? Do dreams? Does one’s Muse…? All interesting questions for sure, but perhaps the wrong ones with which to begin the quest to understand.

I doubt that there is a specific gene for imagination; it seems to me it may be far more global than could be encompassed by one set of genetic instructions. In what we would consider proto-humans it may have involved more primitive components: such non-linguistic features as emotion -fear, elation, confusion- but also encompassed bodily responses to external stimuli: a moving tickle in that interregnum between sleep and wakefulness might have been interpreted as spider and generated a muscular reaction whether or not there was an actual creature crawling on the skin.

Imagination, in other words, may not be an all-or-nothing feature unique to Homo sapiens. It may be a series of  adaptations to the exigencies of life that eventuated in what we would currently recognize as our human creativity.

I have to say, it’s interesting what you can find if you keep your mind, as well as your eyes, open. I wasn’t actively searching for an essay on imagination -although perhaps on some level, I was… At any rate, on whatever level, I happened upon an essay by Stephen T Asma, a professor of philosophy at Columbia College in Chicago and his approach fascinated me. https://aeon.co/essays/imagination-is-such-an-ancient-ability-it-might-precede-language

‘Imagination is intrinsic to our inner lives. You could even say that it makes up a ‘second universe’ inside our heads. We invent animals and events that don’t exist, we rerun history with alternative outcomes, we envision social and moral utopias, we revel in fantasy art, and we meditate both on what we could have been and on what we might become… We should think of the imagination as an archaeologist might think about a rich dig site, with layers of capacities, overlaid with one another. It emerges slowly over vast stretches of time, a punctuated equilibrium process that builds upon our shared animal inheritance.’

Interestingly, many archaeologists seem to conflate the emergence of imagination with the appearance of artistic endeavours –‘premised on the relatively late appearance of cave art in the Upper Paleolithic period (c38,000 years ago)… [and] that imagination evolves late, after language, and the cave paintings are a sign of modern minds at work.’

Asma, sees the sequence rather differently, however: ‘Thinking and communicating are vastly improved by language, it is true. But ‘thinking with imagery’ and even ‘thinking with the body’ must have preceded language by hundreds of thousands of years. It is part of our mammalian inheritance to read, store and retrieve emotionally coded representations of the world, and we do this via conditioned associations, not propositional coding.’

Further, Asma supposes that ‘Animals appear to use images (visual, auditory, olfactory memories) to navigate novel territories and problems. For early humans, a kind of cognitive gap opened up between stimulus and response – a gap that created the possibility of having multiple responses to a perception, rather than one immediate response. This gap was crucial for the imagination: it created an inner space in our minds. The next step was that early human brains began to generate information, rather than merely record and process it – we began to create representations of things that never were but might be.’ I love his idea of a ‘cognitive gap’. It imagines (sorry) a cognitive area where something novel could be developed and improved over time.

I’m not sure that I totally understand all of the evidence he cites to bolster his contention, though- for example, the view of philosopher Mark Johnson at the University of Oregon that there are ‘deep embodied metaphorical structures within language itself, and meaning is rooted in the body (not the head).’ Although, ‘Rather than being based in words, meaning stems from the actions associated with a perception or image. Even when seemingly neutral lexical terms are processed by our brains, we find a deeper simulation system of images.’ But at any rate, Asma summarizes his own thoughts more concisely, I think: ‘The imagination, then, is a layer of mind above purely behaviourist stimulus-and-response, but below linguistic metaphors and propositional meaning.’

In other words, you don’t need to have language for imagination. But the discipline of biosemantics tries to envisage how it might have developed in other animals. ‘[Primates] have a kind of task grammar for doing complex series of actions, such as processing inedible plants into edible food. Gorillas, for example, eat stinging nettles only after an elaborate harvesting and leave-folding [sic] sequence, otherwise their mouths will be lacerated by the many barbs. This is a level of problem-solving that seeks smarter moves (and ‘banks’ successes and failures) between the body and the environment. This kind of motor sequencing might be the first level of improvisational and imaginative grammar. Images and behaviour sequences could be rearranged in the mind via the task grammar, long before language emerged. Only much later did we start thinking with linguistic symbols. While increasingly abstract symbols – such as words – intensified the decoupling of representations and simulations from immediate experience, they created and carried meaning by triggering ancient embodied systems (such as emotions) in the storytellers and story audiences.’ So, as a result, ‘The imaginative musician, dancer, athlete or engineer is drawing directly on the prelinguistic reservoir of meaning.’

Imagination has been lauded as a generator of progress, and derided as idle speculation throughout our tumultuous history, but there’s no denying its power: ‘The imagination – whether pictorial or later linguistic – is especially good at emotional communication, and this might have evolved because emotional information drives action and shapes adaptive behaviour. We have to remember that the imagination itself started as an adaptation in a hostile world, among social primates, so perhaps it is not surprising that a good storyteller, painter or singer can manipulate my internal second universe by triggering counterfactual images and events in my mind that carry an intense emotional charge.’

Without imagination, we cannot hope to appreciate the Shakespeare who also wrote, in his play Richard III:

Princes have but their titles for their glories,                                                                                                      An outward honor for an inward toil,                                                                                                                And, for unfelt imaginations,                                                                                                                                They often feel a world of restless cares.

Personally, I cannot even imagine a world where imagination doesn’t play such a crucial role… Or can I…?

 

Does the love of heaven make one heavenly?

Why do find myself so attracted to articles about religion? I am not an adherent -religion does not stick to me- nor am I tempted to take the famous wager of the 17th century philosopher, Pascal: dare to live life as if God exists, because you’ve got nothing to lose if He doesn’t, and everything to gain if He does.

Perhaps I’m intrigued by the etymological roots that underpin the word: Religare (Latin, meaning ‘to bind’) is likely the original tuber of the word. But is that it -does it bind me? Constrain me? I’d like to think not, and yet… and yet…

Even many diehard atheists concede that religion has a use, if only for social cohesion -Voltaire was probably thinking along those lines when he wrote: ‘If God did not exist, it would be necessary to invent him’. Or Karl Marx: ‘Religion is the sigh of the oppressed creature, the heart of a heartless world, and the soul of soulless conditions. It is the opium of the people’.

And then, of course, there’s Sigmund Freud, an avowed Jewish atheist, who for most of his life, thought that God was a collective neurosis. But, in his later years when he was dying of cancer of the jaw, he suggested (amongst other, much more controversial things) in his last completed book, Moses and Monotheism that monotheistic religions (like Judaism) think of God as invisible. This necessitates incorporating Him into the mind to be able to process the concept, and hence likely improves our ability for abstract thinking. It’s a bit of a stretch perhaps, but an intriguing idea nonetheless.

But, no matter what its adherents may think about the value of the timeless truths revealed in their particular version, or its longevity as proof of concept, religions change over time. They evolve -or failing that, just disappear, dissolve like salt in a glass of water. Consider how many varieties and sects have arisen just from Christianity alone. Division is rife; nothing is eternal; Weltanschauungen are as complicated as the spelling.

So then, why do religions keep reappearing in different clothes, different colours? Alain de Botton, a contemporary British philosopher, argues in his book Religion for Atheists, that religions recognize that their members are children in need of guidance and solace. Although certainly an uncomfortable opinion, there is a ring of truth to his contention. Parents, as much as their children, enjoy ceremonies, games, and rituals and tend to imbue them with special significance that is missing in the secular world. And draping otherwise pragmatic needs in holy cloth, renders the impression that they were divinely inspired; ethics and morality thus clothed, rather than being perceived as arbitrary, wear a spiritual imprimatur. A disguise: the Emperor’s Clothes.

Perhaps, then, there’s more to religion than a set of Holy caveats whose source is impossible to verify. But is it really just something in loco parentis? A stand-in? I found an interesting treatment of this in a BBC Future article written by Sumit Paul-Choudhury, a freelance writer and former editor-in-chief of the New Scientist. He was addressing the possible future of religion. https://www.bbc.com/future/article/20190801-tomorrows-gods-what-is-the-future-of-religion

‘We take it for granted that religions are born, grow and die – but we are also oddly blind to that reality. When someone tries to start a new religion, it is often dismissed as a cult. When we recognise a faith, we treat its teachings and traditions as timeless and sacrosanct. And when a religion dies, it becomes a myth, and its claim to sacred truth expires… Even today’s dominant religions have continually evolved throughout history.’

And yet, what is it that allows some to continue, and others to disappear despite the Universal Truth that each is sure it possesses? ‘“Historically, what makes religions rise or fall is political support,”’ writes Linda Woodhead, professor of sociology of religion at the University of Lancaster in the UK ‘“and all religions are transient unless they get imperial support.”’ Even the much vaunted staying power of Christianity required the Roman emperor Constantine, and his Edict of Milan in 313 CE to grant it legal status, and again the emperor Theodosius and his Edict of Thessalonica in 380 CE to make Christianity the official religion of the Roman Empire.

The first university I attended was originally founded by the Baptists and, at least for my freshman year, there was a mandatory religious studies course. Thankfully, I was able to take a comparative religion course, but in retrospect, I would have liked an even broader treatment of world religions. I realize now that I was quite naïve in those times; immigration had not yet exposed many of us to the foreign customs and ideas with which we are now, by and large, quite familiar. So the very notion of polytheism, for example, where there could be a god dedicated to health, say, and maybe another that spent its time dealing with the weather, was not only fascinating, but also compelling. I mean, why not? The Catholics have their saints picked out that intervene for certain causes, so apart from the intervener numbers, what makes Hinduism with its touted 33 million gods, such an outlier in the West (wherever that is)?

It seems to me that most of us have wondered about the meaning of Life at one time or other, and most of us have reflected on what happens after death. The answers we come up with are fairly well correlated with those of our parents, and the predominant Zeitgeist in which we swim. But as the current changes, each of us is swept up in one eddy or another, yet we usually manage to convince ourselves it’s all for the best. And perhaps it is.

Who’s to say that there needs to be a winner? Religions fragment over time and so do societies; their beliefs, however sacrosanct in the moment, evolve and are in turn sacralized. And yet our wonder of it all remains. Who are we really, and why are we here? What happens when we die? These questions never go away, and likely never will. So maybe, just maybe, we will always need a guide. Maybe we will always need a Charon to row us across the River Styx…

Is the thing translated still the thing?

When I was a student at University, translated Japanese Haiku poetry was all the rage; it seemed to capture the Zeitgeist of the generation to which I had been assigned. I was swept along with others by the simple nature images, but -much like the sonnet, I suppose- I failed to realize how highly structured it was. In fact, I can’t really remember all of its complex requirements -but maybe that’s the beauty of its seeming simplicity in English. However, the contracted translation of the Japanese word –haikai no ku, meaning ‘light verse’- belies the difficulty in translating the poetry into a foreign language while still conserving its structure, its meaning, and also its beauty.

It seems to me that the ability to preserve these things in translation while still engaging the interest of the reader requires no less genius than that of its original creator. While, both in poetry as well as in the narrative of story, the ideas of the authors, and their images, plots and metaphors are an intrinsic part of the whole, sometimes the concepts are difficult to convey to a foreign culture. So, what to do with them to maintain the thrust of the original while not altering the charm? And when does the translation actually become a different work of art and suggest the need for a different attribution?

Given my longstanding  love for poetry and literature, I have often wondered whether I could truly understand the poetry of, say, Rumi who wrote mainly in Persian but also in Turkish, Greek and Arabic; or maybe, the more contemporary Rilke’s German language poetry. I speak none of those languages, nor do I pretend to understand the Umwelten of their times, so how do I know what it is that attracts me, apart from the beauty of their translations? Is it merely the universality of their themes, and perhaps my mistaken interpretations of the images and metaphors, or is there something else that seeps through, thanks to -or perhaps in spite of- the necessary rewording?

Since those heady days in university, I have read many attempts to explain, and even to justify, various methods of translation, and they all seem to adhere to one or both of the only two available procedures: paraphrasing, or metaphrasing (translating word for word). And no matter which is used, I have to wonder if the product is always the poor cousin of the original.

In one of the seminars from university, I remember learning that as far back as St. Augustine and St. Jerome, there was disagreement about how to translate the Bible -Augustine favoured metaphrasis, whereas Jerome felt that there was ‘a grace of something well said’. Jerome’s appealing phrase has stayed with me all these years. Evidently, the problem of translation goes even further back in history though, and yet the best method of preserving the author’s intention is still no closer to being resolved.

In my abiding hope for answers, I still continue to search. One such more recent forage led me to an essay in the online publication Aeon by the American translator and author Mark Polizzotti (who, among other honours, is a Chevalier of the Ordre des Arts et des Lettres, the recipient of a 2016 American Academy of Arts and Letters award for literature, and a publisher and editor-in-chief at the Metropolitan Museum of Art in New York). https://aeon.co/essays/is-the-translator-a-servant-of-the-text-or-an-original-artist

He writes, ‘as the need for global communication grows by proverbial leaps, the efficiency of machine-based translation starts looking rather attractive. In this regard, a ‘good’ translation might simply be one that conveys the requisite bytes of information in the shortest time. But translation is about more than data transmission, and its success is not always easy to quantify. This becomes particularly true in the literary sphere: concerned with delivering artistic effect more than facts simple and straight.’

So, ‘We might think that the very indeterminacy of literary translation would earn it more leeway, or more acceptance.’ And yet, ‘many sophisticated readers view translation as no more than a stopgap… it would be disingenuous to claim that the reader of a translation is truly experiencing, in all its aspects, the foreign-language work it represents, or that in reading any text transposed from one language into another there isn’t a degree of difference (which is not the same as loss). The heart of the matter lies in whether we conceive of a translation as a practical outcome, with qualities of its own that complement or even enhance the original, or as an unattainable ideal, whose best chance for relative legitimacy is to trace that original as closely as possible.’

Polizzotti goes on to catalogue various approaches and views of translation and then suggests what I, at least, would consider the best way to think of translation and the obvious need it attempts to fulfil: ‘If instead we take translators as artists in their own right, in partnership with (rather than servitude to) their source authors; if we think of translation as a dynamic process, a privileged form of reading that can illuminate the original and transfer its energy into a new context, then the act of representing a literary work in another language and culture becomes something altogether more meaningful. It provides a new way of looking at a text, and through that text, a world. In the best of cases, it allows for the emergence of an entirely new literary work, at once dependent on and independent of the one that prompted it – a work that neither subserviently follows the original nor competes with it, but rather that adds something of worth and of its own to the sum total of global literatures. This does not mean taking undue liberties with the original; rather, it means honouring that original by marshalling all of one’s talent and all of one’s inventiveness to render it felicitously in another language.

‘To present a work as aptly as possible, to recreate it in all its beauty and ugliness, grandeur and pettiness, takes sensitivity, empathy, flexibility, knowledge, attention, caring and tact. And, perhaps most of all, it takes respect for one’s own work, the belief that one’s translation is worth judging on its own merits (or flaws), and that, if done well, it can stand shoulder to shoulder with the original that inspired it.’

Polizzotti has nailed it. There’s a spirit inherent in good translation -one that inspires a confidence that the original intent of the author is appropriately, and befittingly displayed.

One of the reasons I was drawn to Polizzotti’s essay was a recent book I read (in translation): The Elegance of the Hedgehog, by Muriel Barbery and translated from the original French by Alison Anderson. So seamless was the narrative, and so apt were the translated dialogues, I have to confess that I had difficulty believing the book had not originally been written in English. And as it stands, it is one of the most rewarding books I have experienced in years. I’m sure that Ms Barbery is well content with Anderson’s translation, not the least because their efforts earned it accolades from various critics, including a posting on the New York Times best-seller list.

It seems to me that one could not expect more from a translator than that.

To hold, as it were, a mirror up to Nature

Who am I? No, really -where do I stop and something else begins? That’s not really as silly a question as it may first appear. Consider, for example, my need to remember something -an address, say. One method is to internalize it -encode it somehow in my brain, I suppose- but another, no less effective, is to write it down. So, if I choose the latter, is my pen (or keyboard, for that matter) now in some sense a functional part of me? Is it an extension of my brain? The result is the same: the address is available whenever I need it.

Ever since my university days, when I discovered the writings of the philosopher Alan Watts, I have been intrigued by his view of boundaries, and whether to consider them as things designed to separate, or to join. Skin, was one example that I remember he discussed -does it define my limits, and enclose the me inside, or is it actually my link with the outside world? I hadn’t really thought much about it until then, but in the intervening years it has remained an idea that continues to fascinate me.

Clearly Watts was not alone in his interest about what constitutes an individual, nor in his speculations about the meaning of whatever identities individuals think they possess by virtue of their boundaries. There was an insightful article in Aeon by Derek Skillings, a biologist and philosopher of science at the University of Pennsylvania entitled ‘Life is not easily bounded’: https://aeon.co/essays/what-constitutes-an-individual-organism-in-biology

‘Most of the time the living world appears to us as manageable chunks,’ he writes, ‘We know if we have one dog or two.’ Why then, is ‘the meaning of individuality … one of the oldest and most vexing problems in biology? …  Different accounts of individuality pick out different boundaries, like an overlapping Venn diagram drawn on top of a network of biotic interactions. This isn’t because of uncertainty or a lack of information; rather, the living world just exists in such a way that we need more than one account of individuality to understand it.’ But really, ‘the problem of individuality is (ironically enough) actually composed of two problems: identity and individuation. The problem of identity asks: ‘What does it mean for a thing to remain the same thing if it changes over time?’ The problem of individuation asks: ‘How do we tell things apart?’ Identity is fundamentally about the nature of sameness and continuity; individuation is about differences and breaks.’ So, ‘To pick something out in the world you need to know both what makes it one thing, and also what makes it different than other things – identity and individuation, sameness and difference.’

What about a forest -surely it is a crowd of individual trees?  Well, one way of differentiating amongst individuals is to think about growth -a tree that is growing (in other words, continuing as more of the same)- and contrasting it with producing something new: as in reproduction. And yet even here, there is a difficulty. It’s difficult to determine the individual identities of any trees that also grew from the original roots -for example from a ‘nurse’ tree lying on the ground with shoots and saplings sprouting from it.’

But it’s not only plants that confuse the issue. If reproduction -i.e. producing something new– counts as a different entity, then what about entities like bacteria? ‘These organisms tend to reproduce [albeit] by asexual division, dividing in half to produce two clones… and, failing mutation and sub-population differentiation, an entire population of bacteria would be considered a single individual.’ -whatever ‘individual’ might therefore mean.

And what about us, then? Surely we have boundaries, surely we are individuals created as unique entities by means of sexual reproduction. Surely we have identities. And yet, what of those other entities we carry with us through our lives -entities that not only act as symbiotes, but are also integrated so thoroughly into our metabolism that they contribute to such intimate functions as our immune systems, our weight and health, and even function as precursors for our neurotransmitters and hence our moods? I refer, of course, to the micro-organisms inhabiting our bowels -our microbiome. Clearly ‘other’ and yet essential to the functioning person I regard as ‘me’.

And yet, our gut bacteria are mostly acquired from the environment -including the bacteria colonizing our mother’s vagina and probably her breast milk- and so are not evolutionarily prescribed, nor thereby hereditarily transmitted. So, am I merely a we –picking up friends along the way? Well, consider mitochondria -the powerhouse of our cells. They were once free-living bacteria that adapted so well inside our cells that they, too, are integral to cell functioning but have lost the ability to survive separately; they are transmitted from generation to generation. So they are me, right…?

Again I have to ask just who is me? Or is the question essentially meaningless put like that? Given that I am a multitude, and more like a city than a single house, shouldn’t the question be who are we? The fact that all of us, at least in Western cultures, consider ourselves to be distinct entities -separate individuals with unique identities- makes me wonder, about our evolutionary history.

Was there a time when we didn’t make the distinctions we do nowadays? A time when we thought of ourselves more as members of a group than as individuals? When, perhaps sensing that we were constantly interacting with things outside and inside us, the boundaries were less important? Is that how animals would say they see the world if they were able to tell us?

Does our very ability to communicate with each other with more sophistication, create the critical difference? Is that what created hubris? In Greek tragedy, remember, hubris -excess pride and self-confidence- led inexorably to Nemesis, retributive justice. Were poets in that faraway time, trying to tell people something they had forgotten? Is that what this is all about?

I wonder if Shakespeare, as about so many things, was also aware of our ignorance: ‘pride hath no other glass to show itself but pride, for supple knees feed arrogance and are the proud man’s fees.’

Plus ça change, eh?

Wast thou o’erlook’d, even in thy birth?

That Age can do some funny things to the mind seems fairly obvious. The accumulation of years, brings with it a panoply of experience that, hopefully, enables a kind of personalized Weltanschauung to emerge -things begin to sort themselves on the proper shelves, and even if they remain difficult to retrieve, there is a satisfaction that they are there, if not completely codified.

Of course, admixed with any elder ruminations are the ever-present intimations of imminent mortality -but it’s not that Age constrains the thought process to memento mori, so much as a flourishing of its antithesis: memento vivere. Age is a time for reflection about one’s life with a perspective from further up the hill.

And yet, for all the experiential input, there are two time frames hidden from each of us -what happens after death, is the obvious one to which most of us turn our attention as the final act draws to a close, but there is an equally shrouded area on which few of us spend any time: what, if anything, was preconceptual existence like? Is it the equivalent of death, perhaps minus the loss of an identity not yet acquired?

I wonder if it’s a subject more understandable to the very young, than the gnarled and aged. I remember the very first time I was taken to a movie theatre, somewhere around two or three years of age, I think. When I say ‘remember’, I mean to say I have only one recollection of the event: that of a speeding locomotive filmed in black-and-white from track level, and roaring over the camera. It was very exciting, but I remember my father being very puzzled when I confessed that I’d seen it before. I hadn’t, of course, as he patiently explained to me, and yet it seemed to me I’d seen the same thing years before.

No doubt it was my still-immature neurons trying to make sense of the world, but the picture seemed so intuitively obvious to me at the time. And through the years, the image has stayed with me, as snippets of childhood memories sometimes do, although with the meaning now sufficiently expurgated as to be innocuous, as well as devoid of any important significance.

And then, of course, there was the Bridey Murphy thing that was all the rage when I was growing up in the 1950ies. I read the book The Search for Bridey Murphy in my early teenage years about a Colorado woman, Virginia Tighe, who, under hypnotic regression in the early 1950ies, claimed she was the reincarnation of an Irish woman, Bridey Murphy from Cork in the 19th century. I even went to see the movie of the same name as the book. It was all pretty well debunked subsequently, but I suppose it was enough, at a tender age, to make me wonder about what might have happened before I become me.

At any rate, I am puzzled about why the seeming non-existence prior to conception is not something we think about more often. True, we would likely have no identity to put into that side of the equation, nor, for that matter, the loss of anything like friends or, well, existence, on the other, but still it is a comparable void. A wonderful mystery every bit as compelling as death.

I suppose the issue resurfaced for me a few years ago when I had a very vivid dream about our three-score-and-ten of existence. I saw myself as a bubble rising through some boiling water. While I was the bubble, I thought of myself as singular and not only separate from, but possessing an identity totally differentiated and unique from everything else around me. My life was the time it took me to rise to the surface. And yet when I arrived there, and my bubble burst and disappeared, when the me inside dissolved in the air from which I started, it all made sense. In fact, the encapsulated journey itself was an aberration, as was the idea of identity…

The dream lay fallow for several years and then reawakened, Phoenix-like, when I discovered an essay in the online publication Aeon, by Alison Stone, a professor of philosophy at Lancaster University in the UK. https://aeon.co/ideas/thinking-about-ones-birth-is-as-uncanny-as-thinking-of-death

‘Many people feel anxious about the prospect of their death,’ she writes. ‘Indeed, some philosophers have argued that death anxiety is universal and that this anxiety bounds and organises human existence. But do we also suffer from birth anxiety? Perhaps. After all, we are all beings that are born as well as beings that die… Once we bear in mind that we are natal as well as mortal, we see some ways in which being born can also occasion anxiety.’

I don’t believe she is thinking of what it must feel like to be born, so much as the transition from, well, the nothing before sperm and egg meet, to a something -to a somebody. She quotes the thoughts of the bioethicist David Albert Jones in his 2004 book The Soul of the Embryo: ‘We might be telling someone of a memory or event and then realise that, at that time, the person in front of us did not even exist! … If we seriously consider the existence and the beginning of any one particular human being … we realise that it is something strange and profound.’

Stone continues, ‘I began to exist at a certain point in time, and there is something mysterious about this. I haven’t always been there; for aeons, events in the world unfolded without me. But the transition from nonexistence to existence seems so absolute that it is hard to comprehend how I can have passed across it… To compound the mystery further, there was no single crossing point. In reality, we don’t begin in [a] sudden, dramatic way… Rather, I came into existence gradually. When first conceived, I was a single cell (a zygote). Then I developed a formed body and began to have a rudimentary level of experience during gestation. And once out of my mother’s womb, I became involved in culture and relationships with others, and acquired a structured personality and history. Yet the zygote that I began as was still me, even though it had none of this.’ Wow -you see what I mean?

Stone seems to think that all this is rather distressing, but I disagree. All I feel is a sense of profound, unbounded wonder at it all. Reflecting on that time-before-time is not unweaving the rainbow, as Keats was said to have accused Newton of doing because he had destroyed its poetry by actually studying it.

In fact, I’m reminded of something the poet Kahlil Gibran wrote: And when you were a silent word upon Life’s quivering lips, I too was there, another silent word. Then life uttered us and we came down the years throbbing with memories of yesterday and with longing for tomorrow, for yesterday was death conquered and tomorrow was birth pursued.

I have to believe there will still be poetry in the world -with or without us…

Virtues we write in water

I’ve only recently stumbled on the concept of virtue signalling. The words seem self-explanatory enough, but their juxtaposition seems curious. I had always thought of virtue as being, if not invisible, then not openly displayed like chest hair or cleavage. Perhaps it’s my United Church lineage, or the fact that many of my formative years were spent in pre-Flood Winnipeg, but the idea of flaunting goodness still seems anathema to me -too social mediesque, I suppose.

Naturally, I am reminded of that line in Shakespeare’s Henry VIII: Men’s evil manners live in brass; their virtues we write in water. And, although I admit that I am perhaps woefully behind the times -and therefore, hopefully, immune from any accusations of what I have just disparaged- it seems to me that virtue disappears when advertised as such; it reappears as braggadocio. Vanity.

Because I had never heard of the issue, it was merely an accident that I came across it in an article in Aeon: https://aeon.co/ideas/is-virtue-signalling-a-perversion-of-morality

It was an essay written by Neil Levy, a senior research fellow of the Oxford Uehiro Centre for Practical Ethics and professor of philosophy at Macquarie University in Sydney. ‘Accusing someone of virtue signalling is to accuse them of a kind of hypocrisy. The accused person claims to be deeply concerned about some moral issue but their main concern is – so the argument goes – with themselves.’

And yet, as I just wrote, ‘Ironically, accusing others of virtue signalling might itself constitute virtue signalling – just signalling to a different audience… it moves the focus from the target of the moral claim to the person making it. It can therefore be used to avoid addressing the moral claim made.’ That’s worrisome: even discussing the concern casts a long shadow. But is that always ‘moral grandstanding’?

Levy wonders if ‘virtue signalling, or something like it, is a core function of moral discourse.’ Maybe you can’t even talk about virtue, without signalling it, and maybe it signals something important about you -like a peacock’s tail advertising its fitness.

The question to be asked about signalling, though, is whether it is costly (like the resources that are needed to create the tail), or enhances credibility -honesty, I suppose- (like the sacrifice that might be involved in outing, say, an intruder that might harm not only the group, but also the signaller). And while the latter case may also involve a significant cost, it may also earn a significant reward -not only cooperation in standing up en masse to the predator, let’s say, but also commendation for alerting the group: honour, prestige…

Seen in this light, Levi thinks, virtue signalling may in fact be a proclamation to the in-group -the tribe- and identifying the signaller as a member. So would this virtue signalling occur when nobody else was around -when only the signaller would know of his own virtue? Would he (Okay, read I) give to charity anonymously? Help someone in need without identifying himself? And if so, would it still be virtue signalling, if only to himself? Is it even possible to be hypocritical to oneself…?  Interesting questions.

Of course, memory is itself mutable, and so is it fair to criticize someone who honestly believes they acted honourably? Would it be legitimate to accuse them of virtue signalling, even if evidence suggested another version of the event?

Long ago, when I was a freshman living in Residence at university, a group of us decided to celebrate our newly found freedom from parental supervision and headed off to a sleazy pub near the school that catered to students and was known to be rather forgiving of minimum age requirements for drinks.

For some of us at least, alcohol had not been a particularly significant part of our high school experience and so I quickly found myself quite drunk. I woke up, apparently hours later, lying on my bed and none the wiser about the night. I was wearing my roommate’s clothes, and I could see mine lying clean and neatly folded on the chair beside my desk. My wallet and watch, along with a few coins were arranged carefully on top.

“You passed out in the pub,” Jeff explained when I tried, unsuccessfully, to sit up in bed. “I thought I’d better wash your clothes, after you were sick all over them,” he explained, smiling proudly at his charity. “Well, actually, Brenda put them in the washer -I’m not good at that kind of stuff.” He stared at me for a moment, shaking his head in mock disbelief. “Boy, you were really wasted! It took three of us to get you back…”

I remember trying to focus my eyes on him as I attempted to think about the evening, and then slumped back onto the pillow and slept for most of the morning.

My memory of the pub night is vague now, but I do remember going to the store the next day to buy something, and finding that, apart from the coins, I had no money left -none in the pockets of the freshly washed clothes, of course, but none of the money my parents had given me for my first month’s expenses that had been in my wallet either.

None of this is particularly consequential, I suppose, but it did surface at a class reunion many years later. Jeff was now a high school teacher, Brenda a lawyer, and I had just finished a medical residency and was about to open a consulting practice.

Jeff, as had always been his wont, was holding his own noisy court at the bar, and Brenda -now his wife- was glaring at him. He was slurring his words already, even though the socializing part of the evening had just begun.

Perhaps in an effort to deflect her attention he glanced around the room and when he saw me, waved.

“Remember old G?” he shouted to nobody in particular, and immediately embraced me as soon as I got close enough. I saw a few people I recognized, but even under Brenda’s worried look, Jeff wouldn’t let go of my arm. “G was my roomie…” Jeff explained and signalled the bartender for another beer with his free hand before Brenda waved him off. “He used to get so drunk,” he explained, although I had trouble untangling his words. “Thank the gods that I was around to take care of his, though…”

His what,” I asked, largely to break the palpable tension between Jeff and Brenda.

Jeff looked surprised. “Take care of him…  Take care of you, roomie. You!” He looked at Brenda and finally let go of my hand. “One night he got so drunk, I had to carry him home, and then lend him my clothes because he’d been sick all over his own…”

The others in the group shuffled nervously and glanced at each other. Brenda seemed angry, but I just shrugged.

“That was good of you, Jeff,” I said. “I obviously needed help that night…” I hadn’t forgotten about the missing money, but now wasn’t the time to mention it.

The others smiled and nodded -rather hesitantly, I thought.

“But, that’s what a real friend does, eh?” Jeff added, as Brenda tugged on his arm to leave. She blinked self-consciously at me as she led him away from the bar. “Nice to see you again, G,” she said, her eyes silently apologizing to me. “Maybe we can talk later, eh…?”

I think she knew more about the missing money than she was willing to admit, even to friends.

Maybe we were all virtue-signalling, though…

Whodunnit?

Popular opinion to the contrary, it seems to me that there are advantages to cultural naïveté -well, literary innocence, at any rate. Being seduced into a novel or short story solely because of the reputation of the author, or the ravings of a friend, risks disappointment -if only in your friend’s lack of sophistication. And even if the choice was successful, there remains, for me at least, a lingering sense of manipulation, of being swept along in a crowd: just another nameless member of the flock. I would much prefer to watch it from the edge, untouched by all but the gentle murmur of its passing.

There is far more pleasure in the unguided discovery of a title or an author unbesmirched by popularity, and hiding, perhaps, in a used book store, or on the shelf of one of those take-one-give-one piles I seem to frequent at neighbourhood bus stops. For me, their anonymity -however transient- is an adventure. But I suppose I’ve always been drawn to the potential of the unsigned, the wisdom of the incognitive with no particular affiliation. Graffiti -the polite ones anyway- can be compelling, too. With them, there is seldom need for attribution, and indeed, the recognition of authorship might well detract from the message, and relegate it to partisan politics rather than liberate it to a vox populi, if not a vox dei.

I had feared this was merely a personal conceit, a longing for an unspoiled hilltop from which to evaluate the countryside, but as sometimes happens, I discovered there were others who also wandered lonely as a cloud -although with much more erudition. Tom Geue is perhaps a good example. He is a lecturer in Latin in the School of Classics at the University of St Andrews in Scotland, and wrote a thought-provoking essay on anonymity for the online publication Aeon:  https://aeon.co/essays/lessons-from-ancient-rome-on-the-power-of-anonymity

‘Not knowing the author of a literary work does something powerful to the reader: it makes her experience the words as an exemplary, representative, far-reaching burst of culture, a spark of art that seems to transcend the limits of the singular intelligence… The potential of the anonymous work is in its ability to throw the reader into the realm of apparent universality.’

As a scholar of classical Latin literature, he illustrates many of his arguments with examples from the period. ‘Literature for the Romans was primarily the product of a singular intelligence… A literary text without authorship was often thought of as something dark, mysterious, lacking and disabled. In fact, a whole part-industry of scholarship sprouted up around securing attribution, making sure, that is, that the right texts had their proper authors, and that readers could know the worth of what they read…  Even when there was no clear single point of origin for a work – eg, when the authorship was genuinely shared – Ancient readers invented one: it could never just be the Iliad or the Odyssey; it had to be the Iliad or Odyssey of Homer. There was little space in the culture of authorship for works whose author was properly unknown; and many modern readers have inherited these exclusionary tastes.’

Despite -or maybe because of- the ‘anti-anonymity biases of the Classical canon’ though, Geue seems intrigued with an anonymous historical novel Octavia that he admits we have probably heard nothing about. ‘The play is an anonymous masterpiece, and it is about the divorce and exile of Nero’s first wife, Octavia, set in 62 BCE. It stages the domestic tension and revolutionary springback of absolute power spinning out of control, and it does so with more ambition and urgency than almost any other piece of drama to survive from Ancient Rome.’ But it is unsigned for an obvious reason: probable political retribution if the author were known. And, as Geue suggests, ‘Names tame certain forces; anonymity unleashes them.’

I see that as a cause for concern, however: information -or propaganda- can obviously wreak havoc if it is false, unattributable. Graffiti are one thing, but social media is another. Since antiquity, it has always been important to know if the source of the information possessed enough expertise to justify acceptance -or, was at least trustworthy and otherwise neutral. No doubt this is why Science and its scientists have hitherto enjoyed wide public acceptance. The recent rapid emergence of social media with its anonymous sources, and agenda-laden dis-information, however, has cast some deep shadows over expert opinions. To say the least, this is a troubling development.

And yet that type of writing is not what I am celebrating. Fact-driven compositions will likely continue to need scrutiny -to mislead is to harm, if only the Zeitgeist. But when we’re talking about literature and poetry, anonymity can be tantalizing. Enticing. Character and subject development, skillful storytelling along with evocative metaphors and a seductive plot-line are far more important than author identification in that idiom. Whether, in other words, the Iliad, was actually written by a poet named Homer -if he even existed- or whether the stories are merely compilations of the works of many unnamed authors, subtracts nothing from the brilliance of their contents. I think the mystery adds to the allure.

There is beauty in discovery, there is wisdom in metaphors- but there is also a certain charm in the as yet unknown. My father was a Baptist, and came from a non-dancing, non-card-playing family, so his cursing was, well, imaginative to say the least. Most of them were evocative of frustration, or folk wisdom -like ‘it’s not the size of the dog in the fight, but the size of the fight in the dog…’ That sort of thing.

Some, though, defied my childhood comprehension and vocabulary, and I assumed they were special remnants of a world I was too young to have experienced. There was a phrase he said that I always enjoyed: ‘jumped-up mackinaw’. It was my father’s favourite expression and it always made me laugh, so he would too, and then reach out and hug me. I’ve always associated the expression with what I loved about him: he made me happy.

It was long before Google and the internet, and I remember my friends thought ‘jumped-up’ was  something bad: swearing. So with considerable trepidation, I asked a teacher what it meant one time after class when she seemed to be in a good mood.

“Well,” she said, after thinking about it, “I know about Mackinaw shirts… They were made of water-repellent wool, or something.” She looked at the ceiling for a moment. “Loggers wore them, I think…”

“So… what about the ‘jumped-up’ part?” I said, and watched her with anxious eyes.

I remember her smiling and shrugging her shoulders. “I don’t know why he’d say that, G. Maybe he read it somewhere, do you think?”

I could only think of the Reader’s Digest books in our bathroom, but I’d read most of them, too, and I was pretty sure I’d never seen it there. Apart from the Bible, I’d never seen him read much else. “I wonder who would write something like that,” I said, frustrated at being no closer to the meaning. “I don’t think it’s in the Bible, is it?”

She shook her head. “Sounds like an anonymous author, don’t you think?”

I looked at her, obviously puzzled at the word.

She smiled and explained. “Anonymous means unknown, or unnamed. So perhaps nobody knows who wrote it.”

After reading Geue’s essay, though, I remembered my father’s expression, and wondered if my teacher had been correct about the anonymity of it’s generation. I considered Googling it, but decided not to. After all, his expression defined my childhood as much as my father’s smile did, and I’m happy to think he wrote it. It’s ours -and I don’t need it to be from someone I don’t know.

Of course, maybe most of us are actually anonymous, anyway…

Fire burn, and cauldron bubble

I love it when I hear a new word, wrestle with a new concept. Pyrocene -don’t you adore it? Even just sounding it out quietly in your head, it’s  hard to miss the excitement, or the imagery.

It takes its shape, as with all great epochs, by combining two Greek words, pur (or pyro), meaning ‘fire’, and the suffix kainos (or cene) -added to whatever noun, and meaning ‘new’. In other words, the Pyrocene is the fire epoch.

When you think about it, Pyrocene is an evocative and descriptive name for what has been going on for some time now. Fire has been tremendously important for our species. First came lightening and its effect of setting nature alight, and then, once we discovered we could tame fire, it kept us warm, it cooked, and it protected us from whatever predators remained afraid of it.

But that was just the beginning of our love affair: we began to invent new things it could do -like smelting metals, and boiling water to produce steam. All you needed was enough wood for fuel. And then, serendipitously no doubt, came the discovery of other less obvious sources that burned even hotter such as coal and, eventually, oil. It seems that hominids have embraced fire almost from the beginning; we are the fire-animal.

Unfortunately, fire seems to be in the news a lot lately -too much, in fact: bush fires, forest fires, the Amazon, Fort McMurray here in Canada, California, Europe, Australia… I can’t help but think of the poem by Goethe: the Sorcerer’s Apprentice -or at least its depiction in the animated Disney film Fantasia, in which Mickey Mouse, to the music of the unforgettable symphonic poem by Paul Ducas, tires of his job of cleaning the room of his mentor (the sorcerer) and tries to use magic to make the broom do it for him. He quickly loses control, however.

I have to admit that my thoughts about the history of fire were otherwise quite embryonic and unfocussed until I came across an epiphanic essay about Fire in Aeon, written by Stephen J Pyne, an emeritus professor of Life sciences at Arizona State University: https://aeon.co/essays/the-planet-is-burning-around-us-is-it-time-to-declare-the-pyrocene

He identifies different sources of fire -different ways of producing the energy: ‘Three fires now exist, and they interact in a kind of three-body dynamic. The first fire is nature’s. It has existed since plants first colonised continents… The second fire is humanity’s. It’s what humans have done as they moved from cooking food to cooking landscapes, and because it feeds on the same grasses, shrubs and woods as first-fire, the two fires compete for fuels: what one burns the other can’t, and neither can break beyond the ecological boundaries set by their biotic matrix… Third-fire transcends the others. It burns fossil biomass, a fuel which is outside the biotic box of the living world. Where third-fire flourishes, the others don’t, or can burn only in special preserves or as genuinely wild breakouts. After a period of transition, third-fire erases the others, leaving ecological messes behind. Because it doesn’t burn living landscapes, those combustibles grow and pile up and create conditions for more damaging burns; because it isn’t in a biotic box, its smoke can overwhelm local airsheds and its emissions can clog the global atmosphere.’

So, why does he feel the need for a new name for the epoch in which we live? I mean, we seem deluged by names -some admittedly hubristic and anthroponomic: centered mainly around us, as if everything revolved around our presence; Anthropocene comes to mind.

‘The Pleistocene began 2.58 million years ago. Unusually among geologic periods, it is characterised by climate. The Earth cooled and, atop that trend, it repeatedly toggled between frost and thaw, as 40-50 cycles switched between glacial ice and interglacial warmth. Some 90 per cent of the past 900,000 years have been icy. Our current epoch, the Holocene, is one of the interglacial warm spells, and most calculations reckon that the Earth is due – maybe overdue – to swing back to ice.’

But Pyne argues that we’re really still in the Pleistocene: ‘Other than the fact that it’s our time, and we are sufficiently special in our own eyes to merit our own era, there is little cause to have split it off from the Pleistocene… By the metrics that established the Pleistocene, the Pleistocene persists. Only humanity’s vanity insists on a secessional epoch. The ice will return… Or not. Something seems to have broken the rhythms. That something is us…

‘Or more usefully, among all the assorted ecological wobbles and biotic swerves that humans affect, the sapients negotiated a pact with fire. We created conditions that favoured more fire, and together we have so reworked the planet that we now have remade biotas, begun melting most of the relic ice, turned the atmosphere into a crock pot and the oceans into acid vats, and are sparking a sixth great extinction…  fire has become as much a cause and consequence as ice was before. We’re entering a Fire Age.’ And yet, in the old days, ‘there were limits to human-enabled burning. Burn too much, too quickly, and living landscape cannot recover, and the fires ebb. Once humans started burning fire’s lithic landscapes – fossil fuels – there seemed to be no such limits.’

Apart from nuclear energy -be it fission, or the long-promised fusion technology- the options currently available to power industry and society’s ever-increasing needs, seem in great need of innovative thinking. In a time of changing climatic conditions, reliable sources that are independent of the vagaries of weather events such as droughts or unexpected flooding, unpredictable or destructive winds, not to mention massive uncontrollable fires, are urgently required. Renewable technology is only as good as the foreseeable conditions upon which it depends.

Our addiction to fire has really left us with a Sophie’s choice: either accept the consequences of the damage it is doing to everything that allowed us to flourish in this geologically opportune -albeit temporary- interregnum between Ice-Ages, or… What? Abandon our overweening hubris and slip back into what forests still remain on the horizon’s edge -but this time aware that we are no more important, no more entitled than anything else that shares our world?

And yet, even then, would we make the same mistakes again…? Would our too-active brains mislead us once more? I don’t mean to end with an existential crisis, but I’m reminded of the observations of Shakespeare’s Macbeth -a creature of that old, untethered world: I have no spur to prick the sides of my intent, but only vaulting ambition, which o’erleaps itself, and falls on th’other. . . .

How much do warnings help?

Is my skin becoming too thick? Too insensitive to those things I want to feel? Need to feel? Or has it merely developed callus over areas too frequently assailed?

These are questions that I’m beginning to ask as I notice the burgeoning warnings on virtually every television channel that whatever follows may not be suitable for all audiences, or that parental discretion is advised. Curious as to what offence is about to be committed, I find myself more engaged in searching for possible misdemeanours than attending to the substance of the program -a minor diversion, to be sure, but nonetheless a distraction from the evaluating the presentation of the subject matter.

I suppose these warnings are the polite thing to do, although they are certainly missing from the business of everyday life. Still, if there are people out there who feel compelled to guard themselves or their children from strong language, or upsetting scenes of violence, I do not begrudge them that -although I do wonder how they manage it elsewhere in their day.

And I certainly don’t want to see gratuitous savagery in a program purporting to educate me about poverty and how different jurisdictions are managing it. We all have our boundaries, and individual thresholds are sometimes hard to gauge, but perhaps our sights are set rather low on programs whose subject matter should be obvious from their titles. If I choose to watch a crime drama, or a documentary on the ravages of war, I would likely have factored in the probable contents before I tuned in. And if I’m being warned about the content of language I might hear, well, good luck walking past a school at recess, or even along the average city street.

At any rate, I’m beginning to question the motivation for these warnings. Does everything require a warning, or are they mainly hedges against possible lawsuits, or something? And perhaps more importantly, do the warnings actually work? Are there such things as ‘triggers’ whose very presence could cause serious injury unless stringently avoided? And what if I warn, but you do not heed? Would I then be at fault -or would it be you, for not listening…? I wrote about this subject a few years ago: (https://musingsonwomenshealth.com/2016/07/06/the-trigger-warning/) but I’m wondering how much we have learned about these triggers in the intervening years.

More recently, I came across an essay in Aeon that I hoped might shed some new light on the issue. https://aeon.co/ideas/trigger-warnings-dont-help-people-cope-with-distressing-material It was written by Christian Jarrett, a cognitive neuroscientist and a senior editor at Aeon. He too, it seemed, was conflicted about trigger warnings: ‘the use of trigger warnings has since spread… to educational institutions around the world, and further: into theatres, festivals and even news stories. The warnings have become another battlefield in the culture wars, with many seeing them as threatening free speech and the latest sign of ‘political correctness’ gone mad.’

And yet, ‘Ideology aside, one could make a basic ethical case for giving warnings in the sense that it’s the considerate thing to do.’ But is there any proof that a warning is helpful in avoiding psychological damage in people with a history of trauma, or painful memories similar to what is being warned about?

Jarrett cites some evidence that, far from helping those sensitive to the issues because of past traumas, ‘trigger warnings enable survivors of trauma to avoid re-experiencing the negative associated emotions, [and] critics argue that the avoidance of potentially upsetting material is actually a counterproductive approach because it offers no chance to learn to manage one’s emotional reactions. As a result, fears deepen and catastrophic thoughts go unchallenged.’

I don’t think Jarrett is totally convinced of the evidence -much more study is required- but at least there are competing considerations in the management of so-called triggers. In fact, ‘On the question of whether trigger warnings give people the chance to brace themselves emotionally, a spate of recent studies suggest that this simply isn’t how the mind works.’

Fair enough, I guess -although I personally wouldn’t want to upset someone by mistake. However, he goes on to mention the concept of ‘coddling’ -as developed in a book by ‘ the attorney Greg Lukianoff and the social psychologist Jonathan Haidt, authors of the book The Coddling of the American Mind (2018) – namely, that these warnings encourage a belief in the vulnerability of people with a history of trauma and, in fact, in people’s vulnerability in general. For instance… Harvard research found that the use of trigger warnings increased participants’ belief in the vulnerability of people with post-traumatic stress disorder – an unwelcome effect that the researchers described as a form of ‘soft stigma’.’ In other words, not being able to talk through issues for fear of upsetting the other person -avoiding the subject altogether, and hence, treating the person ‘differently’.

But, as Jarrett reiterates, much more study is required to determine the benefits of ‘trigger warnings’. ‘Yet already the results are surprisingly consistent in undermining the specific claim that trigger warnings allow people to marshal some kind of mental defence mechanism. There is also a solid evidence base that avoidance is a harmful coping strategy for people recovering from trauma or dealing with anxiety. The clear message from psychology then is that trigger warnings should come with their own warning – they won’t achieve much, except encourage maladaptive coping and the belief that folk are sensitive and need protecting.’

As my previous essay hinted, I am still undecided about the value of these warnings. I would not knowingly wish to offend anybody, nor traumatize them by something I say, but it can be devilishly difficult to know what might need to be avoided. Every topic is a potential minefield, and yet surely part of the onus is on the recipient to choose -and therefore avoid, or at least warn the offending party- what they find problematic. To be sure, sometimes the extent of effect on them is unpredictable, or a surprise -for both sides- and yet if the subject is presented in a sensitive manner, one would expect much of the damage could be mitigated.

Communication, after all, is the exchange of information, but as much as we might try to soften the blow, it will not always pleasant; to pretend that it will be, is disingenuous, at the very least.

Do you understand why I am confused…?

Like madness, is the glory of this life

My grandmother was old when she died -very old, in fact: she died on the morning after her 100th birthday party. Her congratulatory letter from the Queen -or at least someone official claiming to speak for her highness- came the day before. I’m not so sure it was congratulations, really -more a recognition that a member of the United Kingdom, albethey an émigré, had still remained loyal to her majesty and her dominions for a century.

My grandmother seemed to enjoy the party we held for her -she was all smiles and although she also seemed a bit confused by it all, she was delighted by the letter. It spoke to her of another life, I think -one that whispered the secrets of a little girl growing up in an English seaside town with a shingled beach and an amusement pier that offered tempting glimpses of a world across the sea -a world she couldn’t know would become her own for most of her life.

We all have lives like that -the present we currently occupy pales in depth, in colour, and even in meaning to the worlds we have tasted in our incomparably longer past. It only seems appropriate that when our brains tire of sorting through the tyrannies of the moment, we default to the myriad memories of what we lived. The past can be a comfortable place to rest -familiar, at the very least.

I loved visiting my aging granny -even in the hospital where she spent her final days she was always full of stories, full of wisdom, and full of wonder. And although often confused about current events, or what she’d had for breakfast that morning, her eyes would light up when I asked her to tell me about, say, her train journey across the country when she and grampa first arrived in the boat from England.

She would chuckle when she told me of the pioneer stoves they used to cook their food enroute, and how each time the locomotive stopped to fill the water in its tank, everybody would make a mad dash from the railway coaches to find wood and occasional supplies from the little stations along the way. Her eyes would twinkle as she relived the flavours of whatever food they’d had, and she would laugh at the difficulty of cooking on the ever-moving stoves. She had no trouble remembering how everybody helped each other -she even remembered some of their names after more than eighty years.

So whenever she seemed confused at my visits or flustered by my questions about her health, I would smile and settle in a chair beside her and ask her what she remembered about ‘the old days’ as she decided to call them. After all, I think she lived there most of the time -it seemed a place where she was happy. At any rate, it seemed to calm her, and allow her to speak to me as if she were still in the summer garden she’d loved to show me on my visits years ago to the house she and her husband had built near Vancouver. There seemed to be no disorder in the garden, no anxious  search for a constantly fading identity, nothing forgotten there -just flowers all around us, and birds singing in the bower of trees she’d planted so long ago.

She loved to speak from there, and even then -especially then- I was happy to sit there with her in her past. I lived happily in the two worlds, and she enjoyed meeting me there; like lovers we would float from dream to dream, escaping from the bewildering clatter of a crowded hospital ward. Who would not prefer her floral ‘then’ to her sterile ‘here-and-now’?

The staff told me of the problems with her confusion, and how she would sometimes wander off looking, as she told one of them, ‘for the garden’. And all the while around us, there were often moans and shouts, and irritable reactions to attempts to tame the ward. Sanity lay somewhere in the past -their patients’ past- but the department seemed hastily conceived as a holding area until beds became available in community nursing homes. Hospital was perhaps the wrong place for most of the elders -they were not sick except, perhaps, for home… or for something that reminded them of home, at any rate.

I have to say that I was pleasantly surprised to come across an essay on retrieving the autobiographical memory of demented seniors in Aeon: https://aeon.co/ideas/the-self-in-dementia-is-not-lost-and-can-be-reached-with-care

It was written by Muireann Irish, an associate professor of psychology at the University of Sydney. ‘Our autobiographical memory… seems crucial to weaving a life story that bridges past and present, and permits us to extrapolate how the future might unfold, all within a meaningful and coherent narrative. So what happens when the tapestry of memory begins to fray, and we lose access to defining memories from the past?’

There are many types of neurodegenerative loss -Alzheimer’s among them, of course- and it is progressive. ‘Gradually, as the disease spreads, more distant memories are affected, leading to patchy recall of self-defining events, such as one’s wedding day or the birth of one’s children.’ And without our memories, who are we…? ‘There remains a recalcitrant perception that in parallel with the progressive pathological onslaught in the brain is the inevitable demise of personhood, akin to a ‘living death’.’

But, viewing dementia like that is not only depressing, but incomplete, according to the author. ‘While the illness is devastating, not all memories are obliterated by Alzheimer’s, and much of the person’s general knowledge and recollection of the distant past is retained. There remains a vast repository of life experiences, personal history, stories and fables that endures, even late into the illness. At moderate to severe stages of dementia, activities such as art, dance and music therapy provide important nonverbal means of communicating and fostering social interaction even when, on the surface, many core capabilities might seem to be lost… As the disease progresses and their self-concept becomes more rooted in their past, people with dementia can feel increasingly divorced from their current surroundings, which no longer make sense or feel familiar. This is the catalyst for behaviours that are commonly couched as ‘challenging’, such as agitation, wandering, attempts to leave a care facility to ‘go home’.’

Irish suggests that instead of confronting the dementia with an enforced ‘now’, ‘a positive approach could be to create a ‘memory box’ in anticipation of the days to come. This could form a repository of photographs, keepsakes, newspaper clippings, objects with personal meaning, even fabrics and smells, that resonate with the person and provide an external memory store. Conversations regarding music and songs from the person’s formative years, and the memories that these tunes evoke, could inspire personalised playlists that foster social interaction and the springboard for reminiscence. For care staff, a memory store of this nature would be as important as taking a detailed medical history.’

As for my grandmother, I was happy to sit with her in her garden while she happily regaled me with stories of her past. And I’d like to think that after she received that letter from her queen, she retreated to the garden to read it again and again as her life washed over her like a cooling summer breeze, and the flowers whispered sweet nothings in her ear.