What about Now?

Now can be a tricky thing to police, I think; it keeps changing its clothes, and each time I think I finally recognize it, I realize I’ve mistaken it for somebody else. Someone from a different time, perhaps; someone who looks a lot like a friend in another place, but who is a stranger here with a similar face…

We should all try to live in the now I’ve read, but where, exactly, is that? And if I ever did run across it as I wander along the streets of my life, how would I recognize it? Or, perhaps more to the point, how could I pause there long enough to know I was in the right place -long enough to use it before it vanished as if it never really was?

There’s a lot of mystery to a now, you have to admit. Quite apart from it being infinitely evanescent, I imagine each one of them is different, if only by shades. A now on, say, Thursday, is no doubt different than a now on any other day, although I’ve never stopped long enough to analyze the contents, let alone committed any one instance to memory well enough for an accurate comparison.

Still, even if each now is in fact unique, why should any one example be privileged over any other? With an ocean to choose from, what advantage can be accorded to a single drop? And anyway, if the drop merely attests to the value found all around it, and is merely a representative of the whole, then is it sort of like the trailer-teaser of a movie, or the sample of a product that is intended to interest you in buying more? In which case, it is the whole that is being advertised, not the part. The part is incomplete: one page of the story, only.

And is any previous now equivalent to any new one? If not, are there any characteristics that should mark it for special consumption? Or should we just draw lots, throw dice, to choose? Even if I could stop long enough to find a now and valourize it, I am concerned I’d end up being saddled with the wrong one. A plain one; a defective one…

Metaphysics is certainly confusing; I see why it, and the most famous of its three children -ontology- has become the province of the Philosophers. Fortunately I stumbled upon an essay on the now in an essay by John Martin Fischer, a professor of philosophy at the University of California, Riverside; I have to confess it took me innumerable nows to read it, however. There’s nothing surprising in that, I suppose -still, it made me wonder if I could ever stay put in a now. https://aeon.co/essays/the-metaphysical-claims-behind-the-injunction-to-be-in-the-now

Fischer outlines the belief of various adherents -religious and otherwise- that ‘although it might seem to us that other times – past and future – are appropriate targets of attention, we can come to understand (intellectually and affectively) that in a fundamental sense… there is only the now, and thus our attention should be focused on it.’ The singularity of the now.

He is not convinced of the uniqueness of any particular now, however; he suggests that although we are, in reality, only present in the now, it is actually a trivial observation. Indeed, ‘now is an indexical term. That is, it’s employed flexibly to point to the particular time when it’s used, not the same time every time it’s used.  Similarly, the term here is an indexical term, employed flexibly to refer to the place where it’s uttered, not the same place wherever it’s uttered. Now is a temporal indexical, and here is a spatial indexical.’ I like that.

‘It’s thus not true that it’s always now, in the sense that it’s always the same time… Interpreted so that it’s correct, the intuitive idea that it’s always now doesn’t support the crucial inference that we should focus on the present because of its singularity.’ To paraphrase that apocryphal woman who, when challenged to answer what supported the turtle she believed held the world in place, it’s nows all the way down -an infinite number, in fact. Each may well be a singularity… but so what? What makes any one of them so special? There will no doubt be others each claiming to be exceptional, but only because they are indeed different from the rest.

As Fischer says, ‘it’s that there’s no necessity or inevitability to focusing only on the present moment, based on the fact (if it is a fact) that it’s the only moment that exists or is real.’ And, since it is obviously true that we can neither act in the past, nor in the future and only in the present -the now– then shouldn’t we try to stay in it…? Uhmm, I’m not convinced there’s an option, frankly. And anyway, there are inevitable consequences of acting in any given now that spread into the future and so are not a part of that special ‘singularity’. So, let me repeat, why is it so special -and why would I ever want to privilege it as if it actually contained something more than temporal instantaneity? After all, as Fischer points out, ‘every way of inhabiting the now (including ‘being here now’) is also a way of taking up the past and orienting ourselves to the future.’

No, I’m afraid I’m not really convinced there are any special values to the nows that flash past us like individual frames on a celluloid movie reel. It’s the movie as a whole that is ultimately what each now contributes to: the story. That’s where we all live, after all.

I suppose that if we find the story unpleasant in passages, we might benefit by pausing for a now or two -perhaps in meditation, or conversation with a friend- but in the end, we have to join the succession of picture frames and get on with our lives. It’s how it works.

As Fischer concludes, ‘We have a choice about what we focus on, a choice not dictated by the unique present, if there is one. We are free to choose how we wish to be. We should indeed be here now, but not because the now is all we have.’ We think in Time, we love in Time, we live in Time. Perhaps we should enjoy what we have left of it. All of it…!

Fake lies?

Recently, I’ve been thinking a lot about truth, but not for the reasons you might expect. Not because of the abundance of ‘fake news’ about which we seem to be constantly reminded, and not necessarily because I’ve been occasionally embarrassed in a lie, nor because of the tangled web you wove when first you practiced to deceive.

Fake news and deception, not to mention outright lies, have been in the headlines in recent years, but deception is certainly not unique to our era -nor even our species. Think of bird behaviour to distract predators from their nest, cowbirds that lay their eggs in other nests to trick the foreign mothers into raising the alien young, or squirrels that pretend to bury acorns in one place, but in case they were observed, actually keep them in their mouths while they find another spot to cache them.

I grant almost universality to the practice of intended deception -especially where there is something being protected, if only reputation or status. And, given its ubiquity and seemingly relentless practice in humans, it has a long history of ethical debate. Deception, of course is different from lying -deception is more a case of misleading, whereas lying is saying something known to be false.

I am concerned by something a little different, however. I am vexed by what, at first glance, would seem to be a more trivial concern: does a writer of fiction actually lie? And if the medium is one that does not purport to be factual -a novel, say- is it even possible? How important is truth in a fictive world -as long as it is internally consistent? A character in that story can lie, to be sure, but how analogous is that to a real-life character doing the same thing?

Writers have strange thoughts -perhaps that’s why they end up writing- but nonetheless I have been curious about this for some time now. I wonder about the ethics of fiction -not malicious, or scandalous fiction, you understand (although I suspect even those are merely the far edge of the spectrum). As it applies to writing, the very definition of ‘fiction’ -from the Latin fingere, to contrive- suggests imaginative creation, not investigative reportage where false attributions are indeed ethically problematic.

I’ve written fiction for years now (putting aside the fact that I am not at all widely published) so have I been lying all these years? If one of my characters lies, or deceives, and it happens to be read by someone in the ‘real-world’ -trespassing, in other words- have those lies in some sense transgressed the real-world ethics? Soiled our nest?

You’re right, it is perhaps a trifling concern, and yet bothersome nonetheless; I despaired of ever seeing it as the subject of an understandable evaluation. But, on one of my wide-eyed explorations, I happened upon a thoughtful essay by Emar Maier, an assistant professor of philosophy at the University of Groningen. https://aeon.co/essays/how-to-tell-fact-from-fiction-in-fiction-and-other-forms-of-lies

He starts by considering the work of another philosopher, H.P. Grice who considers that ‘it all comes down to the assumption that communication is fundamentally a cooperative endeavour,’ and postulates what seem to be almost ‘Golden Rule’ maxims of quality in communication: ‘‘do not say what you believe to be false’ and ‘do not say that for which you have insufficient evidence’.’ And yet, we violate these all the time -we tell jokes, we exaggerate, we deceive, we use metaphors, we use sarcasm, and, of course, we tell stories. ‘In all of these cases there is a clear sense in which we are not really presenting the truth, as we know it, based on the best available evidence. But there are vast differences between these phenomena. For instance, while some constitute morally objectionable behaviour, others are associated with art and poetry.’

There is a difference, though, between violating one of Grice’s norms, and flouting it with, say, a sigh and rolling of the eyes. However untrue the assertion, it is readily recognizable as an exaggeration or even a lie that is not meant to be taken as true. On the other hand, ‘Liars… violate the same maxim, but they don’t flout it. Theirs is a covert violation, and hence lying has an altogether different effect on the interpreter than irony, sarcasm or metaphor.’

Fiction, however, is more complicated. A work of fiction ‘consists of speech acts that, for the most part, look like ordinary assertions.’ And yet, ‘As with lies and irony, there is no dedicated grammar or style for constructing fictional statements that would reliably distinguish them from regular assertions.’

So, ‘Is fiction more like the covert violation of the liar, or like the overt violation of the ironical speaker? Unlike the liar, the fiction author doesn’t hide her untruthful intentions.’ There are two ways to look at this, Maier says: either that ‘both fiction and lying are quality-violating assertions – ie, speech acts presenting something believed to be false as if it’s known truth’ or ‘we can analyse fictional discourse as constituting a different type of speech act, where the usual norms and maxims don’t apply in the first place.’

‘[T]he idea that both lying and fiction are just assertions of known falsehoods can be traced back to eminent philosophers such as Plato, who wanted to ban poets from his ideal society, [and] David Hume who called them ‘liars by profession’’.

I, however, am more convinced by the opinion of Albert Camus, who believed that ‘fiction is the lie through which we tell the truth’. At any rate, Maier goes on to observe that a ‘striking difference between fictional statements and lies is the fact that, while most lies are simply false… many philosophers have argued that the statements making up a work of fiction, even those involving clearly nonexistent entities, are not really false, but at least ‘in some sense’ true – viz… true relative to the fictional world in question.’ Now we’re getting somewhere -it’s context that matters.

A second difference between fiction and lies, is the emotional response -the paradox of- fiction. ‘[W]orks of fiction induce… a ‘willing suspension of disbelief’, allowing us to be emotionally engaged with commonly known falsehoods. Lies evidently lack this property: once a lie is exposed, suspension of disbelief and emotional engagement in accordance with the story’s content become impossible… the difference between fictional statements and regular communicative assertions lies not in some hidden logical operators in the fictional assertion, but in the fact that telling fictional stories is an altogether different speech act from the act of assertion that makes up our talk about the weather, or our newspaper reporting.’ Kind of what I suspected all along. ‘As the English poet and soldier Sir Philip Sidney put it in The Defence of Poesy (1595): ‘Now for the poet, he nothing affirmeth, and therefore never lieth.’

So, ‘it seems that fiction and lying are mutually exclusive, for they belong to distinct speech act categories, conform to different norms, and affect different cognitive states… since it is the text itself that generates the fictional world, the statements that make up that text should automatically become true in that world. When George Orwell wrote that ‘the clocks were striking thirteen’, it thereby became true in the fictional world of Nineteen Eighty-Four (1949) that the clocks were striking thirteen. Unlike for the historian or the journalist, there is no relevant world outside the text, relative to which we could fact-check whether Orwell miscounted. This line of argument can be summed up in the principle of authorial authority: the statements that make up a work of fiction are true in that fiction.’

Of course there are things like ‘imaginative resistance’ where internal inconsistencies disrupt belief, but writers -and certainly proof readers and editors- are pretty good at resolving these gaffes before they are hung out to air on the clothesline of publication.

At any rate, I’m not sure I’ve discovered many immutable truths in Maier’s treatment of fictive lying, but I feel better about my own ethics of make-believe. I do still wonder about the boundary markers at that razor-thin edge where well-written fiction seems real and induces real emotion. I suppose edges are usually like that, though: porous…


Remember Plato’s Cave allegory? In his Republic he describes a scenario in which some people have spent their lives chained in a cave so they can only see the wall in front of them. There is a fire behind them that casts shadows on the wall that they have no way of knowing are only shadows. For these people, the shadows are their reality. There seem to be many versions of what happens next, but the variation I prefer is that one of the people escapes from the cave and discovers the sun outside for the first time; he realizes that what he had assumed was real -the shadows- were just that: merely the shadows cast by the actual objects themselves.

Sometimes we become so accustomed to seeing things a certain way, we find it difficult, if not impossible, to believe there could be an alternative view. Or assume that, even if there were another version, it must be wrong. But can we be sure that we are evaluating the alternative fairly and without prejudice? Can we assess it with sufficient objectivity to allow a critical analysis? Or are we unavoidably trapped in the prevailing contemporary Weltanschauung? It’s an interesting question to be sure, and one that begs for examination, if only to explore the world behind the mirror.

I stumbled upon an essay by Julie Reshe, a psychoanalyst and philosopher, who, after recovering from a bout of depression, began to wonder whether depression itself was actually the baseline state, and one that allowed a more accurate view of how things actually are: https://aeon.co/essays/the-voice-of-sadness-is-censored-as-sick-what-if-its-sane

I have to admit that I had to temporarily divorce myself from my usually optimistic worldview to be able to fathom her argument, and I found it rather uncomfortable. But sometimes it’s instructive -even valuable- to look under a rock.

As a philosopher, Reshe felt the need to examine both sides of an argument critically, putting aside preconceptions and biases. ‘Depressogenic thoughts are unpleasant and even unbearable,’ she writes, ‘but this doesn’t necessarily mean that they are distorted representations of reality. What if reality truly sucks and, while depressed, we lose the very illusions that help us to not realise this? What if, to the contrary, positive thinking represents a biased grasp of reality? … What if it was a collapse of illusions – the collapse of unrealistic thinking – and the glimpse of a reality that actually caused my anxiety? What if, when depressed, we actually perceive reality more accurately? What if both my need to be happy and the demand of psychotherapy to heal depression are based on the same illusion?’ In other words, what if I am actually not a nice person? What if there’s a reason people don’t like me?

Whoa! Suppose this is not a counterfactual? After all, other philosophers have wondered about this. For example, Arthur Schopenhauer whose deeply pessimistic writings about the lack of meaning and purpose of existence I have never understood, or the equally famous German philosopher Martin Heidegger who felt that anxiety was the basic mode of human existence. ‘We mostly live inauthentically in our everyday lives, where we are immersed in everyday tasks, troubles and worries, so that our awareness of the futility and meaninglessness of our existence is silenced by everyday noise… But the authentic life is disclosed only in anxiety.’ My god, where’s my knife?

And even Freud wasn’t optimistic about the outcome of psychotherapeutic treatment, and was ‘reluctant to promise happiness as a result.’ He felt that ‘psychoanalysis could transform hysterical misery into ‘common unhappiness’. Great…

And then, of course, there’s the philosophical tradition called ‘depressive realism’ which holds that ‘reality is always more transparent through a depressed person’s lens.’ And just to add more poison to the cake, ‘the Australian social psychologist Joseph Forgas and colleagues showed that sadness reinforces critical thinking: it helps people reduce judgmental bias, improve attention, increase perseverance, and generally promotes a more skeptical, detailed and attentive thinking style.’

All of which is to say, I suppose, ‘The evolutionary function of depression is to develop analytical thinking mechanisms and to assist in solving complex mental problems. Depressive rumination helps us to concentrate and solve the problems we are ruminating about… depressive rumination is a problemsolving mechanism that draws attention to and promotes analysis of certain problems.’

I have presented these deeply troubling ideas merely as an exercise in perspective, I hasten to add. Sometimes it is valuable to try to grasp the Umwelt of the stranger on the other side of the door before we open it. We can only help if we are willing to understand why they are there.

Part of the solution may lie in puzzling out Reshe’s counterfactuals. She seems to want to assign meaning to her former depression, as have many of the other people she mentions, to buttress her point. She also seems to feel that there was a time when that point of view might have seemed more mainstream. That nowadays there is just too much expectation of happiness -unrealistic expectations by and large, which presents a problem in and of itself. If we constantly expect to achieve a goal, but, like a prairie horizon, it remains temptingly close and yet just out of reach, we are doomed to frustration -a self-fulfilling prophecy.

And yet, it seems to me that resigning oneself to unhappiness, or its cousin depression, doesn’t represent a paradigm shift, but rather a rationalization that it must be the default position -and therefore must serve some useful evolutionary purpose; a position benighted and stigmatized because it advertises the owner’s failure to achieve the goal that others seem to have realized.

I’m certainly not disparaging depression, but neither am I willing to accept that it serves any evolutionary strategy except that of a temporary, albeit necessary harbour until the storm passes. And to suggest that positive emotions -happiness, contentment, joy, or pleasure, to name just a few- however short-lived, are illusory, and unrealistic expectations, is merely to excuse and perhaps justify an approach to depression that isn’t working. A trail that only wanders further into the woods.

I’m certainly cognizant of the fact that there is a spectrum of depressions, from ennui to psychotic and that some are more refractory to resolution than others, but that very fact argues against leaving them to strengthen, lest they progress to an even more untenable and dangerous state.

Perhaps we need to comfort ourselves with the ever-changing, ever-contrasting nature of emotions, and not expect of them a permanence they were likely never evolved to achieve.

Goldilocks, it seems to me, realized something rather profound when she chose the baby bear’s porridge after finding papa bear’s porridge too hot, and mamma bear’s too cold: it was just right…

Imagination bodies forth the forms of things unknown

The poet’s eye, in fine frenzy rolling,
Doth glance from heaven to earth, from earth to heaven;
And as imagination bodies forth
The forms of things unknown, the poet’s pen
Turns them to shapes and gives to airy nothing
A local habitation and a name.
Such tricks hath strong imagination.                                                                                                                   
Theseus, in Shakespeare’s A Midsummer Night’s Dream

Shakespeare had a keen appreciation of the value of imagination, as that quote from A Midsummer Night’s Dream suggests. But what is imagination? Is it a luxury -a chance evolutionary exaptation of some otherwise less essential neural circuit- or a purpose-made system to analyse novel features in the environment? A mechanism for evaluating counterfactuals -the what-ifs?

A quirkier question, perhaps, would be to ask if it might predate language itself -be the framework, the scaffolding upon which words and thoughts are draped. Or is that merely another chicken versus egg conundrum drummed up by an overactive imagination?

I suppose what I’m really asking is why it exists at all. Does poetry or its ilk serve an evolutionary purpose? Do dreams? Does one’s Muse…? All interesting questions for sure, but perhaps the wrong ones with which to begin the quest to understand.

I doubt that there is a specific gene for imagination; it seems to me it may be far more global than could be encompassed by one set of genetic instructions. In what we would consider proto-humans it may have involved more primitive components: such non-linguistic features as emotion -fear, elation, confusion- but also encompassed bodily responses to external stimuli: a moving tickle in that interregnum between sleep and wakefulness might have been interpreted as spider and generated a muscular reaction whether or not there was an actual creature crawling on the skin.

Imagination, in other words, may not be an all-or-nothing feature unique to Homo sapiens. It may be a series of  adaptations to the exigencies of life that eventuated in what we would currently recognize as our human creativity.

I have to say, it’s interesting what you can find if you keep your mind, as well as your eyes, open. I wasn’t actively searching for an essay on imagination -although perhaps on some level, I was… At any rate, on whatever level, I happened upon an essay by Stephen T Asma, a professor of philosophy at Columbia College in Chicago and his approach fascinated me. https://aeon.co/essays/imagination-is-such-an-ancient-ability-it-might-precede-language

‘Imagination is intrinsic to our inner lives. You could even say that it makes up a ‘second universe’ inside our heads. We invent animals and events that don’t exist, we rerun history with alternative outcomes, we envision social and moral utopias, we revel in fantasy art, and we meditate both on what we could have been and on what we might become… We should think of the imagination as an archaeologist might think about a rich dig site, with layers of capacities, overlaid with one another. It emerges slowly over vast stretches of time, a punctuated equilibrium process that builds upon our shared animal inheritance.’

Interestingly, many archaeologists seem to conflate the emergence of imagination with the appearance of artistic endeavours –‘premised on the relatively late appearance of cave art in the Upper Paleolithic period (c38,000 years ago)… [and] that imagination evolves late, after language, and the cave paintings are a sign of modern minds at work.’

Asma, sees the sequence rather differently, however: ‘Thinking and communicating are vastly improved by language, it is true. But ‘thinking with imagery’ and even ‘thinking with the body’ must have preceded language by hundreds of thousands of years. It is part of our mammalian inheritance to read, store and retrieve emotionally coded representations of the world, and we do this via conditioned associations, not propositional coding.’

Further, Asma supposes that ‘Animals appear to use images (visual, auditory, olfactory memories) to navigate novel territories and problems. For early humans, a kind of cognitive gap opened up between stimulus and response – a gap that created the possibility of having multiple responses to a perception, rather than one immediate response. This gap was crucial for the imagination: it created an inner space in our minds. The next step was that early human brains began to generate information, rather than merely record and process it – we began to create representations of things that never were but might be.’ I love his idea of a ‘cognitive gap’. It imagines (sorry) a cognitive area where something novel could be developed and improved over time.

I’m not sure that I totally understand all of the evidence he cites to bolster his contention, though- for example, the view of philosopher Mark Johnson at the University of Oregon that there are ‘deep embodied metaphorical structures within language itself, and meaning is rooted in the body (not the head).’ Although, ‘Rather than being based in words, meaning stems from the actions associated with a perception or image. Even when seemingly neutral lexical terms are processed by our brains, we find a deeper simulation system of images.’ But at any rate, Asma summarizes his own thoughts more concisely, I think: ‘The imagination, then, is a layer of mind above purely behaviourist stimulus-and-response, but below linguistic metaphors and propositional meaning.’

In other words, you don’t need to have language for imagination. But the discipline of biosemantics tries to envisage how it might have developed in other animals. ‘[Primates] have a kind of task grammar for doing complex series of actions, such as processing inedible plants into edible food. Gorillas, for example, eat stinging nettles only after an elaborate harvesting and leave-folding [sic] sequence, otherwise their mouths will be lacerated by the many barbs. This is a level of problem-solving that seeks smarter moves (and ‘banks’ successes and failures) between the body and the environment. This kind of motor sequencing might be the first level of improvisational and imaginative grammar. Images and behaviour sequences could be rearranged in the mind via the task grammar, long before language emerged. Only much later did we start thinking with linguistic symbols. While increasingly abstract symbols – such as words – intensified the decoupling of representations and simulations from immediate experience, they created and carried meaning by triggering ancient embodied systems (such as emotions) in the storytellers and story audiences.’ So, as a result, ‘The imaginative musician, dancer, athlete or engineer is drawing directly on the prelinguistic reservoir of meaning.’

Imagination has been lauded as a generator of progress, and derided as idle speculation throughout our tumultuous history, but there’s no denying its power: ‘The imagination – whether pictorial or later linguistic – is especially good at emotional communication, and this might have evolved because emotional information drives action and shapes adaptive behaviour. We have to remember that the imagination itself started as an adaptation in a hostile world, among social primates, so perhaps it is not surprising that a good storyteller, painter or singer can manipulate my internal second universe by triggering counterfactual images and events in my mind that carry an intense emotional charge.’

Without imagination, we cannot hope to appreciate the Shakespeare who also wrote, in his play Richard III:

Princes have but their titles for their glories,                                                                                                      An outward honor for an inward toil,                                                                                                                And, for unfelt imaginations,                                                                                                                                They often feel a world of restless cares.

Personally, I cannot even imagine a world where imagination doesn’t play such a crucial role… Or can I…?


Is the thing translated still the thing?

When I was a student at University, translated Japanese Haiku poetry was all the rage; it seemed to capture the Zeitgeist of the generation to which I had been assigned. I was swept along with others by the simple nature images, but -much like the sonnet, I suppose- I failed to realize how highly structured it was. In fact, I can’t really remember all of its complex requirements -but maybe that’s the beauty of its seeming simplicity in English. However, the contracted translation of the Japanese word –haikai no ku, meaning ‘light verse’- belies the difficulty in translating the poetry into a foreign language while still conserving its structure, its meaning, and also its beauty.

It seems to me that the ability to preserve these things in translation while still engaging the interest of the reader requires no less genius than that of its original creator. While, both in poetry as well as in the narrative of story, the ideas of the authors, and their images, plots and metaphors are an intrinsic part of the whole, sometimes the concepts are difficult to convey to a foreign culture. So, what to do with them to maintain the thrust of the original while not altering the charm? And when does the translation actually become a different work of art and suggest the need for a different attribution?

Given my longstanding  love for poetry and literature, I have often wondered whether I could truly understand the poetry of, say, Rumi who wrote mainly in Persian but also in Turkish, Greek and Arabic; or maybe, the more contemporary Rilke’s German language poetry. I speak none of those languages, nor do I pretend to understand the Umwelten of their times, so how do I know what it is that attracts me, apart from the beauty of their translations? Is it merely the universality of their themes, and perhaps my mistaken interpretations of the images and metaphors, or is there something else that seeps through, thanks to -or perhaps in spite of- the necessary rewording?

Since those heady days in university, I have read many attempts to explain, and even to justify, various methods of translation, and they all seem to adhere to one or both of the only two available procedures: paraphrasing, or metaphrasing (translating word for word). And no matter which is used, I have to wonder if the product is always the poor cousin of the original.

In one of the seminars from university, I remember learning that as far back as St. Augustine and St. Jerome, there was disagreement about how to translate the Bible -Augustine favoured metaphrasis, whereas Jerome felt that there was ‘a grace of something well said’. Jerome’s appealing phrase has stayed with me all these years. Evidently, the problem of translation goes even further back in history though, and yet the best method of preserving the author’s intention is still no closer to being resolved.

In my abiding hope for answers, I still continue to search. One such more recent forage led me to an essay in the online publication Aeon by the American translator and author Mark Polizzotti (who, among other honours, is a Chevalier of the Ordre des Arts et des Lettres, the recipient of a 2016 American Academy of Arts and Letters award for literature, and a publisher and editor-in-chief at the Metropolitan Museum of Art in New York). https://aeon.co/essays/is-the-translator-a-servant-of-the-text-or-an-original-artist

He writes, ‘as the need for global communication grows by proverbial leaps, the efficiency of machine-based translation starts looking rather attractive. In this regard, a ‘good’ translation might simply be one that conveys the requisite bytes of information in the shortest time. But translation is about more than data transmission, and its success is not always easy to quantify. This becomes particularly true in the literary sphere: concerned with delivering artistic effect more than facts simple and straight.’

So, ‘We might think that the very indeterminacy of literary translation would earn it more leeway, or more acceptance.’ And yet, ‘many sophisticated readers view translation as no more than a stopgap… it would be disingenuous to claim that the reader of a translation is truly experiencing, in all its aspects, the foreign-language work it represents, or that in reading any text transposed from one language into another there isn’t a degree of difference (which is not the same as loss). The heart of the matter lies in whether we conceive of a translation as a practical outcome, with qualities of its own that complement or even enhance the original, or as an unattainable ideal, whose best chance for relative legitimacy is to trace that original as closely as possible.’

Polizzotti goes on to catalogue various approaches and views of translation and then suggests what I, at least, would consider the best way to think of translation and the obvious need it attempts to fulfil: ‘If instead we take translators as artists in their own right, in partnership with (rather than servitude to) their source authors; if we think of translation as a dynamic process, a privileged form of reading that can illuminate the original and transfer its energy into a new context, then the act of representing a literary work in another language and culture becomes something altogether more meaningful. It provides a new way of looking at a text, and through that text, a world. In the best of cases, it allows for the emergence of an entirely new literary work, at once dependent on and independent of the one that prompted it – a work that neither subserviently follows the original nor competes with it, but rather that adds something of worth and of its own to the sum total of global literatures. This does not mean taking undue liberties with the original; rather, it means honouring that original by marshalling all of one’s talent and all of one’s inventiveness to render it felicitously in another language.

‘To present a work as aptly as possible, to recreate it in all its beauty and ugliness, grandeur and pettiness, takes sensitivity, empathy, flexibility, knowledge, attention, caring and tact. And, perhaps most of all, it takes respect for one’s own work, the belief that one’s translation is worth judging on its own merits (or flaws), and that, if done well, it can stand shoulder to shoulder with the original that inspired it.’

Polizzotti has nailed it. There’s a spirit inherent in good translation -one that inspires a confidence that the original intent of the author is appropriately, and befittingly displayed.

One of the reasons I was drawn to Polizzotti’s essay was a recent book I read (in translation): The Elegance of the Hedgehog, by Muriel Barbery and translated from the original French by Alison Anderson. So seamless was the narrative, and so apt were the translated dialogues, I have to confess that I had difficulty believing the book had not originally been written in English. And as it stands, it is one of the most rewarding books I have experienced in years. I’m sure that Ms Barbery is well content with Anderson’s translation, not the least because their efforts earned it accolades from various critics, including a posting on the New York Times best-seller list.

It seems to me that one could not expect more from a translator than that.

To hold, as it were, a mirror up to Nature

Who am I? No, really -where do I stop and something else begins? That’s not really as silly a question as it may first appear. Consider, for example, my need to remember something -an address, say. One method is to internalize it -encode it somehow in my brain, I suppose- but another, no less effective, is to write it down. So, if I choose the latter, is my pen (or keyboard, for that matter) now in some sense a functional part of me? Is it an extension of my brain? The result is the same: the address is available whenever I need it.

Ever since my university days, when I discovered the writings of the philosopher Alan Watts, I have been intrigued by his view of boundaries, and whether to consider them as things designed to separate, or to join. Skin, was one example that I remember he discussed -does it define my limits, and enclose the me inside, or is it actually my link with the outside world? I hadn’t really thought much about it until then, but in the intervening years it has remained an idea that continues to fascinate me.

Clearly Watts was not alone in his interest about what constitutes an individual, nor in his speculations about the meaning of whatever identities individuals think they possess by virtue of their boundaries. There was an insightful article in Aeon by Derek Skillings, a biologist and philosopher of science at the University of Pennsylvania entitled ‘Life is not easily bounded’: https://aeon.co/essays/what-constitutes-an-individual-organism-in-biology

‘Most of the time the living world appears to us as manageable chunks,’ he writes, ‘We know if we have one dog or two.’ Why then, is ‘the meaning of individuality … one of the oldest and most vexing problems in biology? …  Different accounts of individuality pick out different boundaries, like an overlapping Venn diagram drawn on top of a network of biotic interactions. This isn’t because of uncertainty or a lack of information; rather, the living world just exists in such a way that we need more than one account of individuality to understand it.’ But really, ‘the problem of individuality is (ironically enough) actually composed of two problems: identity and individuation. The problem of identity asks: ‘What does it mean for a thing to remain the same thing if it changes over time?’ The problem of individuation asks: ‘How do we tell things apart?’ Identity is fundamentally about the nature of sameness and continuity; individuation is about differences and breaks.’ So, ‘To pick something out in the world you need to know both what makes it one thing, and also what makes it different than other things – identity and individuation, sameness and difference.’

What about a forest -surely it is a crowd of individual trees?  Well, one way of differentiating amongst individuals is to think about growth -a tree that is growing (in other words, continuing as more of the same)- and contrasting it with producing something new: as in reproduction. And yet even here, there is a difficulty. It’s difficult to determine the individual identities of any trees that also grew from the original roots -for example from a ‘nurse’ tree lying on the ground with shoots and saplings sprouting from it.’

But it’s not only plants that confuse the issue. If reproduction -i.e. producing something new– counts as a different entity, then what about entities like bacteria? ‘These organisms tend to reproduce [albeit] by asexual division, dividing in half to produce two clones… and, failing mutation and sub-population differentiation, an entire population of bacteria would be considered a single individual.’ -whatever ‘individual’ might therefore mean.

And what about us, then? Surely we have boundaries, surely we are individuals created as unique entities by means of sexual reproduction. Surely we have identities. And yet, what of those other entities we carry with us through our lives -entities that not only act as symbiotes, but are also integrated so thoroughly into our metabolism that they contribute to such intimate functions as our immune systems, our weight and health, and even function as precursors for our neurotransmitters and hence our moods? I refer, of course, to the micro-organisms inhabiting our bowels -our microbiome. Clearly ‘other’ and yet essential to the functioning person I regard as ‘me’.

And yet, our gut bacteria are mostly acquired from the environment -including the bacteria colonizing our mother’s vagina and probably her breast milk- and so are not evolutionarily prescribed, nor thereby hereditarily transmitted. So, am I merely a we –picking up friends along the way? Well, consider mitochondria -the powerhouse of our cells. They were once free-living bacteria that adapted so well inside our cells that they, too, are integral to cell functioning but have lost the ability to survive separately; they are transmitted from generation to generation. So they are me, right…?

Again I have to ask just who is me? Or is the question essentially meaningless put like that? Given that I am a multitude, and more like a city than a single house, shouldn’t the question be who are we? The fact that all of us, at least in Western cultures, consider ourselves to be distinct entities -separate individuals with unique identities- makes me wonder, about our evolutionary history.

Was there a time when we didn’t make the distinctions we do nowadays? A time when we thought of ourselves more as members of a group than as individuals? When, perhaps sensing that we were constantly interacting with things outside and inside us, the boundaries were less important? Is that how animals would say they see the world if they were able to tell us?

Does our very ability to communicate with each other with more sophistication, create the critical difference? Is that what created hubris? In Greek tragedy, remember, hubris -excess pride and self-confidence- led inexorably to Nemesis, retributive justice. Were poets in that faraway time, trying to tell people something they had forgotten? Is that what this is all about?

I wonder if Shakespeare, as about so many things, was also aware of our ignorance: ‘pride hath no other glass to show itself but pride, for supple knees feed arrogance and are the proud man’s fees.’

Plus ça change, eh?

Wast thou o’erlook’d, even in thy birth?

That Age can do some funny things to the mind seems fairly obvious. The accumulation of years, brings with it a panoply of experience that, hopefully, enables a kind of personalized Weltanschauung to emerge -things begin to sort themselves on the proper shelves, and even if they remain difficult to retrieve, there is a satisfaction that they are there, if not completely codified.

Of course, admixed with any elder ruminations are the ever-present intimations of imminent mortality -but it’s not that Age constrains the thought process to memento mori, so much as a flourishing of its antithesis: memento vivere. Age is a time for reflection about one’s life with a perspective from further up the hill.

And yet, for all the experiential input, there are two time frames hidden from each of us -what happens after death, is the obvious one to which most of us turn our attention as the final act draws to a close, but there is an equally shrouded area on which few of us spend any time: what, if anything, was preconceptual existence like? Is it the equivalent of death, perhaps minus the loss of an identity not yet acquired?

I wonder if it’s a subject more understandable to the very young, than the gnarled and aged. I remember the very first time I was taken to a movie theatre, somewhere around two or three years of age, I think. When I say ‘remember’, I mean to say I have only one recollection of the event: that of a speeding locomotive filmed in black-and-white from track level, and roaring over the camera. It was very exciting, but I remember my father being very puzzled when I confessed that I’d seen it before. I hadn’t, of course, as he patiently explained to me, and yet it seemed to me I’d seen the same thing years before.

No doubt it was my still-immature neurons trying to make sense of the world, but the picture seemed so intuitively obvious to me at the time. And through the years, the image has stayed with me, as snippets of childhood memories sometimes do, although with the meaning now sufficiently expurgated as to be innocuous, as well as devoid of any important significance.

And then, of course, there was the Bridey Murphy thing that was all the rage when I was growing up in the 1950ies. I read the book The Search for Bridey Murphy in my early teenage years about a Colorado woman, Virginia Tighe, who, under hypnotic regression in the early 1950ies, claimed she was the reincarnation of an Irish woman, Bridey Murphy from Cork in the 19th century. I even went to see the movie of the same name as the book. It was all pretty well debunked subsequently, but I suppose it was enough, at a tender age, to make me wonder about what might have happened before I become me.

At any rate, I am puzzled about why the seeming non-existence prior to conception is not something we think about more often. True, we would likely have no identity to put into that side of the equation, nor, for that matter, the loss of anything like friends or, well, existence, on the other, but still it is a comparable void. A wonderful mystery every bit as compelling as death.

I suppose the issue resurfaced for me a few years ago when I had a very vivid dream about our three-score-and-ten of existence. I saw myself as a bubble rising through some boiling water. While I was the bubble, I thought of myself as singular and not only separate from, but possessing an identity totally differentiated and unique from everything else around me. My life was the time it took me to rise to the surface. And yet when I arrived there, and my bubble burst and disappeared, when the me inside dissolved in the air from which I started, it all made sense. In fact, the encapsulated journey itself was an aberration, as was the idea of identity…

The dream lay fallow for several years and then reawakened, Phoenix-like, when I discovered an essay in the online publication Aeon, by Alison Stone, a professor of philosophy at Lancaster University in the UK. https://aeon.co/ideas/thinking-about-ones-birth-is-as-uncanny-as-thinking-of-death

‘Many people feel anxious about the prospect of their death,’ she writes. ‘Indeed, some philosophers have argued that death anxiety is universal and that this anxiety bounds and organises human existence. But do we also suffer from birth anxiety? Perhaps. After all, we are all beings that are born as well as beings that die… Once we bear in mind that we are natal as well as mortal, we see some ways in which being born can also occasion anxiety.’

I don’t believe she is thinking of what it must feel like to be born, so much as the transition from, well, the nothing before sperm and egg meet, to a something -to a somebody. She quotes the thoughts of the bioethicist David Albert Jones in his 2004 book The Soul of the Embryo: ‘We might be telling someone of a memory or event and then realise that, at that time, the person in front of us did not even exist! … If we seriously consider the existence and the beginning of any one particular human being … we realise that it is something strange and profound.’

Stone continues, ‘I began to exist at a certain point in time, and there is something mysterious about this. I haven’t always been there; for aeons, events in the world unfolded without me. But the transition from nonexistence to existence seems so absolute that it is hard to comprehend how I can have passed across it… To compound the mystery further, there was no single crossing point. In reality, we don’t begin in [a] sudden, dramatic way… Rather, I came into existence gradually. When first conceived, I was a single cell (a zygote). Then I developed a formed body and began to have a rudimentary level of experience during gestation. And once out of my mother’s womb, I became involved in culture and relationships with others, and acquired a structured personality and history. Yet the zygote that I began as was still me, even though it had none of this.’ Wow -you see what I mean?

Stone seems to think that all this is rather distressing, but I disagree. All I feel is a sense of profound, unbounded wonder at it all. Reflecting on that time-before-time is not unweaving the rainbow, as Keats was said to have accused Newton of doing because he had destroyed its poetry by actually studying it.

In fact, I’m reminded of something the poet Kahlil Gibran wrote: And when you were a silent word upon Life’s quivering lips, I too was there, another silent word. Then life uttered us and we came down the years throbbing with memories of yesterday and with longing for tomorrow, for yesterday was death conquered and tomorrow was birth pursued.

I have to believe there will still be poetry in the world -with or without us…

Virtues we write in water

I’ve only recently stumbled on the concept of virtue signalling. The words seem self-explanatory enough, but their juxtaposition seems curious. I had always thought of virtue as being, if not invisible, then not openly displayed like chest hair or cleavage. Perhaps it’s my United Church lineage, or the fact that many of my formative years were spent in pre-Flood Winnipeg, but the idea of flaunting goodness still seems anathema to me -too social mediesque, I suppose.

Naturally, I am reminded of that line in Shakespeare’s Henry VIII: Men’s evil manners live in brass; their virtues we write in water. And, although I admit that I am perhaps woefully behind the times -and therefore, hopefully, immune from any accusations of what I have just disparaged- it seems to me that virtue disappears when advertised as such; it reappears as braggadocio. Vanity.

Because I had never heard of the issue, it was merely an accident that I came across it in an article in Aeon: https://aeon.co/ideas/is-virtue-signalling-a-perversion-of-morality

It was an essay written by Neil Levy, a senior research fellow of the Oxford Uehiro Centre for Practical Ethics and professor of philosophy at Macquarie University in Sydney. ‘Accusing someone of virtue signalling is to accuse them of a kind of hypocrisy. The accused person claims to be deeply concerned about some moral issue but their main concern is – so the argument goes – with themselves.’

And yet, as I just wrote, ‘Ironically, accusing others of virtue signalling might itself constitute virtue signalling – just signalling to a different audience… it moves the focus from the target of the moral claim to the person making it. It can therefore be used to avoid addressing the moral claim made.’ That’s worrisome: even discussing the concern casts a long shadow. But is that always ‘moral grandstanding’?

Levy wonders if ‘virtue signalling, or something like it, is a core function of moral discourse.’ Maybe you can’t even talk about virtue, without signalling it, and maybe it signals something important about you -like a peacock’s tail advertising its fitness.

The question to be asked about signalling, though, is whether it is costly (like the resources that are needed to create the tail), or enhances credibility -honesty, I suppose- (like the sacrifice that might be involved in outing, say, an intruder that might harm not only the group, but also the signaller). And while the latter case may also involve a significant cost, it may also earn a significant reward -not only cooperation in standing up en masse to the predator, let’s say, but also commendation for alerting the group: honour, prestige…

Seen in this light, Levi thinks, virtue signalling may in fact be a proclamation to the in-group -the tribe- and identifying the signaller as a member. So would this virtue signalling occur when nobody else was around -when only the signaller would know of his own virtue? Would he (Okay, read I) give to charity anonymously? Help someone in need without identifying himself? And if so, would it still be virtue signalling, if only to himself? Is it even possible to be hypocritical to oneself…?  Interesting questions.

Of course, memory is itself mutable, and so is it fair to criticize someone who honestly believes they acted honourably? Would it be legitimate to accuse them of virtue signalling, even if evidence suggested another version of the event?

Long ago, when I was a freshman living in Residence at university, a group of us decided to celebrate our newly found freedom from parental supervision and headed off to a sleazy pub near the school that catered to students and was known to be rather forgiving of minimum age requirements for drinks.

For some of us at least, alcohol had not been a particularly significant part of our high school experience and so I quickly found myself quite drunk. I woke up, apparently hours later, lying on my bed and none the wiser about the night. I was wearing my roommate’s clothes, and I could see mine lying clean and neatly folded on the chair beside my desk. My wallet and watch, along with a few coins were arranged carefully on top.

“You passed out in the pub,” Jeff explained when I tried, unsuccessfully, to sit up in bed. “I thought I’d better wash your clothes, after you were sick all over them,” he explained, smiling proudly at his charity. “Well, actually, Brenda put them in the washer -I’m not good at that kind of stuff.” He stared at me for a moment, shaking his head in mock disbelief. “Boy, you were really wasted! It took three of us to get you back…”

I remember trying to focus my eyes on him as I attempted to think about the evening, and then slumped back onto the pillow and slept for most of the morning.

My memory of the pub night is vague now, but I do remember going to the store the next day to buy something, and finding that, apart from the coins, I had no money left -none in the pockets of the freshly washed clothes, of course, but none of the money my parents had given me for my first month’s expenses that had been in my wallet either.

None of this is particularly consequential, I suppose, but it did surface at a class reunion many years later. Jeff was now a high school teacher, Brenda a lawyer, and I had just finished a medical residency and was about to open a consulting practice.

Jeff, as had always been his wont, was holding his own noisy court at the bar, and Brenda -now his wife- was glaring at him. He was slurring his words already, even though the socializing part of the evening had just begun.

Perhaps in an effort to deflect her attention he glanced around the room and when he saw me, waved.

“Remember old G?” he shouted to nobody in particular, and immediately embraced me as soon as I got close enough. I saw a few people I recognized, but even under Brenda’s worried look, Jeff wouldn’t let go of my arm. “G was my roomie…” Jeff explained and signalled the bartender for another beer with his free hand before Brenda waved him off. “He used to get so drunk,” he explained, although I had trouble untangling his words. “Thank the gods that I was around to take care of his, though…”

His what,” I asked, largely to break the palpable tension between Jeff and Brenda.

Jeff looked surprised. “Take care of him…  Take care of you, roomie. You!” He looked at Brenda and finally let go of my hand. “One night he got so drunk, I had to carry him home, and then lend him my clothes because he’d been sick all over his own…”

The others in the group shuffled nervously and glanced at each other. Brenda seemed angry, but I just shrugged.

“That was good of you, Jeff,” I said. “I obviously needed help that night…” I hadn’t forgotten about the missing money, but now wasn’t the time to mention it.

The others smiled and nodded -rather hesitantly, I thought.

“But, that’s what a real friend does, eh?” Jeff added, as Brenda tugged on his arm to leave. She blinked self-consciously at me as she led him away from the bar. “Nice to see you again, G,” she said, her eyes silently apologizing to me. “Maybe we can talk later, eh…?”

I think she knew more about the missing money than she was willing to admit, even to friends.

Maybe we were all virtue-signalling, though…


Popular opinion to the contrary, it seems to me that there are advantages to cultural naïveté -well, literary innocence, at any rate. Being seduced into a novel or short story solely because of the reputation of the author, or the ravings of a friend, risks disappointment -if only in your friend’s lack of sophistication. And even if the choice was successful, there remains, for me at least, a lingering sense of manipulation, of being swept along in a crowd: just another nameless member of the flock. I would much prefer to watch it from the edge, untouched by all but the gentle murmur of its passing.

There is far more pleasure in the unguided discovery of a title or an author unbesmirched by popularity, and hiding, perhaps, in a used book store, or on the shelf of one of those take-one-give-one piles I seem to frequent at neighbourhood bus stops. For me, their anonymity -however transient- is an adventure. But I suppose I’ve always been drawn to the potential of the unsigned, the wisdom of the incognitive with no particular affiliation. Graffiti -the polite ones anyway- can be compelling, too. With them, there is seldom need for attribution, and indeed, the recognition of authorship might well detract from the message, and relegate it to partisan politics rather than liberate it to a vox populi, if not a vox dei.

I had feared this was merely a personal conceit, a longing for an unspoiled hilltop from which to evaluate the countryside, but as sometimes happens, I discovered there were others who also wandered lonely as a cloud -although with much more erudition. Tom Geue is perhaps a good example. He is a lecturer in Latin in the School of Classics at the University of St Andrews in Scotland, and wrote a thought-provoking essay on anonymity for the online publication Aeon:  https://aeon.co/essays/lessons-from-ancient-rome-on-the-power-of-anonymity

‘Not knowing the author of a literary work does something powerful to the reader: it makes her experience the words as an exemplary, representative, far-reaching burst of culture, a spark of art that seems to transcend the limits of the singular intelligence… The potential of the anonymous work is in its ability to throw the reader into the realm of apparent universality.’

As a scholar of classical Latin literature, he illustrates many of his arguments with examples from the period. ‘Literature for the Romans was primarily the product of a singular intelligence… A literary text without authorship was often thought of as something dark, mysterious, lacking and disabled. In fact, a whole part-industry of scholarship sprouted up around securing attribution, making sure, that is, that the right texts had their proper authors, and that readers could know the worth of what they read…  Even when there was no clear single point of origin for a work – eg, when the authorship was genuinely shared – Ancient readers invented one: it could never just be the Iliad or the Odyssey; it had to be the Iliad or Odyssey of Homer. There was little space in the culture of authorship for works whose author was properly unknown; and many modern readers have inherited these exclusionary tastes.’

Despite -or maybe because of- the ‘anti-anonymity biases of the Classical canon’ though, Geue seems intrigued with an anonymous historical novel Octavia that he admits we have probably heard nothing about. ‘The play is an anonymous masterpiece, and it is about the divorce and exile of Nero’s first wife, Octavia, set in 62 BCE. It stages the domestic tension and revolutionary springback of absolute power spinning out of control, and it does so with more ambition and urgency than almost any other piece of drama to survive from Ancient Rome.’ But it is unsigned for an obvious reason: probable political retribution if the author were known. And, as Geue suggests, ‘Names tame certain forces; anonymity unleashes them.’

I see that as a cause for concern, however: information -or propaganda- can obviously wreak havoc if it is false, unattributable. Graffiti are one thing, but social media is another. Since antiquity, it has always been important to know if the source of the information possessed enough expertise to justify acceptance -or, was at least trustworthy and otherwise neutral. No doubt this is why Science and its scientists have hitherto enjoyed wide public acceptance. The recent rapid emergence of social media with its anonymous sources, and agenda-laden dis-information, however, has cast some deep shadows over expert opinions. To say the least, this is a troubling development.

And yet that type of writing is not what I am celebrating. Fact-driven compositions will likely continue to need scrutiny -to mislead is to harm, if only the Zeitgeist. But when we’re talking about literature and poetry, anonymity can be tantalizing. Enticing. Character and subject development, skillful storytelling along with evocative metaphors and a seductive plot-line are far more important than author identification in that idiom. Whether, in other words, the Iliad, was actually written by a poet named Homer -if he even existed- or whether the stories are merely compilations of the works of many unnamed authors, subtracts nothing from the brilliance of their contents. I think the mystery adds to the allure.

There is beauty in discovery, there is wisdom in metaphors- but there is also a certain charm in the as yet unknown. My father was a Baptist, and came from a non-dancing, non-card-playing family, so his cursing was, well, imaginative to say the least. Most of them were evocative of frustration, or folk wisdom -like ‘it’s not the size of the dog in the fight, but the size of the fight in the dog…’ That sort of thing.

Some, though, defied my childhood comprehension and vocabulary, and I assumed they were special remnants of a world I was too young to have experienced. There was a phrase he said that I always enjoyed: ‘jumped-up mackinaw’. It was my father’s favourite expression and it always made me laugh, so he would too, and then reach out and hug me. I’ve always associated the expression with what I loved about him: he made me happy.

It was long before Google and the internet, and I remember my friends thought ‘jumped-up’ was  something bad: swearing. So with considerable trepidation, I asked a teacher what it meant one time after class when she seemed to be in a good mood.

“Well,” she said, after thinking about it, “I know about Mackinaw shirts… They were made of water-repellent wool, or something.” She looked at the ceiling for a moment. “Loggers wore them, I think…”

“So… what about the ‘jumped-up’ part?” I said, and watched her with anxious eyes.

I remember her smiling and shrugging her shoulders. “I don’t know why he’d say that, G. Maybe he read it somewhere, do you think?”

I could only think of the Reader’s Digest books in our bathroom, but I’d read most of them, too, and I was pretty sure I’d never seen it there. Apart from the Bible, I’d never seen him read much else. “I wonder who would write something like that,” I said, frustrated at being no closer to the meaning. “I don’t think it’s in the Bible, is it?”

She shook her head. “Sounds like an anonymous author, don’t you think?”

I looked at her, obviously puzzled at the word.

She smiled and explained. “Anonymous means unknown, or unnamed. So perhaps nobody knows who wrote it.”

After reading Geue’s essay, though, I remembered my father’s expression, and wondered if my teacher had been correct about the anonymity of it’s generation. I considered Googling it, but decided not to. After all, his expression defined my childhood as much as my father’s smile did, and I’m happy to think he wrote it. It’s ours -and I don’t need it to be from someone I don’t know.

Of course, maybe most of us are actually anonymous, anyway…

Fire burn, and cauldron bubble

I love it when I hear a new word, wrestle with a new concept. Pyrocene -don’t you adore it? Even just sounding it out quietly in your head, it’s  hard to miss the excitement, or the imagery.

It takes its shape, as with all great epochs, by combining two Greek words, pur (or pyro), meaning ‘fire’, and the suffix kainos (or cene) -added to whatever noun, and meaning ‘new’. In other words, the Pyrocene is the fire epoch.

When you think about it, Pyrocene is an evocative and descriptive name for what has been going on for some time now. Fire has been tremendously important for our species. First came lightening and its effect of setting nature alight, and then, once we discovered we could tame fire, it kept us warm, it cooked, and it protected us from whatever predators remained afraid of it.

But that was just the beginning of our love affair: we began to invent new things it could do -like smelting metals, and boiling water to produce steam. All you needed was enough wood for fuel. And then, serendipitously no doubt, came the discovery of other less obvious sources that burned even hotter such as coal and, eventually, oil. It seems that hominids have embraced fire almost from the beginning; we are the fire-animal.

Unfortunately, fire seems to be in the news a lot lately -too much, in fact: bush fires, forest fires, the Amazon, Fort McMurray here in Canada, California, Europe, Australia… I can’t help but think of the poem by Goethe: the Sorcerer’s Apprentice -or at least its depiction in the animated Disney film Fantasia, in which Mickey Mouse, to the music of the unforgettable symphonic poem by Paul Ducas, tires of his job of cleaning the room of his mentor (the sorcerer) and tries to use magic to make the broom do it for him. He quickly loses control, however.

I have to admit that my thoughts about the history of fire were otherwise quite embryonic and unfocussed until I came across an epiphanic essay about Fire in Aeon, written by Stephen J Pyne, an emeritus professor of Life sciences at Arizona State University: https://aeon.co/essays/the-planet-is-burning-around-us-is-it-time-to-declare-the-pyrocene

He identifies different sources of fire -different ways of producing the energy: ‘Three fires now exist, and they interact in a kind of three-body dynamic. The first fire is nature’s. It has existed since plants first colonised continents… The second fire is humanity’s. It’s what humans have done as they moved from cooking food to cooking landscapes, and because it feeds on the same grasses, shrubs and woods as first-fire, the two fires compete for fuels: what one burns the other can’t, and neither can break beyond the ecological boundaries set by their biotic matrix… Third-fire transcends the others. It burns fossil biomass, a fuel which is outside the biotic box of the living world. Where third-fire flourishes, the others don’t, or can burn only in special preserves or as genuinely wild breakouts. After a period of transition, third-fire erases the others, leaving ecological messes behind. Because it doesn’t burn living landscapes, those combustibles grow and pile up and create conditions for more damaging burns; because it isn’t in a biotic box, its smoke can overwhelm local airsheds and its emissions can clog the global atmosphere.’

So, why does he feel the need for a new name for the epoch in which we live? I mean, we seem deluged by names -some admittedly hubristic and anthroponomic: centered mainly around us, as if everything revolved around our presence; Anthropocene comes to mind.

‘The Pleistocene began 2.58 million years ago. Unusually among geologic periods, it is characterised by climate. The Earth cooled and, atop that trend, it repeatedly toggled between frost and thaw, as 40-50 cycles switched between glacial ice and interglacial warmth. Some 90 per cent of the past 900,000 years have been icy. Our current epoch, the Holocene, is one of the interglacial warm spells, and most calculations reckon that the Earth is due – maybe overdue – to swing back to ice.’

But Pyne argues that we’re really still in the Pleistocene: ‘Other than the fact that it’s our time, and we are sufficiently special in our own eyes to merit our own era, there is little cause to have split it off from the Pleistocene… By the metrics that established the Pleistocene, the Pleistocene persists. Only humanity’s vanity insists on a secessional epoch. The ice will return… Or not. Something seems to have broken the rhythms. That something is us…

‘Or more usefully, among all the assorted ecological wobbles and biotic swerves that humans affect, the sapients negotiated a pact with fire. We created conditions that favoured more fire, and together we have so reworked the planet that we now have remade biotas, begun melting most of the relic ice, turned the atmosphere into a crock pot and the oceans into acid vats, and are sparking a sixth great extinction…  fire has become as much a cause and consequence as ice was before. We’re entering a Fire Age.’ And yet, in the old days, ‘there were limits to human-enabled burning. Burn too much, too quickly, and living landscape cannot recover, and the fires ebb. Once humans started burning fire’s lithic landscapes – fossil fuels – there seemed to be no such limits.’

Apart from nuclear energy -be it fission, or the long-promised fusion technology- the options currently available to power industry and society’s ever-increasing needs, seem in great need of innovative thinking. In a time of changing climatic conditions, reliable sources that are independent of the vagaries of weather events such as droughts or unexpected flooding, unpredictable or destructive winds, not to mention massive uncontrollable fires, are urgently required. Renewable technology is only as good as the foreseeable conditions upon which it depends.

Our addiction to fire has really left us with a Sophie’s choice: either accept the consequences of the damage it is doing to everything that allowed us to flourish in this geologically opportune -albeit temporary- interregnum between Ice-Ages, or… What? Abandon our overweening hubris and slip back into what forests still remain on the horizon’s edge -but this time aware that we are no more important, no more entitled than anything else that shares our world?

And yet, even then, would we make the same mistakes again…? Would our too-active brains mislead us once more? I don’t mean to end with an existential crisis, but I’m reminded of the observations of Shakespeare’s Macbeth -a creature of that old, untethered world: I have no spur to prick the sides of my intent, but only vaulting ambition, which o’erleaps itself, and falls on th’other. . . .