Comfort like cold porridge

I go a lot by taste. It has usually been a fair guide to what I’m eating, but in this era of plant-based meat, I’m no longer as sure. I’m certainly in favour of diminishing my ecological footprint, and a career as a Vegan is not really in the cards, I’m afraid, but lately I’ve been wondering how much I can trust the advertised contents -even when they’re listed on the label.

To start with, I’m not always familiar with some of the ingredients -or, for that matter, their possible health effects. It’s all very reassuring to see them listed, but I’m not sure how that helps the average person decide whether or not the product is safe for them, especially long-term. Most of us, I suspect, confine our reading to things like the caloric content, and maybe the amount of salt. I doubt that even the health-conscious delve much deeper than, say, the amount of fibre or perhaps whether or not it is gluten-free. The rest is largely a meaningless blur; we assume that the rest of the ingredients have passed some careful scrutiny by the health authorities and are deemed safe.

In fact, I suppose the fact that they are listed as present on the label means we can trust both their safety as well as their presence. Short of operating our own food analysis lab, the only other option is to try our luck with another product whose ingredients we find more reassuringly recognizable. Of course, the elephant in the room -or the jar- is the ability to trust whatever is listed. That may be a mistake.

And yet, perhaps I was already suspicious after reading a novel years ago during my impressionable university days. As with many of the books I’ve read, I’ve retained only interesting fragments that often had little to do with the main theme, but I do remember (I think) a scene in the 1906 novel by Upton Sinclair –The Jungle- that dealt with unsanitary practices in the meat packing plants in Chicago at the time. One of the workers fell into a vat of sausage meat and the because accident was purposely ignored, his body (I assume) was ground into somebody’s sausage. Not only did that have quite an effect on me reading about it some 60 years later, but it must also have shocked the public at the time, and led to greater regulation of the industry and the passing of a Meat Inspection Act.

But that was long ago, and I imagine most of us assume that things have improved steadily since then. Indeed they have, but human nature being what it is, there are always loopholes; there are always lapses in regulatory scrutiny.

I am an obsessive examiner of food labels, and have come to avoid most products that don’t display them. Of course, there are exceptions, where I have to trust the shop from which I buy them. Meat and fish, for example -only the store knows their source and whether they have good reason to believe what they have been told by the original seller.

You have to trust somebody, naturally, but what about the source the store itself relies on? How thoroughly do they investigate their providers? How vigilant should I be? How vigilant can I be?

I happened upon an article by John G. Keogh who is a researcher in Food Chain Transparency at the University of Reading.

‘The globalization of the food chain has resulted in increased complexity and diminished transparency and trust into how and where our foods are grown, harvested, processed and by whom.’ And he suggests that this very complexity has allowed people to exploit the system. Unless the food is locally sourced, there is seldom the ability for the retailer to have personal knowledge of the origin of their groceries.

Indeed, ‘Before modern supermarkets, a local village or town grocery store stocked up to 300 items grown or processed within a 240-kilometre (150-mile) radius. In comparison, our post-modern supermarkets carry an average of 33,000 items that travel 2,400 kilometres or more.’ So, perhaps as a result, the ‘Canadian Food Inspection Agency (CFIA) suggests food fraud affects 10 per cent of commercially sold food. Various academic and industry sources suggest that globally, food fraud ranges from US$10 billion to $49 billion… If you add the sales of fake wines and alcohol, adulterated honey and spices, mislabelled fish and false claims of organic products, wild-caught fish or grain-fed meat, the numbers, and risks, increase significantly.’

Canada, as with most other developed nations, seems to be aware of the risks and has duly enacted regulations in an attempt to curb abuse, but the problem -not unique to Canada, of course- is in the enforcement. Apart from the vigilance required, it is hugely expensive, and the government has turned to genomics and DNA to detect some of the animal product deceptions. For example, there have been ‘a number of research papers uncovering food fraud and revealing the mislabelling of fish species in Canadian restaurants and grocery stores.’ And there was a paper ‘entitled “Re-visiting the occurrence of undeclared species in sausage products sold in Canada” as a followup to a previous study that showed a 20 per cent mislabelling rate for sausage… sampled from grocery stores across Canada [and] contained meats that weren’t on the label. The followup indicated 14 per cent of the 100 sausages tested still contained meat DNA that was undeclared on the label.’

I bought some fish the other day from a store I have come to trust. Of course the fact that it had a 25% off in a red sticker on the wrapper should have alerted something in my hindbrain, but I suppose I was in a hurry and thought it would make a delicious dinner. It purported to be steelhead -one of my favourites- but after I had prepared and barbecued it using a recipe I resort to for special occasions, it tasted suspiciously like Pink salmon. I loved it, though, so I wasn’t so much disappointed as surprised. So, even if it was mislabelled, I could think of no reason to complain to the store.

I still don’t buy sausages, though.

The rest is silence

There’s something special about place, don’t you think? For some, it is a beautiful sight -a mountain, perhaps, or a field of flowers basking in the sun. I agree, of course -vision paints a scene- but for me, it does not capture it. Not completely. Photographs are only quiet markers of things that cannot truly live in silence.

I am seduced by sound, but as I age, my ears, too, have yellowed with the years. I worry that I may eventually slip my anchorage and have to rely too much on sight. Already, I am dependent more on memory than I might wish. Audio ergo video I used to think; but really, it was more like audio ergo sum, however. For me, place is not merely accompanied by sound, described by sound -it is sound! Make a joyful noise, says one of the Psalms I remember from my childhood. I suppose I must have taken it to heart…

Now I will do nothing but listen. That is from a part of the Song of Myself  by the 19th century American poet Walt Whitman -only a part, because I have long treated myself to paring off the bits and pieces I wish to enjoy. I recognize the limits of judging something with carefully trimmed adjectives of course, but there you have it. At any rate, Whitman’s decision is a good one -profound in so many ways.

I’ve read Hempton and Grossmann’s book One Square Inch of Silence with interest and not a little dismay. They start their Prologue with a quote from the Nobel Prize winning bacteriologist Robert Koch, ‘The day will come when man will have to fight noise as inexorably as cholera and the plague.’ And then go on to say that ‘Today silence has become an endangered species. Our cities, our suburbs, our farm communities, even our most expansive and remote national parks are not free from human noise intrusions. Nor is there relief even at the North Pole; continent-hopping jets see to that. Moreover, fighting noise is not the same as preserving silence. Our typical anti-noise strategies … offer no real solution because they do nothing to help us reconnect and listen to the land.’

I am getting old, though: I love sound, and if it’s mixed in with a little noise, I welcome the challenge. After all, noise is really just sound from which meaningful information is missing, or at least difficult to extract -and if it’s not too loud, I say give me the chance. I suppose that’s why I cannot abide earphones when I walk or run through the woods: they imprison me. What I want to hear is the sighing and creaking of the trees in a passing breeze, the mysterious crackling of branches deep in the forest, or the gurgling of a creek hidden somewhere nearby. The distant pounding of a woodpecker is a plus, but I’ll accept the call of a solitary raven searching for who knows what in trees too far away to see. All are Imagination’s guest: the mysteries that scrape quietly against its windows, the rustling shadows that appear like tiny whispering moths heard only in the corners of its ever listening ears.

A few years ago I remember being summoned into the deep green forest, not by formal writ, or a surfeit of leisure, you understand, but by the lusty song of what I used to call ‘the wind-up bird’ because its song seemed to go on and on like in a wind-up toy I was given as a child. In fact, its proper name is Troglodytes pacificus but it prefers just plain Pacific Wren unless it’s being formally introduced at a conference. They’re really quite small, so I’ve never actually seen one in the trees. But since there seemed to be a trail leading towards the sound, I thought it might be an adventure worth pursuing.

It was a mountainous part of the coastal forest I’d never been in before, and I carefully noted the fluorescent red ribbons used as markers for the trail. It’s embarrassing to get lost in the wilderness, and you can get eaten, so I try to be careful. Sometimes, of course, it’s difficult to allocate sufficient neurological resources to keep track of everything at the same time and I soon lost track of the ribbons -and also the bird; I suppose it heard me thrashing through the bushes as I fashioned my own clumsy trail.

You sort of know which way is out on a mountain, though. If you started near the bottom, you go down. If you started from the summit, you still pretty well do the same thing. So I wasn’t worried. And besides, there’s quite a lot going on if you really listen. I have my favourites, of course -wind ruffling its way through a forest of needles is one, although with cedars, are they needles, or leaves? That question always keeps me occupied for a while -I don’t think anybody is willing to commit, however.

And then there’s the legendary tree falling in the forest, and whether or not it still has to pretend to make a noise if there’s nobody around to hear it. Mind you, it’s hard to know just how often that happens, but since I figured I was all alone and not shouting or anything, if I did hear one in flagrante delicto, as it were, that should count, eh?

Unfortunately, apart from me tripping on stray roots, and being scratched by curious bushes, I heard no tree fall… or maybe, it didn’t know I was around, and it fell in silence. I believe there are a lot of unreported mysteries on a mountain.

But I also listen for grunts and howls in the woods, random crackles behind me, or heavy breathing close by. These have a different cachet, depending where I am. In a place where I am only noticed if I smell appetizing, a different algorithm is called for. It seems to me that sometimes the Whitman Principle only applies at the starting block: it determines the direction to run. And so when I heard something snuffling nearby, I understood that the time for simply listening was over.

As I started to crash through the undergrowth, I realized that for several carved moments, I was sound incarnate. I began wondering how my frantic uncoordinated scramble must seem to the animals in the forest who were used to hiding in silence for their lives -I was demonstrating there was also life in noise.

It was just a prophylactic journey, of course -a tentative foray into sound; I wasn’t sure where, or what I had heard, but it had suddenly changed from an interesting scrunch of the kind I was so fond of hearing underfoot on the carpet of leaves, to the more menacing sound that predators are decidedly not fond of wasting.

Eventually, I stopped and listened again, but this time it was not menace that greeted me from the foliage, but a burbling stream that danced onomatopoeically over rock and root, humming gaily as it purred over boulders and whispered under fallen logs: a chorister’s dream. And, even though I had always been too shy to sing with others, I joined it in a shy and halting duet of pastoral celebration. 

But nowadays, when I walk alone in the forest, I find that I am often consumed with a question -this one of a different existential significance: is a person warbling to a creek really making a joyful noise if there’s no one around to hear him?

Fake lies?

Recently, I’ve been thinking a lot about truth, but not for the reasons you might expect. Not because of the abundance of ‘fake news’ about which we seem to be constantly reminded, and not necessarily because I’ve been occasionally embarrassed in a lie, nor because of the tangled web you wove when first you practiced to deceive.

Fake news and deception, not to mention outright lies, have been in the headlines in recent years, but deception is certainly not unique to our era -nor even our species. Think of bird behaviour to distract predators from their nest, cowbirds that lay their eggs in other nests to trick the foreign mothers into raising the alien young, or squirrels that pretend to bury acorns in one place, but in case they were observed, actually keep them in their mouths while they find another spot to cache them.

I grant almost universality to the practice of intended deception -especially where there is something being protected, if only reputation or status. And, given its ubiquity and seemingly relentless practice in humans, it has a long history of ethical debate. Deception, of course is different from lying -deception is more a case of misleading, whereas lying is saying something known to be false.

I am concerned by something a little different, however. I am vexed by what, at first glance, would seem to be a more trivial concern: does a writer of fiction actually lie? And if the medium is one that does not purport to be factual -a novel, say- is it even possible? How important is truth in a fictive world -as long as it is internally consistent? A character in that story can lie, to be sure, but how analogous is that to a real-life character doing the same thing?

Writers have strange thoughts -perhaps that’s why they end up writing- but nonetheless I have been curious about this for some time now. I wonder about the ethics of fiction -not malicious, or scandalous fiction, you understand (although I suspect even those are merely the far edge of the spectrum). As it applies to writing, the very definition of ‘fiction’ -from the Latin fingere, to contrive- suggests imaginative creation, not investigative reportage where false attributions are indeed ethically problematic.

I’ve written fiction for years now (putting aside the fact that I am not at all widely published) so have I been lying all these years? If one of my characters lies, or deceives, and it happens to be read by someone in the ‘real-world’ -trespassing, in other words- have those lies in some sense transgressed the real-world ethics? Soiled our nest?

You’re right, it is perhaps a trifling concern, and yet bothersome nonetheless; I despaired of ever seeing it as the subject of an understandable evaluation. But, on one of my wide-eyed explorations, I happened upon a thoughtful essay by Emar Maier, an assistant professor of philosophy at the University of Groningen.

He starts by considering the work of another philosopher, H.P. Grice who considers that ‘it all comes down to the assumption that communication is fundamentally a cooperative endeavour,’ and postulates what seem to be almost ‘Golden Rule’ maxims of quality in communication: ‘‘do not say what you believe to be false’ and ‘do not say that for which you have insufficient evidence’.’ And yet, we violate these all the time -we tell jokes, we exaggerate, we deceive, we use metaphors, we use sarcasm, and, of course, we tell stories. ‘In all of these cases there is a clear sense in which we are not really presenting the truth, as we know it, based on the best available evidence. But there are vast differences between these phenomena. For instance, while some constitute morally objectionable behaviour, others are associated with art and poetry.’

There is a difference, though, between violating one of Grice’s norms, and flouting it with, say, a sigh and rolling of the eyes. However untrue the assertion, it is readily recognizable as an exaggeration or even a lie that is not meant to be taken as true. On the other hand, ‘Liars… violate the same maxim, but they don’t flout it. Theirs is a covert violation, and hence lying has an altogether different effect on the interpreter than irony, sarcasm or metaphor.’

Fiction, however, is more complicated. A work of fiction ‘consists of speech acts that, for the most part, look like ordinary assertions.’ And yet, ‘As with lies and irony, there is no dedicated grammar or style for constructing fictional statements that would reliably distinguish them from regular assertions.’

So, ‘Is fiction more like the covert violation of the liar, or like the overt violation of the ironical speaker? Unlike the liar, the fiction author doesn’t hide her untruthful intentions.’ There are two ways to look at this, Maier says: either that ‘both fiction and lying are quality-violating assertions – ie, speech acts presenting something believed to be false as if it’s known truth’ or ‘we can analyse fictional discourse as constituting a different type of speech act, where the usual norms and maxims don’t apply in the first place.’

‘[T]he idea that both lying and fiction are just assertions of known falsehoods can be traced back to eminent philosophers such as Plato, who wanted to ban poets from his ideal society, [and] David Hume who called them ‘liars by profession’’.

I, however, am more convinced by the opinion of Albert Camus, who believed that ‘fiction is the lie through which we tell the truth’. At any rate, Maier goes on to observe that a ‘striking difference between fictional statements and lies is the fact that, while most lies are simply false… many philosophers have argued that the statements making up a work of fiction, even those involving clearly nonexistent entities, are not really false, but at least ‘in some sense’ true – viz… true relative to the fictional world in question.’ Now we’re getting somewhere -it’s context that matters.

A second difference between fiction and lies, is the emotional response -the paradox of- fiction. ‘[W]orks of fiction induce… a ‘willing suspension of disbelief’, allowing us to be emotionally engaged with commonly known falsehoods. Lies evidently lack this property: once a lie is exposed, suspension of disbelief and emotional engagement in accordance with the story’s content become impossible… the difference between fictional statements and regular communicative assertions lies not in some hidden logical operators in the fictional assertion, but in the fact that telling fictional stories is an altogether different speech act from the act of assertion that makes up our talk about the weather, or our newspaper reporting.’ Kind of what I suspected all along. ‘As the English poet and soldier Sir Philip Sidney put it in The Defence of Poesy (1595): ‘Now for the poet, he nothing affirmeth, and therefore never lieth.’

So, ‘it seems that fiction and lying are mutually exclusive, for they belong to distinct speech act categories, conform to different norms, and affect different cognitive states… since it is the text itself that generates the fictional world, the statements that make up that text should automatically become true in that world. When George Orwell wrote that ‘the clocks were striking thirteen’, it thereby became true in the fictional world of Nineteen Eighty-Four (1949) that the clocks were striking thirteen. Unlike for the historian or the journalist, there is no relevant world outside the text, relative to which we could fact-check whether Orwell miscounted. This line of argument can be summed up in the principle of authorial authority: the statements that make up a work of fiction are true in that fiction.’

Of course there are things like ‘imaginative resistance’ where internal inconsistencies disrupt belief, but writers -and certainly proof readers and editors- are pretty good at resolving these gaffes before they are hung out to air on the clothesline of publication.

At any rate, I’m not sure I’ve discovered many immutable truths in Maier’s treatment of fictive lying, but I feel better about my own ethics of make-believe. I do still wonder about the boundary markers at that razor-thin edge where well-written fiction seems real and induces real emotion. I suppose edges are usually like that, though: porous…


Remember Plato’s Cave allegory? In his Republic he describes a scenario in which some people have spent their lives chained in a cave so they can only see the wall in front of them. There is a fire behind them that casts shadows on the wall that they have no way of knowing are only shadows. For these people, the shadows are their reality. There seem to be many versions of what happens next, but the variation I prefer is that one of the people escapes from the cave and discovers the sun outside for the first time; he realizes that what he had assumed was real -the shadows- were just that: merely the shadows cast by the actual objects themselves.

Sometimes we become so accustomed to seeing things a certain way, we find it difficult, if not impossible, to believe there could be an alternative view. Or assume that, even if there were another version, it must be wrong. But can we be sure that we are evaluating the alternative fairly and without prejudice? Can we assess it with sufficient objectivity to allow a critical analysis? Or are we unavoidably trapped in the prevailing contemporary Weltanschauung? It’s an interesting question to be sure, and one that begs for examination, if only to explore the world behind the mirror.

I stumbled upon an essay by Julie Reshe, a psychoanalyst and philosopher, who, after recovering from a bout of depression, began to wonder whether depression itself was actually the baseline state, and one that allowed a more accurate view of how things actually are:

I have to admit that I had to temporarily divorce myself from my usually optimistic worldview to be able to fathom her argument, and I found it rather uncomfortable. But sometimes it’s instructive -even valuable- to look under a rock.

As a philosopher, Reshe felt the need to examine both sides of an argument critically, putting aside preconceptions and biases. ‘Depressogenic thoughts are unpleasant and even unbearable,’ she writes, ‘but this doesn’t necessarily mean that they are distorted representations of reality. What if reality truly sucks and, while depressed, we lose the very illusions that help us to not realise this? What if, to the contrary, positive thinking represents a biased grasp of reality? … What if it was a collapse of illusions – the collapse of unrealistic thinking – and the glimpse of a reality that actually caused my anxiety? What if, when depressed, we actually perceive reality more accurately? What if both my need to be happy and the demand of psychotherapy to heal depression are based on the same illusion?’ In other words, what if I am actually not a nice person? What if there’s a reason people don’t like me?

Whoa! Suppose this is not a counterfactual? After all, other philosophers have wondered about this. For example, Arthur Schopenhauer whose deeply pessimistic writings about the lack of meaning and purpose of existence I have never understood, or the equally famous German philosopher Martin Heidegger who felt that anxiety was the basic mode of human existence. ‘We mostly live inauthentically in our everyday lives, where we are immersed in everyday tasks, troubles and worries, so that our awareness of the futility and meaninglessness of our existence is silenced by everyday noise… But the authentic life is disclosed only in anxiety.’ My god, where’s my knife?

And even Freud wasn’t optimistic about the outcome of psychotherapeutic treatment, and was ‘reluctant to promise happiness as a result.’ He felt that ‘psychoanalysis could transform hysterical misery into ‘common unhappiness’. Great…

And then, of course, there’s the philosophical tradition called ‘depressive realism’ which holds that ‘reality is always more transparent through a depressed person’s lens.’ And just to add more poison to the cake, ‘the Australian social psychologist Joseph Forgas and colleagues showed that sadness reinforces critical thinking: it helps people reduce judgmental bias, improve attention, increase perseverance, and generally promotes a more skeptical, detailed and attentive thinking style.’

All of which is to say, I suppose, ‘The evolutionary function of depression is to develop analytical thinking mechanisms and to assist in solving complex mental problems. Depressive rumination helps us to concentrate and solve the problems we are ruminating about… depressive rumination is a problemsolving mechanism that draws attention to and promotes analysis of certain problems.’

I have presented these deeply troubling ideas merely as an exercise in perspective, I hasten to add. Sometimes it is valuable to try to grasp the Umwelt of the stranger on the other side of the door before we open it. We can only help if we are willing to understand why they are there.

Part of the solution may lie in puzzling out Reshe’s counterfactuals. She seems to want to assign meaning to her former depression, as have many of the other people she mentions, to buttress her point. She also seems to feel that there was a time when that point of view might have seemed more mainstream. That nowadays there is just too much expectation of happiness -unrealistic expectations by and large, which presents a problem in and of itself. If we constantly expect to achieve a goal, but, like a prairie horizon, it remains temptingly close and yet just out of reach, we are doomed to frustration -a self-fulfilling prophecy.

And yet, it seems to me that resigning oneself to unhappiness, or its cousin depression, doesn’t represent a paradigm shift, but rather a rationalization that it must be the default position -and therefore must serve some useful evolutionary purpose; a position benighted and stigmatized because it advertises the owner’s failure to achieve the goal that others seem to have realized.

I’m certainly not disparaging depression, but neither am I willing to accept that it serves any evolutionary strategy except that of a temporary, albeit necessary harbour until the storm passes. And to suggest that positive emotions -happiness, contentment, joy, or pleasure, to name just a few- however short-lived, are illusory, and unrealistic expectations, is merely to excuse and perhaps justify an approach to depression that isn’t working. A trail that only wanders further into the woods.

I’m certainly cognizant of the fact that there is a spectrum of depressions, from ennui to psychotic and that some are more refractory to resolution than others, but that very fact argues against leaving them to strengthen, lest they progress to an even more untenable and dangerous state.

Perhaps we need to comfort ourselves with the ever-changing, ever-contrasting nature of emotions, and not expect of them a permanence they were likely never evolved to achieve.

Goldilocks, it seems to me, realized something rather profound when she chose the baby bear’s porridge after finding papa bear’s porridge too hot, and mamma bear’s too cold: it was just right…

Imagination bodies forth the forms of things unknown

The poet’s eye, in fine frenzy rolling,
Doth glance from heaven to earth, from earth to heaven;
And as imagination bodies forth
The forms of things unknown, the poet’s pen
Turns them to shapes and gives to airy nothing
A local habitation and a name.
Such tricks hath strong imagination.                                                                                                                   
Theseus, in Shakespeare’s A Midsummer Night’s Dream

Shakespeare had a keen appreciation of the value of imagination, as that quote from A Midsummer Night’s Dream suggests. But what is imagination? Is it a luxury -a chance evolutionary exaptation of some otherwise less essential neural circuit- or a purpose-made system to analyse novel features in the environment? A mechanism for evaluating counterfactuals -the what-ifs?

A quirkier question, perhaps, would be to ask if it might predate language itself -be the framework, the scaffolding upon which words and thoughts are draped. Or is that merely another chicken versus egg conundrum drummed up by an overactive imagination?

I suppose what I’m really asking is why it exists at all. Does poetry or its ilk serve an evolutionary purpose? Do dreams? Does one’s Muse…? All interesting questions for sure, but perhaps the wrong ones with which to begin the quest to understand.

I doubt that there is a specific gene for imagination; it seems to me it may be far more global than could be encompassed by one set of genetic instructions. In what we would consider proto-humans it may have involved more primitive components: such non-linguistic features as emotion -fear, elation, confusion- but also encompassed bodily responses to external stimuli: a moving tickle in that interregnum between sleep and wakefulness might have been interpreted as spider and generated a muscular reaction whether or not there was an actual creature crawling on the skin.

Imagination, in other words, may not be an all-or-nothing feature unique to Homo sapiens. It may be a series of  adaptations to the exigencies of life that eventuated in what we would currently recognize as our human creativity.

I have to say, it’s interesting what you can find if you keep your mind, as well as your eyes, open. I wasn’t actively searching for an essay on imagination -although perhaps on some level, I was… At any rate, on whatever level, I happened upon an essay by Stephen T Asma, a professor of philosophy at Columbia College in Chicago and his approach fascinated me.

‘Imagination is intrinsic to our inner lives. You could even say that it makes up a ‘second universe’ inside our heads. We invent animals and events that don’t exist, we rerun history with alternative outcomes, we envision social and moral utopias, we revel in fantasy art, and we meditate both on what we could have been and on what we might become… We should think of the imagination as an archaeologist might think about a rich dig site, with layers of capacities, overlaid with one another. It emerges slowly over vast stretches of time, a punctuated equilibrium process that builds upon our shared animal inheritance.’

Interestingly, many archaeologists seem to conflate the emergence of imagination with the appearance of artistic endeavours –‘premised on the relatively late appearance of cave art in the Upper Paleolithic period (c38,000 years ago)… [and] that imagination evolves late, after language, and the cave paintings are a sign of modern minds at work.’

Asma, sees the sequence rather differently, however: ‘Thinking and communicating are vastly improved by language, it is true. But ‘thinking with imagery’ and even ‘thinking with the body’ must have preceded language by hundreds of thousands of years. It is part of our mammalian inheritance to read, store and retrieve emotionally coded representations of the world, and we do this via conditioned associations, not propositional coding.’

Further, Asma supposes that ‘Animals appear to use images (visual, auditory, olfactory memories) to navigate novel territories and problems. For early humans, a kind of cognitive gap opened up between stimulus and response – a gap that created the possibility of having multiple responses to a perception, rather than one immediate response. This gap was crucial for the imagination: it created an inner space in our minds. The next step was that early human brains began to generate information, rather than merely record and process it – we began to create representations of things that never were but might be.’ I love his idea of a ‘cognitive gap’. It imagines (sorry) a cognitive area where something novel could be developed and improved over time.

I’m not sure that I totally understand all of the evidence he cites to bolster his contention, though- for example, the view of philosopher Mark Johnson at the University of Oregon that there are ‘deep embodied metaphorical structures within language itself, and meaning is rooted in the body (not the head).’ Although, ‘Rather than being based in words, meaning stems from the actions associated with a perception or image. Even when seemingly neutral lexical terms are processed by our brains, we find a deeper simulation system of images.’ But at any rate, Asma summarizes his own thoughts more concisely, I think: ‘The imagination, then, is a layer of mind above purely behaviourist stimulus-and-response, but below linguistic metaphors and propositional meaning.’

In other words, you don’t need to have language for imagination. But the discipline of biosemantics tries to envisage how it might have developed in other animals. ‘[Primates] have a kind of task grammar for doing complex series of actions, such as processing inedible plants into edible food. Gorillas, for example, eat stinging nettles only after an elaborate harvesting and leave-folding [sic] sequence, otherwise their mouths will be lacerated by the many barbs. This is a level of problem-solving that seeks smarter moves (and ‘banks’ successes and failures) between the body and the environment. This kind of motor sequencing might be the first level of improvisational and imaginative grammar. Images and behaviour sequences could be rearranged in the mind via the task grammar, long before language emerged. Only much later did we start thinking with linguistic symbols. While increasingly abstract symbols – such as words – intensified the decoupling of representations and simulations from immediate experience, they created and carried meaning by triggering ancient embodied systems (such as emotions) in the storytellers and story audiences.’ So, as a result, ‘The imaginative musician, dancer, athlete or engineer is drawing directly on the prelinguistic reservoir of meaning.’

Imagination has been lauded as a generator of progress, and derided as idle speculation throughout our tumultuous history, but there’s no denying its power: ‘The imagination – whether pictorial or later linguistic – is especially good at emotional communication, and this might have evolved because emotional information drives action and shapes adaptive behaviour. We have to remember that the imagination itself started as an adaptation in a hostile world, among social primates, so perhaps it is not surprising that a good storyteller, painter or singer can manipulate my internal second universe by triggering counterfactual images and events in my mind that carry an intense emotional charge.’

Without imagination, we cannot hope to appreciate the Shakespeare who also wrote, in his play Richard III:

Princes have but their titles for their glories,                                                                                                      An outward honor for an inward toil,                                                                                                                And, for unfelt imaginations,                                                                                                                                They often feel a world of restless cares.

Personally, I cannot even imagine a world where imagination doesn’t play such a crucial role… Or can I…?


Does the love of heaven make one heavenly?

Why do find myself so attracted to articles about religion? I am not an adherent -religion does not stick to me- nor am I tempted to take the famous wager of the 17th century philosopher, Pascal: dare to live life as if God exists, because you’ve got nothing to lose if He doesn’t, and everything to gain if He does.

Perhaps I’m intrigued by the etymological roots that underpin the word: Religare (Latin, meaning ‘to bind’) is likely the original tuber of the word. But is that it -does it bind me? Constrain me? I’d like to think not, and yet… and yet…

Even many diehard atheists concede that religion has a use, if only for social cohesion -Voltaire was probably thinking along those lines when he wrote: ‘If God did not exist, it would be necessary to invent him’. Or Karl Marx: ‘Religion is the sigh of the oppressed creature, the heart of a heartless world, and the soul of soulless conditions. It is the opium of the people’.

And then, of course, there’s Sigmund Freud, an avowed Jewish atheist, who for most of his life, thought that God was a collective neurosis. But, in his later years when he was dying of cancer of the jaw, he suggested (amongst other, much more controversial things) in his last completed book, Moses and Monotheism that monotheistic religions (like Judaism) think of God as invisible. This necessitates incorporating Him into the mind to be able to process the concept, and hence likely improves our ability for abstract thinking. It’s a bit of a stretch perhaps, but an intriguing idea nonetheless.

But, no matter what its adherents may think about the value of the timeless truths revealed in their particular version, or its longevity as proof of concept, religions change over time. They evolve -or failing that, just disappear, dissolve like salt in a glass of water. Consider how many varieties and sects have arisen just from Christianity alone. Division is rife; nothing is eternal; Weltanschauungen are as complicated as the spelling.

So then, why do religions keep reappearing in different clothes, different colours? Alain de Botton, a contemporary British philosopher, argues in his book Religion for Atheists, that religions recognize that their members are children in need of guidance and solace. Although certainly an uncomfortable opinion, there is a ring of truth to his contention. Parents, as much as their children, enjoy ceremonies, games, and rituals and tend to imbue them with special significance that is missing in the secular world. And draping otherwise pragmatic needs in holy cloth, renders the impression that they were divinely inspired; ethics and morality thus clothed, rather than being perceived as arbitrary, wear a spiritual imprimatur. A disguise: the Emperor’s Clothes.

Perhaps, then, there’s more to religion than a set of Holy caveats whose source is impossible to verify. But is it really just something in loco parentis? A stand-in? I found an interesting treatment of this in a BBC Future article written by Sumit Paul-Choudhury, a freelance writer and former editor-in-chief of the New Scientist. He was addressing the possible future of religion.

‘We take it for granted that religions are born, grow and die – but we are also oddly blind to that reality. When someone tries to start a new religion, it is often dismissed as a cult. When we recognise a faith, we treat its teachings and traditions as timeless and sacrosanct. And when a religion dies, it becomes a myth, and its claim to sacred truth expires… Even today’s dominant religions have continually evolved throughout history.’

And yet, what is it that allows some to continue, and others to disappear despite the Universal Truth that each is sure it possesses? ‘“Historically, what makes religions rise or fall is political support,”’ writes Linda Woodhead, professor of sociology of religion at the University of Lancaster in the UK ‘“and all religions are transient unless they get imperial support.”’ Even the much vaunted staying power of Christianity required the Roman emperor Constantine, and his Edict of Milan in 313 CE to grant it legal status, and again the emperor Theodosius and his Edict of Thessalonica in 380 CE to make Christianity the official religion of the Roman Empire.

The first university I attended was originally founded by the Baptists and, at least for my freshman year, there was a mandatory religious studies course. Thankfully, I was able to take a comparative religion course, but in retrospect, I would have liked an even broader treatment of world religions. I realize now that I was quite naïve in those times; immigration had not yet exposed many of us to the foreign customs and ideas with which we are now, by and large, quite familiar. So the very notion of polytheism, for example, where there could be a god dedicated to health, say, and maybe another that spent its time dealing with the weather, was not only fascinating, but also compelling. I mean, why not? The Catholics have their saints picked out that intervene for certain causes, so apart from the intervener numbers, what makes Hinduism with its touted 33 million gods, such an outlier in the West (wherever that is)?

It seems to me that most of us have wondered about the meaning of Life at one time or other, and most of us have reflected on what happens after death. The answers we come up with are fairly well correlated with those of our parents, and the predominant Zeitgeist in which we swim. But as the current changes, each of us is swept up in one eddy or another, yet we usually manage to convince ourselves it’s all for the best. And perhaps it is.

Who’s to say that there needs to be a winner? Religions fragment over time and so do societies; their beliefs, however sacrosanct in the moment, evolve and are in turn sacralized. And yet our wonder of it all remains. Who are we really, and why are we here? What happens when we die? These questions never go away, and likely never will. So maybe, just maybe, we will always need a guide. Maybe we will always need a Charon to row us across the River Styx…

Is the thing translated still the thing?

When I was a student at University, translated Japanese Haiku poetry was all the rage; it seemed to capture the Zeitgeist of the generation to which I had been assigned. I was swept along with others by the simple nature images, but -much like the sonnet, I suppose- I failed to realize how highly structured it was. In fact, I can’t really remember all of its complex requirements -but maybe that’s the beauty of its seeming simplicity in English. However, the contracted translation of the Japanese word –haikai no ku, meaning ‘light verse’- belies the difficulty in translating the poetry into a foreign language while still conserving its structure, its meaning, and also its beauty.

It seems to me that the ability to preserve these things in translation while still engaging the interest of the reader requires no less genius than that of its original creator. While, both in poetry as well as in the narrative of story, the ideas of the authors, and their images, plots and metaphors are an intrinsic part of the whole, sometimes the concepts are difficult to convey to a foreign culture. So, what to do with them to maintain the thrust of the original while not altering the charm? And when does the translation actually become a different work of art and suggest the need for a different attribution?

Given my longstanding  love for poetry and literature, I have often wondered whether I could truly understand the poetry of, say, Rumi who wrote mainly in Persian but also in Turkish, Greek and Arabic; or maybe, the more contemporary Rilke’s German language poetry. I speak none of those languages, nor do I pretend to understand the Umwelten of their times, so how do I know what it is that attracts me, apart from the beauty of their translations? Is it merely the universality of their themes, and perhaps my mistaken interpretations of the images and metaphors, or is there something else that seeps through, thanks to -or perhaps in spite of- the necessary rewording?

Since those heady days in university, I have read many attempts to explain, and even to justify, various methods of translation, and they all seem to adhere to one or both of the only two available procedures: paraphrasing, or metaphrasing (translating word for word). And no matter which is used, I have to wonder if the product is always the poor cousin of the original.

In one of the seminars from university, I remember learning that as far back as St. Augustine and St. Jerome, there was disagreement about how to translate the Bible -Augustine favoured metaphrasis, whereas Jerome felt that there was ‘a grace of something well said’. Jerome’s appealing phrase has stayed with me all these years. Evidently, the problem of translation goes even further back in history though, and yet the best method of preserving the author’s intention is still no closer to being resolved.

In my abiding hope for answers, I still continue to search. One such more recent forage led me to an essay in the online publication Aeon by the American translator and author Mark Polizzotti (who, among other honours, is a Chevalier of the Ordre des Arts et des Lettres, the recipient of a 2016 American Academy of Arts and Letters award for literature, and a publisher and editor-in-chief at the Metropolitan Museum of Art in New York).

He writes, ‘as the need for global communication grows by proverbial leaps, the efficiency of machine-based translation starts looking rather attractive. In this regard, a ‘good’ translation might simply be one that conveys the requisite bytes of information in the shortest time. But translation is about more than data transmission, and its success is not always easy to quantify. This becomes particularly true in the literary sphere: concerned with delivering artistic effect more than facts simple and straight.’

So, ‘We might think that the very indeterminacy of literary translation would earn it more leeway, or more acceptance.’ And yet, ‘many sophisticated readers view translation as no more than a stopgap… it would be disingenuous to claim that the reader of a translation is truly experiencing, in all its aspects, the foreign-language work it represents, or that in reading any text transposed from one language into another there isn’t a degree of difference (which is not the same as loss). The heart of the matter lies in whether we conceive of a translation as a practical outcome, with qualities of its own that complement or even enhance the original, or as an unattainable ideal, whose best chance for relative legitimacy is to trace that original as closely as possible.’

Polizzotti goes on to catalogue various approaches and views of translation and then suggests what I, at least, would consider the best way to think of translation and the obvious need it attempts to fulfil: ‘If instead we take translators as artists in their own right, in partnership with (rather than servitude to) their source authors; if we think of translation as a dynamic process, a privileged form of reading that can illuminate the original and transfer its energy into a new context, then the act of representing a literary work in another language and culture becomes something altogether more meaningful. It provides a new way of looking at a text, and through that text, a world. In the best of cases, it allows for the emergence of an entirely new literary work, at once dependent on and independent of the one that prompted it – a work that neither subserviently follows the original nor competes with it, but rather that adds something of worth and of its own to the sum total of global literatures. This does not mean taking undue liberties with the original; rather, it means honouring that original by marshalling all of one’s talent and all of one’s inventiveness to render it felicitously in another language.

‘To present a work as aptly as possible, to recreate it in all its beauty and ugliness, grandeur and pettiness, takes sensitivity, empathy, flexibility, knowledge, attention, caring and tact. And, perhaps most of all, it takes respect for one’s own work, the belief that one’s translation is worth judging on its own merits (or flaws), and that, if done well, it can stand shoulder to shoulder with the original that inspired it.’

Polizzotti has nailed it. There’s a spirit inherent in good translation -one that inspires a confidence that the original intent of the author is appropriately, and befittingly displayed.

One of the reasons I was drawn to Polizzotti’s essay was a recent book I read (in translation): The Elegance of the Hedgehog, by Muriel Barbery and translated from the original French by Alison Anderson. So seamless was the narrative, and so apt were the translated dialogues, I have to confess that I had difficulty believing the book had not originally been written in English. And as it stands, it is one of the most rewarding books I have experienced in years. I’m sure that Ms Barbery is well content with Anderson’s translation, not the least because their efforts earned it accolades from various critics, including a posting on the New York Times best-seller list.

It seems to me that one could not expect more from a translator than that.

To hold, as it were, a mirror up to Nature

Who am I? No, really -where do I stop and something else begins? That’s not really as silly a question as it may first appear. Consider, for example, my need to remember something -an address, say. One method is to internalize it -encode it somehow in my brain, I suppose- but another, no less effective, is to write it down. So, if I choose the latter, is my pen (or keyboard, for that matter) now in some sense a functional part of me? Is it an extension of my brain? The result is the same: the address is available whenever I need it.

Ever since my university days, when I discovered the writings of the philosopher Alan Watts, I have been intrigued by his view of boundaries, and whether to consider them as things designed to separate, or to join. Skin, was one example that I remember he discussed -does it define my limits, and enclose the me inside, or is it actually my link with the outside world? I hadn’t really thought much about it until then, but in the intervening years it has remained an idea that continues to fascinate me.

Clearly Watts was not alone in his interest about what constitutes an individual, nor in his speculations about the meaning of whatever identities individuals think they possess by virtue of their boundaries. There was an insightful article in Aeon by Derek Skillings, a biologist and philosopher of science at the University of Pennsylvania entitled ‘Life is not easily bounded’:

‘Most of the time the living world appears to us as manageable chunks,’ he writes, ‘We know if we have one dog or two.’ Why then, is ‘the meaning of individuality … one of the oldest and most vexing problems in biology? …  Different accounts of individuality pick out different boundaries, like an overlapping Venn diagram drawn on top of a network of biotic interactions. This isn’t because of uncertainty or a lack of information; rather, the living world just exists in such a way that we need more than one account of individuality to understand it.’ But really, ‘the problem of individuality is (ironically enough) actually composed of two problems: identity and individuation. The problem of identity asks: ‘What does it mean for a thing to remain the same thing if it changes over time?’ The problem of individuation asks: ‘How do we tell things apart?’ Identity is fundamentally about the nature of sameness and continuity; individuation is about differences and breaks.’ So, ‘To pick something out in the world you need to know both what makes it one thing, and also what makes it different than other things – identity and individuation, sameness and difference.’

What about a forest -surely it is a crowd of individual trees?  Well, one way of differentiating amongst individuals is to think about growth -a tree that is growing (in other words, continuing as more of the same)- and contrasting it with producing something new: as in reproduction. And yet even here, there is a difficulty. It’s difficult to determine the individual identities of any trees that also grew from the original roots -for example from a ‘nurse’ tree lying on the ground with shoots and saplings sprouting from it.’

But it’s not only plants that confuse the issue. If reproduction -i.e. producing something new– counts as a different entity, then what about entities like bacteria? ‘These organisms tend to reproduce [albeit] by asexual division, dividing in half to produce two clones… and, failing mutation and sub-population differentiation, an entire population of bacteria would be considered a single individual.’ -whatever ‘individual’ might therefore mean.

And what about us, then? Surely we have boundaries, surely we are individuals created as unique entities by means of sexual reproduction. Surely we have identities. And yet, what of those other entities we carry with us through our lives -entities that not only act as symbiotes, but are also integrated so thoroughly into our metabolism that they contribute to such intimate functions as our immune systems, our weight and health, and even function as precursors for our neurotransmitters and hence our moods? I refer, of course, to the micro-organisms inhabiting our bowels -our microbiome. Clearly ‘other’ and yet essential to the functioning person I regard as ‘me’.

And yet, our gut bacteria are mostly acquired from the environment -including the bacteria colonizing our mother’s vagina and probably her breast milk- and so are not evolutionarily prescribed, nor thereby hereditarily transmitted. So, am I merely a we –picking up friends along the way? Well, consider mitochondria -the powerhouse of our cells. They were once free-living bacteria that adapted so well inside our cells that they, too, are integral to cell functioning but have lost the ability to survive separately; they are transmitted from generation to generation. So they are me, right…?

Again I have to ask just who is me? Or is the question essentially meaningless put like that? Given that I am a multitude, and more like a city than a single house, shouldn’t the question be who are we? The fact that all of us, at least in Western cultures, consider ourselves to be distinct entities -separate individuals with unique identities- makes me wonder, about our evolutionary history.

Was there a time when we didn’t make the distinctions we do nowadays? A time when we thought of ourselves more as members of a group than as individuals? When, perhaps sensing that we were constantly interacting with things outside and inside us, the boundaries were less important? Is that how animals would say they see the world if they were able to tell us?

Does our very ability to communicate with each other with more sophistication, create the critical difference? Is that what created hubris? In Greek tragedy, remember, hubris -excess pride and self-confidence- led inexorably to Nemesis, retributive justice. Were poets in that faraway time, trying to tell people something they had forgotten? Is that what this is all about?

I wonder if Shakespeare, as about so many things, was also aware of our ignorance: ‘pride hath no other glass to show itself but pride, for supple knees feed arrogance and are the proud man’s fees.’

Plus ça change, eh?

Wast thou o’erlook’d, even in thy birth?

That Age can do some funny things to the mind seems fairly obvious. The accumulation of years, brings with it a panoply of experience that, hopefully, enables a kind of personalized Weltanschauung to emerge -things begin to sort themselves on the proper shelves, and even if they remain difficult to retrieve, there is a satisfaction that they are there, if not completely codified.

Of course, admixed with any elder ruminations are the ever-present intimations of imminent mortality -but it’s not that Age constrains the thought process to memento mori, so much as a flourishing of its antithesis: memento vivere. Age is a time for reflection about one’s life with a perspective from further up the hill.

And yet, for all the experiential input, there are two time frames hidden from each of us -what happens after death, is the obvious one to which most of us turn our attention as the final act draws to a close, but there is an equally shrouded area on which few of us spend any time: what, if anything, was preconceptual existence like? Is it the equivalent of death, perhaps minus the loss of an identity not yet acquired?

I wonder if it’s a subject more understandable to the very young, than the gnarled and aged. I remember the very first time I was taken to a movie theatre, somewhere around two or three years of age, I think. When I say ‘remember’, I mean to say I have only one recollection of the event: that of a speeding locomotive filmed in black-and-white from track level, and roaring over the camera. It was very exciting, but I remember my father being very puzzled when I confessed that I’d seen it before. I hadn’t, of course, as he patiently explained to me, and yet it seemed to me I’d seen the same thing years before.

No doubt it was my still-immature neurons trying to make sense of the world, but the picture seemed so intuitively obvious to me at the time. And through the years, the image has stayed with me, as snippets of childhood memories sometimes do, although with the meaning now sufficiently expurgated as to be innocuous, as well as devoid of any important significance.

And then, of course, there was the Bridey Murphy thing that was all the rage when I was growing up in the 1950ies. I read the book The Search for Bridey Murphy in my early teenage years about a Colorado woman, Virginia Tighe, who, under hypnotic regression in the early 1950ies, claimed she was the reincarnation of an Irish woman, Bridey Murphy from Cork in the 19th century. I even went to see the movie of the same name as the book. It was all pretty well debunked subsequently, but I suppose it was enough, at a tender age, to make me wonder about what might have happened before I become me.

At any rate, I am puzzled about why the seeming non-existence prior to conception is not something we think about more often. True, we would likely have no identity to put into that side of the equation, nor, for that matter, the loss of anything like friends or, well, existence, on the other, but still it is a comparable void. A wonderful mystery every bit as compelling as death.

I suppose the issue resurfaced for me a few years ago when I had a very vivid dream about our three-score-and-ten of existence. I saw myself as a bubble rising through some boiling water. While I was the bubble, I thought of myself as singular and not only separate from, but possessing an identity totally differentiated and unique from everything else around me. My life was the time it took me to rise to the surface. And yet when I arrived there, and my bubble burst and disappeared, when the me inside dissolved in the air from which I started, it all made sense. In fact, the encapsulated journey itself was an aberration, as was the idea of identity…

The dream lay fallow for several years and then reawakened, Phoenix-like, when I discovered an essay in the online publication Aeon, by Alison Stone, a professor of philosophy at Lancaster University in the UK.

‘Many people feel anxious about the prospect of their death,’ she writes. ‘Indeed, some philosophers have argued that death anxiety is universal and that this anxiety bounds and organises human existence. But do we also suffer from birth anxiety? Perhaps. After all, we are all beings that are born as well as beings that die… Once we bear in mind that we are natal as well as mortal, we see some ways in which being born can also occasion anxiety.’

I don’t believe she is thinking of what it must feel like to be born, so much as the transition from, well, the nothing before sperm and egg meet, to a something -to a somebody. She quotes the thoughts of the bioethicist David Albert Jones in his 2004 book The Soul of the Embryo: ‘We might be telling someone of a memory or event and then realise that, at that time, the person in front of us did not even exist! … If we seriously consider the existence and the beginning of any one particular human being … we realise that it is something strange and profound.’

Stone continues, ‘I began to exist at a certain point in time, and there is something mysterious about this. I haven’t always been there; for aeons, events in the world unfolded without me. But the transition from nonexistence to existence seems so absolute that it is hard to comprehend how I can have passed across it… To compound the mystery further, there was no single crossing point. In reality, we don’t begin in [a] sudden, dramatic way… Rather, I came into existence gradually. When first conceived, I was a single cell (a zygote). Then I developed a formed body and began to have a rudimentary level of experience during gestation. And once out of my mother’s womb, I became involved in culture and relationships with others, and acquired a structured personality and history. Yet the zygote that I began as was still me, even though it had none of this.’ Wow -you see what I mean?

Stone seems to think that all this is rather distressing, but I disagree. All I feel is a sense of profound, unbounded wonder at it all. Reflecting on that time-before-time is not unweaving the rainbow, as Keats was said to have accused Newton of doing because he had destroyed its poetry by actually studying it.

In fact, I’m reminded of something the poet Kahlil Gibran wrote: And when you were a silent word upon Life’s quivering lips, I too was there, another silent word. Then life uttered us and we came down the years throbbing with memories of yesterday and with longing for tomorrow, for yesterday was death conquered and tomorrow was birth pursued.

I have to believe there will still be poetry in the world -with or without us…

Virtues we write in water

I’ve only recently stumbled on the concept of virtue signalling. The words seem self-explanatory enough, but their juxtaposition seems curious. I had always thought of virtue as being, if not invisible, then not openly displayed like chest hair or cleavage. Perhaps it’s my United Church lineage, or the fact that many of my formative years were spent in pre-Flood Winnipeg, but the idea of flaunting goodness still seems anathema to me -too social mediesque, I suppose.

Naturally, I am reminded of that line in Shakespeare’s Henry VIII: Men’s evil manners live in brass; their virtues we write in water. And, although I admit that I am perhaps woefully behind the times -and therefore, hopefully, immune from any accusations of what I have just disparaged- it seems to me that virtue disappears when advertised as such; it reappears as braggadocio. Vanity.

Because I had never heard of the issue, it was merely an accident that I came across it in an article in Aeon:

It was an essay written by Neil Levy, a senior research fellow of the Oxford Uehiro Centre for Practical Ethics and professor of philosophy at Macquarie University in Sydney. ‘Accusing someone of virtue signalling is to accuse them of a kind of hypocrisy. The accused person claims to be deeply concerned about some moral issue but their main concern is – so the argument goes – with themselves.’

And yet, as I just wrote, ‘Ironically, accusing others of virtue signalling might itself constitute virtue signalling – just signalling to a different audience… it moves the focus from the target of the moral claim to the person making it. It can therefore be used to avoid addressing the moral claim made.’ That’s worrisome: even discussing the concern casts a long shadow. But is that always ‘moral grandstanding’?

Levy wonders if ‘virtue signalling, or something like it, is a core function of moral discourse.’ Maybe you can’t even talk about virtue, without signalling it, and maybe it signals something important about you -like a peacock’s tail advertising its fitness.

The question to be asked about signalling, though, is whether it is costly (like the resources that are needed to create the tail), or enhances credibility -honesty, I suppose- (like the sacrifice that might be involved in outing, say, an intruder that might harm not only the group, but also the signaller). And while the latter case may also involve a significant cost, it may also earn a significant reward -not only cooperation in standing up en masse to the predator, let’s say, but also commendation for alerting the group: honour, prestige…

Seen in this light, Levi thinks, virtue signalling may in fact be a proclamation to the in-group -the tribe- and identifying the signaller as a member. So would this virtue signalling occur when nobody else was around -when only the signaller would know of his own virtue? Would he (Okay, read I) give to charity anonymously? Help someone in need without identifying himself? And if so, would it still be virtue signalling, if only to himself? Is it even possible to be hypocritical to oneself…?  Interesting questions.

Of course, memory is itself mutable, and so is it fair to criticize someone who honestly believes they acted honourably? Would it be legitimate to accuse them of virtue signalling, even if evidence suggested another version of the event?

Long ago, when I was a freshman living in Residence at university, a group of us decided to celebrate our newly found freedom from parental supervision and headed off to a sleazy pub near the school that catered to students and was known to be rather forgiving of minimum age requirements for drinks.

For some of us at least, alcohol had not been a particularly significant part of our high school experience and so I quickly found myself quite drunk. I woke up, apparently hours later, lying on my bed and none the wiser about the night. I was wearing my roommate’s clothes, and I could see mine lying clean and neatly folded on the chair beside my desk. My wallet and watch, along with a few coins were arranged carefully on top.

“You passed out in the pub,” Jeff explained when I tried, unsuccessfully, to sit up in bed. “I thought I’d better wash your clothes, after you were sick all over them,” he explained, smiling proudly at his charity. “Well, actually, Brenda put them in the washer -I’m not good at that kind of stuff.” He stared at me for a moment, shaking his head in mock disbelief. “Boy, you were really wasted! It took three of us to get you back…”

I remember trying to focus my eyes on him as I attempted to think about the evening, and then slumped back onto the pillow and slept for most of the morning.

My memory of the pub night is vague now, but I do remember going to the store the next day to buy something, and finding that, apart from the coins, I had no money left -none in the pockets of the freshly washed clothes, of course, but none of the money my parents had given me for my first month’s expenses that had been in my wallet either.

None of this is particularly consequential, I suppose, but it did surface at a class reunion many years later. Jeff was now a high school teacher, Brenda a lawyer, and I had just finished a medical residency and was about to open a consulting practice.

Jeff, as had always been his wont, was holding his own noisy court at the bar, and Brenda -now his wife- was glaring at him. He was slurring his words already, even though the socializing part of the evening had just begun.

Perhaps in an effort to deflect her attention he glanced around the room and when he saw me, waved.

“Remember old G?” he shouted to nobody in particular, and immediately embraced me as soon as I got close enough. I saw a few people I recognized, but even under Brenda’s worried look, Jeff wouldn’t let go of my arm. “G was my roomie…” Jeff explained and signalled the bartender for another beer with his free hand before Brenda waved him off. “He used to get so drunk,” he explained, although I had trouble untangling his words. “Thank the gods that I was around to take care of his, though…”

His what,” I asked, largely to break the palpable tension between Jeff and Brenda.

Jeff looked surprised. “Take care of him…  Take care of you, roomie. You!” He looked at Brenda and finally let go of my hand. “One night he got so drunk, I had to carry him home, and then lend him my clothes because he’d been sick all over his own…”

The others in the group shuffled nervously and glanced at each other. Brenda seemed angry, but I just shrugged.

“That was good of you, Jeff,” I said. “I obviously needed help that night…” I hadn’t forgotten about the missing money, but now wasn’t the time to mention it.

The others smiled and nodded -rather hesitantly, I thought.

“But, that’s what a real friend does, eh?” Jeff added, as Brenda tugged on his arm to leave. She blinked self-consciously at me as she led him away from the bar. “Nice to see you again, G,” she said, her eyes silently apologizing to me. “Maybe we can talk later, eh…?”

I think she knew more about the missing money than she was willing to admit, even to friends.

Maybe we were all virtue-signalling, though…