Are you really my friend?

There was something that Albert Camus, the Algerian-French philosopher, once wrote that has continued to inspire me since I first read it, so many years ago: “Don’t walk in front of me… I may not follow. Don’t walk behind me… I may not lead. Walk beside me… just be my friend

Friendship is a magical thing that is hard to define; it is like St. Thomas Aquinas’ view of Time: you know what it is until someone asks. Poets, perhaps, with their metaphors come closest to capturing it -Shakespeare for example:

Those friends thou hast, and their adoption tried,
Grapple them unto thy soul with hoops of steel.

Or, the wisdom of Rumi, a 13th century Persian poet: ‘Friend, our closeness is this: anywhere you put your foot, feel me in the firmness under you.’

And even the humour of Oscar Wilde:A good friend will always stab you in the front‘.

And yet, despite the feeling that its essence remains just at the tip of our tongues, there has always been an abiding faith in friendships, a trust that, to paraphrase Abraham Lincoln, ‘I destroy my enemies when I make them my friends’. In more modern times, however, the concept of ‘friend’ has undergone a not-so-subtle shift -everything from ‘friending’ people on social media, to online bullying, to trolling individuals for their putative beliefs, to unintended disclosure of confidences in internet postings.

So should a friend always bear his friend’s infirmities, as Cassius asked Brutus, in Shakespeare’s Julius Caesar? There was a time when the answer seemed obvious; now I am not so sure.

Quite by chance, I came across an essay by Leah Plunkett, an associate dean at the University of New Hampshire’s Franklin Pierce School of Law which raised the question of whether friendship should be policed. Whether it should remain a simple state of loyalty or, if declared, entail a legal obligation -like, say, marriage.

The concept caught me totally by surprise. ‘Friendship is the most lawless of our close relationships,’ she writes. Somehow, the idea that there might even be a need of a legal framework for friendship seemed dystopian to me, so I read on.

‘Friends are tied to each other through emotions, customs and norms – not through a legally defined relationship, such as marriage or parenting, that imposes obligations. Anybody can become friends, we believe…  But with the advent of the digital domain, friendship has become more fraught. Online and off, we can share information about our friends without their permission and without legal restriction (except for slander and libel).’ But, of course, that means that ‘Information shared between friends can wind up being seen by people outside the friendship network who are not the intended audience…  confidences can inadvertently find their way to the public domain; all it takes is one careless email or the wrong privacy setting on a Facebook post.’

And there may even be legal consequences to what we or our friends have posted. ‘Digital social networks are already used to detain people trying to cross into the United States when statements by friends in their network are deemed by border agents to be suspicious or threatening.’ And, although most of us are aware that most social media platforms are collecting and selling our information, ‘Fewer recognise the third-party companies typically behind the scenes of our interactions, often using our information in unknown and uncontrollable ways in pursuit of their own goals.’

And yet, ‘Amid all this chaos, friendship itself remains unregulated. You don’t need a licence to become someone’s friend, like you do to get married. You don’t assume legal obligations when you become someone’s friend, like you do when you have a child. You don’t enter into any sort of contract, written or implied, like you do when you buy something.’ There’s no legal definition of ‘friend’, either.

But, Plunkett has an interesting idea: some U.S. states (like New Hampshire, her own) have definitions of bullying: the state’s Pupil Safety and Violence Prevention Act (2000) for students in primary and secondary school defines what bullying would entail. She wonders if it might be possible to apply its converse to define friendship. So, instead of saying you can’t harm somebody, a friend should need to support a peer or their property; cause emotional comfort, and so on. And, ‘To engage in cyberfriendship, this behaviour would need to take place electronically.’ Interesting idea.

But, although promoting friendship -online or in person- is worthwhile, one clearly has to be careful about how rigorously it is applied. ‘If you could be punished for not being a friend rather than for being a bully, that would undermine the lawlessness that makes friendship so generative.’

And Plunkett feels one has to be particularly careful about this lawlessness. ‘As friendship becomes less lawless, [and] more guarded by cybersurveillance… it might also become less about loyalty, affinity and trust, and more about strategy, currency and a prisoner’s dilemma of sorts (‘I won’t reveal what I know about you if you don’t reveal it about me’).’

It seems to me, she is correct in suggesting that we would be unwise to imprison friendship in too tight a definition -we might find ourselves confined to stocks for punishment and public humiliation like misbehaving villagers in the 16th and 17th centuries.  So, ‘Let’s keep paying our respects to those bonds of friendship that are lawless at heart, opening new frontiers within ourselves.’

And listen to the words of poets like Kahlil Gibran:

When your friend speaks his mind you fear not the “nay” in your own mind, nor do you withhold the “ay.”
And when he is silent your heart ceases not to listen to his heart;
For without words, in friendship, all thoughts, all desires, all expectations are born and shared, with joy that is unacclaimed.
When you part from your friend, you grieve not;
For that which you love most in him may be clearer in his absence, as the mountain to the climber is clearer from the plain.
And let there be no purpose in friendship save the deepening of the spirit.
For love that seeks aught but the disclosure of its own mystery is not love but a net cast forth: and only the unprofitable is caught

If only…

Look the other way, please.

There really are inconvenient truths, aren’t there? There are some things that seem to slip quietly under the radar -things that go unremarked until they  are brought our our attention. And even then, they are perhaps dismissed as unimportant -or worse, accepted and rationalized in an attempt to justify them as tools that enable the greater good of humanity. We, after all, are what it’s all about; our welfare is paramount, not to mention our survival. And when you frame it in those terms, there is little room for noblesse oblige. Survival of the fittest, quickly becomes survival of the ruthless -of the remorseless.

Perhaps I should explain. I live on a little hobby farm in the country, and when I was actively breeding sheep, chickens, and llamas, I was well acquainted with interested visitors, both two and four-legged. Everybody, it seemed, had or wanted, a stake in the game. Friends wanted eggs for their breakfasts, colleagues wanted lamb for their dinners, and I wanted an escape from the city. But, to share with some, was to share with all.

That’s how Life works, I suppose: word gets around, and soon there are all manner of uninvited guests -not all of whom knock, or ask permission. Some just appear -like carpenter ants- but some try not to advertise their arrival, and in fact seem to want to stay out of sight, if not out of mind. They’re the ones I used to worry about -if they’re in the barn, where else might they hide?

Of course I’m talking about rats -not so much the mice which kept my three cats busy in the night. No, the rats who hid in the engine of my pickup truck and ate the plastic off the wires to my distributor, or the battery wires in my car; the rats who patrolled the barn and left their distinctive trail through the uneaten bits of grain I fed the sheep; the rats who also holed up in the woodpile in my garage, and wherever else they could gather relatively undisturbed.

And yes, I declared war on them with spring traps baited with peanut butter, and put warfarin-like pellets in short, narrow little PVC pipes so the cats couldn’t get into them, but alas, the rats outlasted my efforts. Only when I retired and the chickens died in a well-fed old age, and only when I sold the sheep and llamas did the supply of grain eventually disappear -only then did the rats disappear. And I’ve never seen a rat, or droppings since. It reminded me of  the last stanza of Longfellow’s poem The Day is Done:

                                 And the night shall be filled with music,

                                      And the cares, that infest the day,

                                Shall fold their tents, like the Arabs,

                                     And as silently steal away.

I know, I know -they’re only rats, but their leaving seemed so sudden; I came to think of them as having made a collective decision to move their troupe away to greener fields -sort of like the Travellers in Britain with their little trailers, able to leave when conditions are no longer hospitable for them. I suppose I Disneyfied them in my over-active imagination, and yet there was something about their migration that softened their attributes. I’ve never been fond of rats -especially their tails- but on the other hand I’ve always found it hard to believe all of the sinister lore attached to their sneaky habits. After all, they’ve lived with mankind and our middens from the beginning, I would imagine… and we’re both still here in spades. You have to assume a certain degree of intelligence to coexist with us for so long, despite our best efforts to exterminate them.

As these things happen, I tripped over a tantalizing essay co-written by Kristin Andrews, a professor of philosophy at York University in Toronto, and Susana Monsó, a post-doctoral fellow at the Messerli Research Institute in Vienna.

The first three sentences of the article hooked me: ‘In the late 1990s, Jaak Panksepp, the father of affective neuroscience, discovered that rats laugh. This fact had remained hidden because rats laugh in ultrasonic chirps that we can’t hear. It was only when Brian Knutson, a member of Panksepp’s lab, started to monitor their vocalisations during social play that he realised there was something that appeared unexpectedly similar to human laughter.’ And then, okay, they tickled them. ‘They found that the rats’ vocalisations more than doubled during tickling, and that rats bonded with the ticklers, approaching them more frequently for social play. The rats were enjoying themselves.’

Of course, there were some other features, that if further substantiated, we likely don’t want to hear: ‘We now know that rats don’t live merely in the present, but are capable of reliving memories of past experiences and mentally planning ahead the navigation route they will later follow. They reciprocally trade different kinds of goods with each other – and understand not only when they owe a favour to another rat, but also that the favour can be paid back in a different currency. When they make a wrong choice, they display something that appears very close to regret.’ I’ve left the links intact, for reference, in case the reader’s credulity level sinks to the Fake News level.

But, for me at least, ‘The most unexpected discovery, however, was that rats are capable of empathy…  It all began with a study in which the rats refused to press a lever to obtain food when that lever also delivered a shock to a fellow rat in an adjacent cage. The rats would rather starve than witness a rat suffering. Follow-up studies found that rats would press a lever to lower a rat who was suspended from a harness; that they would refuse to walk down a path in a maze if it resulted in a shock delivered to another rat; and that rats who had been shocked themselves were less likely to allow other rats to be shocked, having been through the discomfort themselves.’

The reason the essay intrigued me, I’m sure, is because it has long been a practice to utilize rats (and mice, of course) as mindless fodder for our experimental quandaries. And, there’s little question that it is better to experiment on an animal than on a human, and especially a time-honoured nuisance and villain like a rat rather than a chimpanzee, or whatever. I don’t think I would be prepared to argue their utility for this, nor that until we have devised non-living alternatives -cell cultures, or AI modelling, perhaps- some things will require validation in functioning organisms to advance our knowledge for the benefit of the rulers (us).

My hope, however, is to point out that our hubris may tend to blind us to the increasing likelihood that rats, are not mindless protoplasms living forever in the ‘now’ of their experiences. Are they sentient beings…? I suppose their sentience , like ours, is on a spectrum, isn’t it?

But if we are to continue to utilize them as unwitting research subjects, it seems to me that we should treat them with kindness and a degree of respect. Remember the words of Gloucester after he has been blinded by Cornwall, in Shakespeare’s King Lear: ‘As flies to wanton boys are we to the gods. They kill us for their sport.’ Let us not stoop to that…

Imagination bodies forth the forms of things unknown

The poet’s eye, in fine frenzy rolling,
Doth glance from heaven to earth, from earth to heaven;
And as imagination bodies forth
The forms of things unknown, the poet’s pen
Turns them to shapes and gives to airy nothing
A local habitation and a name.
Such tricks hath strong imagination.                                                                                                                   
Theseus, in Shakespeare’s A Midsummer Night’s Dream

Shakespeare had a keen appreciation of the value of imagination, as that quote from A Midsummer Night’s Dream suggests. But what is imagination? Is it a luxury -a chance evolutionary exaptation of some otherwise less essential neural circuit- or a purpose-made system to analyse novel features in the environment? A mechanism for evaluating counterfactuals -the what-ifs?

A quirkier question, perhaps, would be to ask if it might predate language itself -be the framework, the scaffolding upon which words and thoughts are draped. Or is that merely another chicken versus egg conundrum drummed up by an overactive imagination?

I suppose what I’m really asking is why it exists at all. Does poetry or its ilk serve an evolutionary purpose? Do dreams? Does one’s Muse…? All interesting questions for sure, but perhaps the wrong ones with which to begin the quest to understand.

I doubt that there is a specific gene for imagination; it seems to me it may be far more global than could be encompassed by one set of genetic instructions. In what we would consider proto-humans it may have involved more primitive components: such non-linguistic features as emotion -fear, elation, confusion- but also encompassed bodily responses to external stimuli: a moving tickle in that interregnum between sleep and wakefulness might have been interpreted as spider and generated a muscular reaction whether or not there was an actual creature crawling on the skin.

Imagination, in other words, may not be an all-or-nothing feature unique to Homo sapiens. It may be a series of  adaptations to the exigencies of life that eventuated in what we would currently recognize as our human creativity.

I have to say, it’s interesting what you can find if you keep your mind, as well as your eyes, open. I wasn’t actively searching for an essay on imagination -although perhaps on some level, I was… At any rate, on whatever level, I happened upon an essay by Stephen T Asma, a professor of philosophy at Columbia College in Chicago and his approach fascinated me.

‘Imagination is intrinsic to our inner lives. You could even say that it makes up a ‘second universe’ inside our heads. We invent animals and events that don’t exist, we rerun history with alternative outcomes, we envision social and moral utopias, we revel in fantasy art, and we meditate both on what we could have been and on what we might become… We should think of the imagination as an archaeologist might think about a rich dig site, with layers of capacities, overlaid with one another. It emerges slowly over vast stretches of time, a punctuated equilibrium process that builds upon our shared animal inheritance.’

Interestingly, many archaeologists seem to conflate the emergence of imagination with the appearance of artistic endeavours –‘premised on the relatively late appearance of cave art in the Upper Paleolithic period (c38,000 years ago)… [and] that imagination evolves late, after language, and the cave paintings are a sign of modern minds at work.’

Asma, sees the sequence rather differently, however: ‘Thinking and communicating are vastly improved by language, it is true. But ‘thinking with imagery’ and even ‘thinking with the body’ must have preceded language by hundreds of thousands of years. It is part of our mammalian inheritance to read, store and retrieve emotionally coded representations of the world, and we do this via conditioned associations, not propositional coding.’

Further, Asma supposes that ‘Animals appear to use images (visual, auditory, olfactory memories) to navigate novel territories and problems. For early humans, a kind of cognitive gap opened up between stimulus and response – a gap that created the possibility of having multiple responses to a perception, rather than one immediate response. This gap was crucial for the imagination: it created an inner space in our minds. The next step was that early human brains began to generate information, rather than merely record and process it – we began to create representations of things that never were but might be.’ I love his idea of a ‘cognitive gap’. It imagines (sorry) a cognitive area where something novel could be developed and improved over time.

I’m not sure that I totally understand all of the evidence he cites to bolster his contention, though- for example, the view of philosopher Mark Johnson at the University of Oregon that there are ‘deep embodied metaphorical structures within language itself, and meaning is rooted in the body (not the head).’ Although, ‘Rather than being based in words, meaning stems from the actions associated with a perception or image. Even when seemingly neutral lexical terms are processed by our brains, we find a deeper simulation system of images.’ But at any rate, Asma summarizes his own thoughts more concisely, I think: ‘The imagination, then, is a layer of mind above purely behaviourist stimulus-and-response, but below linguistic metaphors and propositional meaning.’

In other words, you don’t need to have language for imagination. But the discipline of biosemantics tries to envisage how it might have developed in other animals. ‘[Primates] have a kind of task grammar for doing complex series of actions, such as processing inedible plants into edible food. Gorillas, for example, eat stinging nettles only after an elaborate harvesting and leave-folding [sic] sequence, otherwise their mouths will be lacerated by the many barbs. This is a level of problem-solving that seeks smarter moves (and ‘banks’ successes and failures) between the body and the environment. This kind of motor sequencing might be the first level of improvisational and imaginative grammar. Images and behaviour sequences could be rearranged in the mind via the task grammar, long before language emerged. Only much later did we start thinking with linguistic symbols. While increasingly abstract symbols – such as words – intensified the decoupling of representations and simulations from immediate experience, they created and carried meaning by triggering ancient embodied systems (such as emotions) in the storytellers and story audiences.’ So, as a result, ‘The imaginative musician, dancer, athlete or engineer is drawing directly on the prelinguistic reservoir of meaning.’

Imagination has been lauded as a generator of progress, and derided as idle speculation throughout our tumultuous history, but there’s no denying its power: ‘The imagination – whether pictorial or later linguistic – is especially good at emotional communication, and this might have evolved because emotional information drives action and shapes adaptive behaviour. We have to remember that the imagination itself started as an adaptation in a hostile world, among social primates, so perhaps it is not surprising that a good storyteller, painter or singer can manipulate my internal second universe by triggering counterfactual images and events in my mind that carry an intense emotional charge.’

Without imagination, we cannot hope to appreciate the Shakespeare who also wrote, in his play Richard III:

Princes have but their titles for their glories,                                                                                                      An outward honor for an inward toil,                                                                                                                And, for unfelt imaginations,                                                                                                                                They often feel a world of restless cares.

Personally, I cannot even imagine a world where imagination doesn’t play such a crucial role… Or can I…?


To hold, as it were, a mirror up to Nature

Who am I? No, really -where do I stop and something else begins? That’s not really as silly a question as it may first appear. Consider, for example, my need to remember something -an address, say. One method is to internalize it -encode it somehow in my brain, I suppose- but another, no less effective, is to write it down. So, if I choose the latter, is my pen (or keyboard, for that matter) now in some sense a functional part of me? Is it an extension of my brain? The result is the same: the address is available whenever I need it.

Ever since my university days, when I discovered the writings of the philosopher Alan Watts, I have been intrigued by his view of boundaries, and whether to consider them as things designed to separate, or to join. Skin, was one example that I remember he discussed -does it define my limits, and enclose the me inside, or is it actually my link with the outside world? I hadn’t really thought much about it until then, but in the intervening years it has remained an idea that continues to fascinate me.

Clearly Watts was not alone in his interest about what constitutes an individual, nor in his speculations about the meaning of whatever identities individuals think they possess by virtue of their boundaries. There was an insightful article in Aeon by Derek Skillings, a biologist and philosopher of science at the University of Pennsylvania entitled ‘Life is not easily bounded’:

‘Most of the time the living world appears to us as manageable chunks,’ he writes, ‘We know if we have one dog or two.’ Why then, is ‘the meaning of individuality … one of the oldest and most vexing problems in biology? …  Different accounts of individuality pick out different boundaries, like an overlapping Venn diagram drawn on top of a network of biotic interactions. This isn’t because of uncertainty or a lack of information; rather, the living world just exists in such a way that we need more than one account of individuality to understand it.’ But really, ‘the problem of individuality is (ironically enough) actually composed of two problems: identity and individuation. The problem of identity asks: ‘What does it mean for a thing to remain the same thing if it changes over time?’ The problem of individuation asks: ‘How do we tell things apart?’ Identity is fundamentally about the nature of sameness and continuity; individuation is about differences and breaks.’ So, ‘To pick something out in the world you need to know both what makes it one thing, and also what makes it different than other things – identity and individuation, sameness and difference.’

What about a forest -surely it is a crowd of individual trees?  Well, one way of differentiating amongst individuals is to think about growth -a tree that is growing (in other words, continuing as more of the same)- and contrasting it with producing something new: as in reproduction. And yet even here, there is a difficulty. It’s difficult to determine the individual identities of any trees that also grew from the original roots -for example from a ‘nurse’ tree lying on the ground with shoots and saplings sprouting from it.’

But it’s not only plants that confuse the issue. If reproduction -i.e. producing something new– counts as a different entity, then what about entities like bacteria? ‘These organisms tend to reproduce [albeit] by asexual division, dividing in half to produce two clones… and, failing mutation and sub-population differentiation, an entire population of bacteria would be considered a single individual.’ -whatever ‘individual’ might therefore mean.

And what about us, then? Surely we have boundaries, surely we are individuals created as unique entities by means of sexual reproduction. Surely we have identities. And yet, what of those other entities we carry with us through our lives -entities that not only act as symbiotes, but are also integrated so thoroughly into our metabolism that they contribute to such intimate functions as our immune systems, our weight and health, and even function as precursors for our neurotransmitters and hence our moods? I refer, of course, to the micro-organisms inhabiting our bowels -our microbiome. Clearly ‘other’ and yet essential to the functioning person I regard as ‘me’.

And yet, our gut bacteria are mostly acquired from the environment -including the bacteria colonizing our mother’s vagina and probably her breast milk- and so are not evolutionarily prescribed, nor thereby hereditarily transmitted. So, am I merely a we –picking up friends along the way? Well, consider mitochondria -the powerhouse of our cells. They were once free-living bacteria that adapted so well inside our cells that they, too, are integral to cell functioning but have lost the ability to survive separately; they are transmitted from generation to generation. So they are me, right…?

Again I have to ask just who is me? Or is the question essentially meaningless put like that? Given that I am a multitude, and more like a city than a single house, shouldn’t the question be who are we? The fact that all of us, at least in Western cultures, consider ourselves to be distinct entities -separate individuals with unique identities- makes me wonder, about our evolutionary history.

Was there a time when we didn’t make the distinctions we do nowadays? A time when we thought of ourselves more as members of a group than as individuals? When, perhaps sensing that we were constantly interacting with things outside and inside us, the boundaries were less important? Is that how animals would say they see the world if they were able to tell us?

Does our very ability to communicate with each other with more sophistication, create the critical difference? Is that what created hubris? In Greek tragedy, remember, hubris -excess pride and self-confidence- led inexorably to Nemesis, retributive justice. Were poets in that faraway time, trying to tell people something they had forgotten? Is that what this is all about?

I wonder if Shakespeare, as about so many things, was also aware of our ignorance: ‘pride hath no other glass to show itself but pride, for supple knees feed arrogance and are the proud man’s fees.’

Plus ça change, eh?

Virtues we write in water

I’ve only recently stumbled on the concept of virtue signalling. The words seem self-explanatory enough, but their juxtaposition seems curious. I had always thought of virtue as being, if not invisible, then not openly displayed like chest hair or cleavage. Perhaps it’s my United Church lineage, or the fact that many of my formative years were spent in pre-Flood Winnipeg, but the idea of flaunting goodness still seems anathema to me -too social mediesque, I suppose.

Naturally, I am reminded of that line in Shakespeare’s Henry VIII: Men’s evil manners live in brass; their virtues we write in water. And, although I admit that I am perhaps woefully behind the times -and therefore, hopefully, immune from any accusations of what I have just disparaged- it seems to me that virtue disappears when advertised as such; it reappears as braggadocio. Vanity.

Because I had never heard of the issue, it was merely an accident that I came across it in an article in Aeon:

It was an essay written by Neil Levy, a senior research fellow of the Oxford Uehiro Centre for Practical Ethics and professor of philosophy at Macquarie University in Sydney. ‘Accusing someone of virtue signalling is to accuse them of a kind of hypocrisy. The accused person claims to be deeply concerned about some moral issue but their main concern is – so the argument goes – with themselves.’

And yet, as I just wrote, ‘Ironically, accusing others of virtue signalling might itself constitute virtue signalling – just signalling to a different audience… it moves the focus from the target of the moral claim to the person making it. It can therefore be used to avoid addressing the moral claim made.’ That’s worrisome: even discussing the concern casts a long shadow. But is that always ‘moral grandstanding’?

Levy wonders if ‘virtue signalling, or something like it, is a core function of moral discourse.’ Maybe you can’t even talk about virtue, without signalling it, and maybe it signals something important about you -like a peacock’s tail advertising its fitness.

The question to be asked about signalling, though, is whether it is costly (like the resources that are needed to create the tail), or enhances credibility -honesty, I suppose- (like the sacrifice that might be involved in outing, say, an intruder that might harm not only the group, but also the signaller). And while the latter case may also involve a significant cost, it may also earn a significant reward -not only cooperation in standing up en masse to the predator, let’s say, but also commendation for alerting the group: honour, prestige…

Seen in this light, Levi thinks, virtue signalling may in fact be a proclamation to the in-group -the tribe- and identifying the signaller as a member. So would this virtue signalling occur when nobody else was around -when only the signaller would know of his own virtue? Would he (Okay, read I) give to charity anonymously? Help someone in need without identifying himself? And if so, would it still be virtue signalling, if only to himself? Is it even possible to be hypocritical to oneself…?  Interesting questions.

Of course, memory is itself mutable, and so is it fair to criticize someone who honestly believes they acted honourably? Would it be legitimate to accuse them of virtue signalling, even if evidence suggested another version of the event?

Long ago, when I was a freshman living in Residence at university, a group of us decided to celebrate our newly found freedom from parental supervision and headed off to a sleazy pub near the school that catered to students and was known to be rather forgiving of minimum age requirements for drinks.

For some of us at least, alcohol had not been a particularly significant part of our high school experience and so I quickly found myself quite drunk. I woke up, apparently hours later, lying on my bed and none the wiser about the night. I was wearing my roommate’s clothes, and I could see mine lying clean and neatly folded on the chair beside my desk. My wallet and watch, along with a few coins were arranged carefully on top.

“You passed out in the pub,” Jeff explained when I tried, unsuccessfully, to sit up in bed. “I thought I’d better wash your clothes, after you were sick all over them,” he explained, smiling proudly at his charity. “Well, actually, Brenda put them in the washer -I’m not good at that kind of stuff.” He stared at me for a moment, shaking his head in mock disbelief. “Boy, you were really wasted! It took three of us to get you back…”

I remember trying to focus my eyes on him as I attempted to think about the evening, and then slumped back onto the pillow and slept for most of the morning.

My memory of the pub night is vague now, but I do remember going to the store the next day to buy something, and finding that, apart from the coins, I had no money left -none in the pockets of the freshly washed clothes, of course, but none of the money my parents had given me for my first month’s expenses that had been in my wallet either.

None of this is particularly consequential, I suppose, but it did surface at a class reunion many years later. Jeff was now a high school teacher, Brenda a lawyer, and I had just finished a medical residency and was about to open a consulting practice.

Jeff, as had always been his wont, was holding his own noisy court at the bar, and Brenda -now his wife- was glaring at him. He was slurring his words already, even though the socializing part of the evening had just begun.

Perhaps in an effort to deflect her attention he glanced around the room and when he saw me, waved.

“Remember old G?” he shouted to nobody in particular, and immediately embraced me as soon as I got close enough. I saw a few people I recognized, but even under Brenda’s worried look, Jeff wouldn’t let go of my arm. “G was my roomie…” Jeff explained and signalled the bartender for another beer with his free hand before Brenda waved him off. “He used to get so drunk,” he explained, although I had trouble untangling his words. “Thank the gods that I was around to take care of his, though…”

His what,” I asked, largely to break the palpable tension between Jeff and Brenda.

Jeff looked surprised. “Take care of him…  Take care of you, roomie. You!” He looked at Brenda and finally let go of my hand. “One night he got so drunk, I had to carry him home, and then lend him my clothes because he’d been sick all over his own…”

The others in the group shuffled nervously and glanced at each other. Brenda seemed angry, but I just shrugged.

“That was good of you, Jeff,” I said. “I obviously needed help that night…” I hadn’t forgotten about the missing money, but now wasn’t the time to mention it.

The others smiled and nodded -rather hesitantly, I thought.

“But, that’s what a real friend does, eh?” Jeff added, as Brenda tugged on his arm to leave. She blinked self-consciously at me as she led him away from the bar. “Nice to see you again, G,” she said, her eyes silently apologizing to me. “Maybe we can talk later, eh…?”

I think she knew more about the missing money than she was willing to admit, even to friends.

Maybe we were all virtue-signalling, though…

In scorn of Nature, Art gave lifeless life

Age is an artist that continues to paint experience after experience over the worn and tattered scenes that are no more. For most of us, however, the pentimento is obvious, and never quite disappears beneath the crust of what we insist on adding. And yet, we continue to paint in hopes we’ve got it right at last: that what we are now portraying is what we should have seen those many years ago. All the while, of course, the colours thicken on what we layered on before, adding nothing to our knowledge, only curtains that cast shadows on the canvas -the past no more than tricks of light.

And yet I’m beginning to suspect that there is more to Art than the depiction of long forgotten histories in words or canvas -far more, in fact. Art is the plaque in the cornerstone that reminds us of how things were, the figure-ground that taunts our hallowed view of present days -the stories that we have come to revere.

But we are, all of us, Art; we are the stories that we tell, and the ones that we have heard. We are what we have seen, however vaguely remembered, and parts of us are shadows that follow us around like memories.

So, it occurred to me that Art could function as a synergist: its effect is greater than might be expected from what it depicts. If nothing else, a painting -like an old photograph, perhaps- allows us to see what was and compare it with what is. Some difference is usually to be expected, I suppose, but if the change is sufficiently irreconcilable to our expectations, it may speak to those little ears within that are alert to dissonance. In other words, it may spur us to a conclusion, an action, that we may not have felt was either necessary or justified before: the past ‘screwing our courage to the sticking place’, to slightly paraphrase Shakespeare’s Lady Macbeth.

I hasten to admit that my epiphany is far from original, but I was pleased to find a thorough examination of it in an essay in BBC Future written by Ella Saltmarshe and Beatrice Pembroke, the founders of the Long Time Project which ‘champions art and culture as a route to helping people think and act more long-term.’

‘For most of human history we haven’t needed to think long-term,’ they write. ‘” As futurist Jamais Casio puts it, “In a world of constant, imminent existential threats, the ability to recognise subtle, long-term processes and multi-generational changes wasn’t a particularly important adaptive advantage.” Yet today, the nature of risk has changed. We no longer live in a world of clear, local cause and effect, and the greatest threats to civilisation are happening on the timescale of decades or centuries.’ And yet, ‘While our minds might be not be wired to deal with long-term threats and priorities in the abstract, they are wired for two things that we can control: story and emotion… Art can stretch our time frames, helping us develop what geologist Marcia Bjornerud calls “timefulness”: the ability to locate ourselves within eras and aeons, rather than weeks and months.’

The authors go on to document the ‘growing body of deep time work that locates us in the epic geological history of the Universe, evoking awe and wonder.’ And it seems to me that such an approach may help to bridge the ever-widening gap between indifference and despair: the unwillingness to confront the existential threats that seem to be confronting us at every turn -from the paucity new antibiotics able to deal with increasing microbial resistance, the growing mistrust of vaccines in the face of overwhelming evidence for their efficacy, to the elephant looming in the dark, stuffy room of climate change. We are often so frightened by these, and other things, that we turn our heads away, and like children hiding under a blanket, think we have found a refuge from the elephant and his kin. But somewhere inside, we know we have solved nothing, and if we turn again to look, we find that it is staring at us still.

Sometimes, when things seem too remote for action, too unlikely to affect us, or worse, too horrible to contemplate, we benefit from intermediaries we trust to explain what we have failed to understand and to guide us through the fear. Change is normal, but only when it doesn’t colour outside the expected boundaries -then it turns to chaos. In the words of Shakespeare again -this time King Lear- that way madness lies; let me shun that. So, as the authors write: ‘If we can work with art and culture to stretch our time frames so that we care about the long-term future, then hopefully as a species, we will have a future in the long term.’

And sometimes, it is also the little things changing that we’re reluctant to face.

“Is that where you used to live, Grampa?” My 4 year old grandson stared at the picture I had shown him with a doubtful expression on his face. “Can we go and see it…?”

I could only smile at his enthusiasm. I was a child myself when I’d lived there and my parents had long since sold the house to developers, but at the time it was on a quiet, unsidewalked street lined with trees. Now, years later, it was lined with multi-storied apartment blocks and parked cars.

“It’s changed since then, Cas,” I explained. “And our house isn’t there anymore…”

“Where’d it go?” he asked, his face now puzzled.

My answer was a little shrug. In truth, I missed the house with its wide wooden steps and covered porch. It had trees in the front and back, and a garden where my mother used to grow vegetables that she’d preserve for the long, protracted winter season. I’d told Cas about it many times, but had only just found the grainy photograph for him to see.

“Is the street like our street now?” He ran to the front window of the little apartment his mother and my son were renting while they worked their way up their respective corporate ladders. I had agreed to babysit for the afternoon.

I walked over to the window and looked out with him; I had to nod my head. “Yes Cas, very much like this street.”

He stared out the window for a while, and as I started to walk away, he turned to me. “Why did you let them do it, Grampa?”

The question caught me by surprise. “Do what, Cas?”

“Tear down your beautiful house and take away the trees?”

I had to sigh. “I suppose my mommy and daddy were getting old and needed to move to some place smaller that was easier to take care of…” In fact, they were both gone now.

He thought about it for a moment. “Did their new house have trees and a garden, too?”

Cas seemed so earnest that I didn’t want to disappoint him. He’d never met his great-grandparents; he’d never had to endure their gradual decay in the extended care home in which they  ended up. So I nodded. “Yes, they moved to a place with trees and a little flower garden.”

A big smile suddenly appeared on his face and his eyes twinkled with pleasure. “That’s good,” he said, with a sudden adult expression on his little face. “My daddy says we’re going to move to a place with trees…” He glanced out of the window again. “Trees are important when you get old, aren’t they Grampa?”

They certainly are Cas, I thought and nodded with a sigh. Trees will always be important.

Why then, can one desire too much of a good thing?

What have we done? Have we become so transfixed with definitions –differences– we have forgotten where we started? Where we want to go? Has the clarity for which we strived, opacified as it cooled? Sometimes the more encompassing the definition, the less useful it becomes.

I suppose that coming from the putative dark side -that is to say, the male portion of the equation- I am credentialed merely with Age and a world view encrusted with a particular zeitgeist; I come from an era of binaries, albeit with categories that allow for shades -rainbows which do not seek to define the boundaries where one colour fades into the next. They allow a melange without, I hope, troubling themselves with the constituents. Or am I being hopelessly naïve?

The more I am engaged with the issues of gendered literature, though, the more I suspect I have been misled all these past years. I have, of course, been aware of the lengthening gender acronym -LGBTQIA…- that threatens, like the the old lady who lived in the shoe in that Mother Goose rhyme, to outgrow its useful home. In its quest to include and define each shade of  difference -as laudable as that may seem on first glance- it threatens to fragment like shattered glass: useful to nobody as a container. I am, rather oddly, reminded of the advice of the Indian philosopher, Jiddu Krishnamurti, who felt that we should not attempt to categorize, or name something too finely -God, in his example: the name confines and overly limits the concept being promulgated.

The dangers of over-inclusion surfaced when I attempted to read an essay by Georgia Warnke, a professor of political science at the University of California, Riverside, published in Aeon

‘The famed online Stanford Encyclopedia of Philosophy offers separate articles on analytic and continental feminism (although with a separate article on intersections between the two). The article on analytic feminism notes its commitment to careful argumentation and to ‘the literal, precise, and clear use of language’, while that on continental feminism notes its interest in unveiling precisely those ‘non-discursive deep-seated biases and blind spots … not easily detected by an exclusive focus on the examination of arguments’. A few minutes of reflection suggested that neither my vocabulary nor my intellect may be up to the task, but I ploughed on, nonetheless -still curious about the subject.

‘The article on analytic feminism emphasises the importance of the philosophy of language, epistemology and logic; that on continental feminism the importance of postmodernism, psychoanalysis and phenomenology.’ Whoa. What was I asking my obviously non-postmodern brain to assimilate? It was only when I stumbled upon ‘we can begin with a core feminist question: namely, who or what are women? Who are the subjects to whose freedom and equality feminist philosophers are committed?’ that I sensed a meadow just through the trees and across the creek.

There have been waves of Feminist philosophy, ‘Yet for later feminists, references to sex and gender have served primarily to highlight differences between different groups of women, and to underscore the difficulty of defining women so as to include all those who ought to be included, and to exclude those who ought not.’ For example, take genetic sex. If a woman is restricted to somebody who possesses two X chromosomes, then what happens to trans women -or those who don’t see themselves as binarily constrained? Or those who have various abnormalities in the functioning of their hormones which might force them into a different category?

Is it all down to chromosomes then, or can we also include what they look like -or feel like, for that matter? The question, really, is about definitions it seems -equally applicable to the gendering of both chromosomal sexes. ‘When we turn to gender and define women as those who conform to certain socially and culturally prescribed behaviours, roles, attitudes and desires, we run into similar quandaries. Women possess different races, ethnicities, sexualities, religions and nationalities, and they belong to different socioeconomic classes… Such differences can give rise to different concerns and interests… For example, if emancipation for upper- and middle-class white American women who were historically discouraged from working outside the home involves the freedom to take on paid work, for American working-class women and women of colour who historically needed to or were required to work outside the home, emancipation might involve precisely the freedom to care full-time for one’s own family.’ I have to say, that’s a good point -I had not even considered that before. So is there anything that gendered women have in common?

One commonality, suggested by Sally Haslanger, a professor of philosophy and linguistics at MIT, is oppression. ‘To be a woman is to be subordinated in some way because of real or imagined biological features that are meant to indicate one’s female role in reproduction.’ In many ways, this can be inclusive of trans women, etc., but the problem point is somebody like the Queen of England: ‘if one is not subordinated at all or at least not because of presumptions about one’s biological role – perhaps the Queen of England – then one is not a woman according to this definition.’

There have been other attempts at inclusively defining a woman, of course. Simone de Beauvoir (the woman who was close to Sartre) felt that gender was a result of socialization, whereas Judith Butler, a professor of comparative literature at UC, Berkeley, saw it as ‘the imposition of a set of behavioural and attitudinal norms. She suggests that, as such, it is an effect of power.’ An interesting corollary of this, though, is that ‘the challenge turns out to be that women are themselves effects of power, so that emancipation from relations of power and subordination requires emancipation from being women.’

At this point, I have to say, I was beginning to feel like a kitten chasing its own tail. The arguments and counterarguments seemed self-defeating: lofty rhetoric full of sound and fury, yet signifying nothing, if I may borrow from Shakespeare’s Macbeth.

An attempt to escape from this paradox was suggested by Butler herself: ‘by replacing emancipation with what she calls ‘resignification’, a process of taking up the effects of power and redeploying them.   Although women are effects of power, this power is never accomplished once and for all but must be perpetually reinforced and, moreover, we reinforce it in the ways we act as gendered beings… But we can also behave in ways that undermine this supposed naturalness. We can poke fun at our gendered ways of acting and we can act differently. Drag performances, for example, can camp up stereotypical feminine modes of behaviours and by doing so demonstrate their performance elements.’

Now that struck me as ingenious -like ancient Greek theatre undressing the powerful for all to understand how much we all share in common. And anyway, my head was spinning by the time I reached that stage in the essay; I needed something to hold fast to -some sort of solution.

Maybe the suggestion about how drag performances demonstrate the foolishness of our stereotypes about sexual roles is a very apt observation. And yet, remember, we are, all of us, together in this world; we need only step back a bit to see how little official definitions matter. After all, whatever -or whoever- each of us thinks they are is all that matters in the end, isn’t it?  We are such stuff as dreams are made on… Aren’t we?

Love, which alters when it alteration finds

I’m not certain I understand why, but I am being led to believe that Love can be described mathematically using Bayesian Probability Theory… Okay, as a start, I have no idea what subscribing to Bayesian probability theory might entail, except maybe a club membership, and a considerably manipulated personal profile to attract some interest. But, ever alert to new (or any) social possibilities, I decided to read the essay by Suki Finn, a postdoctoral research fellow in philosophy at the University of Southampton in the UK writing in Aeon:

It starts with the not unreasonable premise that there are two basic types of love: conditional, and unconditional. Then, she dips her toes into some background to convince me that Thomas Bayes’ probability theorem is flexible enough to improve my social life.

‘Degrees of belief are called credences. These credences can be given numerical values between 0 and 1 (where 1 is being completely certain), to demonstrate how strong that degree of belief is. Importantly, these values are not forever fixed, and can change when given reason to do so… But how do you rationally alter your credence, and figure out how strong it should be, given the information that you have? Cue Bayesian probability theory to calculate conditional credences. A credence is conditional upon information when it is evaluated with regard to that information, such that the strength of the belief is sensitive to that information and is updated on the basis of it… But what if my credence is completely irresponsive to such evidence? … This is what it is like to have credence 1, in other words, a belief of certainty, which could not be any stronger and cannot be updated. It cannot be updated in either direction – it cannot get stronger because it is already at maximum strength, and it cannot get weaker on the basis of evidence because it was not built on the basis of evidence in the first place.’ Uhmm… easy, right? And these are the rational changes to credence. ‘When your strength of feeling is sensitive to information about how things are, a philosopher would call it rational, as it develops in accordance with that information. Such is loving for a reason: with more reason comes more love, and when the reason goes, the love goes. This type of conditional love is an analogy to rational credences between 0 and 1 (not including the extremes), which change on the basis of evidence.’

Still with me…? I mean with Suki, because I’m not in any way with her…  Okay then, ‘Alternatively, unconditional love is love that will not change according to any information, as it was not built on the basis of information in the first place. This is love without reason… This type of love has an untouchable and irrational mind of its own. As with credence 1, it can change only irrationally – it does not abide by any Bayesian law and so cannot be updated… You fall in and out of unconditional love at the mercy of love itself… This is loving in spite of everything, rather than loving because of something, and so appears unaltered by reason… But this does not make the love stable. It is simply out of your control, and can literally go away for no reason!’

It seems to me that the author is saying that conditional love is probably more predictable, or maybe controllable than unconditional love, because it is not subject to random (uncaused) fluctuations. It’s not as liable to be indiscriminately, or inadvertently snatched away. Nice. But have I learned any non-obfuscatory take-home lessons? Is it readily transferrable to any situations other than amongst rhetoricians? Could I use it in the car on the way home, in other words?

Sometimes the grandest ideas fall short of the mark in actual combat… sorry, relationships. How, in practice, and when you’re just getting to know somebody, can you possibly profess conditional love? And why would you? It sounds like a sort of one-time stand thing. It is of course, but normal rules of courtship require hyperbole. Metaphors -as in: ‘My love is as constant as the northern star, of whose true-fixed and resting quality there is no fellow in the firmament. The skies are painted with unnumbered sparks. They are all fire and every one doth shine, but there’s but one in all doth hold her place.’ As long as she doesn’t know you’ve cribbed the lot from Shakespeare’s Caesar, and you don’t mess up the words, everybody wins.

People are attracted to metaphors -they conjure up sincerity without linking it to unconditionality. Without requiring the intrusion of credences into an otherwise emotionally friable situation. It seems to me there’s nothing but trouble in store for anyone who decides to numerically assign emotional attachment parameters on the way home from a lovely dinner in an expensive restaurant.

Anyway, Thomas Bayes was a Presbyterian minister, and heaven only knows what that entails in terms of the slideabilty of relationships. I mean, their Regulative principle of worship (according to Wikipedia, at least) ‘specifies that (in worship), what is not commanded is forbidden.’ I’m therefore not entirely convinced that he would approve of the commandeering of his theorems in the somewhat tottery realm of Love, whether or not it is entwined with the idea of worship.

Of course, on the other hand, I suspect he would no doubt denounce any effort to charm with untruths, or at least equivocatory declarations. I certainly admire Suki Finn’s attempt to clarify intrinsically opaque emotions, but I’m afraid it will not do. And to revert back to Philosophy -her specialty- for a moment, there are just too many perils for any practical attempt at a Kantian Categorical Imperative application here, either.

It seems to me that I blundered into a more satisfactory solution to the declaration of Love: metaphor. It does not require any numerical assignations that might confuse or even spoil the moment; it does not even require positioning the feeling along a Bell curve for comparison with other loves you might have had. Nope, at the party -after you muster up the courage to ask her to dance- you merely say: ‘When you do dance, I wish you a wave o’ th’ sea, that you might ever do nothing but that’, or in the car on the way home, you just have to come up with something like, ‘O, how this spring of love resembleth the uncertain glory of an April day which now shows all the beauty of the sun…’ and let it go at that.




To wear an undeserved dignity


Lately, I’ve been worried about dignity -not my own, you understand, although I’m sure that could use a little work. I’m more concerned that what I assumed was an inherent quality possessed -if not always demonstrated- by us all, may not be as innate as I thought. An essay in the online publication Aeon, by Remy Debes, an associate professor of philosophy at the University of Memphis entitled Dignity is Delicate, helped me to understand some of its issues:

The word itself is derived from the Latin dignus, meaning ‘worthy’, but as with most words, it can be used in different ways, each with slightly different meanings. ‘Dignity has three broad meanings. There is an historically old sense of poise or gravitas that we still associate with refined manners, and expect of those with high social rank… Much more common is the family of meanings associated with self-esteem and integrity, which is what we tend to mean when we talk of a person’s own ‘sense of dignity’… Third, there is the more abstract but no less widespread meaning of human dignity as an inherent or unearned worth or status, which all human beings share equally.’

This latter aspect, which Debes calls the ‘moralized connotation’ ‘is the kind of worth everyone has, and has equally, just because we are persons.’ As Immanuel Kant wrote, in his Groundwork for the Metaphysics of Morals in 1785: ‘ whatever is above all price, and therefore admits of no equivalent, has a dignity.’ He also argued that we have a duty to treat other humans ‘always at the same time as an end, never merely as a means’ -with respect, in other words. Unfortunately, ‘the Groundwork wasn’t professionally translated until 1836. And even that translation wasn’t easily available until a revised edition appeared in 1869.’

So, in terms of its moral and ethical aspects, the concept of dignity is a recent one. ‘[U]ntil at least 1850, the English term ‘dignity’ had no currency as meaning anything like the ‘unearned worth or status of humans’, and very little such currency well into the 1900s. When the Universal Declaration of Human Rights (1948) used the terminology of human dignity to justify itself, this turned out to be a conceptual watershed.’

What am I missing here? As Debes illustrates in his essay, ‘the idea of human dignity is beset by hypocrisy. After all, our Western ethos evolved from, and with, the most violent oppression. For 200 years, we’ve breathed in the heady aspirations of liberty and justice for all, but somehow breathed out genocide, slavery, eugenics, colonisation, segregation, mass incarceration, racism, sexism, classism and, in short, blood, rape, misery and murder.’ So what is going on? Debes thinks ‘The primary way we have dealt with this shock and the hypocrisy it marks has been to tell ourselves a story – a story of progress… the story’s common hook is the way it moves the ‘real’ hypocrisy into the past: ‘Our forebears made a terrible mistake trumpeting ideas such as equality and human dignity, while simultaneously practising slavery, keeping the vote from women, and so on. But today we recognise this hypocrisy, and, though it might not be extinct, we are worlds away from the errors of the past.’

Of course, a still different way of explaining our abysmal lack of dignity is to suggest, not that we are getting better, but that we are getting worse -that there was a time when it was not so, and we need try going back to that ‘better time’.

Uhmm, they can’t both be correct. Perhaps, like me, you have noticed the presence of gerunds (verbs functioning as nouns with –ing endings), or implied gerunds, in the description: from the Latin gerundum –‘that which is to be carried on’. In other words, that which is not yet completed, or is in the process of happening, and hopefully will be so in the indefinite future.  As Debes writes, ‘facing up to the hypocrisy in our Western ethos requires resisting the temptation to scapegoat both the past and the present. We must not divorce ourselves from the fact that the present is possible only because of our past, the one we helped to create. Likewise, the existential question isn’t, are we really who we say we are? The question is, have we ever been?’

But why is everything so viscid? Humans have always been seen as valuable -the concept evolving through time. ‘The chorus in Sophocles’ Antigone, for example, praises man as the most ‘wondrous’ thing on Earth, a prodigy cutting through the natural world the way a sailor cuts through the ‘perilous’, ‘surging seas’ that threaten to engulf him.’ The word ‘dignity’ was not used, but it seems to me he was on the right track, although perhaps not in the sense that mankind’s value was incommensurable and couldn’t be exchanged for other kinds of worth as Kant had concluded.

Or how about Aristotle: ‘Dignity does not consist in possessing honours, but in deserving them’

Even Shakespeare’s Hector says to Troilus about whether Helen of Troy is worth going to war for: Value dwells not in a particular will; it holds his estimate and dignity as well wherein ‘tis precious of itself as in the prizer. In other words, value -dignity- isn’t subjective, it’s intrinsic.

So what has kept us from believing in that ‘inherent or unearned worth or status, which all human beings share equally’? Admittedly we are children of our era, and very few of us can escape from the Weltanschauung of our time, let alone the political and social ethos in which we find ourselves embedded. There is much that conspires to homogenize and temper our views, I suspect.

Maybe it was as simple as a fear of the unknown, and fear of disruption, that kept the lid on the pot -better the devil we know than the devil we don’t. Moral dignity –ethical dignity- did not accord with the status quo: keeping slaves, or a class system that offered wealth and status to the powerful; women were trapped in a never-ending cycle of pregnancies and children, and so were themselves essentially biologically enslaved… A clock will not work unless all of the parts are in their proper places.

So many levels: civilization -well, at least culture– has always been a matryoshka doll –‘a riddle wrapped in a mystery inside an enigma’, as Winston Churchill so famously said about Russia. But maybe, concealed inside the innermost layer, the sanctum sanctorum of the inner doll, a flower lives, not a minotaur.

We can only hope.



















Fake Views

Don’t you think we try too hard sometimes? And yet, in our zeal to project minorities, or those less favoured in our community in a more favourable light, I suppose we could be forgiven for cherry-picking examples of their accomplishments, or glossing over issues in which they do not excel, so long as there is no attempt to deceive. No serious effort to hyperbolize. All of us are multitudes, and usually only context decides what face we show. And that can be a problem when we judge the past by current standards. The danger is, as Shakespeare’s Antony explained about men like Caesar, ‘the good is oft interred with their bones.’

We all do it, although often unconsciously. We pick situations from the past like apples from a tree, and assume that old flavours should accord with current tastes. They seldom match, of course, so the past risks being disappointing unless we paint it differently and demonstrate its relevance and connection to the present.

Usually this involves investigation, verification and interpretation -it is seldom possible to understand a novel by picking a page at random and drawing conclusions about its contents. Especially if the story is one that hasn’t even been written. An article in Aeon, an online publication, delves into the distribution of fake miniature paintings that purport to represent aspects of Islamic science that may be misleading:

The essay was written by Nir Shafir, a historian of the early modern Ottoman Empire, at the University of California San Diego. And as he says, ‘The irony is that these fake miniatures and objects are the product of a well-intentioned desire: a desire to integrate Muslims into a global political community through the universal narrative of science. That wish seems all the more pressing in the face of a rising tide of Islamophobia.’ But he wonders just what science the counterfeiters hoped to find. ‘These fakes reveal more than just a preference for fiction over truth. Instead, they point to a larger problem about the expectations that scholars and the public alike saddle upon the Islamic past and its scientific legacy.’

But, Shafir does raise an important question of whether the ends justify the means. ‘Using a reproduction or fake to draw attention to the rich and oft-overlooked intellectual legacy of the Middle East and South Asia might be a small price to pay for widening the circle of cross-cultural curiosity. If the material remains of the science do not exist, or don’t fit the narrative we wish to construct, then maybe it’s acceptable to imaginatively reconstruct them… However, there is a dark side to this progressive impulse. It is an offshoot of a creeping, and paternalistic, tendency to reject the real pieces of Islamic heritage for its reimagined counterparts. Something is lost when we reduce the Islamic history of science to a few recognisably modern objects, and go so far as to summon up images from thin air. We lose sight of important traditions of learning that were not visually depicted, whether artisanal or scholastic. We also leave out those domains later deemed irrational or unmodern, such as alchemy and astrology.’

‘Perhaps there’s a worry that the actual remnants of Islamic science simply can’t arouse the necessary wonder; perhaps they can’t properly reveal that Muslims, too, created works of recognisable genius. Using actual artefacts to achieve this end might demand more of viewers, and require a different and more involved mode of explanation. But failing to embrace this challenge means we lose an opportunity to expand the scope of what counted as genius or reflected wonder in the Islamic past.’

It’s an interesting point that he makes. I wonder how many other things are slipping beneath our radars -information we never had occasion to investigate. We still use pictures to disguise our own histories, of course -to freshen them up, and portray otherwise mundane realities in rosy lights. It’s not the same as adding colours to improve an already vaunted past, I suppose, but we often try to dandy up what we’ve boasted about. And the pictures that we take usually seek to portray things as we promised they’d be. Confirm what we want people to think about our lives. A vacation that we hope others would envy, we picture in glowing scenes, that disguise those moments of disappointment in the sites we visited or the food we ate.

My grandfather used to describe his early years in glowing terms, and every photograph depicted triumphs or events that made me envious. But for some reason, my father saw them differently. Life was hard for him, and there were few luxuries when he was growing up. Clearly, history is contextual, and there are as many pasts, as there are participants in it.

But because there are discrepancies in its telling, that doesn’t necessarily invalidate what we’ve tried to illustrate in selected photos. True, it’s unlikely we’ve Photoshopped the pictures, or staged them whole cloth like the Islamic miniatures, but we’re still trying to sell an image of the past that embodies the story we want believed. A story that casts us in a favourable light despite the way our circumstances may appear today.

And yet, the camouflage itself can be a façade. It hides some things merely because there is a belief they need to be disguised -veneered. But it is sometimes the perspective itself that is deceptive -or, perhaps more accurately, selective. None of us see the world through the same eyes; ‘vanquished’ and ‘victorious’ can both describe the same event, and yet colour it with different adjectives.

I have to wonder whether, in the long run, it really matters. Once it is history, it is up for grabs anyway because there is no one, lasting view of anything in that dark and smoky room. As Shakespeare’s King Henry says, ‘Presume not that I am the thing I was’.