Are you really my friend?

There was something that Albert Camus, the Algerian-French philosopher, once wrote that has continued to inspire me since I first read it, so many years ago: “Don’t walk in front of me… I may not follow. Don’t walk behind me… I may not lead. Walk beside me… just be my friend

Friendship is a magical thing that is hard to define; it is like St. Thomas Aquinas’ view of Time: you know what it is until someone asks. Poets, perhaps, with their metaphors come closest to capturing it -Shakespeare for example:

Those friends thou hast, and their adoption tried,
Grapple them unto thy soul with hoops of steel.

Or, the wisdom of Rumi, a 13th century Persian poet: ‘Friend, our closeness is this: anywhere you put your foot, feel me in the firmness under you.’

And even the humour of Oscar Wilde:A good friend will always stab you in the front‘.

And yet, despite the feeling that its essence remains just at the tip of our tongues, there has always been an abiding faith in friendships, a trust that, to paraphrase Abraham Lincoln, ‘I destroy my enemies when I make them my friends’. In more modern times, however, the concept of ‘friend’ has undergone a not-so-subtle shift -everything from ‘friending’ people on social media, to online bullying, to trolling individuals for their putative beliefs, to unintended disclosure of confidences in internet postings.

So should a friend always bear his friend’s infirmities, as Cassius asked Brutus, in Shakespeare’s Julius Caesar? There was a time when the answer seemed obvious; now I am not so sure.

Quite by chance, I came across an essay by Leah Plunkett, an associate dean at the University of New Hampshire’s Franklin Pierce School of Law which raised the question of whether friendship should be policed. Whether it should remain a simple state of loyalty or, if declared, entail a legal obligation -like, say, marriage.

The concept caught me totally by surprise. ‘Friendship is the most lawless of our close relationships,’ she writes. Somehow, the idea that there might even be a need of a legal framework for friendship seemed dystopian to me, so I read on.

‘Friends are tied to each other through emotions, customs and norms – not through a legally defined relationship, such as marriage or parenting, that imposes obligations. Anybody can become friends, we believe…  But with the advent of the digital domain, friendship has become more fraught. Online and off, we can share information about our friends without their permission and without legal restriction (except for slander and libel).’ But, of course, that means that ‘Information shared between friends can wind up being seen by people outside the friendship network who are not the intended audience…  confidences can inadvertently find their way to the public domain; all it takes is one careless email or the wrong privacy setting on a Facebook post.’

And there may even be legal consequences to what we or our friends have posted. ‘Digital social networks are already used to detain people trying to cross into the United States when statements by friends in their network are deemed by border agents to be suspicious or threatening.’ And, although most of us are aware that most social media platforms are collecting and selling our information, ‘Fewer recognise the third-party companies typically behind the scenes of our interactions, often using our information in unknown and uncontrollable ways in pursuit of their own goals.’

And yet, ‘Amid all this chaos, friendship itself remains unregulated. You don’t need a licence to become someone’s friend, like you do to get married. You don’t assume legal obligations when you become someone’s friend, like you do when you have a child. You don’t enter into any sort of contract, written or implied, like you do when you buy something.’ There’s no legal definition of ‘friend’, either.

But, Plunkett has an interesting idea: some U.S. states (like New Hampshire, her own) have definitions of bullying: the state’s Pupil Safety and Violence Prevention Act (2000) for students in primary and secondary school defines what bullying would entail. She wonders if it might be possible to apply its converse to define friendship. So, instead of saying you can’t harm somebody, a friend should need to support a peer or their property; cause emotional comfort, and so on. And, ‘To engage in cyberfriendship, this behaviour would need to take place electronically.’ Interesting idea.

But, although promoting friendship -online or in person- is worthwhile, one clearly has to be careful about how rigorously it is applied. ‘If you could be punished for not being a friend rather than for being a bully, that would undermine the lawlessness that makes friendship so generative.’

And Plunkett feels one has to be particularly careful about this lawlessness. ‘As friendship becomes less lawless, [and] more guarded by cybersurveillance… it might also become less about loyalty, affinity and trust, and more about strategy, currency and a prisoner’s dilemma of sorts (‘I won’t reveal what I know about you if you don’t reveal it about me’).’

It seems to me, she is correct in suggesting that we would be unwise to imprison friendship in too tight a definition -we might find ourselves confined to stocks for punishment and public humiliation like misbehaving villagers in the 16th and 17th centuries.  So, ‘Let’s keep paying our respects to those bonds of friendship that are lawless at heart, opening new frontiers within ourselves.’

And listen to the words of poets like Kahlil Gibran:

When your friend speaks his mind you fear not the “nay” in your own mind, nor do you withhold the “ay.”
And when he is silent your heart ceases not to listen to his heart;
For without words, in friendship, all thoughts, all desires, all expectations are born and shared, with joy that is unacclaimed.
When you part from your friend, you grieve not;
For that which you love most in him may be clearer in his absence, as the mountain to the climber is clearer from the plain.
And let there be no purpose in friendship save the deepening of the spirit.
For love that seeks aught but the disclosure of its own mystery is not love but a net cast forth: and only the unprofitable is caught

If only…

Wast thou o’erlook’d, even in thy birth?

That Age can do some funny things to the mind seems fairly obvious. The accumulation of years, brings with it a panoply of experience that, hopefully, enables a kind of personalized Weltanschauung to emerge -things begin to sort themselves on the proper shelves, and even if they remain difficult to retrieve, there is a satisfaction that they are there, if not completely codified.

Of course, admixed with any elder ruminations are the ever-present intimations of imminent mortality -but it’s not that Age constrains the thought process to memento mori, so much as a flourishing of its antithesis: memento vivere. Age is a time for reflection about one’s life with a perspective from further up the hill.

And yet, for all the experiential input, there are two time frames hidden from each of us -what happens after death, is the obvious one to which most of us turn our attention as the final act draws to a close, but there is an equally shrouded area on which few of us spend any time: what, if anything, was preconceptual existence like? Is it the equivalent of death, perhaps minus the loss of an identity not yet acquired?

I wonder if it’s a subject more understandable to the very young, than the gnarled and aged. I remember the very first time I was taken to a movie theatre, somewhere around two or three years of age, I think. When I say ‘remember’, I mean to say I have only one recollection of the event: that of a speeding locomotive filmed in black-and-white from track level, and roaring over the camera. It was very exciting, but I remember my father being very puzzled when I confessed that I’d seen it before. I hadn’t, of course, as he patiently explained to me, and yet it seemed to me I’d seen the same thing years before.

No doubt it was my still-immature neurons trying to make sense of the world, but the picture seemed so intuitively obvious to me at the time. And through the years, the image has stayed with me, as snippets of childhood memories sometimes do, although with the meaning now sufficiently expurgated as to be innocuous, as well as devoid of any important significance.

And then, of course, there was the Bridey Murphy thing that was all the rage when I was growing up in the 1950ies. I read the book The Search for Bridey Murphy in my early teenage years about a Colorado woman, Virginia Tighe, who, under hypnotic regression in the early 1950ies, claimed she was the reincarnation of an Irish woman, Bridey Murphy from Cork in the 19th century. I even went to see the movie of the same name as the book. It was all pretty well debunked subsequently, but I suppose it was enough, at a tender age, to make me wonder about what might have happened before I become me.

At any rate, I am puzzled about why the seeming non-existence prior to conception is not something we think about more often. True, we would likely have no identity to put into that side of the equation, nor, for that matter, the loss of anything like friends or, well, existence, on the other, but still it is a comparable void. A wonderful mystery every bit as compelling as death.

I suppose the issue resurfaced for me a few years ago when I had a very vivid dream about our three-score-and-ten of existence. I saw myself as a bubble rising through some boiling water. While I was the bubble, I thought of myself as singular and not only separate from, but possessing an identity totally differentiated and unique from everything else around me. My life was the time it took me to rise to the surface. And yet when I arrived there, and my bubble burst and disappeared, when the me inside dissolved in the air from which I started, it all made sense. In fact, the encapsulated journey itself was an aberration, as was the idea of identity…

The dream lay fallow for several years and then reawakened, Phoenix-like, when I discovered an essay in the online publication Aeon, by Alison Stone, a professor of philosophy at Lancaster University in the UK.

‘Many people feel anxious about the prospect of their death,’ she writes. ‘Indeed, some philosophers have argued that death anxiety is universal and that this anxiety bounds and organises human existence. But do we also suffer from birth anxiety? Perhaps. After all, we are all beings that are born as well as beings that die… Once we bear in mind that we are natal as well as mortal, we see some ways in which being born can also occasion anxiety.’

I don’t believe she is thinking of what it must feel like to be born, so much as the transition from, well, the nothing before sperm and egg meet, to a something -to a somebody. She quotes the thoughts of the bioethicist David Albert Jones in his 2004 book The Soul of the Embryo: ‘We might be telling someone of a memory or event and then realise that, at that time, the person in front of us did not even exist! … If we seriously consider the existence and the beginning of any one particular human being … we realise that it is something strange and profound.’

Stone continues, ‘I began to exist at a certain point in time, and there is something mysterious about this. I haven’t always been there; for aeons, events in the world unfolded without me. But the transition from nonexistence to existence seems so absolute that it is hard to comprehend how I can have passed across it… To compound the mystery further, there was no single crossing point. In reality, we don’t begin in [a] sudden, dramatic way… Rather, I came into existence gradually. When first conceived, I was a single cell (a zygote). Then I developed a formed body and began to have a rudimentary level of experience during gestation. And once out of my mother’s womb, I became involved in culture and relationships with others, and acquired a structured personality and history. Yet the zygote that I began as was still me, even though it had none of this.’ Wow -you see what I mean?

Stone seems to think that all this is rather distressing, but I disagree. All I feel is a sense of profound, unbounded wonder at it all. Reflecting on that time-before-time is not unweaving the rainbow, as Keats was said to have accused Newton of doing because he had destroyed its poetry by actually studying it.

In fact, I’m reminded of something the poet Kahlil Gibran wrote: And when you were a silent word upon Life’s quivering lips, I too was there, another silent word. Then life uttered us and we came down the years throbbing with memories of yesterday and with longing for tomorrow, for yesterday was death conquered and tomorrow was birth pursued.

I have to believe there will still be poetry in the world -with or without us…

Wearing Life but as the fashion of a hat

Every once in a while I find that I am confronted by an idea which, even were I to have thought of it first, I would have put aside as of little relevance -or worse, of little consequence.

Clothing, has always been one of those for me: it’s something you wear, not something you are. And despite the desperate claims by Fashionistas that it reflects an inner self -or at least would, if you let it- I’ve always found the argument largely specious, and to reword Samuel Johnson’s quip about marriage, is a triumph of hope over expenditure.

And yet, I was drawn into an essay about clothes -albeit reluctantly- written by Shahida Bari, a lecturer in Romanticism at Queen Mary University of London, for Aeon.

I have to admit the article was not at all what I expected: I was neither deluged with praise for couture, nor subjected to shaming for my sartorial insouciance. At first, I was merely confused by her fascinating ruminations about clothes: ‘Ideas, we languidly suppose, are to be found in books and poems, visualised in buildings and paintings, exposited in philosophical propositions and mathematical deductions. They are taught in classrooms; expressed in language, number and diagram. Much trickier to accept is that clothes might also be understood as forms of thought, reflections and meditations as articulate as any poem or equation. What if the world could open up to us with the tug of a thread, its mysteries disentangling like a frayed hemline?’ What an utterly fascinating thought that what we wear is not merely a passive display, but has a voice of its own.

‘What if clothes were not simply reflective of personality, indicative of our banal preferences for grey over green, but more deeply imprinted with the ways that human beings have lived: a material record of our experiences and an expression of our ambition? What if we could understand the world in the perfect geometry of a notched lapel, the orderly measures of a pleated skirt, the stilled, skin-warmed perfection of a circlet of pearls?’

Do you see why I kept reading? The very idea that clothes have agency in and of themselves is powerful. She goes on to observe that ‘clothes are freighted with memory and meaning… In clothes, we are connected to other people and other places in complicated, powerful and unyielding ways, expressed in an idiom that is found everywhere, if only we care to read it.’

Bari seems to understand that ‘for all the abstract and elevated formulations of selfhood and the soul, our interior life is so often clothed… The garments we wear bear our secrets and betray us at every turn, revealing more than we can know or intend.’

But we cannot hide in clothes -as the poet Kahlil Gibran observes, ‘Your clothes conceal much of your beauty, yet they hide not the unbeautiful’. And Bari goes on to suggest that ‘to entrust to clothes the keeping of our secrets is a seduction in itself.’ I would have thought that this alone would have been fodder for the Philosophers, but as she goes on to explain, ‘the discipline of philosophy has rarely deigned to notice the knowledge to which dress makes claim, preferring instead to dwell on its associations with disguise and concealment.’

She seems to think that Plato had something to do with Philosophy’s aversion to treating clothes as a worthy adversary. ‘Haunted by Plato’s anxiety over how to distinguish truth from its ‘appearance’, and niggled by his injunction to see beyond an illusory ‘cave of shadows’ to a reality to which our back is turned, philosophy’s concept of truth is intractably aligned to ideas of light, revelation and disclosure.’

Still, in fairness, she turns her spotlight on various other philosophers and notes that although appearance has always been a fair topic for discussion, it has rarely concerned itself about physical appearance or dress. And yet, after a tedious, albeit poetically expressed, litany of the views on clothes of characters, both fictional and academic, she concludes with a one sentence précis that I think might have made her point much sooner: ‘Philosophy might have forgotten dress, but all that language cannot articulate – the life of the mind, the vagaries of the body – is there, ready to be read, waiting to be worn.’

I did enjoy her metaphors and evocative language, and I have to admit that, until the latter half of the journey, I was swept along quite contentedly in the current of her thoughts. It reminded me of a recent conversation of two women, both laden with large cloth bags who plonked themselves down beside me on a couch that break-watered the teeming throng of shoppers in a downtown mall. Both were middle-aged, and both spread themselves out as if I wasn’t there.

I’m not keen on being jostled on a seat, and was about to launch myself into the chaotic tide of passing elbows when I saw the woman next to me pull some garish fabric partly out of her bag to show it to her friend.

“What d’ya think Jesse?” she asked, stuffing whatever it was back in her bag once Jesse had seen it.

Jesse looked frazzled by the crowds, and her once-coiffed, greying hair floated in little strands from her head while her eyes stayed anchored on her face. “Colour’s interesting, Paula…” she said, after a noticeable pause.

“It’s a statement, Jess…” She relaxed her buxom frame further into the couch and settled an elbow into my rib without seeming to notice the infringement. “I think it’s time people noticed me.”

Jesse blinked and a weak smile surfaced on her lips for a moment. “I don’t think you need the hat, dear,” she added, as tactfully as the situation allowed.

I could see Paula’s eyes harden, and then the pressure on my rib cage lessened briefly as her hand searched for a pocket in her incredibly wrinkled ankle length coat for a Kleenex. She blew her nose untidily and then tried to stuff what was left of the tissue back in the coat somewhere, and her elbow back into my side. “What are you saying, mirror-child?” she shot back. Clearly they were both tired, but I was beginning to enjoy the exchange.

“Just that you don’t have to wear a sign to attract attention…”

Paula’s face somehow retracted further into itself and her eyes peered out through the bars of their lashes like caged animals. And then, just as suddenly, her expression softened, and she shifted the position of her elbow again. “Oh, you mean that blouse, I bought…?” A smile darted onto her lips and stayed there like a runner that had made it safely to second base. “It’s really more me, isn’t it?”

Jesse’s eyes twinkled mischievously as she nodded. “But I don’t think you should wear them together, do you…?”

I could feel, as well as see Paula sigh. “You’re right, dear,” she said, as they both struggled to their feet. “I’m someone else with the hat on, aren’t I?” Another smile surfaced briefly, like a seal. “But it’s always nice to have a choice, Jess,” Paula added, hefting her bag onto her shoulder. Then pulling her friend with her free hand, they both stepped into the ever-passing flood like branches falling together in a river and were swept away.

I think you learn a lot about philosophy in malls if you’re patient…

Bad Samaritans?

I suspect this is an incredibly naïve, not to mention unpopular, opinion, but I suppose in these times of plague, I should be grateful we have borders -fences that keep them out, walls that keep us safe. But I’m not. I’ve always mistrusted borders: I’ve always been suspicious of boundaries that artificialize the denizens of one region -that privilege residents as opposed to non-residents, friends versus strangers, our needs compared to theirs.

Call me unworldly, but what makes me special, and you not so? It seems to me the italics I have used to mark differences, are as arbitrary as the differences they mark. We are all the same, and deserve the same consideration.

That said, we seem to be stuck with countries determined only to look after their own -even with the global crisis in which we find ourselves in these special, but frightening times. In a desperate attempt at historical recidivism, we are attempting a re-balkanization of the world.

But what is a country, anyway? And does it have a special providence -or provenance, for that matter? I happened upon an interesting essay by Charles Crawford, who once served as the UK Ambassador to Sarajevo and Belgrade discussing much the same thing:

As he writes -‘There are only two questions in politics: who decides? and who decides who decides? … Who gets to say what is or is not a country? For most of human history, nation states as we now recognise them did not exist. Territories were controlled by powerful local people, who in turn pledged allegiance to distant authorities, favouring whichever one their circumstances suited. In Europe, the tensions in this system eventually led to the Thirty Years’ War which… ended in 1648 with a thorough revision of the relationship between land, people and power. The resulting set of treaties, known as the Peace of Westphalia, introduced two novel ideas: sovereignty and territorial integrity. Kings and queens had ‘their’ people and associated territory; beyond their own borders, they should not meddle.’

Voila, the modern idea of states, with loyalties only to themselves. But embedded in the concept were at least two principles -two problems: ‘The first is self-determination: the idea that an identified ‘people’ has the right to run its own affairs within its own state. The other is territorial integrity: the notion that the borders of an existing state should be difficult to change.’ But borders soon spawned customs and attitudes that were different from those on the other side –theirs were different from ours, so they must be different from us. An oversimplification, to be sure, but nonetheless a helpful guide, perhaps.

Borders can change, of course, but not easily, and often not without considerable turmoil. Think of ‘the separation of Bangladesh from Pakistan in 1971 [which] claimed up to a million lives… Ambiguous ceasefires can drag on indefinitely. Taiwan and its 23 million inhabitants live in a curious twilight zone of international law, recognised by only 22 smaller countries and the Vatican.’ Examples of each, abound.

And not all borders were established to reconcile linguistic, ethnic, or religious differences. There are many examples, but perhaps the most egregious borders in modern times were those largely arbitrary ones in the Middle East drawn by two aristocrats Mark Sykes from Britain, and Francois Georges-Picot from France in 1916. As Wikipedia describes: ‘it was a secret agreement between Britain and France with assent from the Russian Empire and Italy, to define their mutually agreed spheres of influence and control in an eventual partition of the Ottoman Empire.’

A famous quotation that encapsulates the attitude was that of Sykes: ‘At a meeting in Downing Street, Mark Sykes pointed to a map and told the prime minister: “I should like to draw a line from the “e” in Acre to the last “k” in Kirkuk.”’-a straight line, more or less.

Crawford’s essay was intended to explain the continuing tensions in the Balkans, but it raises a pertinent question for these times -namely, ‘Should nations stay within their historical boundaries, or change as their populations do?’ Or, put another way, should boundaries remain impermeable to needs outside what I would term their arbitrary limits?

With the current pandemic, there are, no doubt, many reasons that could be offered for being selective at borders: family-first ones, by and large. We need to close our borders to support our own economy, feed our own people; in the midst of a global epidemic, it is not the time to sacrifice our own needs by offering altruism to others. Actually, it seems to me that the underlying belief is that migration -legal or otherwise- is a large contributor to the spread of the infection. But once a communicable virus is in the country, its own citizens also become vectors -and they far outnumber the number of refugees or migrants.

Rather than being focussed on borders and exclusion, efforts would likely be more intelligently spent on things like temporary isolation of any who may have been in areas where the epidemic may have been less controlled, and enforced social separation (social-distancing) of everybody else. Consistent, and frequently publicized advice and updates about new developments to educate the public -all the public- is key to managing fear. And epidemics -they have a habit of evolving rapidly.

And testing, testing, testing. Unless and until, we know who might have the infection and be a risk to others, we are essentially blinkered. It’s not the strangers among us who pose the risk, it’s those who are infected and either have no symptoms or who are at the earliest stages of an infection that has not yet had time to declare itself.

The World Health Organization (and others) have pointed out that travel restrictions not only divert resources from the containment effort, they also have human costs. ‘Travel measures that significantly interfere with international traffic may only be justified at the beginning of an outbreak, as they may allow countries to gain time, even if only a few days, to rapidly implement effective preparedness measures. Such restrictions must be based on a careful risk assessment, be proportionate to the public health risk, be short in duration, and be reconsidered regularly as the situation evolves. Travel bans to affected areas or denial of entry to passengers coming from affected areas are usually not effective in preventing the importation of cases but may have a significant economic and social impact.’ And, as all of us realize -and expect- by now: ‘Travellers returning from affected areas should self-monitor for symptoms for 14 days and follow national protocols of receiving countries.’ Amen.

Turning away migrants often has some desired political effects, however: diverting attention away from the receiving country’s possible lack of preparedness and foresight. It’s seldom about the Science and more about Nationalism -further stoking fears of the other.

I think that at the moment, we are forgetting, as was immortalized in that ancient Persian adage that, This, too, will pass. The pandemic will exhaust itself, and likely soon become both amenable to a vaccine and other medical therapy. And those affected will not soon forget -nor will those denied entry in their time of need. As our economies rebuild in its wake, we -and they- will need all the allies we can muster. Best to be remembered as a friend who helped, than someone who turned their back.

We really are all in this together. As one of my favourite poets, Kahlil Gibran writes, ‘You often say,I would give, but only to the deserving.” The trees in your orchard say not so… They give that they may live, for to withhold is to perish.’

That way madness lies

To portray something -to make it believable- there has to be at least some understanding by the audience of what is being portrayed. Much in the sense, I suppose, that was suggested in the 1974 paper in The Philosophical Review by the American philosopher Thomas Nagel, asking what it would be like to be a bat. Not so much how it would feel to have the added sense of sonar, or be able to fly in the dark, but more about the consciousness of itself. As Wikipedia explains Nagel’s thinking: ‘an organism has conscious mental states, “if and only if there is something that it is like to be that organism—something it is like for the organism to be itself.”

This is a roundabout way of wondering whether an audience could ever know if an actor is representing something realistically if they cannot imagine what it would be like to be that thing.

Mental illness seems as if it is sufficiently prevalent that most of us would be expected to understand whether or not the author, or the actor, has captured its essence accurately, and yet, for those of us who have not experienced the wide panoply of its manifestations -the majority of us, I suspect- we might be easily mislead. The more gripping or sensational portrayals of illness, might well come to stereotype the lot. To stigmatize the condition.

I was scrolling through the BBC Culture section when I happened upon an article that discusses some of these same issues:

‘… the film industry has generally shown a shaky vision of mental health … It’s not that cinema evades ‘taboo’ themes here; it’s more that it tends to swing wildly from sentimentality to sensationalism.’ To attract an audience -i.e. to make a profit- ‘creative drama is drawn to the complexity and fragility of the mind – but mainstream entertainment still demands a snappy fix. And the definition of ‘insanity’ is inherently problematic.’

I am reminded of the French philosopher Michel Foucault’s book Madness and Civilization -subtitled A History of Insanity in the Age of Reason. He felt that the concept of madness was evolving over time: in the Renaissance, (as a thoughtful summary in Wikipedia puts it) the mad were portrayed in art ‘as possessing a kind of wisdom – a knowledge of the limits of our world – and portrayed in literature as revealing the distinction between what men are and what they pretend to be … but the Renaissance also marked the beginning of an objective description of reason and unreason (as though seen from above) compared with the more intimate medieval descriptions from within society.’

Later, however, ‘in the mid-seventeenth century, the rational response to the mad, who until then had been consigned to society’s margins, was to separate them completely from society by confining them, along with prostitutes, vagrants, blasphemers and the like, in newly created institutions all over Europe.’ (The Great Confinement).

‘For Foucault the modern experience began at the end of the eighteenth century with the creation of places devoted solely to the confinement of the mad under the supervision of medical doctors, and these new institutions were the product of a blending of two motives: the new goal of curing the mad away from their family who could not afford the necessary care at home, and the old purpose of confining undesirables for the protection of society. These distinct purposes were lost sight of, and the institution soon came to be seen as the only place where therapeutic treatment can be administered.’

But, back to the BBC Culture depiction of the role of cinema, ‘our mainstream perceptions of ‘madness’ are still fixated with movie scenes – much more emphatically, in fact, than the novels or memoirs on which they might be based. A classic film like One Flew Over the Cuckoo’s Nest (1975) seals the impression of a soul-destroying psychiatric asylum, where livewire convict RP McMurphy (Jack Nicholson) feigns insanity to escape prison labour – yet is ultimately crushed by the system. The dramatic depiction of patient treatment, particularly its brutal electroconvulsive therapy sequences, had far-reaching impact. In 2011, The Telegraph went so far as to say that the film was responsible for “irreparably tarnishing the image of ECT…’

Unfortunately, unlike many art forms, movies usually require a conclusion, a wrapping up of the story, and a realistic depiction of mental illness may not fit into that convenient format. There may be no black or white: not all characterizations can end either pleasantly or sadly -some are palimpsests, to be sure, but many can reach no definitive conclusions that would satisfy the average moviegoer. Hence the temptation to exaggerate, or at least frighten audiences into an odd manifestation of satisfaction.

The temptation, in other words, to see mental illness as alien, separate -like a creature we could not possibly understand because it is so different. As different, perhaps, as Nagel’s bat. But is it? Or was Foucault really on to something in his analysis of the way ‘madness’ seemed to be viewed in Renaissance literature and art -a view which accepted that at least some of the vagaries, some of the stigmata of mental illness, were merely variations of mental states that any of us could exhibit at times? And indeed, that occasionally intimated unique views on a world from which we might learn some important lessons -a world, though, that we might now discard, or shun as too bizarre. Too frightening. Too… real.

On the other hand, there is a danger of romanticizing the past, of airbrushing its naïveté into soft and reassuring colours; of assuming it was what it was because it had not yet been exposed to the unforgiving exigencies of current knowledge. A time when imagination and reality were sometimes allowed to merge. Encouraged to conflate.

It’s difficult to be certain where present day arts can be placed on this spectrum of understanding mental illness -not the least because it is difficult to know where it should be placed. But, suffice it to say, the more fully the illness is portrayed in all its complexity, the more we might be able to see it as a small, but important part of the tapestry of existence -a fragment of the struggle that marks all our days. And, as for any vicissitude, where there is suffering, we must provide succour and relief, and where there is dissimilarity, offer understanding and acceptance. Tolerance. The soul, says the poet Kahlil Gibran, walks upon all paths. The soul walks not upon a line, neither does it grow like a reed. The soul unfolds itself like a lotus of countless petals.


Should We Bell the Cat?

What should you do at a dinner party if the hostess, say, declares that she believes something that you know to be inaccurate -or worse, that you consider repellent? Abhorrent? Should you wait to see how others respond, or take it upon yourself to attempt to correct her belief? If it is merely a divergence of opinion, it might be considered a doctrinaire exercise -a Catholics vs Protestant type of skirmish- and likely unwinnable.

But, suppose it is something about which you are recognized to have particular credentials so your response would not be considered to be merely an opinion, but rather a statement of fact? Should that alter your decision as to whether or not to take issue with her pronouncement? Would your silence imply agreement -acquiescence to a view that you know to be not only wrong, but offensive? And would your failure to contradict her, signal something about her opinion to the others at the table? If it is an ethical issue, should you attempt to teach?

It is a difficult situation to be sure, and one that is no doubt difficult to isolate from context and the responsibilities incumbent upon a guest. Still, what should you do if, uncorrected, she persists in promulgating her belief? Should you leave the table, try to change the topic, or merely smile and wait to see if she is able to sway those around you to her views?

I can’t say that the situation has arisen all that often for me, to tell the truth -we tend to choose our friends, and they theirs, on the basis of shared values- but what risks might inhere in whatever course of action I might choose? I happened upon an insightful and intriguing article that touched on that very subject in Aeon, an online magazine: It was written by John Schwenkler, an associate professor in philosophy at Florida State University.

He starts, by pointing out that ‘Many of our choices have the potential to change how we think about the world. Often the choices taken are for some kind of betterment: to teach us something, to increase understanding or to improve ways of thinking. What happens, though, when a choice promises to alter our cognitive perspective in ways that we regard as a loss rather than a gain?’

And further, ‘When we consider how a certain choice would alter our knowledge, understanding or ways of thinking, we do this according to the cognitive perspective that we have right now. This means that it’s according to our current cognitive perspective that we determine whether a choice will result in an improvement or impairment of that very perspective. And this way of proceeding seems to privilege our present perspective in ways that are dogmatic or closed-minded: we might miss the chance to improve our cognitive situation simply because, by our current lights, that improvement appears as a loss. Yet it seems irresponsible to do away entirely with this sort of cognitive caution… And is it right to trust your current cognitive perspective as you work out an answer to those questions? (If not, what other perspective are you going to trust instead?)’

You can see the dilemma: is the choice or opinion you hold based on knowledge, or simply belief? And here he employs a sort of thought experiment: ‘This dilemma is escapable, but only by abandoning an appealing assumption about the sort of grasp we have on the reasons for which we act. Imagine someone who believes that her local grocery store is open for business today, so she goes to buy some milk. But the store isn’t open after all… It makes sense for this person to go to the store, but she doesn’t have as good a reason to go there as she would if she didn’t just think, but rather knew, that the store were open. If that were case she’d be able to go to the store because it is open, and not merely because she thinks it is.’

But suppose that by allowing an argument -an opinion, say- to be aired frequently or uncontested, you fear you might eventually be convinced by it? It’s how propaganda endeavours to convince, after all. What then? Do you withdraw, or smile and smile and see a villain (to paraphrase Hamlet)? ‘If this is on the right track, then the crucial difference between the dogmatic or closed-minded person and the person who exercises appropriate cognitive caution might be that the second sort of person knows, while the first merely believes, that the choice she decides against is one that would be harmful to her cognitive perspective. The person who knows that a choice will harm her perspective can decide against it simply because it will do so, while the person who merely believes this can make this choice only because that is what she thinks.’

This is philosophical equivocation, and Schwenkler even admits as much: ‘What’s still troubling is that the person who acts non-knowingly and from a mere belief might still believe that she knows the thing in question… In that case, she’ll believe that her choices are grounded in the facts themselves, and not just in her beliefs about them. She will act for a worse sort of reason than the sort of reason she takes herself to have.’

As much as I enjoy the verbiage and logical progression of his argument, I have to admit to being a little disappointed in the concluding paragraph in the article, that seems to admit that he has painted himself into a corner: ‘What’s still troubling is that the person who acts non-knowingly and from a mere belief might still believe that she knows the thing in question: that climate change is a hoax, say, or that the Earth is less than 10,000 years old. In that case, she’ll believe that her choices are grounded in the facts themselves, and not just in her beliefs about them. She will act for a worse sort of reason than the sort of reason she takes herself to have. And what could assure us, when we exercise cognitive caution in order to avoid what we take to be a potential impairment of our understanding or a loss of our grip on the facts, that we aren’t in that situation as well?’

But, I think what this teaches me is the value of critical analysis, not only of statements, but also of context. First of all, obviously, to be aware of the validity of whatever argument is being aired, but then deciding whether or not an attempted refutation would contribute anything to the situation, or merely further entrench the individual in their beliefs, if only to save face. And as well, it’s important to step back for a moment, and assess the real reason I am choosing to disagree. Is it self-aggrandizement, dominance, or an incontestable conviction -incontestable based on knowledge or unprovable belief…?

I realize this is pretty confusing stuff -and, although profound, not overly enlightening- but sometimes we need to re-examine who it is we have come to be. In the words of the poet Kahlil Gibran, The soul walks not upon a line, neither does it grow like a reed. The soul unfolds itself like a lotus of countless petals.

To Be or Not to Be

We are all creatures of our cultures; we are all influenced, if not captured, by the ethos that affected our parents. And for most of us, it is where we feel the most comfortable. It does not require any clarification, or justification -it just is the way things are. The way things are supposed to be. Anything else is not simply an eccentricity, it is an aberration.

I think one of the aspects of Age that sometimes earned respect in times past, was the ability of some elders to stand aside from the fray and place things in context, examine long held opinions and wonder about their value. Unfortunately, often these voices of wisdom were rare and societal acceptance even rarer. At least until the time of social media, Zeitgeist moved like a snail, and only things like war or disasters could reliably hurry it on its journey. We were limited by geography to parochial assumptions of normalcy. We had no reason to doubt that our values were appropriate, and in all likelihood, universal.

Gender was one of those self-evident truths about which there could surely be no dissenting views. We are what genitalia we possess, and our sex is assigned accordingly. Period. Indeed, for years I saw no reason to question this belief. I could understand same-sex relationships easily enough, but the need to interrogate the very idea of ‘sexual identity’ did not occur to me. As I said, I am a creature of my Time, my Culture.

There was a fascinating article in Aeon, an online publication, that helped me to understand how very blinkered my view had been. It was written by Sharyn Graham Davies, an associate professor in the School of Languages and Social Sciences at Auckland University of Technology:

‘[T]he very word ‘transgender’ came into common usage only in the past 20 years or so, and even the word ‘gender’ itself was popularised only in the 1970s. So we couldn’t even talk about transgender in a sophisticated way until practically yesterday.’ Indeed, some people seem to think that the whole idea of ‘transgendered’ people is not only strange, but also a novel, made-up aberration. ‘But there’s a problem if transgender is considered a particularly recent issue, or as a peculiarly Western phenomena. The perceived novelty of transgender people often leads to the accusation that they don’t have a right to exist, that they are just ‘making it up’, or even trying to cash in on some celebrity status, like that acquired by Caitlyn Jenner, the American TV personality who in 2015 transitioned from Bruce Jenner, the Olympian decathlete.’

Among other cultures, the author highlights the case of the bissu, an ‘an order of spiritual leaders (often framed as a priest or shaman) who are commissioned to perform all sorts of tasks for the local community, such as helping those in power to make important decisions on topics such as marriage alliances, crop harvest dates and settlements of debt.’ They were likely first described in a letter written in 1544 to João de Albuquerque, the Portuguese bishop of the Indian state of Goa, by António de Paiva, a Portuguese merchant and missionary, when he was in Sulawesi in Indonesia.

And from the author of the article, we learn that ‘The [local] Bugis people thought that when a being became a woman or a man, that being could no longer communicate with the gods. Men and women were in some sense cut off from the gods that made them. But the gods had a means of communicating with humans: the bissu. Because the gods left the bissu undifferentiated – a combination of woman and man – they are accorded a position of influence. As the bissu bring together woman and man in one person, they can mediate between humans and gods through blessings.’

Interestingly, ‘Early indigenous manuscripts also talk of the bissu occupying a special social position because they combined female and male qualities. But the analytic tools available to these earlier commentators were slim – there was no word for anything like ‘gender’. Therefore, it is difficult to assess whether the bissu were considered a ‘third’ gender or as crossing from ‘one’ gender to the ‘other’ (transgender). However, what we can say is that there was a powerful sense of what today would be called ‘gender pluralism’

But this concept of pluralism was not confined to Indonesia by any means. For example, ‘According to Peter Jackson, a scholar at the Australian National University in Canberra, gender was not differentiated in Thailand until the 1800s. Before then, Thai princes and princesses dressed the same, with matching hairstyles. But in the 19th century, to impress the British, the Thai monarchy decided that women and men should be clearly differentiated with unique clothing and hairstyles… So in fact the Western gender system created ‘transgenderism’ – crossing from one gender to another made no sense in Thailand when there weren’t two strictly defined genders.’

So Graham Davies sums up her arguments about transgenderism by saying, ‘Human nature is diverse, and any attempt to split all 7 billion of us into one of just two categories based on mere genitals is both impossible and absurd. Transgender people have played crucial roles in societies throughout history.’

I find the article intriguing for several reasons, but I suppose the most compelling is that it calls into question what seems for most of us to be self-evident: that gender assignation should be commensurate with genital (or, nowadays, chromosomal) possession. Perhaps it is a good starting point in a culture that demarcates societal roles according to gender, but even a short step back would challenge the need for such rigid rules. Yes, maybe DNA does dictate that only the female is able to become pregnant and propagate the species, but why should it also demarcate other aspects of identity? Why should gender matter in a vocation, say, or even in a sport…? Surely it would be better to depend on quality of performance. And, for that matter, why should it be irrevocable once assigned?

I realize how naïve that sounds -self-identity seems to be such an important component of our ability to function in a multifaceted society. ‘While transgender often implies a crossing from one gender to the next, the use of third gender is a way to try to frame a separate and legitimate space for individuals who want to be considered outside this binary. The debate over terms and labels is fiercely contested, as any comments around the ever-increasing acronym LGBTQIA (for lesbian, gay, bisexual, transgender, queer, intersex and asexual) suggest.’

It seems to me that a helpful way to think about these roles is to understand that they do exist, and they have probably always existed. They are not new phenomena, nor are they bizarre. In fact, they are ‘abnormal’ only in that they usually do not represent the majority in most modern societies. But ‘abnormal’ does not mean aberrant -or necessarily perverse.

And yes, I also realize that the acceptance of cultural relativism swings on a wide pendulum over time, but I have to go back to something one of my favourite poets, Kahlil Gibran, wrote: Say not, “I have found the truth,” but rather, “I have found a truth.” Say not, “I have found the path of the soul.” Say rather, “I have met the soul walking upon my path.” For the soul walks upon all paths.












Beauty is bought by judgement of the eye?

Isn’t it interesting how differently we look at things? How the same bridge crossed by ten people becomes ten bridges? How beauty is so subjective? So ephemeral? Just think of how Shakespeare opened his second sonnet: When forty winters shall besiege thy brow and dig deep trenches in thy beauty’s field, thy youth’s proud livery, so gazed on now, will be a tattered weed, of small worth held.

And yet to some, beauty -however evanescent- seems a prize worth having, no matter the sacrifice. It seems unfair that it should have been doled out to some, but not to others. There are cultures where the inequity of this disparity is taken seriously; there are countries where beauty is felt to be a right to which all should be entitled no matter their social strata.

So accustomed am I to my own cultural mask, I have to admit that I had not realized that Brazil was such a place until I came across an article in the Conversation that addressed the issue. It was written by Alvaro Jarrin, an Assistant Professor of Anthropology, at the College of the Holy Cross in Massachusetts. ‘Brazil considers health to be a basic human right and provides free health care to all its citizens. […] In Brazil […] patients are thought of as having the “right to beauty.” In public hospitals, plastic surgeries are free or low-cost.’ But, ‘public hospitals remain severely underfunded, and most middle-class and upper-class Brazilians prefer to use private medical services.’

Jarrin feels there is a darker side to this medical largesse however, in that the surgeries are frequently performed by more junior surgeons, just learning their techniques (albeit likely under the supervision of more experienced surgeons as is frequently the case even in the USA).

He goes on to say, ‘Yet these patients, most of whom were women, also told me that living without beauty in Brazil was to take an even bigger risk. Beauty is perceived as being so central for the job market, so crucial for finding a spouse and so essential for any chances at upward mobility that many can’t say no to these surgeries.’

‘Plastic surgery is considered an essential service largely due to the efforts of a surgeon named Ivo Pitanguy. In the late 1950s, Pitanguy […] convinced President Juscelino Kubitschek that the “right to beauty” was as basic as any other health need. Pitanguy made the case that ugliness caused so much psychological suffering in Brazil that the medical class could not turn its back on this humanitarian issue. In 1960, he opened the first institute that offered plastic surgery to the poor, one that doubled as a medical school to train new surgeons. It was so successful that it became the educational model followed by most other plastic surgery residencies around the country. In return for free or low-cost surgeries, working-class patients would help surgeons learn and practice their trade.’

The author seems to feel that the reconstructive aspects of plastic surgery -techniques for the treatment of burn victims and those with congenital deformities, etc.- have taken a back seat to techniques geared to aesthetic enhancement, however. ‘Since most of the surgeries in public hospitals are carried out by medical residents who are still training to be plastic surgeons, they have a vested interest in learning aesthetic procedures – skills that they’ll be able to later market as they open private practices. But they have very little interest in learning the reconstructive procedures that actually improve a bodily function or reduce physical pain. Additionally, most of Brazil’s surgical innovations are first tested by plastic surgeons in public hospitals, exposing those patients to more risks than wealthier patients.’

As a retired (gynaecological) surgeon myself, I have to say that I take issue with the naive view Jarrin seems to have about the training of the resident surgeons he reports. After all, clearly it would be better for the young surgeon to learn techniques under the careful guidance of an experienced mentor, than to suddenly be expected to possess the required expertise once she has passed her exams. Indeed, a selection bias is perhaps equally applicable to the anecdotes Jarrin quotes to demonstrate his contention. But, in fairness, I may be guilty of an insidiously perverted form of cultural relativism myself: I see my own world even when it’s not…

Cultural relativism, first popularized in the early twentieth century, attempts to understand and judge other cultures not by our own standards, but by theirs. It is a contextually rooted approach that can be devilishly difficult to achieve. We are all inherently cultural solipsists; we learn customs from the cradle and mistrust or actively disavow any deviations from those to which we have become habituated.

Even beauty itself is fraught. What is beautiful? Surely it is an ill-defined shadow on a rather large spectrum, its position tentative and arbitrary, depending as it must, on time and measurement. Shakespeare knew that. We all know that… Or do we? Are there unequivocal, objective criteria that must be met, or are they entirely subjectively defined? Culturally allotted? Surgically assigned?

No one has defined beauty more bewitchingly, in my opinion, than the poet, Kahlil Gibran, a Lebanese-American writer and artist in The Prophet. When the prophet is asked about beauty, he replies:

… beauty is not a need but an ecstasy.
It is not a mouth thirsting nor an empty hand stretched forth,
But rather a heart enflamed and a soul enchanted.

It is not the image you would see nor the song you would hear,
But rather an image you see though you close your eyes and a song you hear though you shut your ears.
It is not the sap within the furrowed bark, nor a wing attached to a claw,
But rather a garden for ever in bloom and a flock of angels for ever in flight.

… beauty is life when life unveils her holy face.
But you are life and you are the veil.
Beauty is eternity gazing at itself in a mirror.
But you are eternity and you are the mirror.

I cannot criticize the cultural ethos of Brazil, or its need for beauty; I can only wonder whether they will ever find what they are so desperately seeking. Who can touch a rainbow just by reaching?



A Childless Motherhood

Well of course! Did we think there would be no consequences? Did we actually think we could get away with it? That there weren’t two sides to the story that we all needed to hear?

Sometimes I think we are so focused on our journey to right a wrong, that we wander off the path to those we hope to save. Things are too partitioned -a modern day rendition of the biblical Matthew 6:3 where the left hand does not know what the right hand is doing… Or, perhaps, is not doing.

If one side of a page seems to contain all the information I seek, I may miss what’s written on the back. I feel no need to turn it over. An article in the Conversation turned the page for me:

The author, Elizabeth Wall-Wieler, a PhD student in Community Health Sciences at the University of Manitoba, writes that ‘Mothers whose children are placed in foster care are at much higher risk of dying young, particularly due to avoidable causes like suicide. When a child is placed in foster care, most of the resources are focused on the child, with little to no support for the mothers who are left behind.’

In retrospect, of  course, it seems obvious -the mother-child bond is not something easily missed, and whether or not we attribute it to physiological changes such as oxytocin levels in her blood, or less reductionist, atavistic mechanisms, it is a powerful thing, dismissed only at her -and our– peril.

The author was involved in two large studies, one of them published in the Canadian Journal of Psychiatry, which ‘[…] looked at suicide attempts and suicide completions among mothers whose children were placed in care.

‘In this study, we compared rates of suicide attempts and suicides between 1,872 mothers who had a child placed in care with sisters whose children were not placed in care. We found that the rate of suicide attempts was 2.82 times higher, and the rate of death by suicide was more than four times higher for mothers whose children were not in their custody. […] Mothers whose children are taken into care often have underlying health conditions, such as mental illness and substance use. In both studies, we took pre-existing health conditions into account, so that was not the reason for the higher mortality rates we found.’

And, the author feels, ‘Most legislation pertaining to child protection services indicates that families should be supported, but the guidelines around what is expected of the child welfare system when it comes to the biological mothers are not clear. The main role of social workers is to ensure that the child is doing well. Social workers are already so busy, so it is often hard for them to justify spending their limited time to help mothers resolve challenges and work with them to address their mental and physical health needs.’

Other studies have also addressed the issue of sending children to foster care: ‘A study in Sweden found that by age 18, more than 16 per cent of children who had been in foster care had lost at least one parent (compared to three per cent of children who had not been in foster care). By age 25, one in four former foster children had lost at least one parent (compared to one in 14 in the general population). This means that many children in foster care don’t get the chance to be reunited with their families.’

I thought that the whole idea of fostering a child was care and sustenance until a more permanent placement was achieved or, ideally, the birthparent was able to reassume custody. This is perhaps more likely if the child can be placed with members of the same family -grandmothers, aunts, etc.- but even then, if the mother does not receive adequate support and treatment for the condition that led to the apprehension of her child, the results are apt to be the same.

In Canada, it seems, the mothers most affected are those from the indigenous community -our First Nations. The Canadian Minister of Indigenous Services, Jane Philpott, addressed indigenous leaders about this issue at a two-day emergency meeting on Indigenous Child and Family Services in Ottawa in January, 2018. ‘The care system is riddled with “perverse incentives”. Children are being apprehended for reasons ranging from poverty to the health and addiction issues faced by their parents. In some provinces, rules around housing mean that your children can be taken away if you don’t have enough windows. “Right now dollars flow into the child welfare system according to the number of kids that are apprehended.” […] If financial incentives were based on “how many children we were able to keep in homes, how well we were able to support families — then in fact there would be no financial reason why the numbers would escalate.”’

But it’s not too difficult to read something else into all of this, of course. Uncondoned behaviour -behaviour frequently associated with poverty or marginalization- is often penalized isn’t it? Sometimes it is as simple as avoiding the transgressing community, further marginalizing it, but increasingly it is intolerance. Refusal to address the underlying issues. Not even trying to understand.

I admit that it is a difficult journey, and the road that winds between the abused child and its troubled parent is fraught. To empathize with the mother when her conduct may have been so clearly unacceptable, is seen as anathema. And yet, an attempt to understand is not a plea for condonation, merely a search for a solution. Nobody should get away with family neglect -but nothing happens in a vacuum. And there are always unintended consequences, aren’t there? Even our best intentions miss something in retrospect -solve one problem, create another. Our focus is often far too narrow -helping one person misses the one standing beside her.

Perhaps it’s time for us to stand back. As Ms Wall-Wieler puts it, ‘Specific guidelines need to be put in place to make sure that mothers are supported when their child is taken into care. This would improve the chances of reunification. And, by virtue of being a human worthy of treatment with dignity, mothers deserve support, even if it does not directly relate to how she interacts with her child(ren).’

‘Of the good in you I can speak, but not of the evil.
For what is evil but good tortured by its own hunger and thirst?’
Kahlil Gibran







In choice, we are so oft beguiled

It’s interesting just how important categories are in our lives, isn’t it? I mean, let’s face it, often they’re just adjectives –subordinate to their nouns. Add-ons. And yet, they can frame context, colour perception, and even determine value. Some, like, say, texture or odour may be interesting but trivial; some –size, or cost, for example- may be more important although optional in a description. There are, however, categories that seem to thrust themselves upon an object and are deemed essential to its description, essential to placing it in some sort of usable context. To understanding its Gestalt. These often spring to mind as questions so quickly they are almost automatic. Gender is one such category, age, perhaps another. And depending, I suppose on the situation, the society, or even the category to which the listener belongs, there may be several others that are deemed necessary to frame the issue appropriately.

The automaticity of a category is critical, however. If the category is felt to be of such consuming importance that it needs to be established before any further consideration can be given to the object, then that object’s worth –or at least its ranking- is contingent. It is no longer being evaluated neutrally, objectively. It comes replete with those characteristics attendant upon its category –intended or not. Age, for example, wears certain qualities, incites certain expectations that might prejudice acceptance of its behaviour. Gender, too, is another category that seems to colour assumptions about behaviour. So, with the assignation of category, comes opinion and its accompanying attitude.

One might well argue about the importance of these categories, and perhaps even strategize ways of neutralizing their influence on reactions, or subsequent treatment. The problem is much more difficult if knowledge of the category is so necessary it is intuitively provided as part of what is necessary to know about, for example, a person.

I suspect that in my naïveté, I had assumed that foreknowledge of many of these categories was merely curiosity-driven. Politeness oriented. Important, perhaps, so that I wouldn’t be surprised -wouldn’t embarrass the person at our initial encounter. But I am a doctor, and maybe see the world from a different perspective. A piece in the BBC, however, made me realize just how problematic this automaticity had become. How instinctive.

The article dealt mainly with its effects on racism, and the difficulties of countering it if we accept, as some evolutionary psychologists seem to believe, that it is basically intuitive. Evolved for a reason. Wired-in. ‘[…] if perceiving race is automatic then it lays a foundation for racism, and appears to put a limit on efforts to educate people to be “colourblind”, or put aside prejudices in other ways.’ But, as Tom Stafford, the author of the BBC article puts it, ‘Often, scientific racists claim to base their views on some jumbled version of evolutionary psychology (scientific racism is racism dressed up as science, not racisms based on science […]). So it was a delightful surprise when researchers from one of the world centres for evolutionary psychology intervened in the debate on social categorisation, by conducting an experiment they claimed showed that labelling people by race was far less automatic and inevitable than all previous research seemed to show.

‘The research used something called a “memory confusion protocol” […] When participants’ memories are tested, the errors they make reveal something about how they judged the pictures of individuals. […] If a participant more often confuses a black-haired man with a blond-haired man, it suggests that the category of hair colour is less important than the category of gender (and similarly, if people rarely confuse a man for a woman, that also shows that gender is the stronger category). Using this protocol, the researchers tested the strength of categorisation by race, something all previous efforts had shown was automatic. The twist they added was to throw in another powerful psychological force – group membership. People had to remember individuals who wore either yellow or grey basketball shirts. […] Without the shirts, the pattern of errors were clear: participants automatically categorised the individuals by their race (in this case: African American or Euro American). But with the coloured shirts, this automatic categorisation didn’t happen: people’s errors revealed that team membership had become the dominant category, not the race of the players. […] The explanation, according to the researchers, is that race is only important when it might indicate coalitional information – that is, whose team you are on. In situations where race isn’t correlated with coalition, it ceases to be important.’

I don’t know… To me, this type of experiment seems so desperate to appear to be wearing a scientific mantle, that it comes across as contrived –kludged, if you’ll permit an equally non-scientific term. But I take their point. If there is some way of diffusing the automaticity of our categorizations –or at least deflecting them into more malleable descriptors –teams, in this case- perhaps they could be used as exemplars –wedges to mitigate otherwise uncomfortable feelings. Placeboes –to put the concept into more familiar language for me.

Stopgaps, to be sure, and not permanent solutions. But sometimes, we have to ease into things less obtrusively. Less confrontationally. A still-evolving example -at least here in Canada- might be gender bias in hockey. Most Canadians have grown up exposed to hockey, and might be reasonably assumed to have an opinion on the conduct of games, players, and even rules. And yet, until relatively recently, the assumption was that hockey players –good ones, at least- were male. For us older folks, it was automatic. No thought required; no need to ask about gender. But no longer is that the case. For a variety of reasons, there is still no parity, and yet it is changing –slowly, perhaps, but not conflictually. And so, despite any initial challenges, is likely to succeed.

Am I really conflating success in the changing mores of hockey with gender equality? Or basketball teams and how we view their members, with racial equality? Am I assuming that diminishing discrimination in some fields leads to wider societal effects? Yes, I suppose I am. A blotter doesn’t care about the kind, or the colour, of the ink it absorbs; it’s just what it does. What it is. And, in the end, isn’t that what we all are, however vehemently we may protest? However much we may resist the similarities that bind us in relationship for fear of losing our own identities?

But if we step back a little, we may come to appreciate that the correlation need not be like that of a blotter -need not involve a team, or a marriage… I am reminded of the advice from one of my favourite writers, the poet, Kahlil Gibran: Love one another, but make not a bond of love: let it rather be a moving sea between the shores of your souls.

It’s the way I prefer to see the world, anyway…