Bad Samaritans?

I suspect this is an incredibly naïve, not to mention unpopular, opinion, but I suppose in these times of plague, I should be grateful we have borders -fences that keep them out, walls that keep us safe. But I’m not. I’ve always mistrusted borders: I’ve always been suspicious of boundaries that artificialize the denizens of one region -that privilege residents as opposed to non-residents, friends versus strangers, our needs compared to theirs.

Call me unworldly, but what makes me special, and you not so? It seems to me the italics I have used to mark differences, are as arbitrary as the differences they mark. We are all the same, and deserve the same consideration.

That said, we seem to be stuck with countries determined only to look after their own -even with the global crisis in which we find ourselves in these special, but frightening times. In a desperate attempt at historical recidivism, we are attempting a re-balkanization of the world.

But what is a country, anyway? And does it have a special providence -or provenance, for that matter? I happened upon an interesting essay by Charles Crawford, who once served as the UK Ambassador to Sarajevo and Belgrade discussing much the same thing: https://aeon.co/essays/who-gets-to-say-what-counts-as-a-country

As he writes -‘There are only two questions in politics: who decides? and who decides who decides? … Who gets to say what is or is not a country? For most of human history, nation states as we now recognise them did not exist. Territories were controlled by powerful local people, who in turn pledged allegiance to distant authorities, favouring whichever one their circumstances suited. In Europe, the tensions in this system eventually led to the Thirty Years’ War which… ended in 1648 with a thorough revision of the relationship between land, people and power. The resulting set of treaties, known as the Peace of Westphalia, introduced two novel ideas: sovereignty and territorial integrity. Kings and queens had ‘their’ people and associated territory; beyond their own borders, they should not meddle.’

Voila, the modern idea of states, with loyalties only to themselves. But embedded in the concept were at least two principles -two problems: ‘The first is self-determination: the idea that an identified ‘people’ has the right to run its own affairs within its own state. The other is territorial integrity: the notion that the borders of an existing state should be difficult to change.’ But borders soon spawned customs and attitudes that were different from those on the other side –theirs were different from ours, so they must be different from us. An oversimplification, to be sure, but nonetheless a helpful guide, perhaps.

Borders can change, of course, but not easily, and often not without considerable turmoil. Think of ‘the separation of Bangladesh from Pakistan in 1971 [which] claimed up to a million lives… Ambiguous ceasefires can drag on indefinitely. Taiwan and its 23 million inhabitants live in a curious twilight zone of international law, recognised by only 22 smaller countries and the Vatican.’ Examples of each, abound.

And not all borders were established to reconcile linguistic, ethnic, or religious differences. There are many examples, but perhaps the most egregious borders in modern times were those largely arbitrary ones in the Middle East drawn by two aristocrats Mark Sykes from Britain, and Francois Georges-Picot from France in 1916. As Wikipedia describes: ‘it was a secret agreement between Britain and France with assent from the Russian Empire and Italy, to define their mutually agreed spheres of influence and control in an eventual partition of the Ottoman Empire.’

A famous quotation that encapsulates the attitude was that of Sykes: ‘At a meeting in Downing Street, Mark Sykes pointed to a map and told the prime minister: “I should like to draw a line from the “e” in Acre to the last “k” in Kirkuk.”’-a straight line, more or less.

Crawford’s essay was intended to explain the continuing tensions in the Balkans, but it raises a pertinent question for these times -namely, ‘Should nations stay within their historical boundaries, or change as their populations do?’ Or, put another way, should boundaries remain impermeable to needs outside what I would term their arbitrary limits?

With the current pandemic, there are, no doubt, many reasons that could be offered for being selective at borders: family-first ones, by and large. We need to close our borders to support our own economy, feed our own people; in the midst of a global epidemic, it is not the time to sacrifice our own needs by offering altruism to others. Actually, it seems to me that the underlying belief is that migration -legal or otherwise- is a large contributor to the spread of the infection. But once a communicable virus is in the country, its own citizens also become vectors -and they far outnumber the number of refugees or migrants.

Rather than being focussed on borders and exclusion, efforts would likely be more intelligently spent on things like temporary isolation of any who may have been in areas where the epidemic may have been less controlled, and enforced social separation (social-distancing) of everybody else. Consistent, and frequently publicized advice and updates about new developments to educate the public -all the public- is key to managing fear. And epidemics -they have a habit of evolving rapidly.

And testing, testing, testing. Unless and until, we know who might have the infection and be a risk to others, we are essentially blinkered. It’s not the strangers among us who pose the risk, it’s those who are infected and either have no symptoms or who are at the earliest stages of an infection that has not yet had time to declare itself.

The World Health Organization (and others) have pointed out that travel restrictions not only divert resources from the containment effort, they also have human costs. ‘Travel measures that significantly interfere with international traffic may only be justified at the beginning of an outbreak, as they may allow countries to gain time, even if only a few days, to rapidly implement effective preparedness measures. Such restrictions must be based on a careful risk assessment, be proportionate to the public health risk, be short in duration, and be reconsidered regularly as the situation evolves. Travel bans to affected areas or denial of entry to passengers coming from affected areas are usually not effective in preventing the importation of cases but may have a significant economic and social impact.’ And, as all of us realize -and expect- by now: ‘Travellers returning from affected areas should self-monitor for symptoms for 14 days and follow national protocols of receiving countries.’ Amen.

Turning away migrants often has some desired political effects, however: diverting attention away from the receiving country’s possible lack of preparedness and foresight. It’s seldom about the Science and more about Nationalism -further stoking fears of the other.

I think that at the moment, we are forgetting, as was immortalized in that ancient Persian adage that, This, too, will pass. The pandemic will exhaust itself, and likely soon become both amenable to a vaccine and other medical therapy. And those affected will not soon forget -nor will those denied entry in their time of need. As our economies rebuild in its wake, we -and they- will need all the allies we can muster. Best to be remembered as a friend who helped, than someone who turned their back.

We really are all in this together. As one of my favourite poets, Kahlil Gibran writes, ‘You often say,I would give, but only to the deserving.” The trees in your orchard say not so… They give that they may live, for to withhold is to perish.’

That way madness lies

To portray something -to make it believable- there has to be at least some understanding by the audience of what is being portrayed. Much in the sense, I suppose, that was suggested in the 1974 paper in The Philosophical Review by the American philosopher Thomas Nagel, asking what it would be like to be a bat. Not so much how it would feel to have the added sense of sonar, or be able to fly in the dark, but more about the consciousness of itself. As Wikipedia explains Nagel’s thinking: ‘an organism has conscious mental states, “if and only if there is something that it is like to be that organism—something it is like for the organism to be itself.”

This is a roundabout way of wondering whether an audience could ever know if an actor is representing something realistically if they cannot imagine what it would be like to be that thing.

Mental illness seems as if it is sufficiently prevalent that most of us would be expected to understand whether or not the author, or the actor, has captured its essence accurately, and yet, for those of us who have not experienced the wide panoply of its manifestations -the majority of us, I suspect- we might be easily mislead. The more gripping or sensational portrayals of illness, might well come to stereotype the lot. To stigmatize the condition.

I was scrolling through the BBC Culture section when I happened upon an article that discusses some of these same issues: http://www.bbc.com/culture/story/20180828-how-cinema-stigmatises-mental-illness

‘… the film industry has generally shown a shaky vision of mental health … It’s not that cinema evades ‘taboo’ themes here; it’s more that it tends to swing wildly from sentimentality to sensationalism.’ To attract an audience -i.e. to make a profit- ‘creative drama is drawn to the complexity and fragility of the mind – but mainstream entertainment still demands a snappy fix. And the definition of ‘insanity’ is inherently problematic.’

I am reminded of the French philosopher Michel Foucault’s book Madness and Civilization -subtitled A History of Insanity in the Age of Reason. He felt that the concept of madness was evolving over time: in the Renaissance, (as a thoughtful summary in Wikipedia puts it) the mad were portrayed in art ‘as possessing a kind of wisdom – a knowledge of the limits of our world – and portrayed in literature as revealing the distinction between what men are and what they pretend to be … but the Renaissance also marked the beginning of an objective description of reason and unreason (as though seen from above) compared with the more intimate medieval descriptions from within society.’

Later, however, ‘in the mid-seventeenth century, the rational response to the mad, who until then had been consigned to society’s margins, was to separate them completely from society by confining them, along with prostitutes, vagrants, blasphemers and the like, in newly created institutions all over Europe.’ (The Great Confinement).

‘For Foucault the modern experience began at the end of the eighteenth century with the creation of places devoted solely to the confinement of the mad under the supervision of medical doctors, and these new institutions were the product of a blending of two motives: the new goal of curing the mad away from their family who could not afford the necessary care at home, and the old purpose of confining undesirables for the protection of society. These distinct purposes were lost sight of, and the institution soon came to be seen as the only place where therapeutic treatment can be administered.’

But, back to the BBC Culture depiction of the role of cinema, ‘our mainstream perceptions of ‘madness’ are still fixated with movie scenes – much more emphatically, in fact, than the novels or memoirs on which they might be based. A classic film like One Flew Over the Cuckoo’s Nest (1975) seals the impression of a soul-destroying psychiatric asylum, where livewire convict RP McMurphy (Jack Nicholson) feigns insanity to escape prison labour – yet is ultimately crushed by the system. The dramatic depiction of patient treatment, particularly its brutal electroconvulsive therapy sequences, had far-reaching impact. In 2011, The Telegraph went so far as to say that the film was responsible for “irreparably tarnishing the image of ECT…’

Unfortunately, unlike many art forms, movies usually require a conclusion, a wrapping up of the story, and a realistic depiction of mental illness may not fit into that convenient format. There may be no black or white: not all characterizations can end either pleasantly or sadly -some are palimpsests, to be sure, but many can reach no definitive conclusions that would satisfy the average moviegoer. Hence the temptation to exaggerate, or at least frighten audiences into an odd manifestation of satisfaction.

The temptation, in other words, to see mental illness as alien, separate -like a creature we could not possibly understand because it is so different. As different, perhaps, as Nagel’s bat. But is it? Or was Foucault really on to something in his analysis of the way ‘madness’ seemed to be viewed in Renaissance literature and art -a view which accepted that at least some of the vagaries, some of the stigmata of mental illness, were merely variations of mental states that any of us could exhibit at times? And indeed, that occasionally intimated unique views on a world from which we might learn some important lessons -a world, though, that we might now discard, or shun as too bizarre. Too frightening. Too… real.

On the other hand, there is a danger of romanticizing the past, of airbrushing its naïveté into soft and reassuring colours; of assuming it was what it was because it had not yet been exposed to the unforgiving exigencies of current knowledge. A time when imagination and reality were sometimes allowed to merge. Encouraged to conflate.

It’s difficult to be certain where present day arts can be placed on this spectrum of understanding mental illness -not the least because it is difficult to know where it should be placed. But, suffice it to say, the more fully the illness is portrayed in all its complexity, the more we might be able to see it as a small, but important part of the tapestry of existence -a fragment of the struggle that marks all our days. And, as for any vicissitude, where there is suffering, we must provide succour and relief, and where there is dissimilarity, offer understanding and acceptance. Tolerance. The soul, says the poet Kahlil Gibran, walks upon all paths. The soul walks not upon a line, neither does it grow like a reed. The soul unfolds itself like a lotus of countless petals.

 

Should We Bell the Cat?

What should you do at a dinner party if the hostess, say, declares that she believes something that you know to be inaccurate -or worse, that you consider repellent? Abhorrent? Should you wait to see how others respond, or take it upon yourself to attempt to correct her belief? If it is merely a divergence of opinion, it might be considered a doctrinaire exercise -a Catholics vs Protestant type of skirmish- and likely unwinnable.

But, suppose it is something about which you are recognized to have particular credentials so your response would not be considered to be merely an opinion, but rather a statement of fact? Should that alter your decision as to whether or not to take issue with her pronouncement? Would your silence imply agreement -acquiescence to a view that you know to be not only wrong, but offensive? And would your failure to contradict her, signal something about her opinion to the others at the table? If it is an ethical issue, should you attempt to teach?

It is a difficult situation to be sure, and one that is no doubt difficult to isolate from context and the responsibilities incumbent upon a guest. Still, what should you do if, uncorrected, she persists in promulgating her belief? Should you leave the table, try to change the topic, or merely smile and wait to see if she is able to sway those around you to her views?

I can’t say that the situation has arisen all that often for me, to tell the truth -we tend to choose our friends, and they theirs, on the basis of shared values- but what risks might inhere in whatever course of action I might choose? I happened upon an insightful and intriguing article that touched on that very subject in Aeon, an online magazine:  https://aeon.co/ideas/should-you-shield-yourself-from-others-abhorrent-beliefs It was written by John Schwenkler, an associate professor in philosophy at Florida State University.

He starts, by pointing out that ‘Many of our choices have the potential to change how we think about the world. Often the choices taken are for some kind of betterment: to teach us something, to increase understanding or to improve ways of thinking. What happens, though, when a choice promises to alter our cognitive perspective in ways that we regard as a loss rather than a gain?’

And further, ‘When we consider how a certain choice would alter our knowledge, understanding or ways of thinking, we do this according to the cognitive perspective that we have right now. This means that it’s according to our current cognitive perspective that we determine whether a choice will result in an improvement or impairment of that very perspective. And this way of proceeding seems to privilege our present perspective in ways that are dogmatic or closed-minded: we might miss the chance to improve our cognitive situation simply because, by our current lights, that improvement appears as a loss. Yet it seems irresponsible to do away entirely with this sort of cognitive caution… And is it right to trust your current cognitive perspective as you work out an answer to those questions? (If not, what other perspective are you going to trust instead?)’

You can see the dilemma: is the choice or opinion you hold based on knowledge, or simply belief? And here he employs a sort of thought experiment: ‘This dilemma is escapable, but only by abandoning an appealing assumption about the sort of grasp we have on the reasons for which we act. Imagine someone who believes that her local grocery store is open for business today, so she goes to buy some milk. But the store isn’t open after all… It makes sense for this person to go to the store, but she doesn’t have as good a reason to go there as she would if she didn’t just think, but rather knew, that the store were open. If that were case she’d be able to go to the store because it is open, and not merely because she thinks it is.’

But suppose that by allowing an argument -an opinion, say- to be aired frequently or uncontested, you fear you might eventually be convinced by it? It’s how propaganda endeavours to convince, after all. What then? Do you withdraw, or smile and smile and see a villain (to paraphrase Hamlet)? ‘If this is on the right track, then the crucial difference between the dogmatic or closed-minded person and the person who exercises appropriate cognitive caution might be that the second sort of person knows, while the first merely believes, that the choice she decides against is one that would be harmful to her cognitive perspective. The person who knows that a choice will harm her perspective can decide against it simply because it will do so, while the person who merely believes this can make this choice only because that is what she thinks.’

This is philosophical equivocation, and Schwenkler even admits as much: ‘What’s still troubling is that the person who acts non-knowingly and from a mere belief might still believe that she knows the thing in question… In that case, she’ll believe that her choices are grounded in the facts themselves, and not just in her beliefs about them. She will act for a worse sort of reason than the sort of reason she takes herself to have.’

As much as I enjoy the verbiage and logical progression of his argument, I have to admit to being a little disappointed in the concluding paragraph in the article, that seems to admit that he has painted himself into a corner: ‘What’s still troubling is that the person who acts non-knowingly and from a mere belief might still believe that she knows the thing in question: that climate change is a hoax, say, or that the Earth is less than 10,000 years old. In that case, she’ll believe that her choices are grounded in the facts themselves, and not just in her beliefs about them. She will act for a worse sort of reason than the sort of reason she takes herself to have. And what could assure us, when we exercise cognitive caution in order to avoid what we take to be a potential impairment of our understanding or a loss of our grip on the facts, that we aren’t in that situation as well?’

But, I think what this teaches me is the value of critical analysis, not only of statements, but also of context. First of all, obviously, to be aware of the validity of whatever argument is being aired, but then deciding whether or not an attempted refutation would contribute anything to the situation, or merely further entrench the individual in their beliefs, if only to save face. And as well, it’s important to step back for a moment, and assess the real reason I am choosing to disagree. Is it self-aggrandizement, dominance, or an incontestable conviction -incontestable based on knowledge or unprovable belief…?

I realize this is pretty confusing stuff -and, although profound, not overly enlightening- but sometimes we need to re-examine who it is we have come to be. In the words of the poet Kahlil Gibran, The soul walks not upon a line, neither does it grow like a reed. The soul unfolds itself like a lotus of countless petals.

To Be or Not to Be

We are all creatures of our cultures; we are all influenced, if not captured, by the ethos that affected our parents. And for most of us, it is where we feel the most comfortable. It does not require any clarification, or justification -it just is the way things are. The way things are supposed to be. Anything else is not simply an eccentricity, it is an aberration.

I think one of the aspects of Age that sometimes earned respect in times past, was the ability of some elders to stand aside from the fray and place things in context, examine long held opinions and wonder about their value. Unfortunately, often these voices of wisdom were rare and societal acceptance even rarer. At least until the time of social media, Zeitgeist moved like a snail, and only things like war or disasters could reliably hurry it on its journey. We were limited by geography to parochial assumptions of normalcy. We had no reason to doubt that our values were appropriate, and in all likelihood, universal.

Gender was one of those self-evident truths about which there could surely be no dissenting views. We are what genitalia we possess, and our sex is assigned accordingly. Period. Indeed, for years I saw no reason to question this belief. I could understand same-sex relationships easily enough, but the need to interrogate the very idea of ‘sexual identity’ did not occur to me. As I said, I am a creature of my Time, my Culture.

There was a fascinating article in Aeon, an online publication, that helped me to understand how very blinkered my view had been. It was written by Sharyn Graham Davies, an associate professor in the School of Languages and Social Sciences at Auckland University of Technology:

https://aeon.co/essays/the-west-can-learn-from-southeast-asias-transgender-heritage?

‘[T]he very word ‘transgender’ came into common usage only in the past 20 years or so, and even the word ‘gender’ itself was popularised only in the 1970s. So we couldn’t even talk about transgender in a sophisticated way until practically yesterday.’ Indeed, some people seem to think that the whole idea of ‘transgendered’ people is not only strange, but also a novel, made-up aberration. ‘But there’s a problem if transgender is considered a particularly recent issue, or as a peculiarly Western phenomena. The perceived novelty of transgender people often leads to the accusation that they don’t have a right to exist, that they are just ‘making it up’, or even trying to cash in on some celebrity status, like that acquired by Caitlyn Jenner, the American TV personality who in 2015 transitioned from Bruce Jenner, the Olympian decathlete.’

Among other cultures, the author highlights the case of the bissu, an ‘an order of spiritual leaders (often framed as a priest or shaman) who are commissioned to perform all sorts of tasks for the local community, such as helping those in power to make important decisions on topics such as marriage alliances, crop harvest dates and settlements of debt.’ They were likely first described in a letter written in 1544 to João de Albuquerque, the Portuguese bishop of the Indian state of Goa, by António de Paiva, a Portuguese merchant and missionary, when he was in Sulawesi in Indonesia.

And from the author of the article, we learn that ‘The [local] Bugis people thought that when a being became a woman or a man, that being could no longer communicate with the gods. Men and women were in some sense cut off from the gods that made them. But the gods had a means of communicating with humans: the bissu. Because the gods left the bissu undifferentiated – a combination of woman and man – they are accorded a position of influence. As the bissu bring together woman and man in one person, they can mediate between humans and gods through blessings.’

Interestingly, ‘Early indigenous manuscripts also talk of the bissu occupying a special social position because they combined female and male qualities. But the analytic tools available to these earlier commentators were slim – there was no word for anything like ‘gender’. Therefore, it is difficult to assess whether the bissu were considered a ‘third’ gender or as crossing from ‘one’ gender to the ‘other’ (transgender). However, what we can say is that there was a powerful sense of what today would be called ‘gender pluralism’

But this concept of pluralism was not confined to Indonesia by any means. For example, ‘According to Peter Jackson, a scholar at the Australian National University in Canberra, gender was not differentiated in Thailand until the 1800s. Before then, Thai princes and princesses dressed the same, with matching hairstyles. But in the 19th century, to impress the British, the Thai monarchy decided that women and men should be clearly differentiated with unique clothing and hairstyles… So in fact the Western gender system created ‘transgenderism’ – crossing from one gender to another made no sense in Thailand when there weren’t two strictly defined genders.’

So Graham Davies sums up her arguments about transgenderism by saying, ‘Human nature is diverse, and any attempt to split all 7 billion of us into one of just two categories based on mere genitals is both impossible and absurd. Transgender people have played crucial roles in societies throughout history.’

I find the article intriguing for several reasons, but I suppose the most compelling is that it calls into question what seems for most of us to be self-evident: that gender assignation should be commensurate with genital (or, nowadays, chromosomal) possession. Perhaps it is a good starting point in a culture that demarcates societal roles according to gender, but even a short step back would challenge the need for such rigid rules. Yes, maybe DNA does dictate that only the female is able to become pregnant and propagate the species, but why should it also demarcate other aspects of identity? Why should gender matter in a vocation, say, or even in a sport…? Surely it would be better to depend on quality of performance. And, for that matter, why should it be irrevocable once assigned?

I realize how naïve that sounds -self-identity seems to be such an important component of our ability to function in a multifaceted society. ‘While transgender often implies a crossing from one gender to the next, the use of third gender is a way to try to frame a separate and legitimate space for individuals who want to be considered outside this binary. The debate over terms and labels is fiercely contested, as any comments around the ever-increasing acronym LGBTQIA (for lesbian, gay, bisexual, transgender, queer, intersex and asexual) suggest.’

It seems to me that a helpful way to think about these roles is to understand that they do exist, and they have probably always existed. They are not new phenomena, nor are they bizarre. In fact, they are ‘abnormal’ only in that they usually do not represent the majority in most modern societies. But ‘abnormal’ does not mean aberrant -or necessarily perverse.

And yes, I also realize that the acceptance of cultural relativism swings on a wide pendulum over time, but I have to go back to something one of my favourite poets, Kahlil Gibran, wrote: Say not, “I have found the truth,” but rather, “I have found a truth.” Say not, “I have found the path of the soul.” Say rather, “I have met the soul walking upon my path.” For the soul walks upon all paths.

 

 

 

 

 

 

 

 

 

 

 

Beauty is bought by judgement of the eye?

Isn’t it interesting how differently we look at things? How the same bridge crossed by ten people becomes ten bridges? How beauty is so subjective? So ephemeral? Just think of how Shakespeare opened his second sonnet: When forty winters shall besiege thy brow and dig deep trenches in thy beauty’s field, thy youth’s proud livery, so gazed on now, will be a tattered weed, of small worth held.

And yet to some, beauty -however evanescent- seems a prize worth having, no matter the sacrifice. It seems unfair that it should have been doled out to some, but not to others. There are cultures where the inequity of this disparity is taken seriously; there are countries where beauty is felt to be a right to which all should be entitled no matter their social strata.

So accustomed am I to my own cultural mask, I have to admit that I had not realized that Brazil was such a place until I came across an article in the Conversation that addressed the issue. It was written by Alvaro Jarrin, an Assistant Professor of Anthropology, at the College of the Holy Cross in Massachusetts. https://theconversation.com/in-brazil-patients-risk-everything-for-the-right-to-beauty-94159 ‘Brazil considers health to be a basic human right and provides free health care to all its citizens. […] In Brazil […] patients are thought of as having the “right to beauty.” In public hospitals, plastic surgeries are free or low-cost.’ But, ‘public hospitals remain severely underfunded, and most middle-class and upper-class Brazilians prefer to use private medical services.’

Jarrin feels there is a darker side to this medical largesse however, in that the surgeries are frequently performed by more junior surgeons, just learning their techniques (albeit likely under the supervision of more experienced surgeons as is frequently the case even in the USA).

He goes on to say, ‘Yet these patients, most of whom were women, also told me that living without beauty in Brazil was to take an even bigger risk. Beauty is perceived as being so central for the job market, so crucial for finding a spouse and so essential for any chances at upward mobility that many can’t say no to these surgeries.’

‘Plastic surgery is considered an essential service largely due to the efforts of a surgeon named Ivo Pitanguy. In the late 1950s, Pitanguy […] convinced President Juscelino Kubitschek that the “right to beauty” was as basic as any other health need. Pitanguy made the case that ugliness caused so much psychological suffering in Brazil that the medical class could not turn its back on this humanitarian issue. In 1960, he opened the first institute that offered plastic surgery to the poor, one that doubled as a medical school to train new surgeons. It was so successful that it became the educational model followed by most other plastic surgery residencies around the country. In return for free or low-cost surgeries, working-class patients would help surgeons learn and practice their trade.’

The author seems to feel that the reconstructive aspects of plastic surgery -techniques for the treatment of burn victims and those with congenital deformities, etc.- have taken a back seat to techniques geared to aesthetic enhancement, however. ‘Since most of the surgeries in public hospitals are carried out by medical residents who are still training to be plastic surgeons, they have a vested interest in learning aesthetic procedures – skills that they’ll be able to later market as they open private practices. But they have very little interest in learning the reconstructive procedures that actually improve a bodily function or reduce physical pain. Additionally, most of Brazil’s surgical innovations are first tested by plastic surgeons in public hospitals, exposing those patients to more risks than wealthier patients.’

As a retired (gynaecological) surgeon myself, I have to say that I take issue with the naive view Jarrin seems to have about the training of the resident surgeons he reports. After all, clearly it would be better for the young surgeon to learn techniques under the careful guidance of an experienced mentor, than to suddenly be expected to possess the required expertise once she has passed her exams. Indeed, a selection bias is perhaps equally applicable to the anecdotes Jarrin quotes to demonstrate his contention. But, in fairness, I may be guilty of an insidiously perverted form of cultural relativism myself: I see my own world even when it’s not…

Cultural relativism, first popularized in the early twentieth century, attempts to understand and judge other cultures not by our own standards, but by theirs. It is a contextually rooted approach that can be devilishly difficult to achieve. We are all inherently cultural solipsists; we learn customs from the cradle and mistrust or actively disavow any deviations from those to which we have become habituated.

Even beauty itself is fraught. What is beautiful? Surely it is an ill-defined shadow on a rather large spectrum, its position tentative and arbitrary, depending as it must, on time and measurement. Shakespeare knew that. We all know that… Or do we? Are there unequivocal, objective criteria that must be met, or are they entirely subjectively defined? Culturally allotted? Surgically assigned?

No one has defined beauty more bewitchingly, in my opinion, than the poet, Kahlil Gibran, a Lebanese-American writer and artist in The Prophet. When the prophet is asked about beauty, he replies:

… beauty is not a need but an ecstasy.
It is not a mouth thirsting nor an empty hand stretched forth,
But rather a heart enflamed and a soul enchanted.

It is not the image you would see nor the song you would hear,
But rather an image you see though you close your eyes and a song you hear though you shut your ears.
It is not the sap within the furrowed bark, nor a wing attached to a claw,
But rather a garden for ever in bloom and a flock of angels for ever in flight.

… beauty is life when life unveils her holy face.
But you are life and you are the veil.
Beauty is eternity gazing at itself in a mirror.
But you are eternity and you are the mirror.

I cannot criticize the cultural ethos of Brazil, or its need for beauty; I can only wonder whether they will ever find what they are so desperately seeking. Who can touch a rainbow just by reaching?

 

 

A Childless Motherhood

Well of course! Did we think there would be no consequences? Did we actually think we could get away with it? That there weren’t two sides to the story that we all needed to hear?

Sometimes I think we are so focused on our journey to right a wrong, that we wander off the path to those we hope to save. Things are too partitioned -a modern day rendition of the biblical Matthew 6:3 where the left hand does not know what the right hand is doing… Or, perhaps, is not doing.

If one side of a page seems to contain all the information I seek, I may miss what’s written on the back. I feel no need to turn it over. An article in the Conversation turned the page for me:

https://theconversation.com/losing-children-to-foster-care-endangers-mothers-lives-93618

The author, Elizabeth Wall-Wieler, a PhD student in Community Health Sciences at the University of Manitoba, writes that ‘Mothers whose children are placed in foster care are at much higher risk of dying young, particularly due to avoidable causes like suicide. When a child is placed in foster care, most of the resources are focused on the child, with little to no support for the mothers who are left behind.’

In retrospect, of  course, it seems obvious -the mother-child bond is not something easily missed, and whether or not we attribute it to physiological changes such as oxytocin levels in her blood, or less reductionist, atavistic mechanisms, it is a powerful thing, dismissed only at her -and our– peril.

The author was involved in two large studies, one of them published in the Canadian Journal of Psychiatry, which ‘[…] looked at suicide attempts and suicide completions among mothers whose children were placed in care.

‘In this study, we compared rates of suicide attempts and suicides between 1,872 mothers who had a child placed in care with sisters whose children were not placed in care. We found that the rate of suicide attempts was 2.82 times higher, and the rate of death by suicide was more than four times higher for mothers whose children were not in their custody. […] Mothers whose children are taken into care often have underlying health conditions, such as mental illness and substance use. In both studies, we took pre-existing health conditions into account, so that was not the reason for the higher mortality rates we found.’

And, the author feels, ‘Most legislation pertaining to child protection services indicates that families should be supported, but the guidelines around what is expected of the child welfare system when it comes to the biological mothers are not clear. The main role of social workers is to ensure that the child is doing well. Social workers are already so busy, so it is often hard for them to justify spending their limited time to help mothers resolve challenges and work with them to address their mental and physical health needs.’

Other studies have also addressed the issue of sending children to foster care: ‘A study in Sweden found that by age 18, more than 16 per cent of children who had been in foster care had lost at least one parent (compared to three per cent of children who had not been in foster care). By age 25, one in four former foster children had lost at least one parent (compared to one in 14 in the general population). This means that many children in foster care don’t get the chance to be reunited with their families.’

I thought that the whole idea of fostering a child was care and sustenance until a more permanent placement was achieved or, ideally, the birthparent was able to reassume custody. This is perhaps more likely if the child can be placed with members of the same family -grandmothers, aunts, etc.- but even then, if the mother does not receive adequate support and treatment for the condition that led to the apprehension of her child, the results are apt to be the same.

In Canada, it seems, the mothers most affected are those from the indigenous community -our First Nations. The Canadian Minister of Indigenous Services, Jane Philpott, addressed indigenous leaders about this issue at a two-day emergency meeting on Indigenous Child and Family Services in Ottawa in January, 2018. http://www.cbc.ca/radio/thecurrent/a-special-edition-of-the-current-for-january-25-2018-1.4503172/we-must-disrupt-the-foster-care-system-and-remove-perverse-incentives-says-minister-jane-philpott-1.4503253 ‘The care system is riddled with “perverse incentives”. Children are being apprehended for reasons ranging from poverty to the health and addiction issues faced by their parents. In some provinces, rules around housing mean that your children can be taken away if you don’t have enough windows. “Right now dollars flow into the child welfare system according to the number of kids that are apprehended.” […] If financial incentives were based on “how many children we were able to keep in homes, how well we were able to support families — then in fact there would be no financial reason why the numbers would escalate.”’

But it’s not too difficult to read something else into all of this, of course. Uncondoned behaviour -behaviour frequently associated with poverty or marginalization- is often penalized isn’t it? Sometimes it is as simple as avoiding the transgressing community, further marginalizing it, but increasingly it is intolerance. Refusal to address the underlying issues. Not even trying to understand.

I admit that it is a difficult journey, and the road that winds between the abused child and its troubled parent is fraught. To empathize with the mother when her conduct may have been so clearly unacceptable, is seen as anathema. And yet, an attempt to understand is not a plea for condonation, merely a search for a solution. Nobody should get away with family neglect -but nothing happens in a vacuum. And there are always unintended consequences, aren’t there? Even our best intentions miss something in retrospect -solve one problem, create another. Our focus is often far too narrow -helping one person misses the one standing beside her.

Perhaps it’s time for us to stand back. As Ms Wall-Wieler puts it, ‘Specific guidelines need to be put in place to make sure that mothers are supported when their child is taken into care. This would improve the chances of reunification. And, by virtue of being a human worthy of treatment with dignity, mothers deserve support, even if it does not directly relate to how she interacts with her child(ren).’

‘Of the good in you I can speak, but not of the evil.
For what is evil but good tortured by its own hunger and thirst?’
Kahlil Gibran

 

 

 

 

 

 

In choice, we are so oft beguiled

It’s interesting just how important categories are in our lives, isn’t it? I mean, let’s face it, often they’re just adjectives –subordinate to their nouns. Add-ons. And yet, they can frame context, colour perception, and even determine value. Some, like, say, texture or odour may be interesting but trivial; some –size, or cost, for example- may be more important although optional in a description. There are, however, categories that seem to thrust themselves upon an object and are deemed essential to its description, essential to placing it in some sort of usable context. To understanding its Gestalt. These often spring to mind as questions so quickly they are almost automatic. Gender is one such category, age, perhaps another. And depending, I suppose on the situation, the society, or even the category to which the listener belongs, there may be several others that are deemed necessary to frame the issue appropriately.

The automaticity of a category is critical, however. If the category is felt to be of such consuming importance that it needs to be established before any further consideration can be given to the object, then that object’s worth –or at least its ranking- is contingent. It is no longer being evaluated neutrally, objectively. It comes replete with those characteristics attendant upon its category –intended or not. Age, for example, wears certain qualities, incites certain expectations that might prejudice acceptance of its behaviour. Gender, too, is another category that seems to colour assumptions about behaviour. So, with the assignation of category, comes opinion and its accompanying attitude.

One might well argue about the importance of these categories, and perhaps even strategize ways of neutralizing their influence on reactions, or subsequent treatment. The problem is much more difficult if knowledge of the category is so necessary it is intuitively provided as part of what is necessary to know about, for example, a person.

I suspect that in my naïveté, I had assumed that foreknowledge of many of these categories was merely curiosity-driven. Politeness oriented. Important, perhaps, so that I wouldn’t be surprised -wouldn’t embarrass the person at our initial encounter. But I am a doctor, and maybe see the world from a different perspective. A piece in the BBC, however, made me realize just how problematic this automaticity had become. How instinctive. http://www.bbc.com/future/story/20130423-is-race-perception-automatic?ocid

The article dealt mainly with its effects on racism, and the difficulties of countering it if we accept, as some evolutionary psychologists seem to believe, that it is basically intuitive. Evolved for a reason. Wired-in. ‘[…] if perceiving race is automatic then it lays a foundation for racism, and appears to put a limit on efforts to educate people to be “colourblind”, or put aside prejudices in other ways.’ But, as Tom Stafford, the author of the BBC article puts it, ‘Often, scientific racists claim to base their views on some jumbled version of evolutionary psychology (scientific racism is racism dressed up as science, not racisms based on science […]). So it was a delightful surprise when researchers from one of the world centres for evolutionary psychology intervened in the debate on social categorisation, by conducting an experiment they claimed showed that labelling people by race was far less automatic and inevitable than all previous research seemed to show.

‘The research used something called a “memory confusion protocol” […] When participants’ memories are tested, the errors they make reveal something about how they judged the pictures of individuals. […] If a participant more often confuses a black-haired man with a blond-haired man, it suggests that the category of hair colour is less important than the category of gender (and similarly, if people rarely confuse a man for a woman, that also shows that gender is the stronger category). Using this protocol, the researchers tested the strength of categorisation by race, something all previous efforts had shown was automatic. The twist they added was to throw in another powerful psychological force – group membership. People had to remember individuals who wore either yellow or grey basketball shirts. […] Without the shirts, the pattern of errors were clear: participants automatically categorised the individuals by their race (in this case: African American or Euro American). But with the coloured shirts, this automatic categorisation didn’t happen: people’s errors revealed that team membership had become the dominant category, not the race of the players. […] The explanation, according to the researchers, is that race is only important when it might indicate coalitional information – that is, whose team you are on. In situations where race isn’t correlated with coalition, it ceases to be important.’

I don’t know… To me, this type of experiment seems so desperate to appear to be wearing a scientific mantle, that it comes across as contrived –kludged, if you’ll permit an equally non-scientific term. But I take their point. If there is some way of diffusing the automaticity of our categorizations –or at least deflecting them into more malleable descriptors –teams, in this case- perhaps they could be used as exemplars –wedges to mitigate otherwise uncomfortable feelings. Placeboes –to put the concept into more familiar language for me.

Stopgaps, to be sure, and not permanent solutions. But sometimes, we have to ease into things less obtrusively. Less confrontationally. A still-evolving example -at least here in Canada- might be gender bias in hockey. Most Canadians have grown up exposed to hockey, and might be reasonably assumed to have an opinion on the conduct of games, players, and even rules. And yet, until relatively recently, the assumption was that hockey players –good ones, at least- were male. For us older folks, it was automatic. No thought required; no need to ask about gender. But no longer is that the case. For a variety of reasons, there is still no parity, and yet it is changing –slowly, perhaps, but not conflictually. And so, despite any initial challenges, is likely to succeed.

Am I really conflating success in the changing mores of hockey with gender equality? Or basketball teams and how we view their members, with racial equality? Am I assuming that diminishing discrimination in some fields leads to wider societal effects? Yes, I suppose I am. A blotter doesn’t care about the kind, or the colour, of the ink it absorbs; it’s just what it does. What it is. And, in the end, isn’t that what we all are, however vehemently we may protest? However much we may resist the similarities that bind us in relationship for fear of losing our own identities?

But if we step back a little, we may come to appreciate that the correlation need not be like that of a blotter -need not involve a team, or a marriage… I am reminded of the advice from one of my favourite writers, the poet, Kahlil Gibran: Love one another, but make not a bond of love: let it rather be a moving sea between the shores of your souls.

It’s the way I prefer to see the world, anyway…