Should We Bell the Cat?

What should you do at a dinner party if the hostess, say, declares that she believes something that you know to be inaccurate -or worse, that you consider repellent? Abhorrent? Should you wait to see how others respond, or take it upon yourself to attempt to correct her belief? If it is merely a divergence of opinion, it might be considered a doctrinaire exercise -a Catholics vs Protestant type of skirmish- and likely unwinnable.

But, suppose it is something about which you are recognized to have particular credentials so your response would not be considered to be merely an opinion, but rather a statement of fact? Should that alter your decision as to whether or not to take issue with her pronouncement? Would your silence imply agreement -acquiescence to a view that you know to be not only wrong, but offensive? And would your failure to contradict her, signal something about her opinion to the others at the table? If it is an ethical issue, should you attempt to teach?

It is a difficult situation to be sure, and one that is no doubt difficult to isolate from context and the responsibilities incumbent upon a guest. Still, what should you do if, uncorrected, she persists in promulgating her belief? Should you leave the table, try to change the topic, or merely smile and wait to see if she is able to sway those around you to her views?

I can’t say that the situation has arisen all that often for me, to tell the truth -we tend to choose our friends, and they theirs, on the basis of shared values- but what risks might inhere in whatever course of action I might choose? I happened upon an insightful and intriguing article that touched on that very subject in Aeon, an online magazine:  https://aeon.co/ideas/should-you-shield-yourself-from-others-abhorrent-beliefs It was written by John Schwenkler, an associate professor in philosophy at Florida State University.

He starts, by pointing out that ‘Many of our choices have the potential to change how we think about the world. Often the choices taken are for some kind of betterment: to teach us something, to increase understanding or to improve ways of thinking. What happens, though, when a choice promises to alter our cognitive perspective in ways that we regard as a loss rather than a gain?’

And further, ‘When we consider how a certain choice would alter our knowledge, understanding or ways of thinking, we do this according to the cognitive perspective that we have right now. This means that it’s according to our current cognitive perspective that we determine whether a choice will result in an improvement or impairment of that very perspective. And this way of proceeding seems to privilege our present perspective in ways that are dogmatic or closed-minded: we might miss the chance to improve our cognitive situation simply because, by our current lights, that improvement appears as a loss. Yet it seems irresponsible to do away entirely with this sort of cognitive caution… And is it right to trust your current cognitive perspective as you work out an answer to those questions? (If not, what other perspective are you going to trust instead?)’

You can see the dilemma: is the choice or opinion you hold based on knowledge, or simply belief? And here he employs a sort of thought experiment: ‘This dilemma is escapable, but only by abandoning an appealing assumption about the sort of grasp we have on the reasons for which we act. Imagine someone who believes that her local grocery store is open for business today, so she goes to buy some milk. But the store isn’t open after all… It makes sense for this person to go to the store, but she doesn’t have as good a reason to go there as she would if she didn’t just think, but rather knew, that the store were open. If that were case she’d be able to go to the store because it is open, and not merely because she thinks it is.’

But suppose that by allowing an argument -an opinion, say- to be aired frequently or uncontested, you fear you might eventually be convinced by it? It’s how propaganda endeavours to convince, after all. What then? Do you withdraw, or smile and smile and see a villain (to paraphrase Hamlet)? ‘If this is on the right track, then the crucial difference between the dogmatic or closed-minded person and the person who exercises appropriate cognitive caution might be that the second sort of person knows, while the first merely believes, that the choice she decides against is one that would be harmful to her cognitive perspective. The person who knows that a choice will harm her perspective can decide against it simply because it will do so, while the person who merely believes this can make this choice only because that is what she thinks.’

This is philosophical equivocation, and Schwenkler even admits as much: ‘What’s still troubling is that the person who acts non-knowingly and from a mere belief might still believe that she knows the thing in question… In that case, she’ll believe that her choices are grounded in the facts themselves, and not just in her beliefs about them. She will act for a worse sort of reason than the sort of reason she takes herself to have.’

As much as I enjoy the verbiage and logical progression of his argument, I have to admit to being a little disappointed in the concluding paragraph in the article, that seems to admit that he has painted himself into a corner: ‘What’s still troubling is that the person who acts non-knowingly and from a mere belief might still believe that she knows the thing in question: that climate change is a hoax, say, or that the Earth is less than 10,000 years old. In that case, she’ll believe that her choices are grounded in the facts themselves, and not just in her beliefs about them. She will act for a worse sort of reason than the sort of reason she takes herself to have. And what could assure us, when we exercise cognitive caution in order to avoid what we take to be a potential impairment of our understanding or a loss of our grip on the facts, that we aren’t in that situation as well?’

But, I think what this teaches me is the value of critical analysis, not only of statements, but also of context. First of all, obviously, to be aware of the validity of whatever argument is being aired, but then deciding whether or not an attempted refutation would contribute anything to the situation, or merely further entrench the individual in their beliefs, if only to save face. And as well, it’s important to step back for a moment, and assess the real reason I am choosing to disagree. Is it self-aggrandizement, dominance, or an incontestable conviction -incontestable based on knowledge or unprovable belief…?

I realize this is pretty confusing stuff -and, although profound, not overly enlightening- but sometimes we need to re-examine who it is we have come to be. In the words of the poet Kahlil Gibran, The soul walks not upon a line, neither does it grow like a reed. The soul unfolds itself like a lotus of countless petals.

Advertisements

Understanding as…

There is so much stuff out there that I don’t know -things that I hadn’t even thought of as knowledge. Things that I just accepted as ‘givens’.  You know, take the ability to understand something like, say, an arrangement of numbers as a series rather than a bunch of numbers, or the ability to extract meaning from some sounds -for example words spoken in English- and yet not others in a different language.

And, perhaps equally mysterious is the moment when that epiphany strikes. What suddenly changes those numbers into a series? Is it similar to what makes figure-ground alterations flip back and forth in my head: aspect perception? Is it analogous to the assignation of meaning to things -or, indeed, picking them out of the chaos of background and recognizing them as somehow special in the first place? Is it what Plato meant when he referred to the Forms –‘chairness’ or ‘tableness’ for example- abstractions that allow us to identify either, no matter how varied the shapes or sizes -the true essence of what things really are?

I suppose I’m becoming rather opaque -or is it obtuse?- but the whole idea of aspect perception, of ‘seeing as’, is an exciting, yet labyrinthine terra incognita, don’t you think? I’m afraid that what started it all was an essay in the online Aeon publication: https://aeon.co/ideas/do-you-see-a-duck-or-a-rabbit-just-what-is-aspect-perception

It was the edited version of an essay written by Stephen Law, the editor of the Royal Institute of Philosophy journal THINK. He begins by discussing some of the figure-ground changes found in, say Necker cubes whose sides keep flipping back and forth (a type of aspect perception) and then suggests that ‘A[nother] reason why changes in aspect perception might be thought philosophically significant is that they draw our attention to the fact that we see aspects all the time, though we don’t usually notice we’re doing so… For example, when I see a pair of scissors, I don’t see them as a mere physical thing – I immediately grasp that this is a tool with which I can do various things.’

Another example might be ‘…our ability to suddenly ‘get’ a tune or a rule, so we are then able to carry on ourselves.’ Or, how about religion? ‘The idea of ‘seeing as’ also crops up in religious thinking. Some religious folk suggest that belief in God doesn’t consist in signing up to a certain hypothesis, but rather in a way of seeing things.’ But then the caveat: ‘Seeing something as a so-and-so doesn’t guarantee that it is a so-and-so. I might see a pile of clothes in the shadows at the end of my bed as a monster. But of course, if I believe it’s a monster, then I’m very much mistaken.’

I have always loved wandering around bookstores. Maybe it’s an asylum -a refuge from the noisy street, or a spiritual sanctuary in a chaotic mall -but it’s more likely that the range and choice of books allows me to exercise an epiphanic region of my brain, and to practice ‘seeing as’ to my heart’s content. I’d never thought of bookstores as exercise before, of course, but I suppose the seed of ‘understanding as’ was sown by that article… or maybe it was the little girl.

Shortly after reading the essay, I found myself wandering blissfully through the quiet aisles of a rather large bookstore that seemed otologically removed from the noisy mall in which it hid. Coloured titles greeted me like silent hawkers in a park, the ones that sat dislodged from their otherwise tidy rows, sometimes reaching out to me with greater promise: curiosity, as to why someone might have dislodged them, perhaps. But nonetheless, I also found myself amused at their choices: book shops are catholic in the selection they proffer and I relish the opportunity to switch my perspectives… and expand my Weltanschauung, as the Philosophy section into which I had meandered might have put it when the thought occurred.

Of course, unexpected concepts like that are one of the delights of a bookstore -turn a corner into a different aisle and the world changes. It’s where I met the little girl talking to her mother about something in a book she was holding.

No more than four or five years old, she was wearing what I suppose was a pink Princess costume, and trying to be very… mature. Her mother, on the other hand, was dressed for the mall: black baseball cap, jeans, sneakers, and a grey sweatshirt with a yellow mustard stain on the front. Maybe they’d just come from a party, or, more likely, the Food Court, but the mother was trying to explain something in the book to her little daughter. The aisle wasn’t in the children’s section, but seemed to have a lot of titles about puzzles, and illusions, so maybe they’d wandered into it for something different: for surprises.

As I pretended to examine some books nearby, I noticed a Necker’s cube prominently displayed on the page the girl was holding open.

“Why does it do that, mommy?” Even as she spoke the perspective of the cube was flipping back and forth, with one face, then another seeming to be closer.

The mother smiled at this obvious teaching moment.

“It’s a great idea, anyway,” the daughter continued, before she got an answer.

“Idea…?” the mother said, with a patient look on her face. “What’s the idea, Angie?”

Angie scrunched her forehead and gave her mother a rather condescending look. “It’s an exercise book, remember?”

That apparently caught the mother by surprise. “It’s a book of puzzles and magic, sweetheart. I didn’t see any exercises.”

Angie rolled her eyes at her mother’s obvious obtuseness. “The nexercise cube, mommy…!”

Necker’s cube, sweetie,” she responded, trying to suppress a giggle. “It’s not an exercise cube.”

But Angie was having none of that, and stared at her like a teacher with a slow pupil. “It keeps making my mind move, mommy!” She shook her head in an obviously disappointed rebuke. “That’s exercise.”

I slipped back around the corner, unnoticed by them both I think. I felt I’d intruded on a very intimate moment and I didn’t want to trespass, but I couldn’t help wondering if Angie had come far closer to understanding Plato’s Forms than her mother or I could ever hope to.

Life’s Fitful Fever

I have never been terribly interested in historical statues I must confess. Pigeon- encrusted metal or a moulding stone person staring blankly at nothing and rooted firmly to a static prancing horse, does little to attract the attention of passersby like myself with lives and histories of their own to contemplate. Its attempts to dominate a plaza, or commemorate a public square, still do not often produce sufficient motivation to inspect its fading plaque. Perhaps I am alone in this, but unless intentionally identified, it is merely background. I do not notice it in the Gestalt.

And yet, it would seem that there are those who would have me think otherwise and devote undue attention to its original function. They would have me reconsider its historical significance.

Historical revisionism has never been a strong suit of mine –the present, with all of its problems, occupies most of my time. As an aging white male, I suppose I am history, or at least have lived through some of its more recent manifestations and survived. But of course I am one of the fortunate ones, and have been largely cosseted by my gender, ethnicity, and geography, so I appreciate the need to consider the lives and background of those not so blessed. And, had I been one of them, I’m certain I would also be less accepting of a majoritarian view of historical interpretation. History, after all, is written by the victors, not to mention the oppressors, and it frequently ignores the darker side -or at least reinterprets it to suit the prevailing ethos of the time.

And so the current movement to amend our view of omnia praeterita, is understandable. It’s just that the solution often chosen –pulling down statues, or renaming public edifices- creates adversaries who otherwise wouldn’t exist, and problems that rise like bubbles in a boiling pot. Statues, of course, can be seen as emblems of past injustice, and the contributions they commemorate, either misleading, or frankly misinformed. That none of us –not even the protesters- are blemish-free, is lost in the fervour to acknowledge historical repression or exploitation. I neither can, nor wish to deny any calumny that may be hidden in stone, or trapped in rusting metal, but I do hope that there is a middle ground. A workable compromise.

Despite the rush to take sides, I suspect it would be beneficial for all concerned to step back from the abyss of righteousness and look for solutions that neither polarize, nor punish. The past need not be prologue, but the question of historical truth, and unfair representation, is a vexing issue, and solutions are often fraught. In my search for background, however, I found an interesting and fairly balanced discussion of the problems: http://www.bbc.com/news/magazine-41904800 There are, perhaps, no definitive prescriptions, but at least it attempts a balanced context. A philosophical analysis, of a sort.

First, there is a succinct formulation of the problem: ‘All around the world, institutions are dealing with a conundrum. What to do about statues or buildings or scholarships or awards, honouring or funded by people we now regard as seriously morally flawed?’

And then there follows a discussion about possible solutions. ‘One approach is to do nothing. The do-nothing advocates say history shouldn’t be rewritten. To do so would be a form of censorship. And, they say, it’s ridiculous to expect every great historical figure to be blemish-free, to have lived a life of unadulterated purity.’ But, as is pointed out, with that approach, ‘Even those held up as saintly figures, such as Mahatma Gandhi or Nelson Mandela, had flaws (Gandhi’s attitude to women is excruciating, seen through 21st Century eyes)… And what message would it send to contemporary philanthropists? Give generously today, and risk having your reputation trashed tomorrow.

‘But this “do-nothing” position seems too extreme. Imagine that Goebbels had endowed scholarships to Oxford, like Rhodes. Would anybody seriously claim the Goebbels Scholarships shouldn’t be renamed (would anybody want to be a Goebbels Scholar?) or that a Goebbels statue shouldn’t be demolished?’

But, of course, the vast majority of people are neither complete monsters nor complete angels. So, ‘What sort of considerations, then, should come into play? One may be whether the views or actions of the figure in question were typical for their time. If so, that could make them less blameworthy. Another is the extent of their misdeeds and how that is evaluated against their achievements. Churchill held opinions that would disbar him from political office today – despicable yes, but surely massively outweighed by the scale of his accomplishments.’

The article, written by David Edmonds of the BBC, then goes on to point something out that I hadn’t thought of before: ‘… there are what philosophers call consequentialist considerations. How does looking at the statue make passers-by feel? This, in turn, will be connected to whether the history still resonates – an ancient statue of some medieval warlord, however bloody and brutal his conquests, probably won’t bother anybody. And, arguably, a statue of Rhodes in Cape Town will arouse more offence than one of the same man in Oxford.’

In fact, ‘Intergenerational justice is a hugely complex topic, not least because over the passage of time it becomes tricky to identify beneficiaries and victims. Daniel Butt [a politics fellow at Balliol College, Oxford] believes that where there is a clear historical continuity with the past, a modern institution has a duty to remedy wrongs, most especially when the impact of these wrongs is still being felt – for example in racial discrimination. He says Oxford’s complicity in colonialism confers upon it obligations ….’ But that complicity needs recognition -it needs to see the light of day.

There are other solutions that try to mitigate any past harms that the statue or name might invoke. One such approach would be for the institute with the offending statue, to offer scholarships, donations, or aid in areas of the world affected by the individual enstatued. Or, perhaps, as has been suggested in the American south, changing the information on the existing plaques to acknowledge the injustices meted out. Make the reader, of whatever persuasion, aware of them.

It seems to me that merely removing statues, or changing a commemorative name, is equivalent to burying the past. Out of sight, out of mind. If there truly has been an injustice, then surely recognizing it, making those hitherto unaware of it confront the issues, is more likely to prevent it from happening again. More likely to be a lesson, a cautionary tale that has to be heard, a violation that has to be seen. As Brutus says in Shakespeare’s Julius Caesar: It is the bright day that brings forth the adder and that craves wary walking.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Memory Vaults

After a certain age, many of us have concerns about our memories. Nothing much at first, of course -just things like forgetting why you went into the kitchen, or where you put your keys. Later, it can progress to having to write down a phone number immediately after you hear it, say, rather than trusting that you will be able to recall it correctly a minute or two later. Often, it’s easier to remember things if you use little tricks, mnemonic aids, although sometimes you forget to use them, too. But why? Are things just wearing out? Are some neurons in the brain short circuiting, or actively being culled? And why such a variation in people -and, apparently, in different populations?

As one might expect, there is intense research in this field, given the demographics of an aging population base. But have you ever wondered why you don’t have many -or any– early memories of when you were a young child -especially when you were very young: a baby, for example? Surely, the brain is a sponge at that stage, and the neurons and neural connections are propagating like mad to help you learn about the new world you are encountering for the first time? This is a puzzle that has always interested me, but perhaps even more so in my dotage when I finally have the time to reflect more thoroughly on the mystery and its possible ramifications. An article in the BBC Future series written by Zaria Gorvett, helped to shed some light on the mysterious gap:   http://www.bbc.com/future/story/20160726-the-mystery-of-why-you-cant-remember-being-a-baby

I am reminded of a blurred black-and-white memory of my brother holding me in what seemed like a flower garden when I was a baby. I often refer to it as my first memory, but the faded and black-and-white characteristics suggest that it was more likely hewn from a photograph than any still inchoate proto-memory. But as the above-linked article suggests, ‘On average, patchy footage appears from about three-and-a-half.’ And that, of course, reveals another fascinating thing about what we think we remember: ‘Even if your memories are based on real events, they have probably been moulded and refashioned in hindsight – memories planted by conversations rather than first-person memories of the actual events.’ But I’m getting ahead of myself.

Why can’t we remember being a baby? As with many aspects of brain function, nobody seems to know for sure, but the article discusses several theories that might explain it. One such attempt suggests that ‘Our culture may … determine the way we talk about our memories, with some psychologists arguing that they only come once we have mastered the power of speech. “Language helps provide a structure, or organisation, for our memories, that is a narrative.  By creating a story, the experience becomes more organised, and therefore easier to remember over time,” …  Some psychologists are sceptical that this plays much of a role, however. There’s no difference between the age at which children who are born deaf and grow up without sign language report their earliest memories, for instance.’

Then there is the possibility that ‘we can’t remember our first years simply because our brains hadn’t developed the necessary equipment.’ The hippocampus, an area of the brain that is important for dealing with memories, is still developing new neurons for the first few years of life, and it is only when these additions begin to slow down that our first memories emerge. This adds another layer of mystery to the hippocampus, though: ‘is the under-formed hippocampus losing our long-term memories, or are they never formed in the first place? Since childhood events can continue to affect our behaviour long after we’ve forgotten them, some psychologists think they must be lingering somewhere.’

There is another hazy memory that also haunts me; I would be pretty sure that it’s real, except for my brother again. It was in the days before seat belts or infant car seats; my father was driving and I was sitting on my mother’s lap in the car so I would be high enough to see out of the window. We were somewhere in Manitoba on the way to the Winnipeg Beach for the first and only time -it’s on Lake Winnipeg, I think. Apparently we normally went to Grand Beach on the railway my father used to work for, so my brother remembered the year. I would have been about two and a half or three years old.

My brother, who must have been around twelve or thirteen at the time, had the back seat to himself, but I think he was mad that he couldn’t sit in the front so he kept reaching over the seat and pulling my hair. My mother, never the patient one, finally had enough and suddenly threw out her arm to smack him. Unfortunately she hit my father and the car swerved off the highway before he could stop it. None of us were hurt, but it really scared me and I began to sob unconsolably. My parents tried everything to stop me, and finally, my brother -now chastened- told me to look out of the window because there was something out there that I’d never, ever, see again. I remember his emphasis on the ‘ever’, but I also remember not wanting to obey him.

At any rate, many years later, I asked him if he remembered that time we drove up to Winnipeg Beach with our parents.

We were sitting in a little coffee shop and he put down his cup and nodded, a faraway look in his eyes. “You were a little brat then, remember?”

Actually, I thought he’d been the brat, but I supposed he meant my crying. “But you were pulling my hair, remember?”

“And you kicked dad so hard, he lost control and the car swerved off the road…” he said before I interrupted.

“That was mom who knocked his arm accidentally because she was trying to get to you.” I was absolutely clear on that. “Maybe you couldn’t see her well enough from the back seat, though” I added, trying to be diplomatic.

I remember my brother staring at me at that  point. “I was sitting in the front seat, too,” he chided. “The old cars had those big wide front seats, remember?” A wry smile appeared on his lips -he still knew how to stir me up. “And, you were having one of your usual temper tantrums and kicked out with your feet. One of them hit the hand dad was driving with and the car skidded off the road… I think it might have been gravel, or something.”

I shook my head vehemently at his pentimento, and he began to laugh like he used to do when he was taunting me. Apparently he, too, was invested in the memory. And yet, how can you verify your recollection of something that happened that long ago? I suppose I sulked for a moment or two, and then it came to me. “Ron, do you remember that I was crying even worse after the accident?” He nodded graciously, content to grant me a little face. “And do you remember that you tried to get me to stop, by telling me to look out of the window? You said if I didn’t look, I’d miss something I’d never, ever, see again?” I tried to emphasize the ‘ever’ like I remembered he had.

He thought about it for a minute. Clearly it hadn’t been a big thing for him at the time -more a trick to silence me. Then, his face brightened up. “Yes… Yes I think I remember…”

“And what was it?” I asked, leading him into my trap.

He smiled in the same smug way he always had when he was in possession of something I wanted but didn’t have. “There was a huge eagle sitting on a tree near the road. I don’t think I’d ever seen one before in the wild.”

I nodded my head pleasantly, as if he’d finally solved a mystery that had been nagging at me all those years. But I knew he was lying… No, that’s unfair! I knew he thought he was remembering what really happened back then, but he was wrong -I’d peeked. There was no eagle, and in fact we were almost at the beach and there weren’t even trees anywhere near the road. I’m sure I would have seen something that big. And yet… And yet, the eagle has been my favourite bird ever since I was a child…

 

 

 

 

 

 

 

 

 

To Be or Not to Be

We are all creatures of our cultures; we are all influenced, if not captured, by the ethos that affected our parents. And for most of us, it is where we feel the most comfortable. It does not require any clarification, or justification -it just is the way things are. The way things are supposed to be. Anything else is not simply an eccentricity, it is an aberration.

I think one of the aspects of Age that sometimes earned respect in times past, was the ability of some elders to stand aside from the fray and place things in context, examine long held opinions and wonder about their value. Unfortunately, often these voices of wisdom were rare and societal acceptance even rarer. At least until the time of social media, Zeitgeist moved like a snail, and only things like war or disasters could reliably hurry it on its journey. We were limited by geography to parochial assumptions of normalcy. We had no reason to doubt that our values were appropriate, and in all likelihood, universal.

Gender was one of those self-evident truths about which there could surely be no dissenting views. We are what genitalia we possess, and our sex is assigned accordingly. Period. Indeed, for years I saw no reason to question this belief. I could understand same-sex relationships easily enough, but the need to interrogate the very idea of ‘sexual identity’ did not occur to me. As I said, I am a creature of my Time, my Culture.

There was a fascinating article in Aeon, an online publication, that helped me to understand how very blinkered my view had been. It was written by Sharyn Graham Davies, an associate professor in the School of Languages and Social Sciences at Auckland University of Technology:

https://aeon.co/essays/the-west-can-learn-from-southeast-asias-transgender-heritage?

‘[T]he very word ‘transgender’ came into common usage only in the past 20 years or so, and even the word ‘gender’ itself was popularised only in the 1970s. So we couldn’t even talk about transgender in a sophisticated way until practically yesterday.’ Indeed, some people seem to think that the whole idea of ‘transgendered’ people is not only strange, but also a novel, made-up aberration. ‘But there’s a problem if transgender is considered a particularly recent issue, or as a peculiarly Western phenomena. The perceived novelty of transgender people often leads to the accusation that they don’t have a right to exist, that they are just ‘making it up’, or even trying to cash in on some celebrity status, like that acquired by Caitlyn Jenner, the American TV personality who in 2015 transitioned from Bruce Jenner, the Olympian decathlete.’

Among other cultures, the author highlights the case of the bissu, an ‘an order of spiritual leaders (often framed as a priest or shaman) who are commissioned to perform all sorts of tasks for the local community, such as helping those in power to make important decisions on topics such as marriage alliances, crop harvest dates and settlements of debt.’ They were likely first described in a letter written in 1544 to João de Albuquerque, the Portuguese bishop of the Indian state of Goa, by António de Paiva, a Portuguese merchant and missionary, when he was in Sulawesi in Indonesia.

And from the author of the article, we learn that ‘The [local] Bugis people thought that when a being became a woman or a man, that being could no longer communicate with the gods. Men and women were in some sense cut off from the gods that made them. But the gods had a means of communicating with humans: the bissu. Because the gods left the bissu undifferentiated – a combination of woman and man – they are accorded a position of influence. As the bissu bring together woman and man in one person, they can mediate between humans and gods through blessings.’

Interestingly, ‘Early indigenous manuscripts also talk of the bissu occupying a special social position because they combined female and male qualities. But the analytic tools available to these earlier commentators were slim – there was no word for anything like ‘gender’. Therefore, it is difficult to assess whether the bissu were considered a ‘third’ gender or as crossing from ‘one’ gender to the ‘other’ (transgender). However, what we can say is that there was a powerful sense of what today would be called ‘gender pluralism’

But this concept of pluralism was not confined to Indonesia by any means. For example, ‘According to Peter Jackson, a scholar at the Australian National University in Canberra, gender was not differentiated in Thailand until the 1800s. Before then, Thai princes and princesses dressed the same, with matching hairstyles. But in the 19th century, to impress the British, the Thai monarchy decided that women and men should be clearly differentiated with unique clothing and hairstyles… So in fact the Western gender system created ‘transgenderism’ – crossing from one gender to another made no sense in Thailand when there weren’t two strictly defined genders.’

So Graham Davies sums up her arguments about transgenderism by saying, ‘Human nature is diverse, and any attempt to split all 7 billion of us into one of just two categories based on mere genitals is both impossible and absurd. Transgender people have played crucial roles in societies throughout history.’

I find the article intriguing for several reasons, but I suppose the most compelling is that it calls into question what seems for most of us to be self-evident: that gender assignation should be commensurate with genital (or, nowadays, chromosomal) possession. Perhaps it is a good starting point in a culture that demarcates societal roles according to gender, but even a short step back would challenge the need for such rigid rules. Yes, maybe DNA does dictate that only the female is able to become pregnant and propagate the species, but why should it also demarcate other aspects of identity? Why should gender matter in a vocation, say, or even in a sport…? Surely it would be better to depend on quality of performance. And, for that matter, why should it be irrevocable once assigned?

I realize how naïve that sounds -self-identity seems to be such an important component of our ability to function in a multifaceted society. ‘While transgender often implies a crossing from one gender to the next, the use of third gender is a way to try to frame a separate and legitimate space for individuals who want to be considered outside this binary. The debate over terms and labels is fiercely contested, as any comments around the ever-increasing acronym LGBTQIA (for lesbian, gay, bisexual, transgender, queer, intersex and asexual) suggest.’

It seems to me that a helpful way to think about these roles is to understand that they do exist, and they have probably always existed. They are not new phenomena, nor are they bizarre. In fact, they are ‘abnormal’ only in that they usually do not represent the majority in most modern societies. But ‘abnormal’ does not mean aberrant -or necessarily perverse.

And yes, I also realize that the acceptance of cultural relativism swings on a wide pendulum over time, but I have to go back to something one of my favourite poets, Kahlil Gibran, wrote: Say not, “I have found the truth,” but rather, “I have found a truth.” Say not, “I have found the path of the soul.” Say rather, “I have met the soul walking upon my path.” For the soul walks upon all paths.

 

 

 

 

 

 

 

 

 

 

 

He’s mad that trusts in the tameness of a wolf

I am an obstetrician, and not a neuropsychiatrist, but I feel a definite uneasiness with the idea of messing with brains –especially from the inside. Talking at it, sure –maybe even tweaking it with medications- but it seems to me there is something… sacrosanct about its boundaries. Something akin to black-boxhood -or pregnant-wombhood, if you will– where we have a knowledge of its inputs and outputs, but the internal mechanisms still too complex and interdependent to be other than interrogated from without.

I suppose I have a fear of the unintended consequences that seem to dog science like afternoon shadows -a glut of caution born of reading about well-meaning enthusiasms in my own field. And yet, although I do not even pretend to such arcane knowledge as might tempt me to meddle with the innards of a clock let alone the complexities of a head, I do watch from afar, albeit through a glass darkly. And I am troubled.

My concern bubbled to the surface with a November 2017 article from Nature that I stumbled upon: https://www.nature.com/news/ai-controlled-brain-implants-for-mood-disorders-tested-in-people-1.23031 I recognize that the report is dated, and merely scratches the surface, but it hinted at things to come. The involvement of DARPA (the Defense Advanced Research Projects Agency of the U.S. military) did little to calm my fears, either –they had apparently ‘begun preliminary trials of ‘closed-loop’ brain implants that use algorithms to detect patterns associated with mood disorders. These devices can shock the brain back to a healthy state without input from a physician.’

‘The general approach —using a brain implant to deliver electric pulses that alter neural activity— is known as deep-brain stimulation. It is used to treat movement disorders such as Parkinson’s disease, but has been less successful when tested against mood disorders… The scientists behind the DARPA-funded projects say that their work might succeed where earlier attempts failed, because they have designed their brain implants specifically to treat mental illness — and to switch on only when needed.’

And how could the device know when to switch on and off? How could it even recognize the complex neural activity in mental illnesses? Well, apparently, an ‘electrical engineer Omid Sani of the University of Southern California in Los Angeles — who is working with Chang’s team [a neuroscientist at UCSF] — showed the first map of how mood is encoded in the brain over time. He and his colleagues worked with six people with epilepsy who had implanted electrodes, tracking their brain activity and moods in detail over the course of one to three weeks. By comparing the two types of information, the researchers could create an algorithm to ‘decode’ that person’s changing moods from their brain activity. Some broad patterns emerged, particularly in brain areas that have previously been associated with mood.’

Perhaps this might be the time to wonder if ‘broad patterns’ can adequately capture the complexities of any mood, let alone a dysphoric one. Another group, this time in Boston, is taking a slightly different approach: ‘Rather than detecting a particular mood or mental illness, they want to map the brain activity associated with behaviours that are present in multiple disorders — such as difficulties with concentration and empathy.’ If anything, that sounds even broader -more unlikely to specifically hit the neural bullseye. But, I know, I know –it’s early yet. The work is just beginning… And yet, if there ever was a methodology more susceptible to causing collateral damage, and unintended, unforeseeable consequences, or one that might fall more afoul of a hospital’s ethics committee, I can’t think of it.

For example, ‘One challenge with stimulating areas of the brain associated with mood … is the possibility of overcorrecting emotions to create extreme happiness that overwhelms all other feelings. Other ethical considerations arise from the fact that the algorithms used in closed-loop stimulation can tell the researchers about the person’s mood, beyond what may be visible from behaviour or facial expressions. While researchers won’t be able to read people’s minds, “we will have access to activity that encodes their feelings,” says  Alik Widge, a neuroengineer and psychiatrist at Harvard University in Cambridge, Massachusetts, and engineering director of the MGH [Massachusetts General Hospital] team.’ Great! I assume they’ve read Orwell, for some tips.

It’s one of the great conundrums of Science, though, isn’t it? When one stretches societal orthodoxy, and approaches the edge of the reigning ethical paradigm, how should one proceed? I don’t believe merely assuming that someone else, somewhere else, and sometime else will undoubtedly forge ahead with the same knowledge, is a sufficient reason to proceed. It seems to me that in the current climate of public scientific skepticism, it would be best to tread carefully. Science succeeds best when it is funded, fêted, and understood, not obscured by clouds of suspicion or plagued by doubt -not to mention mistrust. Just look at how genetically modified foods are regarded in many countries. Or vaccinations. Or climate change…

Of course, the rewards of successful and innovative procedures are great, but so is the damage if they fail. A promise broken is more noteworthy, more disconcerting, than a promise never made.

Time for a thought experiment. Suppose I’ve advertised myself as an expert in computer hardware and you come to me with particularly vexing problem that nobody else seemed to be able to fix. You tell me there is a semi-autobiographical novel about your life that you’d been writing in your spare time for years, stored somewhere inside your laptop that you can no longer access. Nothing was backed up elsewhere –you never thought it would be necessary- and now, of course, it’s too late for that. The computer won’t even work, and you’re desperate.

I have a cursory look at the model and the year, and assure you that I know enough about the mechanisms in the computer to get it working again.

So you come back in a couple of weeks to pick it up. “Were you able to fix it?” is the first thing you say when you come in the door.

I smile and nod my head slowly. Sagely. “It was tougher than I thought,” I say. “But I was finally able to get it running again.”

“Yes, but does it work? What about the contents? What about my novel…?”

I try to keep my expression neutral as befits an expert talking to someone who knows nothing about how complex the circuitry in a computer can be. “Well,” I explain, “It was really damaged, you know. I don’t know what you did to it… but a lot of it was beyond repair.”

“But…”

“But I managed to salvage quite a bit of the function. The word processor works now –you can continue writing your novel.”

You look at me with a puzzled expression. “I thought you said you could fix it -the area where my novel is…”

I smile and hand you back the computer. “I did fix it. You can write again -just like before.”

“All that information… all those stories… They’re gone?”

I nod pleasantly, the smile on my face broadening. “But without my work you wouldn’t have had them either, remember. I’ve given you the opportunity to write some more.”

“But… But was stored in there,” you say, pointing at the laptop in front of you on the counter. “How do I know who I am now?”

“You’re the person who has been given the chance to start again.”

Sometimes that’s enough, I suppose…

 

 

 

 

 

 

 

 

 

A Pound of Flesh?

 

I’m retired now, and my kids have long since passed the age when, even if I were so disposed, I would dare lay a hand on either them or their children. But of course I wouldn’t -parenting wasn’t like that in my family.

I suspect I rarely hung out in the Goldilocks zone in childhood. I was prey to all of the usual temptations on offer in a 1950ies Winnipeg, but it’s unclear to me just what things I would have to have done to require corporal punishments. I realize that sounds naïve, even all these years later, but my father was not quick with the hand. In fact, on the one occasion he resorted to it, he seemed more upset by it than me, his recalcitrant offspring. And anyway, I think it was my mother’s idea that he wreak some stronger retribution than she could inflict on me with her voice.

My mother was into noise, actually. I imagine I was a frustrating child for her and she would resort to yelling fits when things didn’t go well. Clearly I have a limited, and no doubt statistically insignificant data set when it comes to the effects of corporal punishment, but I would venture to say that I feared my mother’s mouth far more than my father’s hand. My mother’s facial expression bespoke rage, my father’s, though, suggested sorrow -betrayal…

But I do not mean to disparage either of them, nor to suggest that they meted out cruel and unusual punishments under duress -I’m sure they were well-intentioned. And anyway, anecdotal evidence is a poor substitute for well-designed research, so I was pleased to see a more recent attempt to summarize what has been learned about the effects of, in this case, corporally disciplining children: https://theconversation.com/why-parents-should-never-spank-children-85962 The article was co-written by Tracie O. Afifi, Associate Professor, University of Manitoba, and Elisa Romano, Full Professor of Clinical Psychology, University of Ottawa.

‘The use of spanking has been hotly debated over the last several decades. Supporters state that it is safe, necessary and effective; opponents argue that spanking is harmful to children and violates their human rights to protection.’ But despite how common and widespread its use, it has been banned in 53 countries and states throughout the world. ‘The research clearly shows that spanking is related to an increased likelihood of many poor health, social and developmental outcomes. These poor outcomes include mental health problems, substance use, suicide attempts and physical health conditions along with developmental, behavioural, social and cognitive problems. Equally important, there are no research studies showing that spanking is beneficial for children.’ And, indeed,  ‘An updated meta-analysis was most recently published in 2016. This reviewed and analyzed 75 studies from the previous 13 years, concluding that there was no evidence that spanking improved child behaviour and that spanking was associated with an increased risk of 13 detrimental outcomes. These include aggression, antisocial behaviour, mental health problems and negative relationships with parents.’ I suspect there were other things going on in both intent and degree that might have confounded these studies and led to the negative outcomes, though -apples are simply not oranges, and beating or assaulting someone is not the same as striking a buttock with an open hand as a way to deter an unwanted behaviour.

Of course, the researchers hasten to add that ‘this does not make parents who have used spanking bad parents. In the past, we simply did not know the risks.’ I think that lets my father off the hook; I’m not so sure about my mother, though. It seems to me that it is all too easy to condemn corporal punishments, while ignoring –or, perhaps, paying less attention to- the other forms of discipline that, intuitively at least, might be expected to result in equally detrimental  consequences for a developing child. One of these, of course, is verbal haranguing.

I don’t believe that I was ever subject to verbal abuse, however. I was never demeaned, or insulted by my mother –just confronted with my miscreant behaviour, and anointed with the requisite guilt- but I can understand how it could get out of hand under different circumstances and with different personalities. I find that worrisome –alarming, in fact. It is a behaviour that could all too easily slip under the radar. Be explained away.

I recognize that parenting is stressful, and that we all come to it with different temperaments, different abilities to tolerate stress, and different support structures that could be called upon in times of intolerable tension, but I suppose that is just the point. I wrote about this a while ago: https://musingsonwomenshealth.com/2017/05/17/time-out-eh/

But I fear that it sometimes requires the patience of Job to stand-down enough to be able to socially isolate the misbehaving child with a time-out. It is clearly preferable to spanking, to be sure, but I still wonder if what precedes it may be just that verbal abuse it seeks to avoid.

So, given our human propensity to react unpredictably and often adversely to stress, what am I advocating? Well, I have to admit that I have neither the background, nor the temerity to suggest that I have any productive answers. But although the Conversation article I quoted above was focused on spanking –physical punishment- it contains some suggestions that I think would be applicable to other punitive modalities like verbal abuse and insults.

‘Research already shows some evidence that parenting programs specifically aimed at preventing physical punishment can be successful. Some evidence for reducing harsh parenting and physical punishment has been found for Parent-Child Interaction Therapy (PCIT), the Incredible Years (IY) program and the Nurse Family Partnership (NFP). Other promising home visiting initiatives and interventions taking place in community and paediatric settings are also being examined for proven effectiveness.’

I know –education, education, education… But sometimes education is merely making people aware that alternatives exist. That there could be support out there of which they may not have been aware -both with friends and in the community. Remember that African proverb: It takes a village to raise a child