Bad Samaritans?

I suspect this is an incredibly naïve, not to mention unpopular, opinion, but I suppose in these times of plague, I should be grateful we have borders -fences that keep them out, walls that keep us safe. But I’m not. I’ve always mistrusted borders: I’ve always been suspicious of boundaries that artificialize the denizens of one region -that privilege residents as opposed to non-residents, friends versus strangers, our needs compared to theirs.

Call me unworldly, but what makes me special, and you not so? It seems to me the italics I have used to mark differences, are as arbitrary as the differences they mark. We are all the same, and deserve the same consideration.

That said, we seem to be stuck with countries determined only to look after their own -even with the global crisis in which we find ourselves in these special, but frightening times. In a desperate attempt at historical recidivism, we are attempting a re-balkanization of the world.

But what is a country, anyway? And does it have a special providence -or provenance, for that matter? I happened upon an interesting essay by Charles Crawford, who once served as the UK Ambassador to Sarajevo and Belgrade discussing much the same thing: https://aeon.co/essays/who-gets-to-say-what-counts-as-a-country

As he writes -‘There are only two questions in politics: who decides? and who decides who decides? … Who gets to say what is or is not a country? For most of human history, nation states as we now recognise them did not exist. Territories were controlled by powerful local people, who in turn pledged allegiance to distant authorities, favouring whichever one their circumstances suited. In Europe, the tensions in this system eventually led to the Thirty Years’ War which… ended in 1648 with a thorough revision of the relationship between land, people and power. The resulting set of treaties, known as the Peace of Westphalia, introduced two novel ideas: sovereignty and territorial integrity. Kings and queens had ‘their’ people and associated territory; beyond their own borders, they should not meddle.’

Voila, the modern idea of states, with loyalties only to themselves. But embedded in the concept were at least two principles -two problems: ‘The first is self-determination: the idea that an identified ‘people’ has the right to run its own affairs within its own state. The other is territorial integrity: the notion that the borders of an existing state should be difficult to change.’ But borders soon spawned customs and attitudes that were different from those on the other side –theirs were different from ours, so they must be different from us. An oversimplification, to be sure, but nonetheless a helpful guide, perhaps.

Borders can change, of course, but not easily, and often not without considerable turmoil. Think of ‘the separation of Bangladesh from Pakistan in 1971 [which] claimed up to a million lives… Ambiguous ceasefires can drag on indefinitely. Taiwan and its 23 million inhabitants live in a curious twilight zone of international law, recognised by only 22 smaller countries and the Vatican.’ Examples of each, abound.

And not all borders were established to reconcile linguistic, ethnic, or religious differences. There are many examples, but perhaps the most egregious borders in modern times were those largely arbitrary ones in the Middle East drawn by two aristocrats Mark Sykes from Britain, and Francois Georges-Picot from France in 1916. As Wikipedia describes: ‘it was a secret agreement between Britain and France with assent from the Russian Empire and Italy, to define their mutually agreed spheres of influence and control in an eventual partition of the Ottoman Empire.’

A famous quotation that encapsulates the attitude was that of Sykes: ‘At a meeting in Downing Street, Mark Sykes pointed to a map and told the prime minister: “I should like to draw a line from the “e” in Acre to the last “k” in Kirkuk.”’-a straight line, more or less.

Crawford’s essay was intended to explain the continuing tensions in the Balkans, but it raises a pertinent question for these times -namely, ‘Should nations stay within their historical boundaries, or change as their populations do?’ Or, put another way, should boundaries remain impermeable to needs outside what I would term their arbitrary limits?

With the current pandemic, there are, no doubt, many reasons that could be offered for being selective at borders: family-first ones, by and large. We need to close our borders to support our own economy, feed our own people; in the midst of a global epidemic, it is not the time to sacrifice our own needs by offering altruism to others. Actually, it seems to me that the underlying belief is that migration -legal or otherwise- is a large contributor to the spread of the infection. But once a communicable virus is in the country, its own citizens also become vectors -and they far outnumber the number of refugees or migrants.

Rather than being focussed on borders and exclusion, efforts would likely be more intelligently spent on things like temporary isolation of any who may have been in areas where the epidemic may have been less controlled, and enforced social separation (social-distancing) of everybody else. Consistent, and frequently publicized advice and updates about new developments to educate the public -all the public- is key to managing fear. And epidemics -they have a habit of evolving rapidly.

And testing, testing, testing. Unless and until, we know who might have the infection and be a risk to others, we are essentially blinkered. It’s not the strangers among us who pose the risk, it’s those who are infected and either have no symptoms or who are at the earliest stages of an infection that has not yet had time to declare itself.

The World Health Organization (and others) have pointed out that travel restrictions not only divert resources from the containment effort, they also have human costs. ‘Travel measures that significantly interfere with international traffic may only be justified at the beginning of an outbreak, as they may allow countries to gain time, even if only a few days, to rapidly implement effective preparedness measures. Such restrictions must be based on a careful risk assessment, be proportionate to the public health risk, be short in duration, and be reconsidered regularly as the situation evolves. Travel bans to affected areas or denial of entry to passengers coming from affected areas are usually not effective in preventing the importation of cases but may have a significant economic and social impact.’ And, as all of us realize -and expect- by now: ‘Travellers returning from affected areas should self-monitor for symptoms for 14 days and follow national protocols of receiving countries.’ Amen.

Turning away migrants often has some desired political effects, however: diverting attention away from the receiving country’s possible lack of preparedness and foresight. It’s seldom about the Science and more about Nationalism -further stoking fears of the other.

I think that at the moment, we are forgetting, as was immortalized in that ancient Persian adage that, This, too, will pass. The pandemic will exhaust itself, and likely soon become both amenable to a vaccine and other medical therapy. And those affected will not soon forget -nor will those denied entry in their time of need. As our economies rebuild in its wake, we -and they- will need all the allies we can muster. Best to be remembered as a friend who helped, than someone who turned their back.

We really are all in this together. As one of my favourite poets, Kahlil Gibran writes, ‘You often say,I would give, but only to the deserving.” The trees in your orchard say not so… They give that they may live, for to withhold is to perish.’

Mind Trips

Does your mind ever behave as if you weren’t getting enough fibre in your diet? Does it ever seem to plug up with loge -or whatever the noun form of logy is? Mine does that whenever it doesn’t get sufficient exercise, I find -not enough thinking perhaps. On the other hand, even when I think of something to think, keeping it on track is more like trying to keep a cat on a trail: every time something passes by or rustles in the bushes, it’s off. I love the adventure, mind you, and yet I can’t help but wonder if it’s supposed to wander like that. I mean, is it a design flaw, or a sign of trouble in the pipes somewhere -a detour around a badly maintained section of road? Frankly I’m tired of trying to balance loge with rogue.

Of course, although it’s sometimes a problem in the dead of night when my eyes are unable to distract it, by and large it doesn’t care what’s going on around it, or in what direction it was originally pointed. My mind has a mind of its own.

But maybe all minds are like that; maybe we all have a naughty homunculus (or perhaps, a gynuncula) that sits at the steering wheel somewhere inside and veers from road to road with merry abandon. An insightful essay by Jamie Kreiner, from the University of Georgia, pointed out that the problem was also common amongst medieval monks: https://aeon.co/ideas/how-to-reduce-digital-distractions-advice-from-medieval-monks Who would have guessed?

‘They complained about being overloaded with information, and about how, even once you finally settled on something to read, it was easy to get bored and turn to something else… Their job, more than anything else, was to focus on divine communication: to read, to pray and sing, and to work to understand God, in order to improve the health of their souls and the souls of the people who supported them… The ideal was a mens intentus, a mind that was always and actively reaching out to its target.’

Of course, even with the best intent, their minds strayed; something had to be done about it. ‘When the mind wanders, the monastic theorists observed, it usually veers off into recent events. Cut back your commitments to serious stuff, and you’ll have fewer thoughts competing for your attention… Most Christians agreed that the body was a needy creature whose bottomless appetite for food, sex and comfort held back the mind from what mattered most.’ You can see where they’re going with this.

But there is a limit to the extent to which you can deprive a body, and so, ever mindful of the goal, they decided to turn the problems into solutions -well, sort of… ‘Part of monastic education involved learning how to form cartoonish cognitive figures, to help sharpen one’s mnemonic and meditative skills. The mind loves stimuli such as colour, gore, sex, violence, noise and wild gesticulations.’ So, ‘if a nun wanted to really learn something she’d read or heard, she would do this work herself, by rendering the material as a series of bizarre animations in her mind. The weirder the mnemonic devices the better – strangeness would make them easier to retrieve.’

The act of producing these memory aides was supposed to enable concentration and avoid distracting thoughts. Of course, as Kreiner points out, ‘caveat cogitator: the problem of concentration is recursive. Any strategy for sidestepping distraction calls for strategies on sidestepping distraction.’ I used to hate cleaning my teeth when I was a child, so my mother, no doubt fearful of the dental bills she would have to pay, asked me to try not to think of a puppy while I was brushing them. The idea was so bizarre, I’d end up inadvertently finishing the trip around my mouth while trying desperately to forget about the puppy… Uhmm, well maybe the medieval monks were not supposed to use distractions to fight distractions, but then again, my mother wasn’t running a religious institution, just a bathroom.

Anyway, as Kreiner also describes, ‘A more advanced method for concentrating was to build elaborate mental structures in the course of reading and thinking. Nuns, monks, preachers and the people they educated were always encouraged to visualise the material they were processing. A branchy tree or a finely feathered angel…  The point wasn’t to paint these pictures on parchment. It was to give the mind something to draw, to indulge its appetite for aesthetically interesting forms while sorting its ideas into some logical structure.’

Kreiner even teaches these medieval cognitive techniques to her students. She thinks that ‘Constructing complex mental apparatuses gives them a way to organise – and, in the process, analyse – material they need to learn for other classes. The process also keeps their minds occupied with something that feels palpable and riveting. Concentration and critical thinking, in this mode, feel less like a slog and more like a game.’

So, I thought I’d give it a try. I’m currently trying to learn coding -computer programming- and my mind, not understanding a thing about what I’m reading, tends to wander off. The whole thing seems so alien to an older person like me who was brought up on the words and metaphor of poetry that I despaired of ever being able to create stick-figures in it, let alone finely feathered angels -even from the Coding for Kids book I eventually digressed into.

There is, however, an ever-repeating figure that sticks with me from the book. The code I’m learning is called Python and helpfully, the figure that keeps greeting me is a smiley snake wearing a baseball cap. Just thinking about the smile and the cap, helps me to relax -makes me smile- even though if I saw it slithering across the table, my concentration would likely refocus on my legs, not its cap. But safely confined to the book, I find it centers me. It holds my attention like that puppy I had to try to forget. And when it’s hiding somewhere, I can hardly wait for it to reappear, because you never know with snakes, eh?

As for helping me focus on the code, well I can remember it’s called Python, at least… I mean, that’s something, right?

When beggars die there are no comets seen

When I was growing up, Death was a word I rarely had to use. I suppose that’s the thing about nuclear families: they sometimes privilege the unit at the expense of others in the kin. Occasionally, a distant relative I had never met succumbed, or there would be a report on the news of casualties somewhere from disaster or conflict, but mostly they happened there, and I was here -two very different places. I had no problem coping with it; apart from sadness, there seemed little else that would result –could result- from no longer being able to see the person in the flesh. Of course I felt devastated with the death of a pet, but in time, even that faded as other life unrolled around me. Sadness was an ache that healed.

I don’t mean to sound unfeeling, or indifferent to the suffering of others, it’s just that human Death wasn’t something I had to deal with in my youth. It was only later, with the death of my parents, that the full haze of sorrow descended like a cloud and persisted long after the ceremonies for each. I try to remember them as they were, and still trace events throughout their lives until they were no more. Neither funeral was the end of my memories about either of them, of course -each rite was just another marker in a book I could open any time. The ceremony was important, though: we all need an epilogue, a final demonstration that we mattered -a recognition that, in all of time, we did exist for a while.

Perhaps it’s a gift of Age, but what I’m only now coming to realize, is how devastating a person just disappearing without a trace can be. A thoughtful essay by Andy Owen brought thoughts about Death to my attention once again: https://aeon.co/essays/a-missing-person-is-like-a-story-without-an-ending

‘People aren’t meant to just disappear,’ he writes. ‘Disappearance can expose an existential fear in those left behind – what could make you question your self-esteem more than the knowledge that you could just go missing without a trace? This is why some states and armed groups have deliberately ‘disappeared’ those seen as their greatest threat.’

‘There are some common themes across time and cultures that make what Boss [U.S. psychologist Pauline Boss, who started working with the wives of missing US airmen in the 1970s] calls ambiguous loss – the loss of the missing, a loss that has never been fully confirmed – so difficult. One is the inability to perform the appropriate rituals or rites that help us manage loss.’ Another, of course, is not knowing whether or not the missing person is dead. Whether or not the relatives should mourn, or keep hoping…

‘In The Sense of an Ending (1967), the British literary critic Frank Kermode investigates our need to make sense of our lifespan with fictional stories that have an origin, a middle and an end. ‘What puts our mind at rest,’ he writes, ‘is the simple sequence.’ We are, all of us, stories with a universal grammar; it is not followed -not possible- with the missing. The ending is important in a story, it makes sense of the narrative. And if the ending is not there, we are sometimes forced to make one up.

Owen tells of the aftermath of the 2011 tsunami in Japan in which almost 20,000 people died or went missing. ‘Survivors of the disaster soon began seeing and feeling ghostly presences; people dressed in warm clothes at the height of summer, hailing taxis and then disappearing from the back seat.’ Stories need endings.

When I was still a child in a Winnipeg public school I had a friend named Russel Vásquez. He and his mother  had moved from somewhere in Mexico, I remember.

Russel, like me, was shy and although he didn’t talk very much at school, we lived across  the lane from each other and I would see him sitting by himself in his backyard pretending to read. I could tell he was pretending because he never turned any pages. It was as if a book gave him an excuse to sit by himself.

One day just after school had finished for the summer, I walked across the lane and called out his name. He smiled and put down the book and stood up to greet me. I don’t think he’d made many friends at school, but he seemed happy to talk to me. His English was heavily accented, and although at first he looked embarrassed, we soon realized we had a lot in common and became the kind of friends only children of that age can be. Before long we began trading information about our parents.

His mother was an teacher, like mine, but although I mentioned that my father was an accountant, Russel didn’t offer to tell me about his. Only when summer fully matured, and we had become best friends, did he admit that he didn’t know what had happened to his father. Both his parents had been involved in something that involved a lot of phone calls, and they got mad at him if he asked about it. One night after he’d gone to bed, Russel heard someone banging on the door to their house, and then what sounded like a fight. His mother had come running into his bedroom a few minutes later and told him somebody had taken his father so they had to leave right away.

“Didn’t your mother phone the police?” I asked.

He just shrugged. “Many gangs in my country, and I think some of police are work for them.”

“But did your father join you later?” I was aghast at what I’d heard -it was nothing like Winnipeg.

Russel shook his head slowly and I could tell he was on the verge of tears. “My mother no say it, but I think he probably dead… We no heard from him since he left,” he added in a soft voice.

“But…”

Russel stared at me for a moment. “We no talk about it now… Okay?”

I realized then that there are some things that even best friends can’t discuss.

But one day at the end of August, I remember Russel running excitedly across the lane when he saw me coming. “I see him today,” he yelled.

“See who?”

“My father!”

“Where?” It sounded too good to be true.

“With some people in the Eaton store,” he said, lowering his voice and looking around furtively. “But when I call him, maybe he no hear, and he disappear in the people again. I try to find him, but…”

“So what did your mother say?”

His face wrinkled. “She outside on a bench waiting for me, and when I tell her, she don’t believe me at first. And she look more worry than happy, you know. She tell me I can no have see him…”

Just then I heard his mother calling him from their back door and he shrugged apologetically. “She think I see things,” he explained. “That I see…” he searched for an English word, then gave up and shrugged again. “…fantasmas…” We both knew what mothers were like, so I just rolled my eyes.

I was too young at the time to understand, but Russel and his mother suddenly moved out of the house across the lane before school started that fall, and before he could even say goodbye.

I never heard from him again. Stories need endings…

 

 

 

 

 

 

 

 

 

 

 

 

A rarer spirit never did steer humanity

Okay, here’s a seemingly obvious and probably self-evident question: What constitutes personhood? I mean I assume that, until recently, it was something only bestowed on us -humans, that is- but what, exactly, is a person? And does the reason we were its exclusive possessors have anything to do with the fact that we are the bestowers? In United States law at any rate, a corporation -in that it has certain privileges, legal responsibilities, and is able to enter into contracts- may be considered a legal person. But even so, it is us that have granted it that status. We, alone, seem to be the arbiters of who gets into our club.

That we are both enamoured of our rank, and also the adjudicators of the contestants is a fine point, perhaps, And yet, there you have it: it’s our ball, so we get to decide who plays. We have decided it has to be a thing that can interact (with us), that has a sense of identity (as a self or as an entity), and that, presumably, can assume and accept responsibility for its actions.

Fair enough, I suppose, although I continue to wonder if those criteria are not a little too restrictive, their legal usefulness notwithstanding. I continue to suspect things like corporations and their vested interests getting the nod, whereas trees, or dogs, say, do not. I think it’s reasonable that some entities that seem to have some personal interest to me, and with which I interact, however indirectly, should qualify as something close to personhood at times: a tree that I pass each day and whose leaves I enjoy seeing dance in the wind, perhaps, or the peak of a mountain that I use to reference my location.

Okay, I realize those examples might be over-stretching the idea of personhood and diluting the whole purpose of the concept, but what if I have named each of them -given them an identity that draws them out of the background, and allows them to interact with me by fulfilling some need, however mundane or whimsical? And no, I don’t imagine the mountain peak whose position is guiding me out of the woods has any consciousness of itself or its purpose any more than an inuksuk in the barrens of northern Canada; it remains what it is: many things -or nothing- to whoever sees it. But, a potentially useful entity nonetheless. And for that matter, so is a corporation with which I have no dealings in another country, I suppose…

They are, each of them, metaphors in a way: things regarded as representatives or symbols of other things. Beneficial items whenever we might need them. And yet, are they persons?

The etymology of ‘person’, although complicated and disputed, is revealing, I think: the Online Etymology Dictionary describes person asa mask, a false face, such as those of wood or clay worn by the actors in later Roman theater. OED offers the general explanation of persona as “related to” Latin personare “to sound through” (i.e. the mask as something spoken through and perhaps amplifying the voice).’ Non-living entities, in other words, that in some situations pretend to be us.

I don’t mean to go overboard in my assignations of personhood, though -I suppose I only wish to defend my penchant for seeing agency in Nature. I recognize that I am inextricably entangled in its web and point out that it is me as much as I am it… So it was with some considerable relief that I discovered that I may not be sufficiently unique to necessitate a mention in the psychiatric DSM-5 bible. Thank you Aeon. https://aeon.co/ideas/a-rock-a-human-a-tree-all-were-persons-to-the-classic-maya

In an article for the online magazine, Sarah Jackson, an associate professor of Anthropology at the University  of Cincinnati in Ohio, wrote that ‘For the Maya of the Classic period, who lived in southern Mexico and Central America between 250 and 900 CE, the category of ‘persons’ was not coincident with human beings, as it is for us. That is, human beings were persons – but other, nonhuman entities could be persons, too… the ancient Maya experienced a world peopled by a variety of types of beings, who figured large in stories, imagery, social and ritual obligations, and community identities.’

She asks the intriguing question, ‘Do nonhuman persons need human beings to exist?’ For the Maya, ‘the answer was no. Nonhuman persons were not tethered to specific humans, and they did not derive their personhood from a connection with a human… In a Maya way of thinking, personhood is a resource in the world… The Maya saw personhood as ‘activated’ by experiencing certain bodily needs and through participation in certain social activities.’

But Jackson is careful to point out that for the Mayans it was not a magical world in which all of the things surrounding them were talking, or dispensing advice. ‘Rather, the experience would have been one of potentiality’ -rather like my mountain peak, I imagine. ‘they were prepared to recognise signs of personhood in a wide variety of places, and to respond appropriately when nonhuman entities signalled as such to them.’ Interestingly, ‘There’s one other element to consider, in blurring the boundaries of personhood. Personhood was a nonbinary proposition for the Maya. Entities were able to be persons while also being something else… they continue to be functional, doing what objects do (a stone implement continues to chop, an incense burner continues to do its smoky work). Furthermore, the Maya visually depicted many objects in ways that indicated the material category to which they belonged – drawings of the stone implement show that a person-tool is still made of stone.’

Jackson suggest that this idea is certainly of interest nowadays. ‘Challenging ourselves to illuminate assumptions about personhood (and its associated responsibilities and mutual obligations) sheds light on our own roles in constructing and deconstructing people, and the social and political consequences. Environment, race, immigration, civil discourse, gender identity, #MeToo: all of these topics link in some way to whom, or what, we value in comparison with our own experience of being a ‘person’, and our norms of what shared person-status means for action and interaction.’

Boundaries are porous -I like that; things are multifaceted, not forever confined to one identity -nothing need be either this, or that. It can shift, according to context, and perspective. According to need. My favourite mountain peak is a sleeping bear, by the way. I see it whenever I’m on the ferry and travelling from the island where I live to Vancouver. I miss it when I’m away…

Why then, can one desire too much of a good thing?

What have we done? Have we become so transfixed with definitions –differences– we have forgotten where we started? Where we want to go? Has the clarity for which we strived, opacified as it cooled? Sometimes the more encompassing the definition, the less useful it becomes.

I suppose that coming from the putative dark side -that is to say, the male portion of the equation- I am credentialed merely with Age and a world view encrusted with a particular zeitgeist; I come from an era of binaries, albeit with categories that allow for shades -rainbows which do not seek to define the boundaries where one colour fades into the next. They allow a melange without, I hope, troubling themselves with the constituents. Or am I being hopelessly naïve?

The more I am engaged with the issues of gendered literature, though, the more I suspect I have been misled all these past years. I have, of course, been aware of the lengthening gender acronym -LGBTQIA…- that threatens, like the the old lady who lived in the shoe in that Mother Goose rhyme, to outgrow its useful home. In its quest to include and define each shade of  difference -as laudable as that may seem on first glance- it threatens to fragment like shattered glass: useful to nobody as a container. I am, rather oddly, reminded of the advice of the Indian philosopher, Jiddu Krishnamurti, who felt that we should not attempt to categorize, or name something too finely -God, in his example: the name confines and overly limits the concept being promulgated.

The dangers of over-inclusion surfaced when I attempted to read an essay by Georgia Warnke, a professor of political science at the University of California, Riverside, published in Aeonhttps://aeon.co/essays/do-analytic-and-continental-philosophy-agree-what-woman-is

‘The famed online Stanford Encyclopedia of Philosophy offers separate articles on analytic and continental feminism (although with a separate article on intersections between the two). The article on analytic feminism notes its commitment to careful argumentation and to ‘the literal, precise, and clear use of language’, while that on continental feminism notes its interest in unveiling precisely those ‘non-discursive deep-seated biases and blind spots … not easily detected by an exclusive focus on the examination of arguments’. A few minutes of reflection suggested that neither my vocabulary nor my intellect may be up to the task, but I ploughed on, nonetheless -still curious about the subject.

‘The article on analytic feminism emphasises the importance of the philosophy of language, epistemology and logic; that on continental feminism the importance of postmodernism, psychoanalysis and phenomenology.’ Whoa. What was I asking my obviously non-postmodern brain to assimilate? It was only when I stumbled upon ‘we can begin with a core feminist question: namely, who or what are women? Who are the subjects to whose freedom and equality feminist philosophers are committed?’ that I sensed a meadow just through the trees and across the creek.

There have been waves of Feminist philosophy, ‘Yet for later feminists, references to sex and gender have served primarily to highlight differences between different groups of women, and to underscore the difficulty of defining women so as to include all those who ought to be included, and to exclude those who ought not.’ For example, take genetic sex. If a woman is restricted to somebody who possesses two X chromosomes, then what happens to trans women -or those who don’t see themselves as binarily constrained? Or those who have various abnormalities in the functioning of their hormones which might force them into a different category?

Is it all down to chromosomes then, or can we also include what they look like -or feel like, for that matter? The question, really, is about definitions it seems -equally applicable to the gendering of both chromosomal sexes. ‘When we turn to gender and define women as those who conform to certain socially and culturally prescribed behaviours, roles, attitudes and desires, we run into similar quandaries. Women possess different races, ethnicities, sexualities, religions and nationalities, and they belong to different socioeconomic classes… Such differences can give rise to different concerns and interests… For example, if emancipation for upper- and middle-class white American women who were historically discouraged from working outside the home involves the freedom to take on paid work, for American working-class women and women of colour who historically needed to or were required to work outside the home, emancipation might involve precisely the freedom to care full-time for one’s own family.’ I have to say, that’s a good point -I had not even considered that before. So is there anything that gendered women have in common?

One commonality, suggested by Sally Haslanger, a professor of philosophy and linguistics at MIT, is oppression. ‘To be a woman is to be subordinated in some way because of real or imagined biological features that are meant to indicate one’s female role in reproduction.’ In many ways, this can be inclusive of trans women, etc., but the problem point is somebody like the Queen of England: ‘if one is not subordinated at all or at least not because of presumptions about one’s biological role – perhaps the Queen of England – then one is not a woman according to this definition.’

There have been other attempts at inclusively defining a woman, of course. Simone de Beauvoir (the woman who was close to Sartre) felt that gender was a result of socialization, whereas Judith Butler, a professor of comparative literature at UC, Berkeley, saw it as ‘the imposition of a set of behavioural and attitudinal norms. She suggests that, as such, it is an effect of power.’ An interesting corollary of this, though, is that ‘the challenge turns out to be that women are themselves effects of power, so that emancipation from relations of power and subordination requires emancipation from being women.’

At this point, I have to say, I was beginning to feel like a kitten chasing its own tail. The arguments and counterarguments seemed self-defeating: lofty rhetoric full of sound and fury, yet signifying nothing, if I may borrow from Shakespeare’s Macbeth.

An attempt to escape from this paradox was suggested by Butler herself: ‘by replacing emancipation with what she calls ‘resignification’, a process of taking up the effects of power and redeploying them.   Although women are effects of power, this power is never accomplished once and for all but must be perpetually reinforced and, moreover, we reinforce it in the ways we act as gendered beings… But we can also behave in ways that undermine this supposed naturalness. We can poke fun at our gendered ways of acting and we can act differently. Drag performances, for example, can camp up stereotypical feminine modes of behaviours and by doing so demonstrate their performance elements.’

Now that struck me as ingenious -like ancient Greek theatre undressing the powerful for all to understand how much we all share in common. And anyway, my head was spinning by the time I reached that stage in the essay; I needed something to hold fast to -some sort of solution.

Maybe the suggestion about how drag performances demonstrate the foolishness of our stereotypes about sexual roles is a very apt observation. And yet, remember, we are, all of us, together in this world; we need only step back a bit to see how little official definitions matter. After all, whatever -or whoever- each of us thinks they are is all that matters in the end, isn’t it?  We are such stuff as dreams are made on… Aren’t we?

Learned without Opinion…

Sometimes we are almost too confident, aren’t we? Encouraged by something we’ve just read, and recognizing it as being already on file in our internal library, we congratulate ourselves on the depth and breadth of our scope. Perhaps it’s the title of an abstruse article, and even the picture at the top of the page that helped identify it. Or… was it something online, whose excellent graphics made it memorable? And, of course, what does it matter where you saw it? You’ve remembered it; it’s yours. Anyway, you know where to find it if the details are a bit fuzzy.

So, does that mean you know it -have thought it through? Analyzed it? Understood it…? Unfortunately, the answer is too often no. It’s merely filed somewhere, should the need arise. But knowledge, and especially the wisdom that might be expected to accompany it, is often lacking.

This was brought -worrisomely- to my attention in an article in Aeon. https://aeon.co/ideas/overvaluing-confidence-we-ve-forgotten-the-power-of-humility

Drawing, in part, on an essay by the psychologist Tania Lombrozo of the University of California –https://www.edge.org/response-detail/23731– Jacob Burak, the founder of Alaxon, a digital magazine about culture, art and popular science, writes ‘The internet and digital media have created the impression of limitless knowledge at our fingertips. But, by making us lazy, they have opened up a space that ignorance can fill.’ They both argue that ‘technology enhances our illusions of wisdom. She [Lombrozo] argues that the way we access information about an issue is critical to our understanding – and the more easily we can recall an image, word or statement, the more likely we’ll think we’ve successfully learned it, and so refrain from effortful cognitive processing.’

As Lombrozo writes, ‘people rely on a variety of cues in assessing their own understanding. Many of these cues involve the way in which information is accessed. For example, if you can easily (“fluently”) call an image, word, or statement to mind, you’re more likely to think that you’ve successfully learned it and to refrain from effortful cognitive processing. Fluency is sometimes a reliable guide to understanding, but it’s also easy to fool. Just presenting a problem in a font that’s harder to read can decrease fluency and trigger more effortful processing… It seems to follow that smarter and more efficient information retrieval on the part of machines could foster dumber and less effective information processing on the part of human minds.’ And furthermore, ‘educational psychologist Marcia Linn and her collaborators have studied the “deceptive clarity” that can result from complex scientific visualizations of the kind that technology in the classroom and on-line education are making ever more readily available. Such clarity can be deceptive because the transparency and memorability of the visualization is mistaken for genuine understanding.’

I don’t know about you, but I find all of this disturbing. Humbling, even. Not that I have ever been intellectually arrogant -that requires far more than I have ever had to offer- but it does make me pause to reflect on my own knowledge base, and the reliability of the conclusions derived from it.

So, ‘Are technological advances and illusions of understanding inevitably intertwined? Fortunately not. If a change in font or a delay in access can attenuate fluency, then a host of other minor tweaks to the way information is accessed and presented can surely do the same. In educational contexts, deceptive clarity can partially be overcome by introducing what psychologist Robert Bjork calls “desirable difficulties,” such as varying the conditions under which information is presented, delaying feedback, or engaging learners in generation and critique, which help disrupt a false sense of comprehension.’

To be honest, I’m not sure I know what to think of this. Presentation seems a key factor in memory for me -I remember books by the colours or patterns of their covers, for example. Seeing a book on a shelf often helps me remember, if not its exact contents, then at least how I felt about reading it. But I suppose the point of the article is that remembering is not necessarily understanding.

And yet, the book I see on the shelf may, in some fashion, have been incorporated into my thinking -changed something inside me. I’ve read quite a few books over the years, and been surprised, on re-reading them -or later, reading about them- that what I had learned from them was something totally different from what I suppose the author had likely intended.

A good example is Nobel Prize laureate Hermann Hesse’s Magister Ludi (the Glass Bead Game), which I read in the early 1960ies. I completely misremembered the plot (and no doubt the intention of the narrative) and for some reason was convinced that the whole purpose of this story was to suggest that a young student, Knecht, who had devoted his entire life to mastering the Game, comes to realize that his ambition was meaningless in the grand scheme of things -and near the end of the rather long novel, drowns himself as a result. Anybody who has actually read Magister Ludi, blinks in disbelief if I tell them what I remember of the story: I hadn’t really understood what Hesse had been trying to say, they tell me…

But, nonetheless, the novel had quite an effect on me. Because I remembered it the way I did, I began to realize how we come to rank our beliefs -prioritize our choices compared to those around us. So, was it worthwhile to train for years, dedicate his life, and eventually succeed in becoming the Master of a Glass Bead Game, for goodness sakes? And if he did, so what? Would that really make a difference anywhere, and to anybody?

For that matter, are there other choices that might have mattered more? How would you know? Maybe any choice is essentially the same: of equal value. I thought Hesse’s point terribly profound at the time -and still do, for that matter, despite the fact he probably didn’t intend my interpretation…

Perhaps you see what I am getting at: ‘understanding’ is multifaceted. I learned something important, despite my memory distorting the actual contents of what I read. I incorporated what I remembered as deeply meaningful, somehow. Was what I learned, however unintended, useful? Was it not a type of understanding of what might have been written between the lines? And even if not, the message I obtained was an epiphany. Can that be bad?

I’m certainly not arguing with Lombrozo, or Burak -their points are definitely intriguing and thought-provoking; I just worry that they are not all-encompassing -perhaps they overlook the side-effects. The unintended consequences. Maybe knowledge -and understanding- is less about what facts we retain, and more about what we glean from the exposure.

So, did I understand the novel? Perhaps not, but what I learned from it is now a part of me -and that’s just as valuable… What author, what teacher, could hope for more?

Masters of their fates?

Sentience is the present participle of the Latin verb sentire –‘to feel’- but what is it? What does it imply? Consciousness? Thought? Or merely some form of awareness of the surroundings, however indistinct and vague? Is avoidance of a noxious stimulus enough to establish sentience, or does it have to involve an understanding that it is harmful?

How about pain itself, then? What kind of a nervous system can feel pain -not just avoid damage, you understand, but feel it? Because surely feeling pain assumes some sort of an I who perceives it as pain rather than simply moves away reflexively… Are we back to consciousness again?

I suppose it’s easy to posit sentience in something like a dog, or a wary squirrel in whose eyes one can easily see that there is something/someone behind them looking out at the world. It’s more difficult as you move down the phylogenetic chain (if one even can, or should, assign direction or rank to changing phyla): easier with, say, lizards or crocodiles; more difficult with flies and mosquitoes; and impossible -for me, at least- with, oh, tapeworms or amoebae and their ilk.

Yes, and then there are the plants which react to stimuli, often in a purposive fashion -what do we do with them? What constitutes a feeling of pain -especially since they do not have what most of us would consider a nervous system (although their root structures and associated symbiotic fungal networks might qualify). Do plants feel some sort of proto-pain -and if they do, so what? The buck, if I may be allowed to paraphrase the sign on the previous American president Harry Truman’s desk, has to stop somewhere

So where do we draw the line with sentience? Is it entirely subjective (ours, at any rate)? Should it be confined to those things we would not think of stepping on or swatting? Or is it enough to be alive to merit consideration -different from a rock, for example?

I don’t know why I worry about such things, but I obviously do -especially when I come across essays like the one in Aeon written by Brandon Keim. https://aeon.co/essays/do-cyborg-cockroaches-dream-of-electric-trash

It was entitled I, cockroach, and delved into whether insects felt pain, or were conscious. The question occurred to him after reading about Backyard Brains, ‘a Kickstarter-funded neuroscience education company.’ The company’s flagship product is apparently RoboRoach, a ‘bundle of Bluetooth signal-processing microelectronics that’s glued to the back of a living cockroach and wired into the stumps of its cut-off antennae. Cockroaches use their antennae to detect objects; they react to electrical pulses sent through these nerves as though they have bumped into something, allowing children to remote‑control them with smartphones.’

I have to admit that I am appalled at this -although I suppose I would think little of swatting a cockroach crawling across the kitchen floor. The difference, I suspect, is somewhat akin to what Keim discusses: using a living creature as a tool in what might be -for the cockroach, at any rate- similar to some higher being wiring us up for whatever questionable purpose to change and study our behaviour and -who knows?- maybe change our reality. It’s hard not to sound overly anthropomorphic in describing my feelings about this, but there you have it.

‘A note on the company’s website does reassure customers that, though it’s unknown if insects feel pain, anaesthesia is used during procedures on cockroaches, and also on earthworms and grasshoppers involved in other experiments.’ But as I’ve already mentioned, and as Keim discusses, ‘You can’t experience pain unless there’s a you — a sense of self, an interior dialogue beyond the interplay of stimulus and involuntary response, elevating mechanics to consciousness. [And] such sentience is quite unlikely in a bug, says Backyard Brains.’ Really?

Even the likes of Darwin wondered about cognitive states in ‘lower’ creatures. In his final book, The Formation of Vegetable Mould Through the Action of Worms, with Observations on Their Habits (1881), he describes in great detail ‘how earthworms plug the entrance to their burrows with precisely chosen and arranged leaf fragments, and how instinct alone doesn’t plausibly explain that. ‘One alternative alone is left, namely, that worms, although standing low in the scale of organisation, possess some degree of intelligence.’

And no, as the more observant of my readers will no doubt have noted, worms are not cockroaches. Then how about honey bees as insect stand-ins for roaches? How about their waggle dances: ‘the complicated sequence of gestures by which honeybees convey the location and quality of food to hive-mates’? As Keim notes, ‘scientists have assembled a portrait of extraordinary cognitive richness, so rich that honeybees now serve as model organisms for understanding the neurobiology of basic cognition. Honeybees have a sense of time and of space; they have both short- and long-term memories. These memories combine sight and smell, and are available to bees independent of their immediate environments. In other words, they have internal representations of their worlds. They can learn to recognise patterns, and also concepts: above and below, same or different. They have simple emotions and beliefs, and apply those memories and concepts to their decisions. They likely recognise individuals.’

In fact, ‘Cognition is only one facet of mental activity, and not a stand-in for rich inner experience, but underlying honeybee cognition is [a] small but sophisticated brain, with structures that effectively perform similar functions as the mammalian cortex and thalamus — systems considered fundamental to human consciousness.’

I don’t want to take this too far. Thomas Nagel, the American philosopher, in his 1974 essay What is it like to be a bat? argued that ‘an organism has conscious mental states, “if and only if there is something that it is like to be that organism—something it is like for the organism to be itself.” (A fascinating paper, by the way, and well worth the read). But, coming back to cockroaches, as Keim writes, ‘The nature of their consciousness is difficult to ascertain, but we can at least imagine that it feels like something to be a bee or a cockroach or a cricket. That something is intertwined with their life histories, modes of perception, and neurological organisation’ -however impoverished that something might seem in comparison to our own perceptions. Indeed, maybe it would be something like our state of awareness in doing ‘mindless’ tasks like walking down stairs, or picking up a cup of coffee -both purposive, and yet likely unremarked consciously…

There’s even some evidence that cockroaches have a richer social life than most of us might have imagined. According to ethologist Mathieu Lihoreau in his 2012 article for the journal Insectes Sociaux, ‘one can think of them as living in herds. Groups decide collectively on where to feed and shelter, and there’s evidence of sophisticated communication, via chemical signals rather than dances. When kept in isolation, individual roaches develop behavioural disorders; they possess rich spatial memories, which they use to navigate; and they might even recognise group members on an individual basis.’

Maybe the famous English biologist J.B.S. Haldane got it right when, in 1927, he wrote that ‘the universe is not only queerer than we suppose, but queerer than we can suppose’. Then again, I suspect we tend to view things as peculiar or even alien if we feel no connection to them -feel that, as humans, we are not really a part of their world. But remember the words of Gloucester as he stumbles around the moor after being blinded by Regan and Cornwall in Shakespeare’s King Lear: ‘As flies to wanton boys are we to the gods; they kill us for their sport‘.

Who’s world are we in, exactly…?