Why then, can one desire too much of a good thing?

What have we done? Have we become so transfixed with definitions –differences– we have forgotten where we started? Where we want to go? Has the clarity for which we strived, opacified as it cooled? Sometimes the more encompassing the definition, the less useful it becomes.

I suppose that coming from the putative dark side -that is to say, the male portion of the equation- I am credentialed merely with Age and a world view encrusted with a particular zeitgeist; I come from an era of binaries, albeit with categories that allow for shades -rainbows which do not seek to define the boundaries where one colour fades into the next. They allow a melange without, I hope, troubling themselves with the constituents. Or am I being hopelessly naïve?

The more I am engaged with the issues of gendered literature, though, the more I suspect I have been misled all these past years. I have, of course, been aware of the lengthening gender acronym -LGBTQIA…- that threatens, like the the old lady who lived in the shoe in that Mother Goose rhyme, to outgrow its useful home. In its quest to include and define each shade of  difference -as laudable as that may seem on first glance- it threatens to fragment like shattered glass: useful to nobody as a container. I am, rather oddly, reminded of the advice of the Indian philosopher, Jiddu Krishnamurti, who felt that we should not attempt to categorize, or name something too finely -God, in his example: the name confines and overly limits the concept being promulgated.

The dangers of over-inclusion surfaced when I attempted to read an essay by Georgia Warnke, a professor of political science at the University of California, Riverside, published in Aeonhttps://aeon.co/essays/do-analytic-and-continental-philosophy-agree-what-woman-is

‘The famed online Stanford Encyclopedia of Philosophy offers separate articles on analytic and continental feminism (although with a separate article on intersections between the two). The article on analytic feminism notes its commitment to careful argumentation and to ‘the literal, precise, and clear use of language’, while that on continental feminism notes its interest in unveiling precisely those ‘non-discursive deep-seated biases and blind spots … not easily detected by an exclusive focus on the examination of arguments’. A few minutes of reflection suggested that neither my vocabulary nor my intellect may be up to the task, but I ploughed on, nonetheless -still curious about the subject.

‘The article on analytic feminism emphasises the importance of the philosophy of language, epistemology and logic; that on continental feminism the importance of postmodernism, psychoanalysis and phenomenology.’ Whoa. What was I asking my obviously non-postmodern brain to assimilate? It was only when I stumbled upon ‘we can begin with a core feminist question: namely, who or what are women? Who are the subjects to whose freedom and equality feminist philosophers are committed?’ that I sensed a meadow just through the trees and across the creek.

There have been waves of Feminist philosophy, ‘Yet for later feminists, references to sex and gender have served primarily to highlight differences between different groups of women, and to underscore the difficulty of defining women so as to include all those who ought to be included, and to exclude those who ought not.’ For example, take genetic sex. If a woman is restricted to somebody who possesses two X chromosomes, then what happens to trans women -or those who don’t see themselves as binarily constrained? Or those who have various abnormalities in the functioning of their hormones which might force them into a different category?

Is it all down to chromosomes then, or can we also include what they look like -or feel like, for that matter? The question, really, is about definitions it seems -equally applicable to the gendering of both chromosomal sexes. ‘When we turn to gender and define women as those who conform to certain socially and culturally prescribed behaviours, roles, attitudes and desires, we run into similar quandaries. Women possess different races, ethnicities, sexualities, religions and nationalities, and they belong to different socioeconomic classes… Such differences can give rise to different concerns and interests… For example, if emancipation for upper- and middle-class white American women who were historically discouraged from working outside the home involves the freedom to take on paid work, for American working-class women and women of colour who historically needed to or were required to work outside the home, emancipation might involve precisely the freedom to care full-time for one’s own family.’ I have to say, that’s a good point -I had not even considered that before. So is there anything that gendered women have in common?

One commonality, suggested by Sally Haslanger, a professor of philosophy and linguistics at MIT, is oppression. ‘To be a woman is to be subordinated in some way because of real or imagined biological features that are meant to indicate one’s female role in reproduction.’ In many ways, this can be inclusive of trans women, etc., but the problem point is somebody like the Queen of England: ‘if one is not subordinated at all or at least not because of presumptions about one’s biological role – perhaps the Queen of England – then one is not a woman according to this definition.’

There have been other attempts at inclusively defining a woman, of course. Simone de Beauvoir (the woman who was close to Sartre) felt that gender was a result of socialization, whereas Judith Butler, a professor of comparative literature at UC, Berkeley, saw it as ‘the imposition of a set of behavioural and attitudinal norms. She suggests that, as such, it is an effect of power.’ An interesting corollary of this, though, is that ‘the challenge turns out to be that women are themselves effects of power, so that emancipation from relations of power and subordination requires emancipation from being women.’

At this point, I have to say, I was beginning to feel like a kitten chasing its own tail. The arguments and counterarguments seemed self-defeating: lofty rhetoric full of sound and fury, yet signifying nothing, if I may borrow from Shakespeare’s Macbeth.

An attempt to escape from this paradox was suggested by Butler herself: ‘by replacing emancipation with what she calls ‘resignification’, a process of taking up the effects of power and redeploying them.   Although women are effects of power, this power is never accomplished once and for all but must be perpetually reinforced and, moreover, we reinforce it in the ways we act as gendered beings… But we can also behave in ways that undermine this supposed naturalness. We can poke fun at our gendered ways of acting and we can act differently. Drag performances, for example, can camp up stereotypical feminine modes of behaviours and by doing so demonstrate their performance elements.’

Now that struck me as ingenious -like ancient Greek theatre undressing the powerful for all to understand how much we all share in common. And anyway, my head was spinning by the time I reached that stage in the essay; I needed something to hold fast to -some sort of solution.

Maybe the suggestion about how drag performances demonstrate the foolishness of our stereotypes about sexual roles is a very apt observation. And yet, remember, we are, all of us, together in this world; we need only step back a bit to see how little official definitions matter. After all, whatever -or whoever- each of us thinks they are is all that matters in the end, isn’t it?  We are such stuff as dreams are made on… Aren’t we?

Learned without Opinion…

Sometimes we are almost too confident, aren’t we? Encouraged by something we’ve just read, and recognizing it as being already on file in our internal library, we congratulate ourselves on the depth and breadth of our scope. Perhaps it’s the title of an abstruse article, and even the picture at the top of the page that helped identify it. Or… was it something online, whose excellent graphics made it memorable? And, of course, what does it matter where you saw it? You’ve remembered it; it’s yours. Anyway, you know where to find it if the details are a bit fuzzy.

So, does that mean you know it -have thought it through? Analyzed it? Understood it…? Unfortunately, the answer is too often no. It’s merely filed somewhere, should the need arise. But knowledge, and especially the wisdom that might be expected to accompany it, is often lacking.

This was brought -worrisomely- to my attention in an article in Aeon. https://aeon.co/ideas/overvaluing-confidence-we-ve-forgotten-the-power-of-humility

Drawing, in part, on an essay by the psychologist Tania Lombrozo of the University of California –https://www.edge.org/response-detail/23731– Jacob Burak, the founder of Alaxon, a digital magazine about culture, art and popular science, writes ‘The internet and digital media have created the impression of limitless knowledge at our fingertips. But, by making us lazy, they have opened up a space that ignorance can fill.’ They both argue that ‘technology enhances our illusions of wisdom. She [Lombrozo] argues that the way we access information about an issue is critical to our understanding – and the more easily we can recall an image, word or statement, the more likely we’ll think we’ve successfully learned it, and so refrain from effortful cognitive processing.’

As Lombrozo writes, ‘people rely on a variety of cues in assessing their own understanding. Many of these cues involve the way in which information is accessed. For example, if you can easily (“fluently”) call an image, word, or statement to mind, you’re more likely to think that you’ve successfully learned it and to refrain from effortful cognitive processing. Fluency is sometimes a reliable guide to understanding, but it’s also easy to fool. Just presenting a problem in a font that’s harder to read can decrease fluency and trigger more effortful processing… It seems to follow that smarter and more efficient information retrieval on the part of machines could foster dumber and less effective information processing on the part of human minds.’ And furthermore, ‘educational psychologist Marcia Linn and her collaborators have studied the “deceptive clarity” that can result from complex scientific visualizations of the kind that technology in the classroom and on-line education are making ever more readily available. Such clarity can be deceptive because the transparency and memorability of the visualization is mistaken for genuine understanding.’

I don’t know about you, but I find all of this disturbing. Humbling, even. Not that I have ever been intellectually arrogant -that requires far more than I have ever had to offer- but it does make me pause to reflect on my own knowledge base, and the reliability of the conclusions derived from it.

So, ‘Are technological advances and illusions of understanding inevitably intertwined? Fortunately not. If a change in font or a delay in access can attenuate fluency, then a host of other minor tweaks to the way information is accessed and presented can surely do the same. In educational contexts, deceptive clarity can partially be overcome by introducing what psychologist Robert Bjork calls “desirable difficulties,” such as varying the conditions under which information is presented, delaying feedback, or engaging learners in generation and critique, which help disrupt a false sense of comprehension.’

To be honest, I’m not sure I know what to think of this. Presentation seems a key factor in memory for me -I remember books by the colours or patterns of their covers, for example. Seeing a book on a shelf often helps me remember, if not its exact contents, then at least how I felt about reading it. But I suppose the point of the article is that remembering is not necessarily understanding.

And yet, the book I see on the shelf may, in some fashion, have been incorporated into my thinking -changed something inside me. I’ve read quite a few books over the years, and been surprised, on re-reading them -or later, reading about them- that what I had learned from them was something totally different from what I suppose the author had likely intended.

A good example is Nobel Prize laureate Hermann Hesse’s Magister Ludi (the Glass Bead Game), which I read in the early 1960ies. I completely misremembered the plot (and no doubt the intention of the narrative) and for some reason was convinced that the whole purpose of this story was to suggest that a young student, Knecht, who had devoted his entire life to mastering the Game, comes to realize that his ambition was meaningless in the grand scheme of things -and near the end of the rather long novel, drowns himself as a result. Anybody who has actually read Magister Ludi, blinks in disbelief if I tell them what I remember of the story: I hadn’t really understood what Hesse had been trying to say, they tell me…

But, nonetheless, the novel had quite an effect on me. Because I remembered it the way I did, I began to realize how we come to rank our beliefs -prioritize our choices compared to those around us. So, was it worthwhile to train for years, dedicate his life, and eventually succeed in becoming the Master of a Glass Bead Game, for goodness sakes? And if he did, so what? Would that really make a difference anywhere, and to anybody?

For that matter, are there other choices that might have mattered more? How would you know? Maybe any choice is essentially the same: of equal value. I thought Hesse’s point terribly profound at the time -and still do, for that matter, despite the fact he probably didn’t intend my interpretation…

Perhaps you see what I am getting at: ‘understanding’ is multifaceted. I learned something important, despite my memory distorting the actual contents of what I read. I incorporated what I remembered as deeply meaningful, somehow. Was what I learned, however unintended, useful? Was it not a type of understanding of what might have been written between the lines? And even if not, the message I obtained was an epiphany. Can that be bad?

I’m certainly not arguing with Lombrozo, or Burak -their points are definitely intriguing and thought-provoking; I just worry that they are not all-encompassing -perhaps they overlook the side-effects. The unintended consequences. Maybe knowledge -and understanding- is less about what facts we retain, and more about what we glean from the exposure.

So, did I understand the novel? Perhaps not, but what I learned from it is now a part of me -and that’s just as valuable… What author, what teacher, could hope for more?

Masters of their fates?

Sentience is the present participle of the Latin verb sentire –‘to feel’- but what is it? What does it imply? Consciousness? Thought? Or merely some form of awareness of the surroundings, however indistinct and vague? Is avoidance of a noxious stimulus enough to establish sentience, or does it have to involve an understanding that it is harmful?

How about pain itself, then? What kind of a nervous system can feel pain -not just avoid damage, you understand, but feel it? Because surely feeling pain assumes some sort of an I who perceives it as pain rather than simply moves away reflexively… Are we back to consciousness again?

I suppose it’s easy to posit sentience in something like a dog, or a wary squirrel in whose eyes one can easily see that there is something/someone behind them looking out at the world. It’s more difficult as you move down the phylogenetic chain (if one even can, or should, assign direction or rank to changing phyla): easier with, say, lizards or crocodiles; more difficult with flies and mosquitoes; and impossible -for me, at least- with, oh, tapeworms or amoebae and their ilk.

Yes, and then there are the plants which react to stimuli, often in a purposive fashion -what do we do with them? What constitutes a feeling of pain -especially since they do not have what most of us would consider a nervous system (although their root structures and associated symbiotic fungal networks might qualify). Do plants feel some sort of proto-pain -and if they do, so what? The buck, if I may be allowed to paraphrase the sign on the previous American president Harry Truman’s desk, has to stop somewhere

So where do we draw the line with sentience? Is it entirely subjective (ours, at any rate)? Should it be confined to those things we would not think of stepping on or swatting? Or is it enough to be alive to merit consideration -different from a rock, for example?

I don’t know why I worry about such things, but I obviously do -especially when I come across essays like the one in Aeon written by Brandon Keim. https://aeon.co/essays/do-cyborg-cockroaches-dream-of-electric-trash

It was entitled I, cockroach, and delved into whether insects felt pain, or were conscious. The question occurred to him after reading about Backyard Brains, ‘a Kickstarter-funded neuroscience education company.’ The company’s flagship product is apparently RoboRoach, a ‘bundle of Bluetooth signal-processing microelectronics that’s glued to the back of a living cockroach and wired into the stumps of its cut-off antennae. Cockroaches use their antennae to detect objects; they react to electrical pulses sent through these nerves as though they have bumped into something, allowing children to remote‑control them with smartphones.’

I have to admit that I am appalled at this -although I suppose I would think little of swatting a cockroach crawling across the kitchen floor. The difference, I suspect, is somewhat akin to what Keim discusses: using a living creature as a tool in what might be -for the cockroach, at any rate- similar to some higher being wiring us up for whatever questionable purpose to change and study our behaviour and -who knows?- maybe change our reality. It’s hard not to sound overly anthropomorphic in describing my feelings about this, but there you have it.

‘A note on the company’s website does reassure customers that, though it’s unknown if insects feel pain, anaesthesia is used during procedures on cockroaches, and also on earthworms and grasshoppers involved in other experiments.’ But as I’ve already mentioned, and as Keim discusses, ‘You can’t experience pain unless there’s a you — a sense of self, an interior dialogue beyond the interplay of stimulus and involuntary response, elevating mechanics to consciousness. [And] such sentience is quite unlikely in a bug, says Backyard Brains.’ Really?

Even the likes of Darwin wondered about cognitive states in ‘lower’ creatures. In his final book, The Formation of Vegetable Mould Through the Action of Worms, with Observations on Their Habits (1881), he describes in great detail ‘how earthworms plug the entrance to their burrows with precisely chosen and arranged leaf fragments, and how instinct alone doesn’t plausibly explain that. ‘One alternative alone is left, namely, that worms, although standing low in the scale of organisation, possess some degree of intelligence.’

And no, as the more observant of my readers will no doubt have noted, worms are not cockroaches. Then how about honey bees as insect stand-ins for roaches? How about their waggle dances: ‘the complicated sequence of gestures by which honeybees convey the location and quality of food to hive-mates’? As Keim notes, ‘scientists have assembled a portrait of extraordinary cognitive richness, so rich that honeybees now serve as model organisms for understanding the neurobiology of basic cognition. Honeybees have a sense of time and of space; they have both short- and long-term memories. These memories combine sight and smell, and are available to bees independent of their immediate environments. In other words, they have internal representations of their worlds. They can learn to recognise patterns, and also concepts: above and below, same or different. They have simple emotions and beliefs, and apply those memories and concepts to their decisions. They likely recognise individuals.’

In fact, ‘Cognition is only one facet of mental activity, and not a stand-in for rich inner experience, but underlying honeybee cognition is [a] small but sophisticated brain, with structures that effectively perform similar functions as the mammalian cortex and thalamus — systems considered fundamental to human consciousness.’

I don’t want to take this too far. Thomas Nagel, the American philosopher, in his 1974 essay What is it like to be a bat? argued that ‘an organism has conscious mental states, “if and only if there is something that it is like to be that organism—something it is like for the organism to be itself.” (A fascinating paper, by the way, and well worth the read). But, coming back to cockroaches, as Keim writes, ‘The nature of their consciousness is difficult to ascertain, but we can at least imagine that it feels like something to be a bee or a cockroach or a cricket. That something is intertwined with their life histories, modes of perception, and neurological organisation’ -however impoverished that something might seem in comparison to our own perceptions. Indeed, maybe it would be something like our state of awareness in doing ‘mindless’ tasks like walking down stairs, or picking up a cup of coffee -both purposive, and yet likely unremarked consciously…

There’s even some evidence that cockroaches have a richer social life than most of us might have imagined. According to ethologist Mathieu Lihoreau in his 2012 article for the journal Insectes Sociaux, ‘one can think of them as living in herds. Groups decide collectively on where to feed and shelter, and there’s evidence of sophisticated communication, via chemical signals rather than dances. When kept in isolation, individual roaches develop behavioural disorders; they possess rich spatial memories, which they use to navigate; and they might even recognise group members on an individual basis.’

Maybe the famous English biologist J.B.S. Haldane got it right when, in 1927, he wrote that ‘the universe is not only queerer than we suppose, but queerer than we can suppose’. Then again, I suspect we tend to view things as peculiar or even alien if we feel no connection to them -feel that, as humans, we are not really a part of their world. But remember the words of Gloucester as he stumbles around the moor after being blinded by Regan and Cornwall in Shakespeare’s King Lear: ‘As flies to wanton boys are we to the gods; they kill us for their sport‘.

Who’s world are we in, exactly…?

A thousand times goodnight

Am I working against the grain? Or is it just that I’m getting older? Unable to assimilate new situations quickly enough to form a useful opinion? I’d rather think of it as the wisdom of Age, but, of course, I would think that, wouldn’t I? And yet, the realization that first impressions are often premature impressions is something only acquired through experience, I suppose, because it’s difficult to shed the initial suspicion that you may have discovered something really important.

I’m pretty sure I have never formed friends like that -friendship (as opposed to acquaintanceship) is acquired slowly, and over time. And as to something akin to ‘love at first sight’, I can only say that for those kinds of feelings to last -at least on my part- they have to be reciprocated. That, too, takes time. ‘Attraction at first sight’ is another thing altogether, though -it is more superficial, and probably less demanding. Love is a deep -dare I say, spiritual– thing, whereas I think attraction sits more tenuously on the rather slippery surface of our attention.

Still, I recognize that as the years slowly thicken around me, they may have dampened the restless partner-seeking vibrissae to which younger, thinner skin is so exposed. I’m not sure that I am completely disqualified, but at least my muffled needs have allowed me time to reflect before deciding -to breathe, before seeking to envelop…

And yet, I remain curious, if not vicariously attracted to the issue of first impressions, so I just had to read the BBC story that promised to unwrap it like a bedtime story from long ago: http://www.bbc.com/future/story/20190401-is-there-such-a-thing-as-love-at-first-sight

In an essay for BBC by William Park, he writes that ‘There is evidence that we are able to make an assessment of someone’s attractiveness in the blink of an eye, but it doesn’t necessarily mean that those assessments are accurate… It takes less than 1/10th of a second to form an assessment of someone’s face. These first impressions predict all kinds of important characteristics, not just attractiveness.’ And, ‘These impressions we make in a split second are not random; they tend to be shared by the majority of the people surveyed. But it doesn’t necessarily make them correct. “A first impression could be misleading,” says professor Alexander Todorov [an academic at Princeton University]… “We only make first impressions about strangers. So naturally they are superficial.”’

‘Whether our predictions are accurate or not, we make them quickly and we stick to them. Even if we are given more time than 1/10th of a second to judge the attractiveness of a face, we are unlikely to arrive at a different conclusion… There are three universal qualities that people infer from a face: attractiveness, trustworthiness and dominance. Evolutionarily, this makes sense. Attractiveness is a mating cue, trustworthiness implies useful social characteristics, like being able to care for children, and assessing dominance is useful to avoid conflict.’

So far, so good, I suppose -if a bit reductionist. But the essay goes on to suggest that we prejudge facial photos using the same categories and ‘portraits taken from a low angle are more likely to be judged as dominant, which is positive for men and negative for women. Whereas the reverse is seen in portraits taken from a high angle.’ -so, my first clue as to what kind of picture to put on a dating site, I guess. But there is a catch: ‘In dating apps, it is a case of love at second sight. When asked to rate the attractiveness of potential partners, if the preceding face was attractive you are more likely to rate the next face as attractive and vice versa.’

Well, that confirms my suspicion that online first impressions are such stuff as dreams are made on. ‘First impressions are rapid but shallow and mutable if you have better information.’ You have to talk to somebody, engage with them to sustain something more than a passing interest. And then, of course, it is no longer a ‘first’ impression. But, I’m only reiterating what Todorov  believes: ‘“The only way to tell whether two people will really like each other – they have to talk. People don’t make good predictions for compatibility without talking,” says Professor Todorov.’

Uhmm… I have to say that I began to lose interest at that point. I began to wonder, as I pointed out earlier, whether the essay was more about attraction, than love. It’s easy to get them mixed up in the soup of hormones in which we swim. In many ways, the article was a ‘how to’ for the young and restless. I was more intrigued by something  Park points out in the dying embers of his article when he quotes a professor of psychology from California State University, Los Angeles, Karen Wu. ‘Wu studies dating behaviours in Asian-American communities who put a different emphasis on certain values… “Western cultures value individual goals more than group goals. Collectivistic cultures might value niceness more because you’re interested in group benefits rather than individual benefits.”

In other words, ‘Considering this, it is a miracle that we ever find someone who is as attracted to us as we are to them. The conversation your potential partner had directly before meeting you, their general mood, their cultural background, the angle at which they are looking at you, whether they deem themselves to be more popular than you – all these factors could influence whether you hit it off seems endless.’

So, is it any wonder that Age seems like a vacation at the cottage? No compulsion to drive somewhere, and then get up the next day and drive someplace else. No need to worry about the angle from which you take your selfies, or whether the next individual who wanders past is judging you by the standards of the person with whom they last talked.

These all seem like minor things in the bigger picture, and yet they loom large in the quest for partnership, I suppose. Attractiveness, trustworthiness and dominance -is that what we’re expected -okay, designed– to glean from the first glance without even needing to break the ice with a smile or a kind word? Biologic atavisms, if you ask me… although I am seldom canvassed for that kind of opinion anymore. I’m not sure why.

This thing of darkness

I’m starting to wonder if I was misled during those halcyon days on my father’s knee. It was a time when heroes and villains were easily recognizable -like the white and black hats on the cowboys in the movies that were in vogue then. Apparently, I needed to know who to root for when I was a child -life was a battle between good and evil, God and the Devil. I thought that maybe all stories were supposed to be like that in one way or another -that, in fact, perhaps without that tension, there would be no story worth listening to -nothing and nobody to cheer for. But I’m getting ahead of myself.

I suppose it’s natural to assume that the early patterns learned are special -sacrosanct, even- when you are a child. Deviations are regarded with suspicion. Doubt. And even now, I usually expect there to be recognizable traits in the characters, and a direction of the storyline that helps me to pick out which side to pull for. So maybe it has been that way from the time stories were first told -the one inviolable rule to keep an audience attentive. In a chaotic world, people must have longed for order, justice -something to hope for, if only in the imaginary tales told in the flickering firelight as eyes watched hungrily from the forest. The eventual triumph of good over evil -surely that’s the purpose of a good story…

But it seems that in the glimmer of recent historical analysis, that’s not how it used to be. I came across a fascinating essay by Catherine Nichols which suggests that the Ancients did not see it quite like that: https://aeon.co/essays/why-is-pop-culture-obsessed-with-battles-between-good-and-evil

Entitled The good guy/bad guy myth, she writes that ‘In old folktales, no one fights for values. Individual stories might show the virtues of honesty or hospitality, but there’s no agreement among folktales about which actions are good or bad. When characters get their comeuppance for disobeying advice, for example, there is likely another similar story in which the protagonist survives only because he disobeys advice. Defending a consistent set of values is so central to the logic of newer plots that the stories themselves are often reshaped to create values for characters such as Thor and Loki – who in the 16th-century Icelandic Edda had personalities rather than consistent moral orientations.

‘Stories from an oral tradition never have anything like a modern good guy or bad guy in them,  despite their reputation for being moralising. In stories such as Jack and the Beanstalk or Sleeping Beauty, just who is the good guy? Jack is the protagonist we’re meant to root for, yet he has no ethical justification for stealing the giant’s things. Does Sleeping Beauty care about goodness? Does anyone fight crime? Even tales that can be made to seem like they are about good versus evil, such as the story of Cinderella, do not hinge on so simple a moral dichotomy. In traditional oral versions, Cinderella merely needs to be beautiful to make the story work. In the Three Little Pigs, neither pigs nor wolf deploy tactics that the other side wouldn’t stoop to. It’s just a question of who gets dinner first, not good versus evil.’

That’s interesting, don’t you think? Although I suspect the final verdict is not yet in, we have -all of us- been guilty of inadvertent historical revisionism: we have felt the need to interpret stories as having morals, unaware of the ‘historic shift that altered the nature of so many of our modern retellings of folklore, to wit: the idea that people on opposite sides of conflicts have different moral qualities, and fight over their values. That shift lies in the good guy/bad guy dichotomy, where people no longer fight over who gets dinner, or who gets Helen of Troy, but over who gets to change or improve society’s values.’ In other words, we are judging, or appending qualities, that originally may not have been intended.

‘The situation is more complex in epics such as The Iliad, which does have two ‘teams’, as well as characters who wrestle with moral meanings. But the teams don’t represent the clash of two sets of values in the same way that modern good guys and bad guys do. Neither Achilles nor Hector stands for values that the other side cannot abide, nor are they fighting to protect the world from the other team. They don’t symbolise anything but themselves and, though they talk about war often, they never cite their values as the reason to fight the good fight. The ostensibly moral face-off between good and evil is a recent invention that evolved in concert with modern nationalism – and, ultimately, it gives voice to a political vision not an ethical one.’

‘In her book The Hard Facts of the Grimms’ Fairy Tales (1987), the American scholar Maria Tatar remarks on the way that Wilhelm Grimm would slip in, say, adages about the importance of keeping promises. She argued that: ‘Rather than coming to terms with the absence of a moral order … he persisted in adding moral pronouncements even where there was no moral.’’

But now that I am in my autumn years, and have had time to reflect on these things, I think that Nichols only partially captures what’s really going on: although the revised stories incorporate values that have shaped (or been shaped by) society as a whole, they also encompass a whole range of feelings from my childhood, that I am rather loathe to abandon. Her analysis seems a bit too reductionist for me.

I suppose I am a product of my era, a result of its prevailing zeitgeist, and yet when I was a child, I think I wanted to believe, not so much in the continuing battle of Good with Evil (whatever either of those concepts mean), but more in the hope of an acceptable ending to the story. So, for me, the success of Goldilocks and the Three Bears, for example, didn’t hinge on a moral dichotomy -a Life-lesson. It’s a story that doesn’t really need that kind of tension. A child doesn’t impute evil to Goldilocks for eating Baby Bear’s porridge -okay, I didn’t, anyway. It left me with a warm, fuzzy feeling, and I don’t think I learned a lesson from it -I was entertained. I was fascinated. I was just a child sitting on a lap wanting to hear my father’s reassuring voice. Surely you have to have some stories that don’t teach you anything, except how to smile…

Of thinking too precisely on the event

The right not to know -now there’s an interesting concept in today’s competition for instant news, ‘breaking stories’, and the ever present titillation of factoids. It seems almost counterintuitive -why would anyone choose not to know something? Surely knowledge trumps ignorance. Surely Hamlet’s timeless question ‘Whethertis nobler in the mind to suffer the slings and arrows of outrageous fortune or to take arms against a sea of troubles and by opposing, end them?’ has been answered in the modern era: it is better to know what hides behind the door than to turn one’s back… Or is that just naïve? Hopelessly romantic?

And yet, at least in Medicine, there is the potential struggle between beneficence -promoting and advocating for the well-being of others and autonomy -the right of someone to determine their own fate. And when the two are in conflict, there is an ethical dilemma.

But what about the right not to know something? Something that neither party had any reason to anticipate, and of which ignorance could be disastrous? Does the knowledgeable party have an obligation to inform the other, even if they were instructed not to? In everyday affairs, that seems an unlikely scenario, but an interesting article in Aeon by the writer Emily Willingham outlines some examples from medical research that probably cross the line: https://aeon.co/ideas/the-right-to-know-or-not-know-the-data-from-medical-research

A blood sample targeting cholesterol for example, might show another, seemingly unrelated abnormality. Should your doctor tell you about it, even though the cholesterol value was normal? Of course she should, you would assume. ‘But what if the finding turned up in samples donated for medical research instead of taken for medical testing? … [The] UK Biobank offers a case in point. When participants submit samples to be mined for genetic information, they agree to receive no individual feedback about the results, and formally waive their right to know…’ But that seems unethical, does it not? ‘The reality is that the ‘right’ thing to do about these competing rights to know and not know – and to tell what you do know – varies depending on who’s guiding the discussion. For example, a clinician ordering a test and finding something incidental but worrisome is already in a patient-doctor relationship with at least a tacit agreement to inform. But a researcher collecting DNA samples for a big data biobank has formed no such relationship and made no such commitment.’

Still, there should be some way -perhaps a retroactive clause that would enable a researcher to inform. One way, for example, might be to recognize that ‘people who submit samples for research might benefit from the same process that’s provided to people undergoing genetic testing in the clinic. Genetic counselling is strongly recommended before such testing, and this kind of preparation for research participants could clarify their decisions as well. Investigators who engage with these data on the research side deserve similar preparation and attention to their rights. Before getting involved in such studies, they should be able to give informed consent to withholding findings that could affect a donor’s health. Study investigators should also be unable to link donors and results, removing the possibility of accidental informing, and lifting the burden of the knowledge.’

Of course, the problem doesn’t just start and stop with whether the study participant decides she doesn’t want to know, does it? If the problem has a genetic component, ‘what about the people who were never tested but who are genetic relatives to those with an identified risk or disease? … After all, your genes aren’t yours alone. You got them from your parents, and your biological children will get some of yours from you…  in reality, the revelations – and repercussions – can span generations.’

On a lighter note, I can’t help but be reminded of my friend Brien. Readers of some of my more retirement-centered feuilletons will recall that he is a rather eccentric individual who seems to enjoy living on his porch and watching the world go by, no matter the weather. Rain or shine, summer or winter, I see him ensconced in his seat with a beer in his hand and another one on the railing in case I happen by. A harmless sort, and barely noticed by those who amble past, he is not an infrastructure man. His porch ekes out an existence from day to day in terminal decline. Every time I visit, he assures me that because the sidewalk leading to the house is also deteriorating, it discourages unnecessary visitors. And those who brave the path -me, I suppose he means- know and accept the risks -especially of the dangerous and disintegrating steps onto the porch.

But the last time I was over there, I almost put my foot through a rotting board near his chair, and I felt it had gone too far; I thought he should know. “Brien,” I started, somewhat hesitantly, given his explicit instructions to avoid any criticism of his porchdom, “I just…”

But he silenced me with a regal wave of his beer-hand -a sure sign of displeasure. “You’re gonna tell me something bad, I just know it…”

“No… I was just going to suggest that…”

“Remember the rules, eh?”

Brien can be so annoying sometimes. I think he honestly believes that naming a problem -identifying it by whatever means- gives it the right, formerly denied to it -of existence. So I shrugged, and decided to acquiesce and similarly ignore its right to life. Autonomy, after all is a right as well.

I didn’t see Brien for a week or two, but when I next happened by, as I often do on my way to the store, he seemed unusually bulky on his chair. He was covered in a thick Hudson’s Bay blanket, of course, but I assumed the cold autumn wind had made him bring it out early this year.

He waved at me from the porch and told me not to worry about the steps anymore. And as I approached, I noticed they were brand new and ready for painting. In fact, as I neared the porch, I noticed some of the boards near his chair had been replaced as well.

“What’s going on?” I asked as soon as he handed me a beer.

He smiled and pulled back the blanket to show me the cast on his leg. “I decided to take your advice…”

“But, I never…”

He held up his hand to silence me. “Sometimes I can hear what you don’t say, G…” he interrupted, calling me by my nickname. “And just because I don’t want you to tell me, doesn’t mean I don’t want to know about it, eh?”

I’m still not sure I feel good about not telling him, though…

More than kin and less than kind

Am I really the me I think I am -the cogito ergo sum I have been led to believe? Or have I been naïve all these years in assuming my identity rests solely inside somewhere -in the uniqueness of my brain, maybe, or in the peculiarities of my experiences that no one else could ever hope to share in the same intimate fashion? Am I, in other words, a self-portrait?

I was raised in a society that values self-fulfillment as if were a birthright. Even the motto of my high school was ad maiora natus sum –‘I was born for better things’. Not we, you understand but I… me. And, of course, my teachers were only too happy to inculcate the values of independence and self-reliance in each and every one of us. Competitions on the sports field, and gradations in our marks, only heightened the feeling that each of us was separate, and in charge of our ranking, somehow. It seemed only natural -to some of us, at any rate- to see ourselves as nascent statues seeking our own pedestals.

There’s nothing wrong with that, I suppose, except that as I grew older and gained more experience, I began to realize that I was not alone. Much like my shadow that followed me everywhere, so did the world. Indeed, everything I did, and much of what I thought, was influenced by others -either by assimilation, or unwitting imitation. The opinion of others, although sometimes shunned, was more often modified and subsequently integrated as if by disguising it, I became the author. And yet, deep down, I realized that the parthenogenesis of ideas was largely fictive. I was swimming in the same waters as everyone else…

But it was not as traumatic as I might have predicted in my tutored youth. In fact, on reflection, it has been more affirming than repudiating, more reassuring than discouraging -almost as if I had finally been accepted as a member of something I had unconsciously coveted all along. I had not capitulated to something I had struggled against, but, instead of staring through its windows like a bewildered shopper, I was welcomed through the door.

But why? Why the initial reluctance to accept my membership in something to which I had always belonged? Some of the answers emerged in an online publication, Aeon, from an essay by Abeba Birhane, a cognitive science student at University College Dublin. https://aeon.co/ideas/descartes-was-wrong-a-person-is-a-person-through-other-persons

‘We know from everyday experience that a person is partly forged in the crucible of community. Relationships inform self-understanding. Who I am depends on many ‘others’: my family, my friends, my culture, my work colleagues… Even my most private and personal reflections are entangled with the perspectives and voices of different people, be it those who agree with me, those who criticise, or those who praise me.’

‘The 17th-century French philosopher [René Descartes] believed that a human being was essentially self-contained and self-sufficient; an inherently rational, mind-bound subject, who ought to encounter the world outside her head with skepticism.’ The only thing I can say for certain is that I am, because I am the entity able to conceptualize it. The rest of the world could be a dream -but not the dreamer… So this leaves the effects of anything else on us in a sort of limbo.

Of course others have tried to get around the problem: the 20th-century Russian philosopher Mikhail Bakhtin ‘believed that it was only through an encounter with another person that you could come to appreciate your own unique perspective and see yourself as a whole entity… Nothing simply is itself, outside the matrix of relationships in which it appears. Instead, being is an act or event that must happen in the space between the self and the world.’

I love that. It suggests that we derive our identity -our very existence as that identity- through our interactions, recognition and validation by others. Think of people in solitary confinement in prisons. ‘studies of such prisoners suggest that their sense of self dissolves if they are punished this way for long enough. Prisoners tend to suffer profound physical and psychological difficulties, such as confusion, anxiety, insomnia, feelings of inadequacy, and a distorted sense of time. Deprived of contact and interaction – the external perspective needed to consummate and sustain a coherent self-image – a person risks disappearing into non-existence.’

And it’s not just in prisons we can disappear. I met her at a bus stop -or, rather, she met me. I  happened to be the first person in a lengthy, but orderly queue waiting in the rain for a long overdue bus.

“I was actually first,” she said, staring at me defiantly. She was well dressed in a grey skirt and I could just see a frilly white blouse under her upmarket raincoat. Her short, dark hair was barely mussed in the wind and rain it was now enduring. “I was waiting over there… Out of the rain,” she added, as if to prove her point.

I smiled pleasantly at her as the bus pulled up. “I should have done the same,” I said, furling my embarrassingly inadequate umbrella.

“I didn’t want you to think I just came along, you know,” she persisted. “Sometimes people get really upset when they’ve been waiting in line…”

The way she said it made me think this probably wasn’t the first time she’d crashed a queue. “No, please go in front of me,” I said, trying to show I was not upset. “It’s raining and you don’t have an umbrella.”

She promptly boarded the already crowded bus, then signalled me to sit beside her on one of the only remaining seats. “Do you live in the city?” she asked as soon as I was settled. She looked anxious.

I nodded politely, thinking she was just looking for a way to start a conversation.

“A house…?”

I nodded again, but her eyes immediately landed painfully on my face.

“Landlord, or renter?”

“Excuse me?” It seemed like a trick question, and I was immediately wary.

“You live in a house,” she said slowly and carefully, as if I was hard of hearing.

I nodded, carefully.

“Are you renting it?”

This was getting uncomfortable. “Why do you ask?”

Her eyes scratched at my face for a moment, before flying off again. “Because the city is trying to institute rent controls.” She frowned as she said it.

I brushed her cheek with a quick glance before I stared at my lap. I still wasn’t sure why she was asking. “Do you think that’s a good thing…?” I asked, trying to seem tentative.

“Oh yes!” she hurried to answer. I could almost feel the exclamation mark hovering between us. “Landlords shouldn’t be able to take advantage of their tenants.”

“So, I take it you are a renter?” I said kindly.

“Of course! I’m a single woman now. I’ll never be able to buy a house…”

“Do you like the place you rent?” I asked, trying to change the subject a bit.

She blinked at me, probably wondering if I was trying to trap her, but she relaxed a little. “It’s a bit of a hovel, really. The fridge makes a noise and only one of the burners on the stove works. The walls need some paint, and the rug is frayed…” She sighed and fiddled with a button on her coat. “But it’s the only place I could find.” She looked at the person sitting in front of her for a moment. “I thought it had promise when I first saw it, though…”

I had obviously unleashed something.

“The landlord says he wants to fix it up.”

“Certainly sounds like it could use some work,” I said, smiling.

She glared at me for a moment, and then softened her expression. “That’s exactly what he said.” She glanced out of the window at the rain. “But then he said he would have to raise the rent to pay for it.”

“Do you think that’s fair,” I asked.

I could see she was about to say ‘no’, but she changed her mind and turned her head to look out of the window again. Finally, she shrugged. “We’re both caught, aren’t we? On the one hand, I don’t want to pay more, but on the other, I’d love to see the place fixed up.”

“But if rent controls come into effect, he’s also in a bind, isn’t he?”

She nodded sombrely. “I mean, they’re probably a good idea, but…”

“But they don’t work for you or your landlord…”

She sighed again, and then shrugged.

“Of course there is a way out, isn’t there?” I said, thinking I was just stating the obvious.

She nodded. “Let him fix the place, and pay more rent.”

“Could you not come to a fair compromise with him about the price? Or maybe agree to a gradual increase over, say, a year, or something?” I smiled conspiratorially at her. “After all, rent controls only kick in if you complain.”

I could see her eyes widen as she thought about it. She nodded her head, slowly, and a smile quietly spread over her face. “He’s actually a decent guy…” Suddenly she reached for the pull cord. “My goodness, we’ve been talking so much I almost missed my stop,” she said as she stood and squeezed past me. And then she turned to face me as she struggled through the people standing in the aisle. “I’m so glad I talked to you,” she said. “Thank you,” was the last I heard from her as she disappeared through the sea of dripping coats.

Sometimes it’s good to talk about your problems, I thought and smiled to myself, glad that I might have helped her. ‘A person is a person through other persons’ -wasn’t that the Zulu phrase Abeba Birhane had quoted in that article…?