Errare humanum est

After so many years distant from my university Philosophy courses, I have to admit that I’d come to believe that rationality is a process designed for avoiding mistakes. That to err is to have made a miscalculation in its undertaking. And given that we humans are prone to frequent miscalculations -or, to adopt the aphorism of our time, fall prey to unintended consequences- what does that say about our acumen, let alone our wisdom? Does our seemingly inherent ability to take the wrong path or deviate from the planned course of action, mean that we are too easily distracted? Too readily deceived? Or that we weren’t designed to act rationally?

These failures suggest that, far from being rational, we are at best, credulous about our abilities… or does it? To be able to be deceived, it is necessary to have arrived at some sort of  expectation of what is correct or appropriate in the first place. One cannot be fooled, if one doesn’t understand anything about what is happening. In a way, then, the ability to err, suggests that one has already developed a theory about how it should be -that the failure was not meaningless, in other words. Reasoning that comes to a different conclusion than one that has been widely accepted may still be reasoning.

In a democracy, there are usually several options from which to choose, but the outcome of a vote does not mean the other choices were wrong. It does not invalidate them, nor imply that they were irrational -it merely postpones their serious consideration to another time. That things change over the years does not negate the past; it does not suggest that those living in those benighted years were unable to think properly.

Many of these thoughts were highlighted in a somewhat obtuse essay I came across in Aeon written by Daniel Ward, a lawyer and PhD candidate in Cambridge University: https://aeon.co/essays/i-think-therefore-i-make-mistakes-and-change-my-mind

He writes of a dog watching a card trick being performed. ‘It will just ignore what it perceives as meaningless markings on bits of cardboard. Hence it is immune to deception.’ It has no idea what to expect, because it has no idea what is going on. There is no error in the dog’s mind, presumably, because ‘Susceptibility to error validates rather than detracts from rationality.’

For example, ‘Those who study the human visual system also draw a link between the capacity for error and the capacity for thought.’ But, the ability to be fooled by an optical illusion ‘demonstrates the success rather than the failure of the visual system. That your brain occasionally makes this kind of mistake is testament to the fact that it is doing complex, intelligent things that go beyond merely absorbing incoming sensory data. The antithesis of the view that normal, intelligent people are susceptible to error is a view that treats people as infallible.’ And we certainly aren’t that: ‘incapable of error in a wide range of matters, ranging from day-to-day decisions about how we spend our money to ideological commitments… Treating an individual’s attitudes and preferences as givens – as matters beyond debate or criticism – might seem to promote human dignity by forcing us to treat all views as equally worthy of respect. But such an outlook is likely, if anything, to have the opposite effect. This is because taking seriously a person’s capacity to make mistakes is critical to taking seriously their capacity for rationality. Only by recognising that people are capable of error can we properly value anyone’s goals or engage in rational debate.’

After all, if we had to assume that a rational person with whom we disagreed could not have made a mistake in their reasoning, then we could not depend on an intelligent debate to resolve the issue -only force. No, rationality does not preclude error in and of itself… And that’s okay.

“You do realize that I’ve put my shopping bag on there, don’t you…?” The elderly lady glared at me, and made no effort to move the bag from what I could see was the only empty seat on the bus.

Her statement was obviously correct and I had neither desire nor rhetorical skills, to contradict her assertion. I did, however, want to sit down. It had been a long day, and an even longer wait for the already crowded bus.

I decided to meet her challenging expression with a smile and a shrug, but to show her I hadn’t really given up, I continued to stand beside the almost-empty seat and waited for guilt to wreak its havoc on her conscience. Unfortunately she retrieved her eyes and sent them to scout the scenery outside her window. I was just another tree in a forest she did not deign to enter.

I sighed and was about to resign myself to a journey spent swaying on my feet, when I suddenly remembered something, and decided to try my luck again. “I imagine your bag is quite heavy,” I started, pretending I just wanted to engage her in idle conversation. Actually, I was hoping to cash in on a program about logical argumentation in a podcast I’d downloaded from the BBC.

She dragged her eyes back from the window and plonked them on one of my ears. Her lips said nothing, but her face told me to mind my own business.

“My backpack is also heavy,” I continued, hoping I could build on the premise. “And,” I added, trying to twinkle my eyes, “there’s a bit of room left on the seat…” I cleverly added the ellipsis to show there was a conclusion inherent in my prologue.

Her eyes continued to grill me, but her forehead was beginning to wrinkle -so were her lips, for that matter. “And you think that I will be convinced by a faulty syllogism?”

“Which premise was faulty?” I suddenly realized that my memory of the podcast was sketchy at best, so I hoped I had understood the thrust of her rebuttal.

A tiny smile appeared on her face. “It was more the assumption that my bag was heavy, than that because there was room left on the seat, your also-heavy backpack deserved a place beside it.”

I thought about that for a moment. Did I flaw the first chance I’d had for engaging in a public rhetorical challenge? Did I waste the podcast?

I must have looked perplexed because her smile suddenly blossomed and she feathered her shopping bag onto her lap as if it were almost empty. “You passed the test,” she said and chuckled.

“Test…?”

Her eyes tapped briefly on my face and then flew off to other perches on the outside of the window. I wondered if she’d read the same article in Aeon.

Truth hath a quiet breast

What makes something ‘real’? For that matter, what does that even mean? Is a character in one of my favourite books any less real than what I remember of an uncle my family used to visit when I was a child? I used to wonder about that until I was old enough to be able to transition from pretending the space underneath the bed was a fort, to the understanding that it was somehow actually -and ‘really’- just a bed.

But imagination -so important to a child at play- assumes a different purpose as we age. It continues to offer an escape from the world around us perhaps, but in the cognitively unimpaired, begins to wear the patina of context -its potential seldom all-consuming, its boundaries identifiable.

And yet, for an adult living in a different perceptual Magisterium, the innocence of a child’s beliefs and the questions arising from them can be difficult to answer in kind. Once the heavy obligations of maturation have hardened the boundaries, even words may require translation, and unintended metaphors may have consequences.

I came across an interesting essay on this in Aeon in which a philosopher from Florida State University, Nathanael Stein, was wondering how to answer his young son’s queries about reality: https://aeon.co/essays/can-a-philosopher-explain-reality-and-make-believe-to-a-child

The difficulty seemed to be in deciding just what his son wanted to know. Was it simply a variation of the universal ‘Why?’ question, or something more deeply probing about reality itself?  As he notes, ‘there are surprisingly many ways of distinguishing what’s real from what isn’t. One of the most familiar contrasts we draw is between reality and appearance… reality is sometimes contrasted with what we might call mere appearance, like the motion we create on screens: pixels are turning on and off, and changing colour, so there’s change going on, but nothing that seems to be moving really is. This is different again from the kind of illusion of motion we get from certain patterns.’

We also distinguish ‘what’s real from what’s merely imagined or dreamt… what has existed at least at some time from what never has. Dinosaurs and ancestors are real in this last sense, but unicorns aren’t.’ His young son, though, was perhaps only trying to differentiate between what was ‘really’ real and what was only pretend-real, or make-believe.

Stein then goes on at length on discussing which of the several reality varieties his child was probably puzzled about, but ends up wondering if philosophy could ever solve the riddle for a non-adult. In fact, his concluding sentence seems to concede this point: ‘My son is only four, and by the time he’s able to explain what he means by Why?, he’ll have forgotten what puzzled him – if he hasn’t already.’

Stein’s difficulty in understanding the Lebenswelt of his son reminded me of a lengthy discussion I had many years ago with my similarly aged daughter.

“Daddy, what’s a ‘stralyer’?”

My daughter had a habit of coming up with sounds, part-words, and checking them out on me.

“You mean trailer, don’t you sweetheart? It’s a thing on wheels that you pull behind you…”

I could see a sly look come over her face as she prepared to correct me. “That’s a wagn, silly.”

Pronunciation was never a strong point with my children. “I asked you about a ‘stralyer’…”

Catherine was only about three feet tall then, so it was hard to look her in the eye without considerable effort. She also insisted on wearing at least one of her golden curls on her face -to hide behind if necessary. She wasn’t hiding, however, so I crouched down as best I could and tried to read her expression. Actually, I was trying to read her lips. She repeated the word with me about six inches away and nose level, but it didn’t help much.

“Where did you hear the word, Cath?” Sometimes you can trace these things.

“From Michael.”

I waited for an explanation, but Godot would have arrived before she caught on. “And what was Michael talking about?” I finally asked.  Michael is my son, and he was terribly precocious for nine, I think. His questions were worse, though, because I understood them.

Catherine looked at me as if I were inordinately dense. “About a ‘stralyer’, of course.”  Sometimes I saw too much of her mother in her, with her hands on her hips, one foot tapping impatiently, and an expression of utter condescension nailed to her forehead. Only with Catherine, it looked benign -comical, almost. They lived with their mother then, so I supposed neither of them would adopt any of my mannerisms.

Children are tautological creatures; they have the good sense to stick to their guns when all else -adults, by and large- fail them. “Ahh, you don’t happen to know what else Michael said, do you?”

She nodded her head vehemently, convinced she was getting somewhere at last.

“Well..?”

She just looked at me. Sometimes I wondered if she was really four, or whether she had forgotten something somewhere around two and a half.

Finally, she got the idea. “He said it was under something.”

That’s what I like about Catherine: just like her mother, she remembered only things that stick out: a flower outside a thousand year old French cathedral, the smell of Machu Pichu, the colour of the mud in Manaus… Context, for her, was merely the background against which the really important things were displayed.

“I don’t suppose he happened to mention what it was under, did he?”

She was silent for a moment -no mean feat for Catherine- and then a smile lit up her face and her eyes grew large. “Under the water, I think…”

There are only so many things that sound like trailer and are under stuff -especially water. I took a stab at it. “Australia?” I said in my best adult voice.

“That’s it, Daddy… What is it?”

“Well,” I said, not entirely sure how much she wanted to know, “it’s a country.”

“But we live in a country…”

“Yes.” I also nodded, to give it added strength.

I could see her playing with it for a while before leaving it on whatever shelf she filed such things -Catherine’s face was a movie screen sometimes. But after a minute between shows, I could see a new thought growing. “How many countries are there, Daddy?”

That’s a good question, actually. Does anybody know? I was so relieved that she hadn’t asked me what a country was that I offered to look it up. “Have you ever seen an atlas, Cath?”

A new word! She perked up immediately. “Anatlus? Nope… Is it what reindeer wear, Daddy?”

Where do kids get their ideas nowadays?  “Antlers are what reindeer have, Cath. Atlas is what I’m going to use to count the number of countries,” I said, but I don’t think it stuck. I think she liked the idea of finding countries on reindeer heads.

“But don’t the reindeer have to know where they’re going?”

“Huh?”

“You know. On Christmas eve.”

Actually the thought had never occurred to me. I guess I just figured they did it by the stars, or that Santa kind of navigated by instinct, or something. Kids aren’t satisfied with the old stories anymore. “Ahh, well maybe if you looked at the atlas you’d understand what I mean.”

Her eyes positively sparkled. “You mean you have some reindeer here?” She looked wide-eyed around the room, expecting to see a nose pop out of a closet any moment, I’m sure.

“Cath, we don’t seem to be getting anywhere. Just wait here, okay?” I went into the den and rummaged around for the atlas. It was an old Reader’s Digest variety -you know, solar system in the front few pages, then what each country does for a living and how many did it, at least in 1969. The rest was a smorgasbord of colors and names that brought back painful recollections of Miss Pleasance in Grade 4 or 5 and having to pronounce them in front of the whole class by memory. I could never say ‘Afghanistan’ and everybody would wait for it and laugh. Not Miss Pleasance, though. It’d just get me another turn the next day. I hated geography.

When I returned, Catherine was prowling through the cupboards and sniffing. I didn’t ask why. “This is an atlas, Cath,” I said proudly, holding it in front of me like a jewel.

She took one look at it and her face lost interest. “That’s just another book, Daddy,” she said, her voice pleading with me to say I was kidding.

“Just another book?” I pretended to be hurt. “Catherine, this is a genuine, nothing-else-is-remotely-like-it Reader’s Digest version of the world.”

Her eyes resumed their dinner-plate imitations and her mouth fell open. “The world! In there?” I had the sinking feeling that I’d lost again. “Lemme see,” she said grabbing the book firmly, but reverently from my hands.

I was pleased to see that she at least started from the front, but she whipped through the solar system at a breakneck pace and was half way through the gross national product of the Netherlands before she slowed down. “Awhh…” She leafed through a couple of pages of countries outlined in their pale reds and yellows, crammed with lines and unreadable letters and put the book down gently on the table. She looked at me -sadly, I thought- and shook her head. “Daddy,” she said slowly, and carefully, sounding for all the world like she was choosing her words carefully so as not to offend me. “Daddy, did you pay a lot for the anatlus?”

“Atlas,” I corrected as gently as I could. “No, not a whole lot. Why?”

“Well… I think you got gypped.”

“Huh?”

She stared at me and sighed with a little shake of her head -just like her mother used to do. “I saw the world on T.V. and it’s different.”

She was right, you know. And I’ll bet they pronounced Afghanistan correctly, too.

Wearing Life but as the fashion of a hat

Every once in a while I find that I am confronted by an idea which, even were I to have thought of it first, I would have put aside as of little relevance -or worse, of little consequence.

Clothing, has always been one of those for me: it’s something you wear, not something you are. And despite the desperate claims by Fashionistas that it reflects an inner self -or at least would, if you let it- I’ve always found the argument largely specious, and to reword Samuel Johnson’s quip about marriage, is a triumph of hope over expenditure.

And yet, I was drawn into an essay about clothes -albeit reluctantly- written by Shahida Bari, a lecturer in Romanticism at Queen Mary University of London, for Aeon. https://aeon.co/essays/why-does-philosophy-hold-clothes-in-such-low-regard?

I have to admit the article was not at all what I expected: I was neither deluged with praise for couture, nor subjected to shaming for my sartorial insouciance. At first, I was merely confused by her fascinating ruminations about clothes: ‘Ideas, we languidly suppose, are to be found in books and poems, visualised in buildings and paintings, exposited in philosophical propositions and mathematical deductions. They are taught in classrooms; expressed in language, number and diagram. Much trickier to accept is that clothes might also be understood as forms of thought, reflections and meditations as articulate as any poem or equation. What if the world could open up to us with the tug of a thread, its mysteries disentangling like a frayed hemline?’ What an utterly fascinating thought that what we wear is not merely a passive display, but has a voice of its own.

‘What if clothes were not simply reflective of personality, indicative of our banal preferences for grey over green, but more deeply imprinted with the ways that human beings have lived: a material record of our experiences and an expression of our ambition? What if we could understand the world in the perfect geometry of a notched lapel, the orderly measures of a pleated skirt, the stilled, skin-warmed perfection of a circlet of pearls?’

Do you see why I kept reading? The very idea that clothes have agency in and of themselves is powerful. She goes on to observe that ‘clothes are freighted with memory and meaning… In clothes, we are connected to other people and other places in complicated, powerful and unyielding ways, expressed in an idiom that is found everywhere, if only we care to read it.’

Bari seems to understand that ‘for all the abstract and elevated formulations of selfhood and the soul, our interior life is so often clothed… The garments we wear bear our secrets and betray us at every turn, revealing more than we can know or intend.’

But we cannot hide in clothes -as the poet Kahlil Gibran observes, ‘Your clothes conceal much of your beauty, yet they hide not the unbeautiful’. And Bari goes on to suggest that ‘to entrust to clothes the keeping of our secrets is a seduction in itself.’ I would have thought that this alone would have been fodder for the Philosophers, but as she goes on to explain, ‘the discipline of philosophy has rarely deigned to notice the knowledge to which dress makes claim, preferring instead to dwell on its associations with disguise and concealment.’

She seems to think that Plato had something to do with Philosophy’s aversion to treating clothes as a worthy adversary. ‘Haunted by Plato’s anxiety over how to distinguish truth from its ‘appearance’, and niggled by his injunction to see beyond an illusory ‘cave of shadows’ to a reality to which our back is turned, philosophy’s concept of truth is intractably aligned to ideas of light, revelation and disclosure.’

Still, in fairness, she turns her spotlight on various other philosophers and notes that although appearance has always been a fair topic for discussion, it has rarely concerned itself about physical appearance or dress. And yet, after a tedious, albeit poetically expressed, litany of the views on clothes of characters, both fictional and academic, she concludes with a one sentence précis that I think might have made her point much sooner: ‘Philosophy might have forgotten dress, but all that language cannot articulate – the life of the mind, the vagaries of the body – is there, ready to be read, waiting to be worn.’

I did enjoy her metaphors and evocative language, and I have to admit that, until the latter half of the journey, I was swept along quite contentedly in the current of her thoughts. It reminded me of a recent conversation of two women, both laden with large cloth bags who plonked themselves down beside me on a couch that break-watered the teeming throng of shoppers in a downtown mall. Both were middle-aged, and both spread themselves out as if I wasn’t there.

I’m not keen on being jostled on a seat, and was about to launch myself into the chaotic tide of passing elbows when I saw the woman next to me pull some garish fabric partly out of her bag to show it to her friend.

“What d’ya think Jesse?” she asked, stuffing whatever it was back in her bag once Jesse had seen it.

Jesse looked frazzled by the crowds, and her once-coiffed, greying hair floated in little strands from her head while her eyes stayed anchored on her face. “Colour’s interesting, Paula…” she said, after a noticeable pause.

“It’s a statement, Jess…” She relaxed her buxom frame further into the couch and settled an elbow into my rib without seeming to notice the infringement. “I think it’s time people noticed me.”

Jesse blinked and a weak smile surfaced on her lips for a moment. “I don’t think you need the hat, dear,” she added, as tactfully as the situation allowed.

I could see Paula’s eyes harden, and then the pressure on my rib cage lessened briefly as her hand searched for a pocket in her incredibly wrinkled ankle length coat for a Kleenex. She blew her nose untidily and then tried to stuff what was left of the tissue back in the coat somewhere, and her elbow back into my side. “What are you saying, mirror-child?” she shot back. Clearly they were both tired, but I was beginning to enjoy the exchange.

“Just that you don’t have to wear a sign to attract attention…”

Paula’s face somehow retracted further into itself and her eyes peered out through the bars of their lashes like caged animals. And then, just as suddenly, her expression softened, and she shifted the position of her elbow again. “Oh, you mean that blouse, I bought…?” A smile darted onto her lips and stayed there like a runner that had made it safely to second base. “It’s really more me, isn’t it?”

Jesse’s eyes twinkled mischievously as she nodded. “But I don’t think you should wear them together, do you…?”

I could feel, as well as see Paula sigh. “You’re right, dear,” she said, as they both struggled to their feet. “I’m someone else with the hat on, aren’t I?” Another smile surfaced briefly, like a seal. “But it’s always nice to have a choice, Jess,” Paula added, hefting her bag onto her shoulder. Then pulling her friend with her free hand, they both stepped into the ever-passing flood like branches falling together in a river and were swept away.

I think you learn a lot about philosophy in malls if you’re patient…

Should Life be a walking shadow?

I have to admit that I had not heard of the ‘attention economy’ before -never even thought about it like that, in fact. And yet, when I think about attention, I suppose I’ve always heard it used as a currency, a thing that was paid to a specified ‘other’ -the thing attended to, in other words. Inattention, was no attention at all; it was an aimless, uncontrolled drift that gathered nothing of importance, and hence was to be discouraged; it was the kind of activity that, if noticed, would inevitably evoke a ‘pay attention’ rebuke from the teacher in a classroom, thus reinforcing the idea that there was only one flavour of the concept: the directed attention.

Indeed, capturing attention is the way business works, isn’t it? Getting you to think about a particular product when the need arises, and making you aware of its attributes and benefits, is the only way to interest you in buying it. We tend to segment reality into little fragments that we grasp, one at a time, to sew into the fabric we call our day; from the moment we awaken until we slip, often exhausted, into sleep, we wear a patchwork quilt of attendances as our realities. Only in retrospect, should we ever choose to examine them, are we able to recognize the qualitative dissonances we have accepted as seamless.

Perhaps the perceptually dissolute may decide to comb the day for information, but somehow that act seems as bad as stripping a sentence of all its adjectives to get at the nouns and thereby losing the colour and vibrancy each was intended to wear. Sometimes attention misses the beauty of the forest by examining the trees too closely. At any rate, I’ve often thought there must be more than one variety of ‘attending to’…

I can’t help but notice people on the street walking past me wearing earphones, or staring at their phones, oblivious of my presence unless we bump. Do they see the series of comic faces in the clouds, or the hawk circling high above the little park? Do they notice the missing brick in the wall of the hospital across the street, or the sad old man leaning on the post outside the Emergency department? Would they stop to smell the flowers in the little pot beside the barbershop; do they wonder if the owner takes it in each night so it doesn’t disappear? Do they even care?

Do they feel the wind on their cheeks, and hear the rustle of leaves as it fights its way into the meadow? Do they see the squirrel that stares at them from a branch above their heads and wonder which tree he calls home? Even more important, do they notice that there are less birds singing in the trees above the trail in the nearby woods, and more litter along the way?

I suppose they are entangled in another, more important, world -and yet it’s probably the same one as mine, but without the distracting detours that make the journey as important as the destination. Of course I realize that few of us are monolithic, or so wedded to each moment that we are not tempted by diversions from time to time; we all daydream, I imagine, although some of us are more open to it than others; some of us are not wracked by guilt at the imagined loss of time that should have been better employed.

We have all, no doubt, travelled to someplace new to us, and been completely absorbed in the novelty of discovery: the unusual smells, the strangely loud buzz of traffic, or maybe the unanticipated imaginative architecture, or the flash of unfamiliar clothes hurrying by on unexpectedly familiar bodies. In those moments, we are immersed in the experience and only when the shock wears off do we re-emerge to attend to particulars and grasp at purpose.

Yes, I know I am not alone in seeking different ways of defining what it is to pay attention, but I have to say that I was delighted to find that someone had actually written an essay about it:

https://aeon.co/ideas/attention-is-not-a-resource-but-a-way-of-being-alive-to-the-world

Dan Nixon, a freelance writer and senior researcher at the Mindfulness Initiative in England writes that, ‘Talk of the attention economy relies on the notion of attention-as-resource: our attention is to be applied in the service of some goal.’ So ‘Our attention, when we fail to put it to use for our own objectives, becomes a tool to be used and exploited by others… However, conceiving of attention as a resource misses the fact that attention is not just useful. It’s more fundamental than that: attention is what joins us with the outside world. ‘Instrumentally’ attending is important, sure. But we also have the capacity to attend in a more ‘exploratory’ way: to be truly open to whatever we find before us, without any particular agenda.’

‘An instrumental mode of attention… tends to divide up whatever it’s presented with into component parts: to analyse and categorise things so that it can utilise them towards some ends.’ There is also, however, an exploratory way of attending: ‘a more embodied awareness, one that is open to whatever makes itself present before us, in all its fullness. This mode of attending comes into play, for instance, when we pay attention to other people, to the natural world and to works of art.’ And it’s this exploratory mode that likely offers us a broader and more inclusive way to experience reality: an attention-as-experience. In fact, it is probably ‘what the American philosopher William James had in mind in 1890 when he wrote that ‘what we attend to is reality’: the simple but profound idea that what we pay attention to, and how we pay attention, shapes our reality, moment to moment, day to day.’ And ‘It is also the exploratory mode of attention that can connect us to our deepest sense of purpose… the American Zen teacher David Loy characterises an unenlightened existence (samsara) as simply the state in which one’s attention becomes ‘trapped’ as it grasps from one thing to another, always looking for the next thing to latch on to. Nirvana, for Loy, is simply a free and open attention that is completely liberated from such fixations.’

I like the idea of liberation; I cherish the notion that by simply opening myself to what is going on around me, I am, in a sense experiencing what the French mystic Simone Weil called ‘the infinite in an instant’.  It sure beats grasping at Samsara straws.

Tis in ourselves that we are thus or thus

I must have learned a bit about phenomenology in Philosophy courses at university, but except for the fact that it has something to do with lived experience and consciousness, I have pretty well forgotten almost everything about it in the intervening years, I’m afraid. The name alone was enough for it to merit a place of its own in a dark corner of a barely reachable shelf inside my brain somewhere. Strange names like Husserl and Heidegger stand guard, but in all that time, they were relatively undisturbed by any neuronal probes -any interest whatsoever, in fact.

And now, in my yellow leaf, I’ve stumbled upon it once again, but this time in the context of health, ironically. Given that phenomenology purports to concern itself with experience, and nurses would like -and in fact, need– to understand the subjective experience of those under their care, it seems like a good, if somewhat awkward fit I suppose.

After more than 40 years in Medicine myself (as a specialist in Ob/Gyn) I recognize that it would be an advantage for all of us who deal with people with health needs, to understand how those individuals experience their worlds. But an essay written by Dan Zahavi, a professor of philosophy at both Oxford and the University of Copenhagen helped me to realize how nurses, especially, might benefit by looking at it from a more phenomenological perspective: https://aeon.co/essays/how-can-phenomenology-help-nurses-care-for-their-patients

‘By being interested in patient experience and striving to understand people’s experiences of health, illness and care, the discipline of nursing might have more affinities with the social sciences and its qualitative methods than with medicine and its reliance on the quantitative methods of the natural sciences. Indeed, if the aim is to provide proper care for, say, stroke patients, or patients with diabetes or Alzheimer’s disease, it is important to have some understanding of what it is like, subjectively, to live with such conditions, just as it is important to understand the meaning that patients attach to the events that disrupt their lives.’

This is not to diminish the role of Medicine in any way, but merely to suggest that Nursing and Medicine each have complementary roles in the provision of care. After all, ‘This focus on patient experience isn’t simply about monitoring (and increasing) patient satisfaction. It is about obtaining information that will allow for more adequate healthcare… one reason why nursing science became interested in phenomenology was precisely because the latter was seen as a resource that could bridge the gap between research and practice… It might, in short, help to ensure that the academic field of nursing research actually led to an improvement of nursing practice.’ Medical practice as well, but for now, let’s stick with Nursing.

The issue, however, is not to become too entangled with the competing nuances of the various philosophical movements that call Phenomenology home. Does it really matter, for example, that the philosopher Heidegger stressed ‘the ontological difference, inauthenticity, solicitude, average everydayness, thrownness and fallenness’ -whatever in the world that means? Or that  Jonathan Smith (a psychologist) ‘has argued that his own approach, which is called Interpretative Phenomenological Analysis (IPA), is phenomenological because it seeks to ‘explore the participant’s view of the world and to adopt, as far as is possible, an “insider’s perspective” of the phenomenon under study’?

How about Max Van Manen distinguishing ‘what he calls the heuristic, hermeneutic, experiential, methodological, eidetic, ontological, ethical, radical and originary reduction as important elements of the phenomenological method’? I mean, come on, eh?

As Zahavi sees it, ‘nursing research’s current use of phenomenology faces three challenges: it risks being too superficial by mistakenly thinking that phenomenology is simply about paying attention to experience; it risks being too philosophical by employing too many theoretical concepts with little clinical relevance; and it risks being misled by misguided methodological requirements.’

But, shouldn’t it be enough to extract what value you find in viewing the world from the point of view of the person under your care -call it what you will? A balance, please: a just-right-baby-bear, Goldilockean approach would do just fine, thank you.

As a now-retired doctor, I have worked with nurses all my career; we have always worked as a team, each with subtly overlapping roles, and yet I blush to admit that it wasn’t until I required a minor surgical procedure that I truly appreciated the difference.

One cold night, as I lay in bed with the covers pulled up to my chin for warmth, I noticed some lumps in my neck. Subsequent specialist medical consultation did little to reassure me -despite the delicacy and empathy with which the differential diagnosis was outlined for me. To further clarify whether the lumps were indeed malignant, as the consultant expected -and if so, their origin- a surgical biopsy would be required.

A speedy diagnosis was deemed essential so that treatment, if necessary, could be started as soon as possible. But there was apparently no expeditiously suitable time available in the operating theatre, so the consultant surgeon agreed to do it under local anaesthetic in the outpatient department of the hospital within the next day or so. That was fine with me -I just wanted a diagnosis.

What I hadn’t anticipated, however, was just how very anxious I would feel as I lay in one of the same rooms -and maybe on the same table- where I had performed many of the gynaecological procedures so common in my own practice. I knew the surgeon, and we talked pleasantly enough about our lives, and how often our specialties intersected. I knew he was trying to be empathetic and set me at ease, but we both realized there was an unbridgeable gap that separated us now, no matter the care we both took to disguise it: I was the patient -and not just a colleague. It’s difficult enough to be a patient, but perhaps even more so when the roles are suddenly reversed.

I knew the nurse in the room, of course -she had helped me on many occasions with the procedures I had booked in the department. But that day, her eyes were seldom far from mine, even though she was helping the surgeon set up some of his equipment. I could sense her concern whenever our eyes met -she’d always been attentive when she’d helped me before, and yet it was subtly different this time: she was dividing her attention between helping the surgeon and making sure I was okay.

But I wasn’t; I was terrified, although I tried my best to disguise it. Even though the local anaesthetic was working, I could still imagine what the surgeon was doing because of the subtle pressure changes I could feel on the skin distant from the lumps -you can’t freeze an entire neck. I tried not to tense any muscles in the area, but I suppose panic was starting to set in…

Suddenly, there it was: a hand gently grasping mine. The warmth of it, skin to skin, was soothing, reassuring, and although I couldn’t turn my head to look, I knew it was the nurse. I also realized she was aware of what I was going through –she had been all along, I sensed. She was living it herself in a way.

Until that moment, I don’t think I really understood the true value of rapport in caring for people. Of course I often used touch to reach out and connect with others in my own practice: on morning hospital visits to my patients after surgery or with new mothers and the babies I had helped deliver, and frequently in the office just to show anxious and fearful patients that I was listening and would try to help… That I wasn’t just a voice from the door, or on the other side of the desk.

And yet, that reassuring hand during the biopsy taught me something else: that there is more to compassion than a reassuring smile, more than just an offer of help. Care involves trying to understand what the other person is going through, and guiding them thoughtfully and kindly along the way. We can probably never really know the pain of another, but we can let them know we are trying.

If that is what Phenomenology offers, then by any other name, it would smell as sweet…

Let shame say what it will

Call me overly sensitive, but I don’t like to be shamed. There, I’ve said it. I suspect it is because shaming causes me to think less of myself: to feel humiliated, demeaned. And yet, there is another side to humiliation that seems to hide in the shadows: the feeling of humility – ‘This amounts not to thinking less of yourself but to thinking of yourself less. The person so ‘humiliated’ becomes less self-centred: her ethical concerns bear witness to a kind of revolution through which her own private and peculiar desires lose credence and authority, a diminution that finally allows her to take notice of what is positively owed to others.’- so writes Louise Chapman, a PhD candidate at the time in Philosophy at Pembroke College at the University of Cambridge in an essay on shaming in Aeon. https://aeon.co/essays/on-immanuel-kants-hydraulic-model-of-moral-education

‘Has the behaviour of another person ever made you feel ashamed? Not because they set out to shame you but because they acted so virtuously that it made you feel inadequate by comparison.  If so, then it is likely that, at least for a brief moment in time, you felt motivated to improve as a person.’

Perhaps, in the embarrassing circumstances of the moment of humiliation, I never stopped to think about it very deeply, but operating behind the scenes was a type of hydraulic system whereby ‘the elevation of one desire in a closed system causes a proportional diminution in another… the 18th-century German philosopher Immanuel Kant presents it as a useful metaphor for capturing the seesawing nature of real psychological forces. In his view, the subordination of self-interest removes, or at least diminishes, hindrances to willing the good. For Kant, the denigration of one’s pathological interests is thus tantamount to removing barriers to acting well. This pivotal mechanism of moral education could be classed as a form of sublimation or diversion, whereby inappropriate desires are channelled into higher pursuits.’

In more recent times, it was Sigmund Freud ‘who claimed that psychic energy can be redirected from lower aims to higher ones, at least when the patient herself recognises that the desiderative drive imperils her.’

This is where exemplary individuals come into the picture. ‘These are people who have the ability to cause profound shifts in the motivational landscapes of their spectators.’ But the exemplars should not serve as a model but only as proof that it really is possible to act in a better way.

Social Media nowadays provides an instructive example. It is tempting to rid ourselves –unfollow– those who continually post their successes, and yet ‘while they can stir up the pains of comparative humiliation, in so doing they strike down our tendency towards intellectual and physical torpor, thereby inspiring us to action.’ This could be termed a form of ‘appraisal respect’. We don’t have to engage with them, only to bear witness -and appreciate that we are not being manipulated if we see some merit in their success as an example for ourselves. In theory, at least, ‘Once the spectator has been shamed by the exemplar’s behaviour, external examples of morality are no longer necessary for continuing moral progress.’ Moral hydraulics.

Comparisons with others merely remind us of what we ourselves are capable of, and with continuing practice, can find ourselves achieving. But we do need reminders from time to time.

Take the old man I saw leaning against a lamppost on a main street in downtown Vancouver. It was a typically cool, wet, and windy day in autumn and I was snuggling into my umbrella trying to make the best of it. I almost bumped into him, but when a gust of rain suddenly tore at the umbrella, I jumped to the side in time. Dressed in a dirty brown baseball cap, a torn cloth jacket, and -judging by the cuffs that were rolled up many times- jeans that were obviously too large for him, he still managed a smile at the near collision.

It’s sometimes hard to judge the age of people who frequent the streets, but he looked old, and frail -someone who would have been sitting in a warm room somewhere, had Life not been so harsh on him. He did not have the look of a dissipated life -just an unfortunate one that had dealt him all the wrong cards.

“Spare some change…?” he rasped with an old man’s voice, then coughed as if the effort involved in speaking was too much for him. He sent his eyes to inspect my face, and they hovered over my cheeks like hopeful sparrows looking for a roost, then flittered away when they saw my expression.

I suppose his words caught me off-guard -embarrassed me, perhaps- and I merely pretended to listen, shook my head, and fought another gust of wind as I walked away. My first impression was distrust of the neighbourhood, and yet when I turned, warily -and, in truth, with guilt- to check behind me a few moments later, he was still there, the smile clinging to his face: a default expression – hoping, like its owner, for a reason to survive.

He looked so delicate, and elderly that I stopped, uncertain what to do. I was ashamed I had brushed him off so quickly, to tell the truth. His smile, I think, was what had disarmed me -that and the fleeting hope I’d seen written on his face at our chance encounter: an unexpected gift on a cold and blustery day on the street.

Something -perhaps his eyes, still heavy on my shoulders- made me turn to face him. His smile grew and his face crinkled happily at my change of heart. And when I reached him, his hand did not extend as if he expected a reward- just his eyes: two souls searching for my own to touch; two minds joining, if only for a moment in greeting.

I struggled for words, and all I could manage was an apology for being so insensitive. “I’m so ashamed,” I mumbled, reaching into my pocket. “It can’t be easy on the street…” I felt myself blushing as I pulled out the only bill I had -a crumpled ten- and handing it to him. I didn’t want him to think I was just expiating my guilt.

“Don’t be ashamed,” he said, evidently also embarrassed. “You came back… Most people don’t.” And he reached out and shook my hand like a long lost friend.

Looking back, I think he was what we all fear we might become some day. He was my face, in another’s mirror.

Does everything have meaning?

What is the meaning of rain? No, really -what, if anything does it mean? If we ask the same question of Life, we understand immediately the type of answer required, so what is different about rain? Both are processes, of sorts, although rain has the added advantage of also being a thing -both palpable and visible- I suppose. But should that disqualify it from having meaning?

Meaning is something that stretches beyond the thing described, and expands it in ways perhaps not obvious at first glance: beyond just descriptive definition, beyond attempts at capturing it with a synonym -those are mere tautologies and add little clarity beyond finding other words to say the same thing.

It would be all too tempting to resort to simply describing rain’s cause -its meteorological significance; or suggesting its value in the sustenance of Life -but these would only describe its purpose -what it does- not its meaning. There is surely more to rain than water falling from the sky, just as there is more to Life than growth, reproduction, and change.

No, it seems to me that meaning points to something else, and a grammatical equivalent might be something like a metaphor.

I suspect it was an essay in Aeon by Jeremy Mynott, an emeritus fellow at Wolfson College in Cambridge, that rekindled my wonder about meaning in the world around us: https://aeon.co/essays/the-ancient-world-teemed-with-birds-now-we-think-with-them

As he suggests, ‘Sometimes you need to look at things from outside to see them more clearly.’ And history can do that for many things -birds, for example. Before the days of over-population with its attendant pollution and habitat destruction, the much smaller aggregations of humanity were more intimately exposed to the perils -and beauty- that surrounded them.

‘The Mediterranean world of 2,500 years ago would have looked and sounded very different. Nightingales sang in the suburbs of Athens and Rome; wrynecks, hoopoes, cuckoos and orioles lived within city limits, along with a teeming host of warblers, buntings and finches; kites and ravens scavenged the city streets; owls, swifts and swallows nested on public buildings. In the countryside beyond, eagles and vultures soared overhead, while people could observe the migrations of cranes, storks and wildfowl. The cities themselves were in any case tiny by modern standards – ancient Athens, for example, had a population of about 120,000 at the height of its power in the 5th century BC.’

Things in nature impressed their physical presence on people’s daily lives to a degree now hard to imagine. ‘Not surprising either, therefore, that they also populated people’s minds and imaginations and re-emerged in their culture, language, myths and patterns of thought in some symbolic form.’ Some things -birds in his essay, at least- acquired a meaning beyond their mere physical presence.

Because Mynott is writing about the ‘meaning’ of birds, he goes on to describe how they became metaphors -there is ‘a simple progression from a descriptive fact about a bird (swallows migrate here the same time every spring), to a human comparison (that’s when we change what we wear, too) and then, in a natural and almost imperceptible segue, to making the bird represent something other than itself (a sign of spring, a reminder to start gardening, a valued guest). That is, a metaphor, something that ‘carries us across’ from one dimension of meaning to another.’

I think there is a very obvious parallel with other aspects of the natural world, too -rain, for example. And where he supplies examples of proverbs to bolster his contention of how the idea of birds has migrated into the realm of metaphor: ‘One swallow doesn’t make a summer’, there is certainly an equivalence in rain proverbs that do the same: ‘You need to save for a rainy day’, or ‘Rain does not fall on one roof alone’.

Metaphors work by having one thing stand symbolically for another, and by so doing, achieve a meaning far larger than the original.

When my children were young and beginning to learn the intricacies of language, they sounded very literal -so much so, that at times it was difficult to explain things to them without endlessly searching for another word to use for clarification: definition again. And yet, often they seemed to be searching for something more than description -and the perpetual ‘Why?’ questions that dog every parent are testament to that. No matter the skillfulness of the answer, it is seldom enough to satisfy their inner quest.

I’m not suggesting that this is necessarily indicative of children’s innate need for meaning so much as simple curiosity born of insufficient exposure to the world -or perhaps incipient mischievousness- but it is interesting that it seems to be a search for more than just a cursory explanation. Perhaps it is a developing awareness that there is more to reality than surface -an early, and tentative, exploration of Philosophy.

“Why does it rain, daddy?” my little daughter once asked. I remember the question because of her drive to understand more about rain.

“Well,” I started, unsure of the answer, to be honest, “… you know how sometimes the air around you feels wet in the summer?” I was on shaky ground already, but I pressed on when she nodded her head enthusiastically. “And sometimes if you look really hard you can see little water droplets on the window glass?”

I have to admit I was making it up as I went along, but her little face seemed so eager for more, I embellished it a bit. “Well, those drops appear when wet air touches something cool like the glass in the window. It’s called condensation,” I added, but more for my sake than hers, I think.

“So, is that where rain comes from, daddy?” She was obviously confused that windows didn’t usually rain.

“Uhmm, no but it was just a way of explaining that wet air sometimes condenses on cold things, and it’s really cold way up in the sky…”
“So…” I could almost see her processing the information behind her eyes. “So, are there windows up in the sky…?” That didn’t seem right to her, I could tell.

“No, but there are little particles of dust up there, and they’re really cold, so water droplets condense on them. And when there are a lot of them, you see them as clouds…” I was way beyond my depth, so I rather hoped she’d be satisfied with that. But I could see by her face that the machinery inside was still churning.

“So, clouds are rain before it falls…” There, I had told her all I knew about rain -more than I knew, in fact.

Suddenly, a large smile grew on her face, and her eyes twinkled mischievously. “You’re just kidding me, aren’t you daddy?”

My heart sank. We were walking along a trail in the woods at the time, and had stopped to rest in a little clearing; I hadn’t thought to bring an encyclopedia. I can still remember the flowers peeking through the grass like children thinking they could hide in plain sight and I shrugged to hide my embarrassment. “What do you mean, sweetheart?”

She grabbed my hand and looked up at my face. “There’s more to rain than clouds, daddy…”

I tried to look like the wise parent, but she was having none of it.

“Why do you say that, sweetie?” I said and held my breath.

She sighed and rolled her eyes like she’d seen me do so often. Then she pointed to an enormous fluffy cloud that was floating lazily just over our heads. “Miss Janzen at kindergarten says that rain happens when clouds cry…”

I didn’t know whether to nod in agreement -it was a kind of vindication of my explanation- or stay still, in case it was a trap.

She suddenly blinked and stared at the cloud. “You can tell that cloud doesn’t have any rain in it…” I smiled and waited for the explanation. “It looks happy, doesn’t it…?”

I’m not sure, but I suspect my daughter already knew about metaphors, even if she’d never heard the word… and perhaps she’d grasped the meaning of rain, as well…

Tomorrow, and tomorrow, and tomorrow

What is Time, if not a river flowing ever onwards from now -or from an ill-remembered ‘then’ to the same now? Of course, we all know the quotation attributed to Saint Augustine: What then is time? If no one asks me, I know what it is. If I wish to explain it to him who asks, I do not know – but that doesn’t get us very far. It neither allows events to be situated in time, nor allows us to appreciate its passage. Perhaps that’s unfair to ask of a denizen of the fourth century -saint, or no- but an orderly historical conception of time’s progression began long before his birth.

To more fully acknowledge the extent of time, one must be able to measure it -not so much mechanically, as calendrically. And, as Paul J. Kosmin pointed out in an article in Aeon, ‘from earliest recorded history right up to the years after Alexander the Great’s conquests in the late 4th century BCE, historical time – the public and annual marking of the passage of years – could be measured only in three ways: by unique events, by annual offices, or by royal lifecycles. https://aeon.co/essays/when-time-became-regular-and-universal-it-changed-history

‘In ancient Mesopotamia, years could be designated by an outstanding event of the preceding 12 months’ -presumably this would make the time frame more easily memorable. The more distant in time events occurred, the more difficult it would be to appreciate any surrounding context. And it would be meaningful only to those living in the country, or region, so unless forced by conquest or a shared natural disaster, uninterpretable by others.

Finally, though, ‘In the chaos that followed the death of Alexander the Great in Babylon in 323 BCE, all this changed. One of Alexander’s Macedonian generals, who would go on to win an enormous kingdom stretching from Bulgaria to Afghanistan, introduced a new system for reckoning the passage of time. It is known, after him, as the Seleucid Era. This was the world’s first continuous and irreversible tally of counted years. It is the unheralded ancestor of every subsequent era system, including the Christian Anno Domini system, our own Common Era, the Jewish Era of Creation, the Islamic Hijra, the French Revolutionary Era, and so on… For the first time in history, historical time was marked by a number that never restarted, reversed or stopped… Most importantly, as a regularly increasing number, the Seleucid Era permitted an entirely new kind of predictability… to confidently and accurately conceive, name and hold in the imagination a date several years, decades or centuries into the future.’

So, no matter what else happened, the year and all that happened in it was stable -and traceable. Nowadays that may not seem so amazing, but if you think about it, the perception of time itself changes when that happens: ‘Every event must be chained to its place in time before it becomes an available object of historical articulation. And the modes by which we date the world, by which we apprehend historical duration and the passage of time, frame how we experience our present, conceive a future, remember the past, reconcile with impermanence, and make sense of a world far wider, older and more enduring than any of us.’

Of course that’s not to imply the ancients had no concept of travelling through time, but with only remembered time posts as a guide, it made the journey more fraught -more circumscribed. And for many, it must have seemed like they were confined in a room whose walls were events they may not themselves have witnessed. Kosmin quotes a paragraph written by the Norwegian author Karl Ove Knausgård about the introduction of numeric time that describes it nicely: ‘It was as if a wall had been removed in the room they inhabited. The world no longer enveloped them completely. There was suddenly an opening … Their glance no longer met any resistance, but swept on and on through more of the same.’ We take the view for granted.

I remember visiting my grandmother in her final days in hospital. She was approaching her 100th year, and becoming increasingly lost as she wandered along the ever winding trail she’d taken through time. It was often difficult for her to pin down the order of some things, and yet her memory of other details seemed impeccable.

She recounted tales about her early life I had never heard before, but at times they seemed metaphorical -believable only in translation. She meant well, but I suspect that because she found dates elusive, she was trying to compensate with word pictures, and comparisons to tell her story. And then, like pre-Seleucid times, she would necessarily tack her story to past events.

When, ‘Do you remember when…?’ didn’t work because the incident was well before my time, she would resort to things like ‘When your mother was a little girl…’ or ‘The year Joe and I got married…’. Then, with no need to worry about correction, she would recount her version of what had happened.

I found it a delightful, albeit opaque, excursion along her personal timeline, but one I could never even hope to verify without considerable effort. Her description of their journey across the country to the west coast on a ‘pioneer train’ as she called it, was a good example.

“Whenever the train would stop to pick up water for the engine,” she said, “ it was our signal for the men to jump off the cars and search for firewood…”

I remember her eyes twinkling at the memory. “There were stoves in each car for cooking, so while the men were away, the women would rummage around in their trunks for the rice and beans we were told to pack for the trip.”

I remember being surprised at them having stoves on a moving train -I come from a railway family, and I’d never heard of such a thing. “When did you travel across the country, grandma?” I remember asking her, thinking maybe she meant the train would stop long enough for people to cook at the station, or wherever.

Her eyes looked inward for a while -whether to remember, or relive the experience, it was difficult to tell. “I remember seeing soldiers wandering around on some of the platforms, so maybe it was during the war…” And she shrugged.

That didn’t sound right. “But grandma, soldiers would have been going to the east coast -to Montreal or Halifax -not west to Vancouver…”

Her eyes cleared for a moment and she sent them to reprimand my face for its expression, all the while shaking her head at my inability to follow her story. “The soldiers didn’t get on our train. They were waiting for the next one, G… Try to pay attention, eh?” she added and then sighed like the woman I used to visit when I was young.

I shrugged, embarrassed for doubting her. “So, I suppose that was early in the First World War,” I said, mostly to myself, I suppose. I was trying to establish a time frame that made sense to me -a picture that I could pass on to my own children about her life.

Her eyes, though, were the real storytellers, and at times they seemed impatient as they watched from their increasingly bony redoubt. “I don’t remember the year, G, but you asked me to describe our journey across the country; the year’s not as important as the story, is it?”

Then, she smiled at the scolding and the grandmother of my childhood returned briefly. “You have to open a book to see what’s in it -the cover it’s wrapped in is irrelevant…”

I think my grandmother would have done just fine without the encumbrance of the irreversible tally of counted years.

The colour of truth is gray

It’s back again… Well, actually I suppose it never left. We still seem to be obsessed with the genderization of colours -as if it were an established biological given; as if it were as obvious as handedness, or as necessary as the assignation of gender at birth. ‘Pink is for girls and Blue is for boys’ -its self-evidence is once again being called into question; it seems an endless, pointless cycle.

There have been many attempts to link gendered colour preference to Weltanschauungen, genetic atavisms, and of course, persistent, market-savvy fashion manipulation (even I attempted a commentary in a previous essay: https://musingsonwomenshealth.com/2013/12/06/nature-versus-princess-nurture/) -but none seem adequate explanations for its persistence in our culture. Indeed, those studies that have sought to resolve the issue seem to have canvassed opinions from predominantly western cultures. And apart from the probable sampling bias, there are other factors that likely come into play, as suggested in a 2015 article in Frontiers in Psychology: ‘… red symbolizes good luck in China, Denmark, and Argentina, while it means bad luck in Germany, Nigeria, and Chad (Schmitt, 1995Neal et al., 2002). White is a color of happiness and purity in the USA, Australia, and New Zealand, but symbolizes death in East Asia (Ricks, 1983Neal et al., 2002). Green represents envy in the USA and Belgium, while in Malaysia it represents danger or disease (Ricks, 1983Hupka et al., 1997).’ In other words, ‘this variation in the symbolism of color could lead to variation on color preference between cultures.’ We’d best choose our colours carefully.

But, I suppose what got me interested again in this perpetual, gendered debate was a rather lengthy and thoughtful article (extracted from her book Gender and Our Brains) in Aeon by Gina Rippon, an emerita professor of cognitive neuroimaging at Aston University in Birmingham, UK: https://aeon.co/essays/is-pink-for-girls-and-blue-for-boys-biology-or-conditioning

I have to say I was lured into reading the entire article when she quickly introduced me to the dreadful concept of ‘gender reveal’ parties. They apparently come in two varieties: in one, the pregnant woman for whom the party is held, does not know the sex of her fetus as do the organizers (the ultrasound sex, by agreement, has been sent only to them) -it is guarded in a sealed envelope as is the colour motif; in the second variety, the mother knows and reveals it with all the appropriately coloured hoopla at the party.

And why, apart from the puerile attempts to colourize the event, do I find it so disagreeable? Well, as Rippon suggests, ‘20 weeks before little humans even arrive into it, their world is already tucking them firmly into a pink or a blue box. And… in some cases, different values are attached to the pinkness or blueness of the news.’

I also read further, in hopes that the author had some convincing insights as to whether the colour assigned to each gender was biologically or culturally determined. Unfortunately, the evidence she cites seems able to support either -or neither- side. One study, however, did make some progress in resolving the problem: ‘American psychologists Vanessa LoBue and Judy DeLoache tracked more closely just how early this preference emerges. Nearly 200 children, aged seven months to five years, were offered pairs of objects, one of which was always pink. The result was clear: up to the age of about two, neither boys nor girls showed any kind of pink preference. After that point, though, there was quite a dramatic change, with girls showing an above-chance enthusiasm for pink things, whereas boys were actively rejecting them. This became most marked from about three years old onwards.’ This suggests a cultural rather than biological explanation: ‘once children learn gender labels, their behaviour alters to fit in with the portfolio of clues about genders and their differences that they are gradually gathering.’

But why, then, the cultural preference? There was recently what may be an Urban Legend suggesting that at one time, the gendered colour preferences were actually reversed and ‘that any kind of gender-related colour-coding was established little more than 100 years ago, and seems to vary with fashion, or depending on whether you were reading The New York Times in 1893 [pink for a boy]… or the Los Angeles Times in the same year [pink for a girl].’

But, at least in our current milieu, the issue is not so much the colour as what it has come to suggest, consciously or not: ‘Pink has become a cultural signpost or signifier, a code for one particular brand: Being a Girl. The issue is that this code can also be a ‘gender segregation limiter’, channelling its target audience (girls) towards an extraordinarily limited and limiting package of expectations, and additionally excluding the non-target audience (boys).’

Of course, as Rippon points out, the fact that Pink may be a signifier of what is acceptable to females, allows it to bridge the gender gap: colour a toy truck pink, and it becomes acceptable for a girl to play with it. Unfortunately, the other side of the permission can be that ‘pinkification is all too often linked with a patronising undertow, where you can’t get females to engage with the thrills of engineering or science unless you can link them to looks or lipstick, ideally viewed through – literally – rose-tinted glasses.’ And viewed through prevailing stereotypes as well, I might add.

And yet, what determines what constitutes a ‘boy toy’? Is it what the child sees -or what their parents and grandparents saw in the world in which they grew up? In the world today, women drive trucks, operate diggers, become doctors and lawyers -not just secretaries, teachers, and nurses.

There is also a danger to pandering to ill-conceived remedies, of course. Take Rippon’s example of the STEM Barbie doll (STEM -for the older, more naïve readers like me- stands for Science, Technology, Engineering, and Mathematics -traditionally male-dominated fields, apparently): ‘efforts to level the playing field get swamped in the pink tide – Mattel has produced a STEM Barbie doll to stimulate girls’ interest in becoming scientists. And what is it that our Engineer Barbie can build? A pink washing machine, a pink rotating wardrobe, a pink jewellery carousel.’

Only in the penultimate and last paragraph of the article does Rippon come close to answering the question on the reader’s lips from the beginning of her 4500 word document: ‘It is clear that boys and girls play with different toys. But an additional question should be – why?… The answer to these questions could lie in our new understanding of how, from the moment of birth (if not before), our brains drive us to be social beings – to understand social scripts, social norms, social behaviour – to make sure we understand the groups we should belong to and how we can fit in… our brains are scouring our world for the rules of the social game – and if that world is full of powerful messages about gender, helpfully flagged by all sorts of gendered labelling and gendered colour-coding, our brains will pick up such messages and drive their owners to behave ‘appropriately.’’

Perhaps Rippon is correct, but I wonder if it’s more accurate to say that we were stuck with gendered colours; I think there is room for hope: what the child sees when she looks around is changing. So I am instead inclined to the view of André Gide, the French author who won the Nobel Prize in Literature in 1947: ‘The colour of truth is gray,’ he wrote.

May we all be free to mix our own colours…

A Predilection for Extinction?

 

There appears to be a lot of concern about extinctions nowadays -everything from spotted owls to indigenous languages pepper the list. Things around us that we took for granted seem to be disappearing before we even get to know or appreciate them. One has to wonder whether this is accompanied by furtive, yet anxious, glances in the mirror each morning.

Extinction. I wonder what it would be like -or can we even imagine it? If we could, then presumably we’re not extinct, of course, but our view of history is necessarily a short one. Oral traditions aside, we can only confidently access information from the onset of written accounts; many extinctions require a longer time-frame to detect… although, perhaps even that is changing as we become more aware of the disappearance of less threatening -less obvious- species. Given our obsessive proclivity for expanding our knowledge, someone somewhere is bound to have studied issues that have simply not occurred to the rest of us.

And yet, it’s one thing to comment on the absence of Neanderthals amongst us and tut-tut about their extinction, but yet another to fail to fully appreciate the profound changes in climate that are gradually occurring. Could the same fate that befell Neanderthals be forecasting our own demise -a refashioning of the Cassandra myth for our self-declared Anthropocene?

It would not be the first time we failed to see our own noses, though, would it? For all our perceived sophistication, we often forget the ragged undergarments of hubris we hide beneath our freshly-laundered clothes.

Religion has long hinted at our ultimate extinction, of course -especially the Christian one with which those of us in the West are most familiar- with its talk of End-of-Days. But, if you think more closely about it, this is predicted to occur at the end of Time; extinction, on the other hand, occurs -as with, say, the dinosaurs- within Time. After all, we are able to talk about it, measure its extent, and determine how long ago it happened.

And yet, for most of us, I suspect, the idea of extinction of our own species is not inextricably linked to our own demise. Yes, each of us will cease to exist at some point, but our children will live on after us -and their children, too. And so on for a long, long time. It is enough to think that since we are here, our children will continue on when we are not. Our species is somehow different than our own progeny…

Darwin, and the subsequent recognition of the evolutionary pressures that favour the more successfully adapted no doubt planted some concerns, but an essay in Aeon by Thomas Moynihan (who completed his PhD at Oxford), set the issue of Extinction in a more historical context for me, however. https://aeon.co/essays/to-imagine-our-own-extinction-is-to-be-able-to-answer-for-it

Moynihan believes that only after the Enlightenment (generally attributed to the philosophical movement between the late 17th to the 19th century) did the idea of human extinction become an issue for consideration. ‘It was the philosopher Immanuel Kant who defined ‘Enlightenment’ itself as humanity’s assumption of self-responsibility. The history of the idea of human extinction is therefore also a history of enlightening. It concerns the modern loss of the ancient conviction that we live in a cosmos inherently imbued with value, and the connected realisation that our human values would not be natural realities independently of our continued championing and guardianship of them.’

But, one may well ask, why was there no serious consideration of human extinction before then? It would appear to be related to what the American historian of ideas, Arthur Lovejoy, has called the Principle of Plenitude that seemed to have been believed in the West since the time of Aristotle right up until the time of Leibniz (who died in 1716): things as they are, could be no other way. It would be meaningless to think of any species (even human) not continuing to exist, because they were meant to exist. Period. I am reminded -as I am meant to be- of Voltaire’s satirical novel Candide and the uncritical espousal of Leibniz’ belief that they were all living in ‘the best of all possible worlds’ -despite proof to the contrary.

I realize that in our current era, this idea seems difficult to accept, but Moynihan goes on to list several historical examples of the persistence of this type of thinking -including those that led ‘Thomas Jefferson to argue, in 1799, in the face of mounting anatomical evidence to the contrary, that specimens such as the newly unearthed Mammuthus or Megalonyx represented species still extant and populous throughout the unexplored regions of the Americas.’

Still, ‘A related issue obstructed thinking on human extinction. This was the conviction that the cosmos itself is imbued with value and justice. This assumption dates back to the roots of Western philosophy… Where ‘being’ is presumed inherently rational, reason cannot itself cease ‘to be’… So, human extinction could become meaningful (and thus a motivating target for enquiry and anticipation) only after value was fully ‘localised’ to the minds of value-mongering creatures.’ Us, in other words.

And, of course, the emerging findings in geology and archeology helped to increase our awareness of the transience of existence. So too, ‘the rise of demography [the statistical analysis of human populations] was a crucial factor in growing receptivity to our existential precariousness because demography cemented humanity’s awareness of itself as a biological species.’

Having set the stage, Moynihan’s argument is finally ready: ‘And so, given new awareness of the vicissitude of Earth history, of our precarious position within it as a biological species, and of our wider placement within a cosmic backdrop of roaming hazards, we were finally in a position to become receptive to the prospect of human extinction. Yet none of this could truly matter until ‘fact’ was fully separated from ‘value’. Only through full acceptance that the Universe is not itself inherently imbued with value could ‘human extinction’ gain the unique moral stakes that pick it out as a distinctive concept.’

And interestingly, it was Kant who, as he aged, became ‘increasingly preoccupied with the prospect of human extinction…  During an essay on futurology, or what he calls ‘predictive history’, Kant’s projections upon humanity’s perfectibility are interrupted by the plausibility of an ‘epoch of natural revolution which will push aside the human race… Kant himself characteristically defined enlightening as humanity’s undertaking of self-responsibility: and human rationality assumes culpability for itself only to the exact extent that it progressively spells out the stakes involved… This means that predicting increasingly severe threats is part and parcel of our progressive and historical assumption of accountability to ourselves.’

So, I don’t see this recognition of the possibility of human extinction as a necessarily bad thing. The more we consider the prospect of our disappearance, the more we become motivated to do something about it. Or, as Moynihan points out, ‘The story of the discovery of our species’ precariousness is also the story of humanity’s progressive undertaking of responsibility for itself. One is only responsible for oneself to the extent that one understands the risks one faces and is thereby motivated to mitigate against them.’ That’s what the  Enlightenment was all about: humanity’s assumption of self-responsibility.

Maybe there is still hope for us… well, inshallah.