Historiognosis

When I was in school, history was just a series of strange and unfamiliar stories -some interesting, most forgettable. Of course, I recognize the irony in describing the effects of teaching methods that are now, themselves, historical, but I still wonder how decisions were made about which facts to focus on. The date of a battle, or the names of the generals who were killed are easily agreed upon, and yet what about things like the beginning of the Enlightenment, or when the Little Ice Age began and ended? They are surely more approximations -opinions- than known ‘facts’.

Again, when I was much younger and my leaves were still green and tender, it seemed that most, if not all, of the important historical figures were male, and by and large, European. Females, if they were mentioned at all, were like adjectives that added colour to their male nouns. There were, of course, exceptions -Hildegard von Bingen, the 12th century Abbess, polymath, and musical composer of sacred monophony is my favourite, I think- but only relatively recently have more historically important females been  ‘discovered’. Heaven only knows how many more lie patiently awaiting exhumation.

And, let’s face it, even when I was in high school, Columbus was still felt to have ‘discovered’ America, much to the amused astonishment of the original inhabitants, no doubt. And Africa had never been considered to have hosted any civilizations worthy of the name, let alone exhibited any philosophical thinking, or theological profundities.

I suppose, for an interested but no doubt naïve amateur, it has always been the arbitrariness of the choices about what happened in the past, and often, the seemingly limited perspectives of the almost infinite number of  possibilities available, that trouble me.

And yes, I understand that the sources from which conclusions are drawn may be unreliable, or reflect the biases of their creators (or historians), and I can imagine that even where the written documents may be clearly worded, their meanings are not fixed for all times. Societal norms, and expectations also no doubt influence what was felt to be important to record. So, although I am intrigued by History, I am wary of any lessons it might purport to offer those of us alive today.

Still, I continue to be attracted to new analyses, and I remain curious about novel ways of approaching and evaluating the Past. So, it was with a frisson of excitement that I embarked upon the exploration of a rather complex essay that suggested there may be a more objective way of appraising history -a mathematical approach that is no doubt old-hat to professional historians, but new to uncredentialled and only part-time acolytes like myself. Amanda Rees, a historian of science in the department of sociology at the University of York, surveyed attempts to objectivize History, and bring it more in line with the Natural Sciences with their use of statistical analyses and the like: https://aeon.co/essays/if-history-was-more-like-science-would-it-predict-the-future

Rees, ends her first paragraph with a question: ‘If big data could enable us to turn big history into mathematics rather than narratives, would that make it easier to operationalise our past?’ There have been several unsuccessful attempts to try something like this. For example, ‘In the 19th century, the English historian Henry Thomas Buckle used a broad-brush approach to the past in an effort to identify ‘natural laws’ that governed society… Buckle’s contemporary, the French positivist Auguste Comte, had earlier proposed his ‘law of three stages’ which styled human society as passing through ‘theological’ and ‘metaphysical’ stages, before arriving at a scientific self-understanding through which to build a better society.’

And then there was ‘the more elaborate social Darwinism of Herbert Spencer, who coined the phrase ‘survival of the fittest’. These views were an attempt to marry the organic nature of evolution to history, but unfortunately, became embedded in the Zeitgeist of the time and seem to us nowadays as distinctly racist.

Rees, however, spends considerable time explaining the views of Peter Turchin, who in 2010 was an ecologist in the  University of Connecticut. ‘Why, Turchin wanted to know, were the efforts in medicine and environmental science to produce healthy bodies and ecologies not mirrored by interventions to create stable societies? Surely it was time ‘for history to become an analytical, and even a predictive, science’… he proposed a new discipline: ‘theoretical historical social science’ or ‘cliodynamics’ – the science of history.’

Of course, unlike objective attributes such as, say, temperature or infective processes, ‘‘historical facts’ are not discrete items that exist independently, awaiting scholars who will hunt them down, gather them up and catalogue them safely. They need to be created and interpreted. Textual archives might seem relatively easy to reproduce, for example, but, just as with archaeological digs, the physical context in which documents are found is essential to their interpretation: what groups, or items, or experiences did past generations value and record… What do the marginalia tell us about how the meanings of words have changed?’

A good example perhaps: ‘is it really possible to gauge subjective happiness by counting how many times words such as ‘enjoyment’ or ‘pleasure’ occur in the more than 8 million books digitised by Google?’ Or another: ‘a quantitative study of American slavery, in which Fogel [Robert Fogel, joint winner of the 1993 Nobel Prize in economic history] used plantation records to show that slavery was an economically efficient means of production, and to suggest that Southern slaves were better off than many Northern wage-earners.’ And yet, ‘plantation records didn’t fully capture the nature of slavery. How could they, when they were created by one group of humans at the expense of another?’

It is for this reason, among others, that ‘a positivist language of science – of testing hypotheses against data – sits uncomfortably with the practice of history.’ Are historical facts (a debatable term at best) really ‘things’? Turchin seemed to think so.  ‘Inspired by the work of the American sociologist Jack Goldstone, who in the 1990s had tried to translate Alexis de Tocqueville’s philosophy into mathematical equations, Turchin began to relate population size to economic output (and, critically, levels of economic inequality) as well as social and political instability… Social structure, for example, could be treated as a product of health and wealth inequality – but to measure either, you need to choose approximate and appropriate proxies. The process was further complicated by the fact that, when you’re working with a chronology that spans millennia, these proxies must change over time. The texture of that change might be qualitative as well as quantitative.’ You get the idea: apples and oranges.

And anyway, what makes anybody believe that the Natural Sciences (which Turchin was trying to emulate) are actually capable of producing ‘objective knowledge’? ‘Half a century of research in the history of science has shown that this perspective is deeply flawed. The sciences have their own history – as indeed does the notion of objectivity – and that history is deeply entwined with power, politics and, importantly, the naturalisation of social inequality by reference to biological inferiority. No programme for understanding human behaviour through the mathematical modelling of evolutionary theory can afford to ignore this point… meta-analyses and abstract mathematical models depend, by necessity, on manipulating data gathered by other scholars, and a key criticism of their programme is that cliodynamicists are not sensitive to the nuances and limitations of that data.’

By this stage in the essay, I was not only confused, I was also disappointed. But, could I really expect an answer to the arbitrariness of historical data when even my brother and I, in describing an event from our shared childhood, can never agree on what really happened? It’s all perspective in the end; as Nietzsche said, ‘There are no facts, only interpretations.’

And anyway, my brother doesn’t know what he’s talking about…

I long to hear the story of your life

I like the idea that I am a story which I am still writing. After all, there seems to be a rambling kind of direction to it, and if pressed, I could likely invent a plot. Of course, until the final page, nobody -not even me- really knows how it’s going to turn out, but that’s the beauty of it. The intrigue of it -a story in which we’re kept guessing until the very end.

I suppose, though, any story admits of a certain amount of literary licence, and a life story is definitely no exception -just look at many social media postings where fact and fiction intermingle like roots from a pot-bound plant. It’s hard to know what to do with stories like that; by constructing them the way they have, does that still make them stories of who the narrators actually are? By altering the way we tell our tales, does that also alter us?

Because our memories are imperfect reproductions of what actually happened, and because the way we sift through them may favour -or reject- the more remarkable ones, do we have a story at all? Could our lives be nothing but random hodgepodges of remembered events on flash-cards we hold up when asked to tell someone about ourselves? And, even when organized in some semblance of sequential chronology, is it a story, or a fantasy?

Sometimes the only thing that stitches the narrative together is the fact that it all seems to have happened to that fraying thread of continuity we identify as ourselves. And yet, an old black and white photograph of me as a child seems not to be the same ‘me’ as the one I feel I am now. I have to believe the writing on the back that identifies it as me, and yet… and yet I do not remember it being taken. That child is not me anymore -there are so many gaps I cannot fill. So how do I construct a narrative that includes him? Am I his story, or is he mine?

I had almost forgotten about the idea of Homo narrans when I happened upon a counter-argument in an essay by Galen Strawson, a British analytic philosopher from the University of Texas at Austin: https://aeon.co/essays/let-s-ditch-the-dangerous-idea-that-life-is-a-story

After reciting a few examples of sundry philosophers and savants who subscribe to the narrative approach, Strawson writes, ‘I think it’s false – false that everyone stories themselves, and false that it’s always a good thing.’ I have to say that it spurred me to read on; I wondered how he would attempt to assail the now-conventional wisdom of the narrativists.

He firmly rejects this thesis. ‘It’s not just that the deliverances of memory are, for us, hopelessly piecemeal and disordered, even when we’re trying to remember a temporally extended sequence of events. The point is more general. It concerns all parts of life, life’s ‘great shambles’, in the American novelist Henry James’s expression. This seems a much better characterisation of the large-scale structure of human existence as we find it. Life simply never assumes a story-like shape for us.’

Strawson seems to feel that identity through narrative is an almost desperate attempt to seize control of the course one’s life: to develop an autobiographical narrative to act as a lens through which we experience the world. He quotes McAdams, a leading narrativist among social psychologists: ‘Beginning in late adolescence and young adulthood, we construct integrative narratives of the self that selectively recall the past and wishfully anticipate the future to provide our lives with some semblance of unity, purpose, and identity.’

And then, before we decide that describes our ‘story’, Strawson counters with a quote from the British author W Somerset Maugham: ‘I recognise that I am made up of several persons and that the person that at the moment has the upper hand will inevitably give place to another. But which is the real one? All of them or none?’ I suspect this was an attempt to muddy the water, but I pressed on.

I get the impression that Strawson identifies most with the 16th century philosopher and writer Michel de Montaigne who is mainly remembered for his prodigious output of essays. Montaigne, was also known for his memory lapses, and so it is no surprise that ‘‘I can find hardly a trace of [memory] in myself,’ he writes in his essay ‘Of Liars’ (1580). ‘I doubt if there is any other memory in the world as grotesquely faulty as mine is!’ And why is that important for Strawson?

Because ‘Poor memory protects him [Montaigne] from a disagreeable form of ambition, stops him babbling, and forces him to think through things for himself because he can’t remember what others have said… To this we can add the point that poor memory and a non-Narrative disposition aren’t hindrances when it comes to autobiography in the literal sense – actually writing things down about one’s own life. Montaigne is the proof of this, for he is perhaps the greatest autobiographer, the greatest human self-recorder… Montaigne writes the unstoried life – the only life that matters… He knows his memory is hopelessly untrustworthy, and he concludes that the fundamental lesson of self-knowledge is knowledge of self-ignorance.’

Not to make too great a point of a faulty memory being a helpful argument against the narrative of life, though, Strawson borrows from a comment on a book by the American novelist James Salter: ‘Salter strips out the narrative transitions and explanations and contextualisations, the novelistic linkages that don’t exist in our actual memories, to leave us with a set of remembered fragments, some bright, some ugly, some bafflingly trivial, that don’t easily connect and can’t be put together as a whole, except in the sense of chronology, and in the sense that they are all that remains.’ In other words, I suppose, self-knowledge comes mostly in bits and pieces that we fit together as best we can.

I still feel that life is a narrative, though -a series of events, however cobbled together, however rambling, that tell a story. Like an archeologist sifting through fragments from a midden, we piece together seemingly disparate items and random shards until we have a chronology that makes sense to us, an idea about what was happening and when. Our lives are not so much describable in accurately numbered episodes, as they are in contextually sorted potsherds found in the dust we have been walking through for years; we’ve organized them in a way that makes sense to us. So, our lives are only as colourful as the fragments we have stooped to gather.

Shouldn’t that tell us something…?

Why is Wonder?

Sometimes I am accosted by the strangest questions; they remind me of the unanswerable ‘why’ questions that so often bubble out of 3 year olds -the only difference, I suppose, is that I would no longer be satisfied with the unadorned ‘just because’ answers I’m sure I used to get from my frustrated parents.

But it seems to me that once I retired from the practice of Medicine and was no longer required to answer other people’s questions, it was inevitable that I would have to ask some tricky ones of my own -questions I would be forced to answer for my self. And then I realized that I was no longer the authority figure I once thought I was because the questions I ended up asking were more abstract. Less likely to admit of an easy solution.

Perhaps that’s what Retirement is designed for, though: imponderables. It is close to the denouement of a life and any loose threads of the clothes you once wore are carefully trimmed away and cleaned in preparation for the final scene.

My most recent attempt at cutting away loose strands concerned wonder. Why is wonder? I asked myself one evening, as I sat on a beach watching the cooling sun slowly sinking into the ocean. But when no answers surfaced and I turned to leave, a rim of clouds still glowed orange on the horizon like embers on a dying campfire and I found I couldn’t move. Why did I find myself so filled with awe, so taken with a scene I must have admired in countless versions in magazines or coffee-table books? Why did the experience stop time? Why did I hold my breath? And, what purpose could it possibly serve?

Different from mere beauty, which is shallow in comparison, wonder seems more spiritual -more intense– and less describable in words. It is a feeling, a wordless experience of awe. And yet, ineffable as it may be, is it just a curiosity, or more like an emergent phenomenon: a synergism of factors each of which, taken by itself, is not at all unique? And if so, why would their collective presence have such a major effect on me?

I am far from a religious person, but I am reminded of an idea I found somewhere in the writings of the philosopher/theologian Paul Tillich, years ago. I was wondering what prayer was, apart from an attempt to request a favour from whatever deity you had been taught was in charge. It seemed to me unlikely that words alone could establish any real communication with the god; more was required to make prayers feel like they were being heard. If I understood Tillich, he seemed to imply that prayer was -or should be- like the ephemeral, but almost overwhelming sense of awe felt on seeing, say, the array of burning clouds that could still bathe in the sun setting behind a now-silhouetted mountain. An unknitted communion…

Is that, then, what wonder is: an unintended prayer…? An unasked question? I happened across an essay by Jesse Prinz, a professor of philosophy at the City University of New York, who attempted an examination of wonder, and felt that it shared much in common with Science, Religion, and even Art: https://aeon.co/essays/why-wonder-is-the-most-human-of-all-emotions

The experience of wonder, he writes, seems to be associated with certain bodily sensations which ‘point to three dimensions that might in fact be essential components of wonder. The first is sensory: wondrous things engage our senses — we stare and widen our eyes… The second is cognitive: such things are perplexing because we cannot rely on past experience to comprehend them. This leads to a suspension of breath, akin to the freezing response that kicks in when we are startled… Finally, wonder has a dimension that can be described as spiritual: we look upwards in veneration.’

That may be the ‘what’ of wonder -its component parts- but not the ‘why’; his explanation seemed more of a Bayeux Tapestry chronicling events, not why they existed in the first place. I dug deeper into the essay.

Prinz goes on to mention the ‘French philosopher René Descartes, who in his Discourse on the Method (1637) described wonder as the emotion that motivates scientists to investigate rainbows and other strange phenomena.’ Their investigations, in turn produce knowledge. ‘Knowledge does not abolish wonder; indeed, scientific discoveries are often more wondrous than the mysteries they unravel. Without science, we are stuck with the drab world of appearances.’ Fair enough -Science can not only be motivated by wonder, it also contributes to it.

‘In this respect,’ Prinz writes, ‘science shares much with religion… like science, religion has a striking capacity to make us feel simultaneously insignificant and elevated.’ Awe, an intense form of wonder, makes people feel physically smaller than they are. ‘It is no accident that places of worship often exaggerate these feelings. Temples have grand, looming columns, dazzling stained glass windows, vaulting ceilings, and intricately decorated surfaces.’

Art, too, began to partake of the sublime, especially when it parted company from religion in the 18th century. ‘Artists began to be described as ‘creative’ individuals, whereas the power of creation had formerly been reserved for God alone.’ Artists also started to sign their paintings. ‘A signature showed that this was no longer the product of an anonymous craftsman, and drew attention to the occult powers of the maker, who converted humble oils and pigments into objects of captivating beauty, and brought imaginary worlds to life.’

Interestingly, then, ‘science, religion and art are unified in wonder. Each engages our senses, elicits curiosity and instils reverence. Without wonder, it is hard to believe that we would engage in these distinctively human pursuits.’ Mere survival does not require any of these, and yet they exist. ‘Art, science and religion are all forms of excess; they transcend the practical ends of daily life. Perhaps evolution never selected for wonder itself… For most of our history, humans travelled in small groups in constant search for subsistence, which left little opportunity to devise theories or create artworks. As we gained more control over our environment, resources increased, leading to larger group sizes, more permanent dwellings, leisure time, and a division of labour. Only then could wonder bear its fruit. Art, science and religion reflect the cultural maturation of our species.’

‘For the mature mind, wondrous experience can be used to inspire a painting, a myth or a scientific hypothesis. These things take patience, and an audience equally eager to move beyond the initial state of bewilderment. The late arrival of the most human institutions suggests that our species took some time to reach this stage. We needed to master our environment enough to exceed the basic necessities of survival before we could make use of wonder.’

Maybe, then, that is the answer to the ‘why’ of wonder. Perhaps it’s a fortunate -some might say providential- byproduct of who we are. Not inevitable, by any means, and not meant for any particular purpose, and yet, however accidental, it was the spur that pricked the sides of our dreams, to paraphrase Shakespeare’s Macbeth.

I’m not so sure it’s even that complicated, though. It seems to me that wonder is more of an acknowledgement, than anything else: an acknowledgment that we are indeed a part of the world around us; a tiny thread in a larger pattern. And every once in a while, when we step back, we catch a glimpse of the motif and marvel at its complexity. It is, then, an acknowledgment of gratitude that we are even a small part of it all… Yes, a prayer, if you will.

Am I anybody’s keeper?

Is it possible to understand the world as if you were another person? Or, no matter the effort, would you still be imprisoned within yourself -feeling what you assume you would feel if you were in the same circumstance as her? That what you manage to sample of her condition is inevitably filtered through your own experience is far from profound, of course, but it is often buried within the empathy you think you are expressing. Empathy is not really how you feel about something; it is about how the other person feels.

But of course you are not the other person, nor have you lived the same life as her. Perhaps, in fact, it is the other way around: the more she has experienced similar things to you -the more like you she is- the more you can empathize with her feelings. Still, this merely reduces empathy to a set of feelings; I suspect there is more to it than this, however. An integral component of empathy is understanding. Much like the philosopher Thomas Nagel’s famous question, ‘What is it like to be a bat?’, surely the central question for empathy would be to ask what it would be like to be the person in question -not just how to feel like her. It seems to me there must be a cognitive, as well as emotional side to empathy.

I found an insightful essay on this multimodal requirement as exemplified in the fictional character of Sherlock Holmes by Maria Konnikova. She felt that Holmes seemed to be able to put himself in the victim’s mind, if not necessarily in their heart. https://aeon.co/essays/empathy-depends-on-a-cool-head-as-much-as-a-warm-heart

As she observes, according to Holmes ‘whatever is emotional is opposed to that true cold reason which I place above all things… It is of the first importance not to allow your judgment to be biased by personal qualities… The emotional qualities are antagonistic to clear reasoning.’

But, how can that be? For Holmes, his ability to understand the problem is based on his creative imagination. ‘In fact, his success stems from the very non-linearity and imaginative nature of his thinking, his ability to engage the hypothetical just as he might the physical here-and-now… So Holmes is an expert at the very thing that makes empathy possible in the first place – seeing the world from another’s point of view. He is entirely capable of understanding someone else’s internal state, mentalising and considering that state.’ But not just that.

An emotional lack may permit a relative freedom from prejudice. ‘[R]ecent research bears this out. Most of us start from a place of deep-rooted egocentricity: we take things as we see them, and then try to expand our perspectives to encompass those of others. But we are not very good at it… Even when we know that someone’s background is different from our own, and that we should be wary of assuming we can understand their situation as though it were our own, we still can’t shake off our own preconceptions in judging them. The more cognitively strained we are (the more we have going on mentally), the worse we become at adjusting our egocentric views to fit someone else’s picture of the world… Our neural networks might be mirroring another’s suffering, but largely because we worry how it would feel for us. Not so Holmes. Because he has worked hard to dampen his initial emotional reactions to people, he becomes more complete in his adjustment, more able to imagine reality from an alternative perspective.’

So, in a way, sometimes it’s actually their difference from us that allows us to judge what the other person is going through more accurately. ‘Empathy it seems, is not simply a rush of fellow-feeling, for this might be an entirely unreliable gauge of the inner world of others.’

In fact, ‘The ability to see the world from another set of eyes, to experience things vicariously, at multiple levels, is training ground for such feats of imagination and reason that allow a Holmes to solve almost any crime, an Einstein to imagine a reality unlike any that we’ve experienced before (in keeping with laws unlike any we’ve come up with before), and a Picasso to make art that differs from any prior conception of what art can be.’ Imagination, and emotion; there’s a commonality: ‘to be creative, just as to be empathetic, we must depart from our own point of view… The emotional element in empathy is itself a limited one. It is selective and often prejudicial – we tend to empathise more with people whom we know or perceive to be like us.’

I was talking to a friend in a grocery store lineup the other day, masked and socially distanced of course, when an elderly  man with his mask hanging from his chin moved into the space ahead of her. He made no apology, nor did he seem to understand the need for distancing in a line. He’d merely seen a space and moved into it.

“The poor old dear,” I muttered, my voice muffled by my mask. I’m not sure if she heard my words, but my friend’s eyes first saucered in surprise at my reaction, and then narrowed into an angry scowl at the intrusion as she turned to glare at him.

“Excuse me, sir,” she said, addressing the old man, ‘The line starts back there…” And she pointed past a number of people standing behind her.

He turned his head slowly and stared at her for a moment. “There was a space in front of you, though…”

“As there is supposed to be,” she interrupted before he could finish his sentence. She sounded angry –righteously angry.

The only indication that he had understood her anger was to shrug and turn his head away again.

“And you are not wearing your mask, sir,” she continued, her anger obviously unsated.

His response was to turn and point to the mask hanging from his chin and smile. “I can’t breathe very well through it,” he said in a soft, firm voice.

I risked a step forward. “He’s just an old man, Janice,” I said, trying to talk softly, but the sound was no doubt further muffled by the thick mask I was wearing. “He’s probably a little confused. Let it go…”

Janice stared at me for a moment. “Then he shouldn’t be shopping on his own, G,” she said, shaking her head as if she couldn’t understand why I would be defending him. “These are dangerous times…”

I blinked, similarly wondering why she was so adamant.

“There are rules, G!” I could see by the movement of her mask that she had sighed. “We can’t just bend the rules because we feel sorry for someone.”

“Maybe not, but it would create less of a fuss for people in the line if we just let him proceed.” I looked at the line behind me and nobody else seemed upset by his action -if they had even noticed. “And he doesn’t seem bothered by them either…”

She continued to stare at me -blankly at first, uncomprehendingly.

Then, when I smiled behind my mask, I think she saw the wrinkles from my eyes because her aggressive posture seemed to relax. “Well, maybe just this once, eh?” she said, and shrugged.

Empathy in action…

Flowers are slow and weeds make haste

Sometimes it’s obvious that we all need to cope –In the fell clutch of circumstance, I have not winced, nor cried aloud. Under the bludgeonings of chance, my head is bloody but unbowed, in the immortal words of the poet William Ernest Henley. Those words have seen me through many of Life’s crises, but each time, it seemed as if I suffered alone; each time I needed encouragement to endure.

When my own personal needs had decreased, however -when I was less consumed with my own fate- I could understand that everyone suffers at one time or another; my own suffering  was far from unique. We all inhabit our own universes; each of us hides a solipsist under our clothes: my fate determines my universe -the only one I have ever experienced.

It should come as no surprise, then, that even in my own universe, other things also feel a need to survive, and when that is in doubt, they suffer. Of course I judge my own travails to be both paramount and likely more intense, but is that fair? I can feel my pain, experience my anguish, but no matter how much empathy I attempt, others are still remote from me; I cannot feel their pain -I must translate it into how I might perceive something similar: I must suffer by proxy, as it were.

Each living thing suffers, though; each living thing strives to correct the situation if it is able, or otherwise succumbs. It is simply not enough to posit that the degree of suffering varies in direct proportion to the complexity of its nervous system. How can we measure suffering?  Is it fair to attribute the gasping of a freshly caught fish -no doubt desperate for oxygen when dumped onto the deck of a fishing boat- merely to some mindless reflex reaction? Or that a common housefly, seeking freedom through a closed window, and buzzing desperately at it until it eventually tires and drops to the sill, feels some proto-anguish at the imprisonment it cannot understand?

The very act of avoidance suggests a need to prevent something undesired, but I realize that I have to be careful not to push the analogy too far. Most would exclude the Plant Kingdom, if not most of the Animal Kingdom -Homo sapiens is touchy about sharing the spoils of suffering. Still, we are a little more generous about other things, other words: awarding the ability to be resilient in the face of adversity to many and sundry things -even plants.

It’s a small victory for flowers, I suppose, but I chanced upon an article in Vox about accidents in flowers that reminded me of that stanza in the poem by Henley that I quoted above. https://www.vox.com/science-and-health/2020/4/14/21208857/pandemic-plants-evolution-beauty

It describe the efforts of some flowers which have been knocked down by wind, say, that try to right both themselves and the flowers on their stalks, and to rotate back, as best they can, into a position better suited for pollinationNot just to enable them to gather more sunlight for photosynthesis, mind you, but align themselves again both to attract, as well as to enable insects to access the pollen they need to propagate.

I guess that in the larger scheme of things, this is a trivial observation, and yet think about it.  How could a downed flower reorient itself and its leaves towards the sun to enable photosynthesis? Plants have no muscles -and even if they did, what would organize the response?

Scientists, of course, are curious, and not satisfied with merely a sense of wonder and awe at the determination of the little plant struggling to regain some semblance of the old order; botanists have discovered that a plant hormone –auxin– elongates the cells of the stem on the side that is farthest from the light, pushing the stem or the leaf towards the light. It’s called phototropism.

I know it’s always better to find explanations -mechanisms- for this sort of thing, but I have to confess that I am more amazed that a presumably mindless plant can act as if it had a brain striving to organize its responses. It’s one thing to explain turning towards the light with mechanical changes in cells that don’t receive enough, but another thing entirely to explain the seemingly purposive correction of the floral orientation to improve the likelihood of insect visitation and hence improve pollination. It’s not as simple as turning towards a light – first of all, the flower has to be seen by a bee, say, but the bee also has to be able to land on it properly to acquire the pollen. An upside down flower in the grass is unlikely able fulfil either of these conditions.

Somehow, the thought of a flower resisting death, and striving to correct its situation no matter the odds against it, is incredibly inspiring -touching, even. I wouldn’t want to suggest it suffers under these conditions -or at least not as we humans would envisage suffering, but I would point out that we tend to judge everything using only those rules which we have come to understand apply to us. As time and knowledge expand, however, our opinions change. There was a time, not so distant, when there was little concern about cruelty to animals -or cruelty to others of our own species, for that matter. Little thought was given to communication systems among animals, and even less to signals among plants. If they didn’t do it like us, then it wasn’t happening…

Times change, though, don’t they? Or maybe it’s not so much the times as our attitudes toward those things with which we interact. I am still thrilled at listening to a bird sing; I am soothed by the wind wandering slowly through the leaves of a tree, enchanted by the sound of water trickling over a rock in a swiftly flowing stream. I am happy for somebody else to work out the mechanisms and reduce them to mathematical modelling. For me, the experience is as important as the explanation.

I am more entranced by the sound of geese flying high above me in a mountain fog, than curious as to their destination. In the words of the poet Kahlil Gibran:

Beauty is life when life unveils her holy face.
But you are life and you are the veil.
Beauty is eternity gazing at itself in a mirror.
But you are eternity and you are the mirror.

I had as lief have been myself alone

Being alone is not easy for many of us -perhaps because it allows an inner dialogue to emerge that is ordinarily submerged in the noise of the crowd. And yet it is in solitude that a still small voice emerges: the one that allows us to assess our actions, and to argue with ourselves.

This, of course, was a central theme of the Jewish-German thinker Hannah Arendt who fled Nazi Germany to America. I suppose she came to public attention largely because in 1961, The New Yorker commissioned Arendt to cover the trial of Adolf Eichmann, a Nazi SS officer who helped to orchestrate the Holocaust.

I happened upon an essay on solitude by Jennifer Stitt, then studying at University of Wisconsin-Madison who was obviously impressed by Arendt’s work: https://aeon.co/ideas/before-you-can-be-with-others-first-learn-to-be-alone

Arendt believed that ‘solitude empowers the individual to contemplate her actions and develop her conscience, to escape the cacophony of the crowd – to finally hear herself think… Arendt was surprised by Eichmann’s lack of imagination, his consummate conventionality. She argued that while Eichmann’s actions were evil, Eichmann himself – the person – ‘was quite ordinary, commonplace, and neither demonic nor monstrous. There was no sign in him of firm ideological convictions.’ She attributed his immorality – his capacity, even his eagerness, to commit crimes – to his ‘thoughtlessness’. It was his inability to stop and think that permitted Eichmann to participate in mass murder… A person who does not know that silent intercourse (in which we examine what we say and what we do) will not mind contradicting himself, and this means he will never be either able or willing to account for what he says or does; nor will he mind committing any crime, since he can count on its being forgotten the next moment.’ The banality of evil.

I also discussed this in an essay I wrote last year in relation to extremism and loneliness: https://musingsonwomenshealth.com/2019/03/27/society-is-no-comfort-to-one-not-sociable/

But here I’m not so concerned with the aberrant aspects of enforced solitude -during a quarantine, say- because being lonely and being alone are separate creatures. Most of us are never really alone -that’s when we meet our inner selves. It’s when there is no one else inside, that we feel lonely. ‘Eichmann had shunned Socratic self-reflection. He had failed to return home to himself, to a state of solitude. He had discarded the vita contemplativa, and thus he had failed to embark upon the essential question-and-answering process that would have allowed him to examine the meaning of things, to distinguish between fact and fiction, truth and falsehood, good and evil.’ This suggested to Arendt ‘that society could function freely and democratically only if it were made up of individuals engaged in the thinking activity – an activity that required solitude. Arendt believed that ‘living together with others begins with living together with oneself’… Thinking, existentially speaking, is a solitary but not a lonely business; solitude is that human situation in which I keep myself company. Loneliness comes about … when I am one and without company’ but desire it and cannot find it.’

‘Arendt reminds us, if we lose our capacity for solitude, our ability to be alone with ourselves, then we lose our very ability to think. We risk getting caught up in the crowd. We risk being ‘swept away’, as she put it, ‘by what everybody else does and believes in’ – no longer able, in the cage of thoughtless conformity, to distinguish ‘right from wrong, beautiful from ugly’. Solitude is not only a state of mind essential to the development of an individual’s consciousness – and conscience – but also a practice that prepares one for participation in social and political life. Before we can keep company with others, we must learn to keep company with ourselves.’

Millenia ago, when I was a child in Winnipeg, I remember having to stay away from school and in our house for a week or two because, in those pre-vaccine days, I had the measles. I would stare through the bedroom window at my friends playing in the field outside in the snow and tell my mother how bored I was. After reading every book I could find, and tiring of the adult radio programs she was fond of listening to while she cooked, I would wander into the kitchen and complain that there was nothing to do. She would listen patiently for a while, and then shoo me out of the room.

I still remember the day at breakfast that I announced that I had decided I was going to go out and play with my friends. It was Saturday and everybody was throwing snowballs at each other -I could even see them through the frosty kitchen window. I tried to look determined and crossed my arms over my chest like I’d seen my father do when he was intent on something.

“It’s only been 5 days, G,” she said, shaking her head. “You’re still contagious.”

I shrugged at the argument. “They’ve all had measles, mother… And besides, I’ll be so wrapped up none of my measles could get at them.”

She smiled at me -it was one of those fake smiles she usually put on when she was trying to hide her frustration. “How do you know they’ve all had it, G?” Her face softened when she could see I no longer had my arms crossed over my chest. “What do you think would happen then?”

I thought about it for a moment. The teacher had warned us that measles could be dangerous to some children. She’d never actually told us what that meant, but at recess Jamie told me that his uncle had got a bad case when he was young and something had happened to his head –‘gitis’, or something. He was never the same after it, apparently, but he didn’t explain.

“What could happen, G…?” I was taking too long to answer her question, I suppose.

I remember shrugging and looking first at the window, and then at the floor. “Gitis,” I mumbled guiltily, not confident I had pronounced it correctly. Anyway, I should have thought of that, and prepared a suitable rebuttal.

It had an unexpected result at any rate: she bent down and hugged me. “That’s right, sweetie,” she said after kissing the top of my head. “I knew you understood why I need to keep you home. You just had to think about it, that’s all…”

I now realize that Arendt was on to something. There really is a voice somewhere inside if we stop to listen to it. Mine sounds suspiciously like my mother’s, though…

An accident of birth

For years now, I have picked through the garden of my life -sometimes for pleasure, and sometimes for utility. I weed, of course -the privilege of growing in my aging plot is largely contingent on my having planted it in the first place. Contingent on the purpose for which it was intended. Things that arrive unannounced might be tolerated at times, but the recent discovery of a flower tucked in amongst the lettuce plants instead of growing where I’d planted others of its kind, spoke more of my neglect than serendipity.

And now that I’ve been retired long enough to ponder these things, it occurred to me that the peripatetic guest may not have the same value in its new home. It’s still a flower to be sure -it’s still beautiful, and still proffers its petals as seductively to passing bees- but is it really the same flower as one that was the product of my labour? Does the intent flavour the result?

For some, I suspect it’s a trivial question: surely a daisy, say, is a daisy, no matter whether it arrived accidentally or was planted in the spot. It is a gift, they might say -something for which gratitude not deliberation is appropriate. In a sense, of course, they are correct. And yet, is all the work I may have expended -choosing its pedigree and colour,  calculating a location that might offer it the best chance to thrive, and then watering and weeding- do these not affect the appreciation of the resulting flower? And was appreciation not a large part of the original incentive that led to its planting?

For that matter, does a gift share an equal merit as the same item obtained through work and planning? Does it even possess the same meaning?

It occurred to me that maybe I simply have too much time on my hands now that I’m retired, and I tried to shelve the thought along with all those books I have been meaning to read once the opportunity presented itself. But the question continued to poke annoyingly at my brain in the evenings whenever my eyes tired of reading. I just could not understand what it was about the problem that was continuing to disturb me; and more, was I the only one who even thought there might be something to it?

I can’t say I actively sought an answer -quite frankly, I couldn’t even think of a way to phrase the question- but I did stumble upon a short philosophical enquiry written by Jonny Robinson, a tutor and ‘casual lecturer’ in the department of philosophy at Macquarie University in Australia: https://aeon.co/ideas/would-you-rather-have-a-fish-or-know-how-to-fish

It touched on a theme that seemed eerily similar: how there may be a difference in the quality of the knowledge of Truth, depending upon how it was acquired. ‘Many are born into severe poverty with a slim chance at a good education, and others grow up in religious or social communities that prohibit certain lines of enquiry. Others still face restrictions because of language, transport, money, sickness, technology, bad luck and so on. The truth, for various reasons, is much harder to access at these times. At the opposite end of the scale, some are effectively handed the truth about some matter as if it were a mint on their pillow, pleasantly materialising and not a big deal. Pride in this mere knowledge of the truth ignores the way in which some people come to possess it without any care or effort, and the way that others strive relentlessly against the odds for it and still miss out.’

Each type is in possession of the same Truth, presumably, although in one case it is a gift and in the other, has required an effort to obtain it. It seems to me there is a difference, though: ‘the person ready to correct herself, courageous in her pursuit of the truth, open-minded in her deliberation, and driven by a deep curiosity has a better relationship to truth even where she occasionally fails to obtain it than does the indifferent person who is occasionally handed the truth on a silver platter.’

So, to my question about the itinerant daisy: does it possess the same intrinsic worth as one that has been purposely planted and nourished? Robinson, for his essay, puts the question slightly differently: ‘Is it better to know, or to seek to know?’ Both seem labyrinthine, and unanswerable -trivial, perhaps- largely because they are both perspectival.

So he rephrases the question in the form of a thought-experiment: ‘Would you rather have a fish or know how to fish?’ If having a fish is the result of knowing how to catch it, that is different from having to wait for someone who knows how to fish, and hoping she will actually give the one she caught to you.

Robinson feels it is the same with knowledge. An isolated fact (knowledge) may be valuable, but if you have learned how to acquire more knowledge, you are not limited to that one fact. It is, in fact, a type of synergism: knowledge plus the ability to add to it turns out to be better than the mere fact of knowledge on its own.

That accidental daisy growing by itself amongst the lettuce is still beautiful, but if it truly was an accident, that may or may not be the end of the line for it -especially if I don’t know how to care for it. It is, in that case, on its own. In fact, given its location, I may even think of it as an undesirable -a weed- and pull it out.

It does seem to suggest that it has a different value, a different essence, from a bed of cherished Gerbera Daisies planted and growing contentedly, in their assigned place. In a sense, it is no longer a flower -or, at any rate, not one that I treasure.

One question, though, inevitably leads to another: what is growing alongside the lettuce then…?

Light, seeking light, doth light of light beguile

I have to admit that I have always had trouble with arguments. I dislike confrontation, and whenever it occurs, I seem to get backed into a corner from which I am forced to lash out. Often, I feel that my very identity is at risk: how could any thinking person who was in tune with reality, believe what I do? And if my argument is, in fact, wrong then what does that say about my other opinions that we haven’t yet touched on? Disagreements suggest as much about me as they do about the positions I espouse.

I have had a life-long passion for Philosophy, and I know many of the drills. An argument is seen less as combat or an attempt to disparage the opponent, but more as an exercise in clarification and a search, perhaps, for common ground. So, one hears the opponent’s position and attempts to reword it to show it has been understood. If the opponent agrees that their opinion has been correctly grasped, then ideally, they can state why they disagree with what they’ve heard from me. And so it goes, back and forth -each position clarified and understood before either moves on. Not infrequently, commonalities emerge, and hopefully, the ability to reach some form of compromise begins to materialize.

The problem in most of our encounters, of course, is proceeding without one side being forced to lose face -without feeling that only one side is correct -or, in the case of being proven incorrect, not feeling heard. Why, in other words, did the side espousing Fake News, let us say, come to believe it? Shouting at them, or belittling them is pretty well guaranteed to further intrench them in their views. We all do it, though -okay, I do, anyway.

Sometimes my way of seeing things seems so… obvious to me, that I become infuriated with the expression on the other person’s face, or when they shrug, sigh, or even roll their eyes at my opinion. I suppose I don’t feel heard -no, I don’t feel respected

I was dreading phoning a dear friend of mine who lives on the other side of the country. I hadn’t heard from her for a couple of months, and I wondered if there was something wrong. Since university, we’d always found ourselves on opposite sides of the political and ecological spectrum -we disagreed about almost everything, and so our Emails had to be carefully worded; even with phone calls we had to tip-toe around many of the issues. Skype was especially problematic because I could read the frustration in her eyes, and the way she wrinkled her forehead, or clenched her teeth. I realize I probably did the same and that just amplified the conflict. And yet, each time, despite my determination to change, I usually found myself rerouted along the same trail we always seemed to travel.

I’m always looking for helpful hints and so I was drawn to an essay from Australia by Hugh Breakey, a research fellow at Griffith University in Queensland. I wondered if they did things differently in the antipodes. https://theconversation.com/actually-its-ok-to-disagree-here-are-5-ways-we-can-argue-better-121178

Argument is everywhere, he writes, but ‘Unfortunately, we often fail to consider the ethics of arguing. This makes it perilously easy to mistreat others.’ So, there are certain norms we should follow in an argument: ‘we should be open to their views. We should listen carefully and try to understand their reasoning. And while we can’t all be Socrates, we should do our best to respond to their thoughts with clear, rational and relevant arguments… norms are valuable because they promote knowledge, insight and self-understanding… being reasonable and open-minded ensures we treat our partners in argument in a consensual and reciprocal way. During arguments, people open themselves up to attaining worthwhile benefits, like understanding and truth.’ And, ‘obeying the norms of argument shows respect for our partners in argument as intelligent, rational individuals. It acknowledges they can change their minds based on reason.’

It was also encouraging to find that Breakey and I were on the same track. ‘Two arguers, over time, can collectively achieve a shared intellectual creation. As partners in argument, they define terms, acknowledge areas of shared agreement, and mutually explore each other’s reasons. They do something together.’

All fine and good, but sticking to that in the heat of battle has always been my problem. My heart may be in the right place, but my mouth is not. My mind tricks me into thinking my opponent is being illogical -it’s them, and not me, who’s failing to argue properly. So, to counter this, Breakey offers a few tips. Like, trying not to think I’m being attacked, and remembering that I don’t want to lose my opponent as a friend. I should treat them with respect, and not judge their argument (and hence them) as faulty; they may well be open to changing their views -I shouldn’t assume otherwise -and let’s face it, we may both be wrong…

I don’t know why, but I suddenly felt equipped to phone my friend. I can do this, I told myself when she answered.

“Are you phoning to lecture me on climate change again, G?”

Wow, that started early, I thought. My first reaction was to feel hurt, but I caught myself in time. “Well, actually, I wanted to know how you were doing. I haven’t heard from you in a while…”

That seemed to soften her voice. “Oh, that’s nice of you,” she said tenderly. “I would have let you know if I was sick, you know…” I breathed a bit easier. “But you usually only phone when you’ve thought of a new argument to try out on me,” she continued, her voice noticeably harder.

I had to think. Do I argue with that point, or ignore it? I decided to clarify her assertion. “Do you really think that’s why I phone?”

There was a pause at the end of the line. “It seems that way, G.”

I wasn’t sure whether I should become defensive, or agree with her and apologize. I decided on the middle road. “I guess I do come on a bit strong sometimes, don’t I?”

Another pause -she was obviously having difficulty deciding how to reply as well. She finally settled on “I know you mean well…”

Not a victory, but a white flag of sorts I suppose.

Then, “But I don’t think you can convince me, you know…”

Was she trying to say I was incapable of convincing her, or just that I hadn’t approached her the right way? “Well, maybe I can suggest…” was all I could think of to say before she interrupted me.

“Although that article you sent me a while back was certainly worth thinking about…”

“The one on renewables, you mean?”

“Mmm Hmm…” I could hear her breathing into her phone. “I’ve even decided to ride my bike to work.”

It seemed like a turning point. “That’s great, Melissa!” I thought I’d share in her decision. “Maybe I should do the same, eh?”

A friendly chuckle echoed through my phone. “You’re retired G… But maybe you could at least ride down to the store…”

We were friends again; maybe they really have figured out how to argue in Australia.

Are you really my friend?

There was something that Albert Camus, the Algerian-French philosopher, once wrote that has continued to inspire me since I first read it, so many years ago: “Don’t walk in front of me… I may not follow. Don’t walk behind me… I may not lead. Walk beside me… just be my friend

Friendship is a magical thing that is hard to define; it is like St. Thomas Aquinas’ view of Time: you know what it is until someone asks. Poets, perhaps, with their metaphors come closest to capturing it -Shakespeare for example:

Those friends thou hast, and their adoption tried,
Grapple them unto thy soul with hoops of steel.

Or, the wisdom of Rumi, a 13th century Persian poet: ‘Friend, our closeness is this: anywhere you put your foot, feel me in the firmness under you.’

And even the humour of Oscar Wilde:A good friend will always stab you in the front‘.

And yet, despite the feeling that its essence remains just at the tip of our tongues, there has always been an abiding faith in friendships, a trust that, to paraphrase Abraham Lincoln, ‘I destroy my enemies when I make them my friends’. In more modern times, however, the concept of ‘friend’ has undergone a not-so-subtle shift -everything from ‘friending’ people on social media, to online bullying, to trolling individuals for their putative beliefs, to unintended disclosure of confidences in internet postings.

So should a friend always bear his friend’s infirmities, as Cassius asked Brutus, in Shakespeare’s Julius Caesar? There was a time when the answer seemed obvious; now I am not so sure.

Quite by chance, I came across an essay by Leah Plunkett, an associate dean at the University of New Hampshire’s Franklin Pierce School of Law which raised the question of whether friendship should be policed. Whether it should remain a simple state of loyalty or, if declared, entail a legal obligation -like, say, marriage.   https://aeon.co/ideas/friendship-is-about-loyalty-not-laws-should-it-be-policed

The concept caught me totally by surprise. ‘Friendship is the most lawless of our close relationships,’ she writes. Somehow, the idea that there might even be a need of a legal framework for friendship seemed dystopian to me, so I read on.

‘Friends are tied to each other through emotions, customs and norms – not through a legally defined relationship, such as marriage or parenting, that imposes obligations. Anybody can become friends, we believe…  But with the advent of the digital domain, friendship has become more fraught. Online and off, we can share information about our friends without their permission and without legal restriction (except for slander and libel).’ But, of course, that means that ‘Information shared between friends can wind up being seen by people outside the friendship network who are not the intended audience…  confidences can inadvertently find their way to the public domain; all it takes is one careless email or the wrong privacy setting on a Facebook post.’

And there may even be legal consequences to what we or our friends have posted. ‘Digital social networks are already used to detain people trying to cross into the United States when statements by friends in their network are deemed by border agents to be suspicious or threatening.’ And, although most of us are aware that most social media platforms are collecting and selling our information, ‘Fewer recognise the third-party companies typically behind the scenes of our interactions, often using our information in unknown and uncontrollable ways in pursuit of their own goals.’

And yet, ‘Amid all this chaos, friendship itself remains unregulated. You don’t need a licence to become someone’s friend, like you do to get married. You don’t assume legal obligations when you become someone’s friend, like you do when you have a child. You don’t enter into any sort of contract, written or implied, like you do when you buy something.’ There’s no legal definition of ‘friend’, either.

But, Plunkett has an interesting idea: some U.S. states (like New Hampshire, her own) have definitions of bullying: the state’s Pupil Safety and Violence Prevention Act (2000) for students in primary and secondary school defines what bullying would entail. She wonders if it might be possible to apply its converse to define friendship. So, instead of saying you can’t harm somebody, a friend should need to support a peer or their property; cause emotional comfort, and so on. And, ‘To engage in cyberfriendship, this behaviour would need to take place electronically.’ Interesting idea.

But, although promoting friendship -online or in person- is worthwhile, one clearly has to be careful about how rigorously it is applied. ‘If you could be punished for not being a friend rather than for being a bully, that would undermine the lawlessness that makes friendship so generative.’

And Plunkett feels one has to be particularly careful about this lawlessness. ‘As friendship becomes less lawless, [and] more guarded by cybersurveillance… it might also become less about loyalty, affinity and trust, and more about strategy, currency and a prisoner’s dilemma of sorts (‘I won’t reveal what I know about you if you don’t reveal it about me’).’

It seems to me, she is correct in suggesting that we would be unwise to imprison friendship in too tight a definition -we might find ourselves confined to stocks for punishment and public humiliation like misbehaving villagers in the 16th and 17th centuries.  So, ‘Let’s keep paying our respects to those bonds of friendship that are lawless at heart, opening new frontiers within ourselves.’

And listen to the words of poets like Kahlil Gibran:

When your friend speaks his mind you fear not the “nay” in your own mind, nor do you withhold the “ay.”
And when he is silent your heart ceases not to listen to his heart;
For without words, in friendship, all thoughts, all desires, all expectations are born and shared, with joy that is unacclaimed.
When you part from your friend, you grieve not;
For that which you love most in him may be clearer in his absence, as the mountain to the climber is clearer from the plain.
And let there be no purpose in friendship save the deepening of the spirit.
For love that seeks aught but the disclosure of its own mystery is not love but a net cast forth: and only the unprofitable is caught
.’

If only…

Look the other way, please.

There really are inconvenient truths, aren’t there? There are some things that seem to slip quietly under the radar -things that go unremarked until they  are brought our our attention. And even then, they are perhaps dismissed as unimportant -or worse, accepted and rationalized in an attempt to justify them as tools that enable the greater good of humanity. We, after all, are what it’s all about; our welfare is paramount, not to mention our survival. And when you frame it in those terms, there is little room for noblesse oblige. Survival of the fittest, quickly becomes survival of the ruthless -of the remorseless.

Perhaps I should explain. I live on a little hobby farm in the country, and when I was actively breeding sheep, chickens, and llamas, I was well acquainted with interested visitors, both two and four-legged. Everybody, it seemed, had or wanted, a stake in the game. Friends wanted eggs for their breakfasts, colleagues wanted lamb for their dinners, and I wanted an escape from the city. But, to share with some, was to share with all.

That’s how Life works, I suppose: word gets around, and soon there are all manner of uninvited guests -not all of whom knock, or ask permission. Some just appear -like carpenter ants- but some try not to advertise their arrival, and in fact seem to want to stay out of sight, if not out of mind. They’re the ones I used to worry about -if they’re in the barn, where else might they hide?

Of course I’m talking about rats -not so much the mice which kept my three cats busy in the night. No, the rats who hid in the engine of my pickup truck and ate the plastic off the wires to my distributor, or the battery wires in my car; the rats who patrolled the barn and left their distinctive trail through the uneaten bits of grain I fed the sheep; the rats who also holed up in the woodpile in my garage, and wherever else they could gather relatively undisturbed.

And yes, I declared war on them with spring traps baited with peanut butter, and put warfarin-like pellets in short, narrow little PVC pipes so the cats couldn’t get into them, but alas, the rats outlasted my efforts. Only when I retired and the chickens died in a well-fed old age, and only when I sold the sheep and llamas did the supply of grain eventually disappear -only then did the rats disappear. And I’ve never seen a rat, or droppings since. It reminded me of  the last stanza of Longfellow’s poem The Day is Done:

                                 And the night shall be filled with music,

                                      And the cares, that infest the day,

                                Shall fold their tents, like the Arabs,

                                     And as silently steal away.

I know, I know -they’re only rats, but their leaving seemed so sudden; I came to think of them as having made a collective decision to move their troupe away to greener fields -sort of like the Travellers in Britain with their little trailers, able to leave when conditions are no longer hospitable for them. I suppose I Disneyfied them in my over-active imagination, and yet there was something about their migration that softened their attributes. I’ve never been fond of rats -especially their tails- but on the other hand I’ve always found it hard to believe all of the sinister lore attached to their sneaky habits. After all, they’ve lived with mankind and our middens from the beginning, I would imagine… and we’re both still here in spades. You have to assume a certain degree of intelligence to coexist with us for so long, despite our best efforts to exterminate them.

As these things happen, I tripped over a tantalizing essay co-written by Kristin Andrews, a professor of philosophy at York University in Toronto, and Susana Monsó, a post-doctoral fellow at the Messerli Research Institute in Vienna. https://aeon.co/essays/why-dont-rats-get-the-same-ethical-protections-as-primates

The first three sentences of the article hooked me: ‘In the late 1990s, Jaak Panksepp, the father of affective neuroscience, discovered that rats laugh. This fact had remained hidden because rats laugh in ultrasonic chirps that we can’t hear. It was only when Brian Knutson, a member of Panksepp’s lab, started to monitor their vocalisations during social play that he realised there was something that appeared unexpectedly similar to human laughter.’ And then, okay, they tickled them. ‘They found that the rats’ vocalisations more than doubled during tickling, and that rats bonded with the ticklers, approaching them more frequently for social play. The rats were enjoying themselves.’

Of course, there were some other features, that if further substantiated, we likely don’t want to hear: ‘We now know that rats don’t live merely in the present, but are capable of reliving memories of past experiences and mentally planning ahead the navigation route they will later follow. They reciprocally trade different kinds of goods with each other – and understand not only when they owe a favour to another rat, but also that the favour can be paid back in a different currency. When they make a wrong choice, they display something that appears very close to regret.’ I’ve left the links intact, for reference, in case the reader’s credulity level sinks to the Fake News level.

But, for me at least, ‘The most unexpected discovery, however, was that rats are capable of empathy…  It all began with a study in which the rats refused to press a lever to obtain food when that lever also delivered a shock to a fellow rat in an adjacent cage. The rats would rather starve than witness a rat suffering. Follow-up studies found that rats would press a lever to lower a rat who was suspended from a harness; that they would refuse to walk down a path in a maze if it resulted in a shock delivered to another rat; and that rats who had been shocked themselves were less likely to allow other rats to be shocked, having been through the discomfort themselves.’

The reason the essay intrigued me, I’m sure, is because it has long been a practice to utilize rats (and mice, of course) as mindless fodder for our experimental quandaries. And, there’s little question that it is better to experiment on an animal than on a human, and especially a time-honoured nuisance and villain like a rat rather than a chimpanzee, or whatever. I don’t think I would be prepared to argue their utility for this, nor that until we have devised non-living alternatives -cell cultures, or AI modelling, perhaps- some things will require validation in functioning organisms to advance our knowledge for the benefit of the rulers (us).

My hope, however, is to point out that our hubris may tend to blind us to the increasing likelihood that rats, are not mindless protoplasms living forever in the ‘now’ of their experiences. Are they sentient beings…? I suppose their sentience , like ours, is on a spectrum, isn’t it?

But if we are to continue to utilize them as unwitting research subjects, it seems to me that we should treat them with kindness and a degree of respect. Remember the words of Gloucester after he has been blinded by Cornwall, in Shakespeare’s King Lear: ‘As flies to wanton boys are we to the gods. They kill us for their sport.’ Let us not stoop to that…