In praise of an empty brain

How do I love thee, Age? Let me count the ways… Well, actually I’m not actually going to, because of late, I’ve fallen out with it. Perhaps it’s just my memory that’s falling, though: I was about to parody Shakespeare -it’s what I knew I knew, and yet I didn’t (it was Elizabeth Barrett Browning. I checked). A simple mistake, perhaps, and yet, once again, the hubris of my years led me along the wrong neurons. I feel embarrassed about it now, but suppose I had offered it to someone as a valid Shakespearean quotation and, out of respect for my age, I had not been contradicted? For that matter, what if I’d felt there was no need even to look it up?

Although I am now retired after a long career in medicine, people still ask for my opinion. I answer them, of course, but I do wonder if what I say is still up to date and correct. And often as not, I will look up the answer when I get home. Whether it be age or temperament, the assumption of knowledge I do not possess sits poorly with me. Nowadays, I am far more likely to shrug and admit that I do not know the answer to the question asked -or at least admit that I am uncertain.

However for an expert, I suppose it’s a matter of pride to speak with certainty, even if that confidence is apt to block, or even deride other viewpoints. It seems to me that knowledge is never a locked door -we can always learn by opening it from time to time.

Of course I have never been able to keep track of my keys, so I suppose I am particularly vulnerable. The other day while I was meandering through my apps, for example, I stumbled upon an intriguing essay in Psyche: https://psyche.co/guides/how-to-cultivate-shoshin-or-a-beginners-mind

The author, Christian Jarrett, a cognitive neuroscientist, writes that ‘The Japanese Zen term shoshin translates as ‘beginner’s mind’ and refers to a paradox: the more you know about a subject, the more likely you are to close your mind to further learning.’ He cites several historical examples of the inability to accept new findings, including one that promises my increasing years the hope of new clothes: ‘belief in the legendary Spanish neuroscientist Santiago Ramón y Cajal’s ‘harsh decree’ that adult humans are unable to grow new neurons persisted for decades in the face of mounting contradictory evidence.’

But, of course this is hardly confined to academia. Expertise in any field breeds hubris. ‘Merely having a university degree in a subject can lead people to grossly overestimate their knowledge… participants frequently overestimated their level of understanding, apparently mistaking the ‘peak knowledge’ they had at the time they studied at university for their considerably more modest current knowledge.’ In fact, ‘there is research evidence that even feeling like an expert also breeds closed-mindedness.’

Jarrett then suggests something obvious: ‘Approaching issues with a beginner’s mind or a healthy dose of intellectual humility can help to counter the disadvantages of intellectual hubris… being intellectually humble is associated with open-mindedness and a greater willingness to be receptive to other people’s perspectives.’

Good idea for sure, but how can a dyed-in-the-wool expert stoop to conquer their own hard-earned arrogance? One way that I thought was clever was ‘to make the effort to explain a relevant issue or topic to yourself or someone else in detail, either out loud or in writing. This exercise makes the gaps in our knowledge more apparent and bursts the illusion of expertise.’ It also makes me think of that famous quote from St. Augustine: What then is time? If no one asks me, I know what it is. If I wish to explain it to him who asks, I do not know.

But, I have to say there is another method that Jarrett suggests, almost as an addendum, that has to be my favourite: ‘deliberately invoking in oneself the emotion of awe. Several studies have shown that awe quietens the ego and prompts epistemological openness’. In other words, ‘Gaze at the night sky, take a walk in nature, or listen to a stirring symphony. Any activities that invoke in you the emotion of awe (wonder at the enormity and beauty of the world) will increase your feelings of humility, and inspire a more open-minded perspective.’

The essay reminded me of something that happened to me many years ago while I was in the thrall of my freshly earned medical degree. Nancy, my date and I had been invited to her uncle Arvid’s house for dinner, and I suppose I felt a little intimidated sitting across the table from a recently retired professor of history. I don’t know why I was uncomfortable. He was an absolutely delightful man who was so energetic when he spoke that his arms seemed to explode upward as if they were spring-loaded. He wore his snow-white hair long and each time his hands unfurled to make a point, a curly lock would roll onto his forehead and his eyes would twinkle in response as if he found the whole thing hilarious. Sophie, his wife, was all smiles, as well, although she wore her hair short and could only resort to smoothing it out each time she laughed.

It was clear from the start that they wanted to put me at ease, and Arvid was careful that he didn’t seed his usually witty remarks with historical references; he didn’t even mention the university position he had held. But I wanted to let Arvid know that I, too, was interested in history and knew something about his area of specialization: the French Enlightenment. Well, actually I only started reading about it when Nancy told me about the dinner.

During a lull in the conversation when we were helping ourselves to dessert, I decided to make my move. “Is it true that the Little Ice Age may have played a part in the French Revolution, Arvid?” I asked, as casually as I could manage.

Arvid smiled at me as he scooped some strawberries from a bowl onto his plate. “Climate was probably a factor, G,” he replied pleasantly. “But all of Europe was affected by that as well.”

“I suppose I was drawing a bit of a parallel with current climate change issues -although certainly not an Ice Age…”

“You mean the effects that major climate shifts may have on political stability?” He seemed genuinely interested.

I nodded and took my turn with the strawberries. “I suspect that our crop yields may suffer as they did in France with the climatic upheavals of the time…” I left the sentence open so he would know I was only offering it as a possible result.

Arvid seemed to think about it as he scooped some ice cream on top of his plate of strawberries. “That’s an interesting comparison, G.” He took a tentative sample of the dessert and studied the spoon for a moment. “I must say we historians sometimes content ourselves with the proximate causes of events: endemic corruption, increased taxes, and the unaffordable price of bread in the case of the French Revolution…”. He tasted the heaping spoonful and then attacked the dessert more seriously. “I think you have a point, G. I must look into that a bit more…” he added between bites, then glanced at his wife who had been largely silent so far.

Their eyes touched briefly and she smiled indulgently as she no doubt always did when hosting dinners for his many students over the years.

Historiognosis

When I was in school, history was just a series of strange and unfamiliar stories -some interesting, most forgettable. Of course, I recognize the irony in describing the effects of teaching methods that are now, themselves, historical, but I still wonder how decisions were made about which facts to focus on. The date of a battle, or the names of the generals who were killed are easily agreed upon, and yet what about things like the beginning of the Enlightenment, or when the Little Ice Age began and ended? They are surely more approximations -opinions- than known ‘facts’.

Again, when I was much younger and my leaves were still green and tender, it seemed that most, if not all, of the important historical figures were male, and by and large, European. Females, if they were mentioned at all, were like adjectives that added colour to their male nouns. There were, of course, exceptions -Hildegard von Bingen, the 12th century Abbess, polymath, and musical composer of sacred monophony is my favourite, I think- but only relatively recently have more historically important females been  ‘discovered’. Heaven only knows how many more lie patiently awaiting exhumation.

And, let’s face it, even when I was in high school, Columbus was still felt to have ‘discovered’ America, much to the amused astonishment of the original inhabitants, no doubt. And Africa had never been considered to have hosted any civilizations worthy of the name, let alone exhibited any philosophical thinking, or theological profundities.

I suppose, for an interested but no doubt naïve amateur, it has always been the arbitrariness of the choices about what happened in the past, and often, the seemingly limited perspectives of the almost infinite number of  possibilities available, that trouble me.

And yes, I understand that the sources from which conclusions are drawn may be unreliable, or reflect the biases of their creators (or historians), and I can imagine that even where the written documents may be clearly worded, their meanings are not fixed for all times. Societal norms, and expectations also no doubt influence what was felt to be important to record. So, although I am intrigued by History, I am wary of any lessons it might purport to offer those of us alive today.

Still, I continue to be attracted to new analyses, and I remain curious about novel ways of approaching and evaluating the Past. So, it was with a frisson of excitement that I embarked upon the exploration of a rather complex essay that suggested there may be a more objective way of appraising history -a mathematical approach that is no doubt old-hat to professional historians, but new to uncredentialled and only part-time acolytes like myself. Amanda Rees, a historian of science in the department of sociology at the University of York, surveyed attempts to objectivize History, and bring it more in line with the Natural Sciences with their use of statistical analyses and the like: https://aeon.co/essays/if-history-was-more-like-science-would-it-predict-the-future

Rees, ends her first paragraph with a question: ‘If big data could enable us to turn big history into mathematics rather than narratives, would that make it easier to operationalise our past?’ There have been several unsuccessful attempts to try something like this. For example, ‘In the 19th century, the English historian Henry Thomas Buckle used a broad-brush approach to the past in an effort to identify ‘natural laws’ that governed society… Buckle’s contemporary, the French positivist Auguste Comte, had earlier proposed his ‘law of three stages’ which styled human society as passing through ‘theological’ and ‘metaphysical’ stages, before arriving at a scientific self-understanding through which to build a better society.’

And then there was ‘the more elaborate social Darwinism of Herbert Spencer, who coined the phrase ‘survival of the fittest’. These views were an attempt to marry the organic nature of evolution to history, but unfortunately, became embedded in the Zeitgeist of the time and seem to us nowadays as distinctly racist.

Rees, however, spends considerable time explaining the views of Peter Turchin, who in 2010 was an ecologist in the  University of Connecticut. ‘Why, Turchin wanted to know, were the efforts in medicine and environmental science to produce healthy bodies and ecologies not mirrored by interventions to create stable societies? Surely it was time ‘for history to become an analytical, and even a predictive, science’… he proposed a new discipline: ‘theoretical historical social science’ or ‘cliodynamics’ – the science of history.’

Of course, unlike objective attributes such as, say, temperature or infective processes, ‘‘historical facts’ are not discrete items that exist independently, awaiting scholars who will hunt them down, gather them up and catalogue them safely. They need to be created and interpreted. Textual archives might seem relatively easy to reproduce, for example, but, just as with archaeological digs, the physical context in which documents are found is essential to their interpretation: what groups, or items, or experiences did past generations value and record… What do the marginalia tell us about how the meanings of words have changed?’

A good example perhaps: ‘is it really possible to gauge subjective happiness by counting how many times words such as ‘enjoyment’ or ‘pleasure’ occur in the more than 8 million books digitised by Google?’ Or another: ‘a quantitative study of American slavery, in which Fogel [Robert Fogel, joint winner of the 1993 Nobel Prize in economic history] used plantation records to show that slavery was an economically efficient means of production, and to suggest that Southern slaves were better off than many Northern wage-earners.’ And yet, ‘plantation records didn’t fully capture the nature of slavery. How could they, when they were created by one group of humans at the expense of another?’

It is for this reason, among others, that ‘a positivist language of science – of testing hypotheses against data – sits uncomfortably with the practice of history.’ Are historical facts (a debatable term at best) really ‘things’? Turchin seemed to think so.  ‘Inspired by the work of the American sociologist Jack Goldstone, who in the 1990s had tried to translate Alexis de Tocqueville’s philosophy into mathematical equations, Turchin began to relate population size to economic output (and, critically, levels of economic inequality) as well as social and political instability… Social structure, for example, could be treated as a product of health and wealth inequality – but to measure either, you need to choose approximate and appropriate proxies. The process was further complicated by the fact that, when you’re working with a chronology that spans millennia, these proxies must change over time. The texture of that change might be qualitative as well as quantitative.’ You get the idea: apples and oranges.

And anyway, what makes anybody believe that the Natural Sciences (which Turchin was trying to emulate) are actually capable of producing ‘objective knowledge’? ‘Half a century of research in the history of science has shown that this perspective is deeply flawed. The sciences have their own history – as indeed does the notion of objectivity – and that history is deeply entwined with power, politics and, importantly, the naturalisation of social inequality by reference to biological inferiority. No programme for understanding human behaviour through the mathematical modelling of evolutionary theory can afford to ignore this point… meta-analyses and abstract mathematical models depend, by necessity, on manipulating data gathered by other scholars, and a key criticism of their programme is that cliodynamicists are not sensitive to the nuances and limitations of that data.’

By this stage in the essay, I was not only confused, I was also disappointed. But, could I really expect an answer to the arbitrariness of historical data when even my brother and I, in describing an event from our shared childhood, can never agree on what really happened? It’s all perspective in the end; as Nietzsche said, ‘There are no facts, only interpretations.’

And anyway, my brother doesn’t know what he’s talking about…

I long to hear the story of your life

I like the idea that I am a story which I am still writing. After all, there seems to be a rambling kind of direction to it, and if pressed, I could likely invent a plot. Of course, until the final page, nobody -not even me- really knows how it’s going to turn out, but that’s the beauty of it. The intrigue of it -a story in which we’re kept guessing until the very end.

I suppose, though, any story admits of a certain amount of literary licence, and a life story is definitely no exception -just look at many social media postings where fact and fiction intermingle like roots from a pot-bound plant. It’s hard to know what to do with stories like that; by constructing them the way they have, does that still make them stories of who the narrators actually are? By altering the way we tell our tales, does that also alter us?

Because our memories are imperfect reproductions of what actually happened, and because the way we sift through them may favour -or reject- the more remarkable ones, do we have a story at all? Could our lives be nothing but random hodgepodges of remembered events on flash-cards we hold up when asked to tell someone about ourselves? And, even when organized in some semblance of sequential chronology, is it a story, or a fantasy?

Sometimes the only thing that stitches the narrative together is the fact that it all seems to have happened to that fraying thread of continuity we identify as ourselves. And yet, an old black and white photograph of me as a child seems not to be the same ‘me’ as the one I feel I am now. I have to believe the writing on the back that identifies it as me, and yet… and yet I do not remember it being taken. That child is not me anymore -there are so many gaps I cannot fill. So how do I construct a narrative that includes him? Am I his story, or is he mine?

I had almost forgotten about the idea of Homo narrans when I happened upon a counter-argument in an essay by Galen Strawson, a British analytic philosopher from the University of Texas at Austin: https://aeon.co/essays/let-s-ditch-the-dangerous-idea-that-life-is-a-story

After reciting a few examples of sundry philosophers and savants who subscribe to the narrative approach, Strawson writes, ‘I think it’s false – false that everyone stories themselves, and false that it’s always a good thing.’ I have to say that it spurred me to read on; I wondered how he would attempt to assail the now-conventional wisdom of the narrativists.

He firmly rejects this thesis. ‘It’s not just that the deliverances of memory are, for us, hopelessly piecemeal and disordered, even when we’re trying to remember a temporally extended sequence of events. The point is more general. It concerns all parts of life, life’s ‘great shambles’, in the American novelist Henry James’s expression. This seems a much better characterisation of the large-scale structure of human existence as we find it. Life simply never assumes a story-like shape for us.’

Strawson seems to feel that identity through narrative is an almost desperate attempt to seize control of the course one’s life: to develop an autobiographical narrative to act as a lens through which we experience the world. He quotes McAdams, a leading narrativist among social psychologists: ‘Beginning in late adolescence and young adulthood, we construct integrative narratives of the self that selectively recall the past and wishfully anticipate the future to provide our lives with some semblance of unity, purpose, and identity.’

And then, before we decide that describes our ‘story’, Strawson counters with a quote from the British author W Somerset Maugham: ‘I recognise that I am made up of several persons and that the person that at the moment has the upper hand will inevitably give place to another. But which is the real one? All of them or none?’ I suspect this was an attempt to muddy the water, but I pressed on.

I get the impression that Strawson identifies most with the 16th century philosopher and writer Michel de Montaigne who is mainly remembered for his prodigious output of essays. Montaigne, was also known for his memory lapses, and so it is no surprise that ‘‘I can find hardly a trace of [memory] in myself,’ he writes in his essay ‘Of Liars’ (1580). ‘I doubt if there is any other memory in the world as grotesquely faulty as mine is!’ And why is that important for Strawson?

Because ‘Poor memory protects him [Montaigne] from a disagreeable form of ambition, stops him babbling, and forces him to think through things for himself because he can’t remember what others have said… To this we can add the point that poor memory and a non-Narrative disposition aren’t hindrances when it comes to autobiography in the literal sense – actually writing things down about one’s own life. Montaigne is the proof of this, for he is perhaps the greatest autobiographer, the greatest human self-recorder… Montaigne writes the unstoried life – the only life that matters… He knows his memory is hopelessly untrustworthy, and he concludes that the fundamental lesson of self-knowledge is knowledge of self-ignorance.’

Not to make too great a point of a faulty memory being a helpful argument against the narrative of life, though, Strawson borrows from a comment on a book by the American novelist James Salter: ‘Salter strips out the narrative transitions and explanations and contextualisations, the novelistic linkages that don’t exist in our actual memories, to leave us with a set of remembered fragments, some bright, some ugly, some bafflingly trivial, that don’t easily connect and can’t be put together as a whole, except in the sense of chronology, and in the sense that they are all that remains.’ In other words, I suppose, self-knowledge comes mostly in bits and pieces that we fit together as best we can.

I still feel that life is a narrative, though -a series of events, however cobbled together, however rambling, that tell a story. Like an archeologist sifting through fragments from a midden, we piece together seemingly disparate items and random shards until we have a chronology that makes sense to us, an idea about what was happening and when. Our lives are not so much describable in accurately numbered episodes, as they are in contextually sorted potsherds found in the dust we have been walking through for years; we’ve organized them in a way that makes sense to us. So, our lives are only as colourful as the fragments we have stooped to gather.

Shouldn’t that tell us something…?

Why is Wonder?

Sometimes I am accosted by the strangest questions; they remind me of the unanswerable ‘why’ questions that so often bubble out of 3 year olds -the only difference, I suppose, is that I would no longer be satisfied with the unadorned ‘just because’ answers I’m sure I used to get from my frustrated parents.

But it seems to me that once I retired from the practice of Medicine and was no longer required to answer other people’s questions, it was inevitable that I would have to ask some tricky ones of my own -questions I would be forced to answer for my self. And then I realized that I was no longer the authority figure I once thought I was because the questions I ended up asking were more abstract. Less likely to admit of an easy solution.

Perhaps that’s what Retirement is designed for, though: imponderables. It is close to the denouement of a life and any loose threads of the clothes you once wore are carefully trimmed away and cleaned in preparation for the final scene.

My most recent attempt at cutting away loose strands concerned wonder. Why is wonder? I asked myself one evening, as I sat on a beach watching the cooling sun slowly sinking into the ocean. But when no answers surfaced and I turned to leave, a rim of clouds still glowed orange on the horizon like embers on a dying campfire and I found I couldn’t move. Why did I find myself so filled with awe, so taken with a scene I must have admired in countless versions in magazines or coffee-table books? Why did the experience stop time? Why did I hold my breath? And, what purpose could it possibly serve?

Different from mere beauty, which is shallow in comparison, wonder seems more spiritual -more intense– and less describable in words. It is a feeling, a wordless experience of awe. And yet, ineffable as it may be, is it just a curiosity, or more like an emergent phenomenon: a synergism of factors each of which, taken by itself, is not at all unique? And if so, why would their collective presence have such a major effect on me?

I am far from a religious person, but I am reminded of an idea I found somewhere in the writings of the philosopher/theologian Paul Tillich, years ago. I was wondering what prayer was, apart from an attempt to request a favour from whatever deity you had been taught was in charge. It seemed to me unlikely that words alone could establish any real communication with the god; more was required to make prayers feel like they were being heard. If I understood Tillich, he seemed to imply that prayer was -or should be- like the ephemeral, but almost overwhelming sense of awe felt on seeing, say, the array of burning clouds that could still bathe in the sun setting behind a now-silhouetted mountain. An unknitted communion…

Is that, then, what wonder is: an unintended prayer…? An unasked question? I happened across an essay by Jesse Prinz, a professor of philosophy at the City University of New York, who attempted an examination of wonder, and felt that it shared much in common with Science, Religion, and even Art: https://aeon.co/essays/why-wonder-is-the-most-human-of-all-emotions

The experience of wonder, he writes, seems to be associated with certain bodily sensations which ‘point to three dimensions that might in fact be essential components of wonder. The first is sensory: wondrous things engage our senses — we stare and widen our eyes… The second is cognitive: such things are perplexing because we cannot rely on past experience to comprehend them. This leads to a suspension of breath, akin to the freezing response that kicks in when we are startled… Finally, wonder has a dimension that can be described as spiritual: we look upwards in veneration.’

That may be the ‘what’ of wonder -its component parts- but not the ‘why’; his explanation seemed more of a Bayeux Tapestry chronicling events, not why they existed in the first place. I dug deeper into the essay.

Prinz goes on to mention the ‘French philosopher René Descartes, who in his Discourse on the Method (1637) described wonder as the emotion that motivates scientists to investigate rainbows and other strange phenomena.’ Their investigations, in turn produce knowledge. ‘Knowledge does not abolish wonder; indeed, scientific discoveries are often more wondrous than the mysteries they unravel. Without science, we are stuck with the drab world of appearances.’ Fair enough -Science can not only be motivated by wonder, it also contributes to it.

‘In this respect,’ Prinz writes, ‘science shares much with religion… like science, religion has a striking capacity to make us feel simultaneously insignificant and elevated.’ Awe, an intense form of wonder, makes people feel physically smaller than they are. ‘It is no accident that places of worship often exaggerate these feelings. Temples have grand, looming columns, dazzling stained glass windows, vaulting ceilings, and intricately decorated surfaces.’

Art, too, began to partake of the sublime, especially when it parted company from religion in the 18th century. ‘Artists began to be described as ‘creative’ individuals, whereas the power of creation had formerly been reserved for God alone.’ Artists also started to sign their paintings. ‘A signature showed that this was no longer the product of an anonymous craftsman, and drew attention to the occult powers of the maker, who converted humble oils and pigments into objects of captivating beauty, and brought imaginary worlds to life.’

Interestingly, then, ‘science, religion and art are unified in wonder. Each engages our senses, elicits curiosity and instils reverence. Without wonder, it is hard to believe that we would engage in these distinctively human pursuits.’ Mere survival does not require any of these, and yet they exist. ‘Art, science and religion are all forms of excess; they transcend the practical ends of daily life. Perhaps evolution never selected for wonder itself… For most of our history, humans travelled in small groups in constant search for subsistence, which left little opportunity to devise theories or create artworks. As we gained more control over our environment, resources increased, leading to larger group sizes, more permanent dwellings, leisure time, and a division of labour. Only then could wonder bear its fruit. Art, science and religion reflect the cultural maturation of our species.’

‘For the mature mind, wondrous experience can be used to inspire a painting, a myth or a scientific hypothesis. These things take patience, and an audience equally eager to move beyond the initial state of bewilderment. The late arrival of the most human institutions suggests that our species took some time to reach this stage. We needed to master our environment enough to exceed the basic necessities of survival before we could make use of wonder.’

Maybe, then, that is the answer to the ‘why’ of wonder. Perhaps it’s a fortunate -some might say providential- byproduct of who we are. Not inevitable, by any means, and not meant for any particular purpose, and yet, however accidental, it was the spur that pricked the sides of our dreams, to paraphrase Shakespeare’s Macbeth.

I’m not so sure it’s even that complicated, though. It seems to me that wonder is more of an acknowledgement, than anything else: an acknowledgment that we are indeed a part of the world around us; a tiny thread in a larger pattern. And every once in a while, when we step back, we catch a glimpse of the motif and marvel at its complexity. It is, then, an acknowledgment of gratitude that we are even a small part of it all… Yes, a prayer, if you will.