Why is Wonder?

Sometimes I am accosted by the strangest questions; they remind me of the unanswerable ‘why’ questions that so often bubble out of 3 year olds -the only difference, I suppose, is that I would no longer be satisfied with the unadorned ‘just because’ answers I’m sure I used to get from my frustrated parents.

But it seems to me that once I retired from the practice of Medicine and was no longer required to answer other people’s questions, it was inevitable that I would have to ask some tricky ones of my own -questions I would be forced to answer for my self. And then I realized that I was no longer the authority figure I once thought I was because the questions I ended up asking were more abstract. Less likely to admit of an easy solution.

Perhaps that’s what Retirement is designed for, though: imponderables. It is close to the denouement of a life and any loose threads of the clothes you once wore are carefully trimmed away and cleaned in preparation for the final scene.

My most recent attempt at cutting away loose strands concerned wonder. Why is wonder? I asked myself one evening, as I sat on a beach watching the cooling sun slowly sinking into the ocean. But when no answers surfaced and I turned to leave, a rim of clouds still glowed orange on the horizon like embers on a dying campfire and I found I couldn’t move. Why did I find myself so filled with awe, so taken with a scene I must have admired in countless versions in magazines or coffee-table books? Why did the experience stop time? Why did I hold my breath? And, what purpose could it possibly serve?

Different from mere beauty, which is shallow in comparison, wonder seems more spiritual -more intense– and less describable in words. It is a feeling, a wordless experience of awe. And yet, ineffable as it may be, is it just a curiosity, or more like an emergent phenomenon: a synergism of factors each of which, taken by itself, is not at all unique? And if so, why would their collective presence have such a major effect on me?

I am far from a religious person, but I am reminded of an idea I found somewhere in the writings of the philosopher/theologian Paul Tillich, years ago. I was wondering what prayer was, apart from an attempt to request a favour from whatever deity you had been taught was in charge. It seemed to me unlikely that words alone could establish any real communication with the god; more was required to make prayers feel like they were being heard. If I understood Tillich, he seemed to imply that prayer was -or should be- like the ephemeral, but almost overwhelming sense of awe felt on seeing, say, the array of burning clouds that could still bathe in the sun setting behind a now-silhouetted mountain. An unknitted communion…

Is that, then, what wonder is: an unintended prayer…? An unasked question? I happened across an essay by Jesse Prinz, a professor of philosophy at the City University of New York, who attempted an examination of wonder, and felt that it shared much in common with Science, Religion, and even Art: https://aeon.co/essays/why-wonder-is-the-most-human-of-all-emotions

The experience of wonder, he writes, seems to be associated with certain bodily sensations which ‘point to three dimensions that might in fact be essential components of wonder. The first is sensory: wondrous things engage our senses — we stare and widen our eyes… The second is cognitive: such things are perplexing because we cannot rely on past experience to comprehend them. This leads to a suspension of breath, akin to the freezing response that kicks in when we are startled… Finally, wonder has a dimension that can be described as spiritual: we look upwards in veneration.’

That may be the ‘what’ of wonder -its component parts- but not the ‘why’; his explanation seemed more of a Bayeux Tapestry chronicling events, not why they existed in the first place. I dug deeper into the essay.

Prinz goes on to mention the ‘French philosopher René Descartes, who in his Discourse on the Method (1637) described wonder as the emotion that motivates scientists to investigate rainbows and other strange phenomena.’ Their investigations, in turn produce knowledge. ‘Knowledge does not abolish wonder; indeed, scientific discoveries are often more wondrous than the mysteries they unravel. Without science, we are stuck with the drab world of appearances.’ Fair enough -Science can not only be motivated by wonder, it also contributes to it.

‘In this respect,’ Prinz writes, ‘science shares much with religion… like science, religion has a striking capacity to make us feel simultaneously insignificant and elevated.’ Awe, an intense form of wonder, makes people feel physically smaller than they are. ‘It is no accident that places of worship often exaggerate these feelings. Temples have grand, looming columns, dazzling stained glass windows, vaulting ceilings, and intricately decorated surfaces.’

Art, too, began to partake of the sublime, especially when it parted company from religion in the 18th century. ‘Artists began to be described as ‘creative’ individuals, whereas the power of creation had formerly been reserved for God alone.’ Artists also started to sign their paintings. ‘A signature showed that this was no longer the product of an anonymous craftsman, and drew attention to the occult powers of the maker, who converted humble oils and pigments into objects of captivating beauty, and brought imaginary worlds to life.’

Interestingly, then, ‘science, religion and art are unified in wonder. Each engages our senses, elicits curiosity and instils reverence. Without wonder, it is hard to believe that we would engage in these distinctively human pursuits.’ Mere survival does not require any of these, and yet they exist. ‘Art, science and religion are all forms of excess; they transcend the practical ends of daily life. Perhaps evolution never selected for wonder itself… For most of our history, humans travelled in small groups in constant search for subsistence, which left little opportunity to devise theories or create artworks. As we gained more control over our environment, resources increased, leading to larger group sizes, more permanent dwellings, leisure time, and a division of labour. Only then could wonder bear its fruit. Art, science and religion reflect the cultural maturation of our species.’

‘For the mature mind, wondrous experience can be used to inspire a painting, a myth or a scientific hypothesis. These things take patience, and an audience equally eager to move beyond the initial state of bewilderment. The late arrival of the most human institutions suggests that our species took some time to reach this stage. We needed to master our environment enough to exceed the basic necessities of survival before we could make use of wonder.’

Maybe, then, that is the answer to the ‘why’ of wonder. Perhaps it’s a fortunate -some might say providential- byproduct of who we are. Not inevitable, by any means, and not meant for any particular purpose, and yet, however accidental, it was the spur that pricked the sides of our dreams, to paraphrase Shakespeare’s Macbeth.

I’m not so sure it’s even that complicated, though. It seems to me that wonder is more of an acknowledgement, than anything else: an acknowledgment that we are indeed a part of the world around us; a tiny thread in a larger pattern. And every once in a while, when we step back, we catch a glimpse of the motif and marvel at its complexity. It is, then, an acknowledgment of gratitude that we are even a small part of it all… Yes, a prayer, if you will.

Is Seeing Believing?

Isn’t it interesting that some of us can look at a forest and miss the wind riffling through the leaves, while others see the moon as a ‘ghostly galleon tossed upon cloudy seas’? What determines what we see? Does it have to relate to something we’ve seen before -patterns that we recognize? Is our apprehension of reality an expectation? A sorting through the chaos and discarding what we don’t understand -the noise– until something more familiar emerges? Why do we not all see the same thing?

If patterns are what we are evolved to see, if they are what we use to make sense of the world, are there always patterns everywhere? These are things I wonder about, now that I have time to wonder. Now that I am retired, I suppose I can wade more thoughtfully into the turbulence I once found swirling about my days. Clarity is certainly not a common property of old age, but occasionally it descends as softly as a gossamer thread, and then as quickly drifts away leaving only traces of its presence. Doubts about its visit.

Are these mere hints of what the gifted see? Is peering beyond the horizon just a gift, or is it fleeting and unstable unless learned? There was an interesting essay in Aeon, an online offering that touched on the subject of insightful examination, by Gene Tracy, the founding director of the Center for the Liberal Arts at William and Mary in Williamsburg, Virginia: https://aeon.co/essays/seeing-is-not-simple-you-need-to-be-both-knowing-and-naive

‘When Galileo looked at the Moon through his new telescope in early 1610, he immediately grasped that the shifting patterns of light and dark were caused by the changing angle of the Sun’s rays on a rough surface. He described mountain ranges ‘ablaze with the splendour of his beams’, and deep craters in shadow as ‘the hollows of the Earth’. […] Six months before, the English astronomer Thomas Harriot had also turned the viewfinder of his telescope towards the Moon. But where Galileo saw a new world to explore, Harriot’s sketch from July 1609 suggests that he saw a dimpled cow pie.’ And so, the question must be asked, ‘Why was Galileo’s mind so receptive to what lay before his eyes, while Harriot’s vision deserves its mere footnote in history?’ But, as the author notes, ‘Learning to see is not an innate gift; it is an iterative process, always in flux and constituted by the culture in which we find ourselves and the tools we have to hand. […] the historian Samuel Y Edgerton has argued that Harriot’s initial (and literal) lack of vision had more to do with his ignorance of chiaroscuro – a technique from the visual arts first brought to full development by Italian artists in the late 15th century. By Galileo’s time, the Florentines were masters of perspective, using shapes and shadings on a two-dimensional canvas to evoke three-dimensional bodies in space. […] Harriot, on the other hand, lived in England, where general knowledge of these representational techniques hadn’t yet arrived. The first book on the mathematics of perspective in English – The Art of Shadows by John Wells – appeared only in 1635.’

But is it really as fortuitous as that? As temporally serendipitous? Tracy makes the point that, at least in the case of Science, observations are ‘often complex, contingent and distributed.’ And, ‘By exploring vision as a metaphor for scientific observation, and scientific observation as a kind of seeing, we might ask: how does prior knowledge about the world affect what we observe? If prior patterns are essential for making sense of things, how can we avoid falling into well-worn channels of perception? And most importantly, how can we learn to see in genuinely new ways?

‘Scientific objectivity is the achievement of a shared perspective. It requires what the historian of science Lorraine Daston and her colleagues call ‘idealisation’: the creation of some simplified essence or model of what is to be seen, such as the dendrite in neuroscience, the leaf of a species of plant in botany, or the tuning-fork diagram of galaxies in astronomy. Even today, scientific textbooks often use drawings rather than photographs to illustrate categories for students, because individual examples are almost always idiosyncratic; too large, or too small, or not of a typical colouration. The world is profligate in its variability, and the development of stable scientific categories requires much of that visual richness to be simplified and tamed. […] So, crucially, some understanding of the expected signal usually exists prior to its detection: to be able to see, we must know what it is we’re looking for, and predict its appearance, which in turn influences the visual experience itself.’

‘If the brain is a taxonomising engine, anxious to map the things and people we experience into familiar categories, then true learning must always be disorienting. […]Because of the complexity of both visual experience and scientific observation, it is clear that while seeing might be believing, it is also true that believing affects our understanding of what we see. The filter we bring to sensory experience is commonly known as cognitive bias, but in the context of a scientific observation it is called prior knowledge. […] If we make no prior assumptions, then we have no ground to stand on.’

In his opinion, there is a thrust and parry between learning to see, and seeing to learn. I have no trouble with that, but I have to say that Science is only one Magisterium in a world of several. Science is neither omniscient, nor omnispective.

I happened across a friend standing transfixed in the middle of a trail in the woods the other day. A gentle breeze was coaxing her hair across her face, but her eyes were closed and she was smiling as if she had just been awarded an epiphany.

At first I wondered if I should try to pass her unannounced, but I suppose she heard my approach and glanced at me before I had made up my mind. Her eyes fluttered briefly over my face for a moment, like birds investigating a place to perch, then landed as softly as a whisper on my cheek.

“I… I’m sorry, Mira,” I stammered, as surprised by her eyes as her expression. “You looked so peaceful, I didn’t want to disturb you…”

Her smile remained almost beatific, rapturous, but she recalled her eyes to brief them for a moment before returning them to me. “I was just listening to that bird,” she said and glanced into the thick green spaces between the trees to show me where, “when I felt the breeze…” I have to say, I hadn’t noticed anything -I hadn’t even heard the bird. “…And it touched my forehead like a kiss,” she said, and blushed for describing it like that. She closed her eyes and thought about it for a moment. “I can’t think of another word,” she added, and slowly walked away from me with a wink, onto a nearby path.

I don’t think that what she was saying was Science, or even meant to require a proof, and yet I felt far better knowing there are people like her in my world. I think I even felt a brief nuzzle by the wind as I watched her disappear into the waiting, excited fondle of the leaves.

 

 

 

Cycling in a Dish

Where do they get this stuff? Menses in a dish –or, to be more academically abstruse, ‘A microfluidic culture model of the human reproductive tract and 28-day menstrual cycle’?

It has a pedantic ring to it, even though it doesn’t exactly roll off the tongue, but I have to ask a simple, quasi-lay interrogative: why? Critical Thinking 101 –at least as they used to teach it- would demand to know why it is important that we make this model. If the answer is vague, or even unnecessarily complicated,  then one begins to suspect academic foppery.

The model is unique –I’ll give them that –let me refer to a succinct description of the model from a BBC News article: http://www.bbc.com/news/health-39421396 ‘The 3D model is made up of a series of cubes that each represent the different parts of the female reproductive system.

‘Each cube contains collections of living cells from the respective bits of this system – fallopian tubes, uterus, cervix and vagina (all human cells), and the ovaries (taken from mice).

‘The cubes are connected together with small tubes, which allow special fluid to flow through the entire system, much like blood.

‘This also means the “mini organs” can communicate with each other using hormones, mimicking what happens in a woman’s body during a “typical” 28-day human menstrual cycle.

‘Tests suggested that the tissues in the system responded to the cyclical ebb and flow of hormones, in a similar way to those of the female body.’

At first, I have to admit, I was skeptical of the need for the model –and yes, I was inclined to see it as foppery. But when I actually read the paper (http://www.nature.com/articles/ncomms14584) I was impressed. Why not just obtain the same cells from, say, hysterectomy specimens, grow them, and subject them to whatever hormones or chemicals you pick to study? It quickly became evident that there are two types of scientists: clinical, and laboratory –as a practicing doctor, I’m afraid I fall squarely into the clinical end, and never the twain shall meet… The reason became all too clear as I continued to read.

First of all, as they explain in the introduction, ‘Preclinical studies often begin with individual cells, separated from cellular and physical contacts that are important for biological function. These dispersed cells must be propagated through weekly reduction divisions and maintained on flat plastic; however, these cells are missing the cell physicochemical microenvironment, three-dimensional (3D) tissue-specific architecture, and blood flow perfusion found in natural tissues.’ In other words, they take a long time to grow and aren’t subject to the same environment they would have in the body (i.e. in vivo).

Secondly, what is used to perfuse them may not have the same effect as the blood circulation they would receive in vivo: ‘typical media composition is based on basal nutrients, bovine serum and a few specialized factors that are placed in a static setting with random mixing. As a consequence, cell–cell and tissue-level cytokine and endocrine signals are not integrated into signalling pathways.’

So not only can delicate organs be assessed as if they were still in the body (like the rhythmic beating of the hair cells that line the Fallopian tubes –cilia), but the effects of different pharmaceuticals could be safely tested in the model before expensive clinical trials were undertaken. ‘Despite large investments in research funding, only ∼8% of drugs for which Investigational New Drug applications have been filed will be approved by the FDA.’

I have to say that I am intrigued, and not a little embarrassed that I was put off by the title of the article. Perhaps to expiate that guilt a little, I mentioned it to Ted, a –I hesitate to say ‘older’- colleague of mine that I met for lunch the other day.

We are both retired now so, apart from our similar past histories, talk heads more towards hobbies and bowels, than to scientific literature. I’m not even sure how it came up, except that his niece was trying –unsuccessfully, it seems- to conceive and his sister had phoned him for advice. She had mentioned several chemicals she’d discovered in an online search –both as potential therapies, as well as putative endocrine disruptors. She wanted to know what Ted thought.

“So what do you think?” he asked me as he carefully cut his hamburger in quarters so he could manage the entry problems. “You retired a couple of years after me…” he added, as if that meant that I was a couple of years smarter than him –or at least a bit more up to date.

“Ted, you know as well as I do that it’s dangerous to speculate on the effects of chemicals on human pregnancies. There are too many variables, too many potential effects that may not show up till birth, or even later…”

He nodded. “I remember DES for threatened first term miscarriages and the later damage to reproductive anatomy -not to mention the clear-cell cancers of the vagina that didn’t show up till years later…”

“Or thalidomide for morning sickness in pregnancies…” I added solemnly.

He nodded thoughtfully and after carefully picking out shreds of lettuce that were hiding under the bun, attempted to cram a freshly-cut piece of the hamburger into his waiting mouth, but it leaked somewhere south of his lips. “They can’t even design a portable hamburger,” he said, unsure whether to laugh or blush. He wiped his face, picked up the meat, and then searched for more hidden pieces of lettuce and placed them on his plate.

When he noticed me watching his lettuce hunt, he shrugged. “I’ve always hated lettuce,” he explained. “So does my niece for that matter. Probably hereditary,” he mumbled and then licked his fingers, contritely. “I mean, look,” he said with a crooked smile on his face, “I solved this problem using in vitro techniques.” And he held up his now-clean fingers and licked his lips in embarrassed satisfaction. “There’s gotta be a way to test these drugs without endangering a pregnancy. Mice and rats are not humans and you can’t safely extrapolate the results to us.”

“Amen,” I said, shaking my head sadly. We’d had similar discussions over the years, so I suspect he was merely venting his frustrations again.

We sat in silence for a moment while he tried to improve his in vitro technique, as he insisted on calling it.

“You know, Ted,” I said after it looked as if his practice runs at the lettuce were failing, “I did read about an in vitro way being developed for safely testing things in women. It was in the journal Nature and entitled something like ‘Microfluidic culture model of the human reproductive tract’. I’m not sure they could adapt it to pregnancies, but maybe it might be helpful in seeing what effects drugs or other chemicals could have on reproductive cells in the events leading up to a pregnancy…” I tried to remember more. “And it seems to me they were even able to replicate a 28 day menstrual cycle…”

His eyebrows shot up, and then one stayed elevated, as if scotch-taped to his forehead. “Micro fluidic what…? My god, is your retirement that bad?”

I think I blushed –he made it sound as if I’d been caught downloading porn. “Uhmm, it’s called ‘body on a chip’, or something –or at least this part is…”

He suddenly turned his attention to the chips still languishing on his plate and smiled. “Thanks for the excuse,” he said. Obviously his wife or maybe Chrissie, his sister, had warned him not to order them, let alone eat them.

“Yeah, I gather that the idea, eventually, is to be able to take cells from somebody who needs a particular treatment and test those cells in the model with various chemicals to see what happens.” It seemed quite exciting to me.

“Mmmh,’ he said, polishing off the last of his fries, while the puzzled lettuceal remnants stared at him from the plate. “I don’t think I can tell Chrissie about it, though.”

I looked up from my napkin and cocked my head. “Why not, Ted? Wouldn’t she find it interesting?”

He rested his head on his hands and peered at me as if he were looking over a pair of bifocals like a professor with a sleepy student. “She’d be on Craigslist trying to buy one as soon as she got off the phone… In fact she’d probably try for a matching set –a male one as well as a female one,” he added when I stared at him.

I must have still looked puzzled, because he smiled and picked up a piece of lettuce. “Chrissie’s a dietician and she thinks her daughter has a terrible diet… and she certainly remembers mine,” he explained, shaking his head. “She’d be trying to feed this to the model to prove her point.” He sighed. “I’d never hear the end of it…”

I suddenly realized I’d not thought about it that deeply. He’s right, maybe the ‘body on a chip’ thing needs a bit more work before they advertise it on Facebook…

Unregarded Age in Corners Thrown

I worry too much; I didn’t used to, but it kind of crept up on me along with my aches and pains over the years. Age is something that has always been fraught with tensions as we stumble through the calendar first wanting more, then less and then, I suppose, trying to forget about it altogether -ignore it when it clearly needs to be addressed. Demands recognition.

Age –especially old age- is one of those concepts that is very much contextually driven. Age-driven, in fact: where one sits on the spectrum very much influences age perception. An elder would live many fewer years if it is a teenager, rather than a senior who is canvassed.

But a good case can be made that age is not a mono-dimensional concept. Chronology does not come in one flavour; not all eighty year olds, say, are tied to the same constraints. Age might better be considered as a quartet with the other members consisting of Biological –we all age differently, Psychological –some aged people retain their faculties better than others, and Social –some elderly people, whether by fiat, or necessity, no longer work outside their homes and are no longer as connected to social networks. Indeed, in times past, ‘old’ might well have been related to usefulness rather than chronology.

http://www.bbc.com/news/magazine-34465190

So there we have it: usefulness. Purpose! Self-worth. All contingent concepts to be sure. And retirement, despite the positive connotations that Society has tried to foist upon it, is still a denouement –however it is rationalized. However many cosmetics are applied. Wallpaper may fool strangers, but it is still wallpaper…

So, you see why I worry. It is not that there might not be new opportunities available in retirement -new venues- but simply the realization that it is a final chapter of a thoroughly read book. An epilogue.

But I digress. It is something of a fool’s errand to attempt to encapsulate Retirement under one banner. It is a chapter as yet unprinted, and at best only sketchily conceived. There are also portions of it written, even if unwittingly, by someone else.

There is a store I visit every so often to buy dog food. It is a large and perhaps corporately criticized chain, but my dog is fussy and became addicted when she was only a puppy to a brand only they seem to offer. To tell the truth, I enjoy the store; I enjoy wandering the aisles and feeling -what? – pride at resisting things I do not need and casting a cold eye on those I do not want. It is a juvenile thing, I suppose, but maybe that’s the point: a recapitulation of past temptations seen through different eyes. Different years…

But on my way in, I saw a face that seemed familiar. It wore the uniform of the store and yet it seemed out of place somehow. A bush growing in a patch of vegetables –or more aptly, perhaps, a tree standing all alone in the middle of a field of wheat. Staid and stolid, watching, bemused, the tender stalks waving frenetically around her feet.

It was not so much her age that separated her from those around her as her composure, her calmness in the Storm of Store. In the eye of the hurricane of shoppers intent on their own missions, her smile was like a shrine erected at the doorway, a refuge offered, but seldom taken. Seldom noticed: the store was not a temple –just another place to visit when the need arose, a series of shelves to inspect. There were no sacred places here, no altars, no need to reflect on the meaning of it all. The store fulfilled a function, not a curb-side meditation.

I have to admit, the face was so unexpected, so completely out of context that I passed it by with barely a thought, although I did stop halfway down a nameless aisle and wonder why. And it was there that it –she– caught up with me.

“Doctor?” she asked tentatively, clearly uncertain from behind at least, that it was me. And when I turned to face the voice I remembered somewhat shakily from the past, she smiled broadly. I could see it was all she could do to refrain from hugging me; instead, she proffered a bony hand, its skin replete with veins and the brown patches of age.

“Doris!” I somehow managed to retrieve the name from a long closed memory drawer -although not without an awkward pause because it was not the Doris I remembered, but an older, frailer model. Doris had been well into her seventies when I had last seen her in consultation but this face, this figure, was a worn and crumpled copy of that older woman I had once filed away.

Her smile looked painful it was so wide and welcoming, but it was her eyes that immediately captivated me. Like delicate pale blue figurines trapped behind the glass of an old cabinet, they begged for release, and when she opened their cages they flew to my face and rested there. “It’s so nice to see you again, doctor,” she said with her joy so evident I was almost taken aback.

Her frailty dissolved as I watched, and the younger Doris emerged as if it had been hiding all the while. I remembered her now as the vibrant woman who had quoted poetry to me when I was trying to take her history. Who had dismissed the referral from her GP as ‘misguided over-concern’ from a young doctor uncomfortable in dealing with a patient older than his grandmother. And as a result she had brokered the compromise of seeing an older specialist. When I also agreed that she really had no cause for concern, she’d bonded with me and even showed up at the office the next week with flowers. I suppose we all like our judgements to be validated.

But on that occasion it had led to a discussion of age, and whether or not to succumb. Whether, as Dylan Thomas had written, to go gentle into that good night. Or to… Rage against the dying of the light. She most emphatically was with Thomas, whereas I, in an uncharacteristic disclosure, had expressed uncertainty as to whether with identity obscured and purpose thwarted, I would be forced to go gentle into whatever the good night hid in retirement.

“Nonsense!” I still remembered her saying that, her face fierce, her eyes locked on mine. “Is ‘Doctor’ your last name? Age and function do not change who you are –just what you do…” Then her expression softened and her eyes unlocked from me and twinkled when they had returned to her face. She got up from her chair with an enigmatic smile and turned to me as she was walking out of the room. “Do not become what Shakespeare called Unregarded age in corners thrown. I would be very disappointed.”

I took a long hard look at her, standing in the aisle with her uniform proudly displayed and I smiled. “You’ve certainly taken your own advice, Doris. Not too many people your age would have chosen your path. You look happy.”

“Have you ever read any Robert Frost?” she asked after observing me quietly for a moment.

I nodded, suspecting what was to come.

She closed her eyes and a beatific expression emerged as if she were about to pray.

“Two roads diverged in a wood, and I— I took the one less traveled by, And that has made all the difference.”

There are forks in every road. Maybe she was praying -praying for me…