Why is Wonder?

Sometimes I am accosted by the strangest questions; they remind me of the unanswerable ‘why’ questions that so often bubble out of 3 year olds -the only difference, I suppose, is that I would no longer be satisfied with the unadorned ‘just because’ answers I’m sure I used to get from my frustrated parents.

But it seems to me that once I retired from the practice of Medicine and was no longer required to answer other people’s questions, it was inevitable that I would have to ask some tricky ones of my own -questions I would be forced to answer for my self. And then I realized that I was no longer the authority figure I once thought I was because the questions I ended up asking were more abstract. Less likely to admit of an easy solution.

Perhaps that’s what Retirement is designed for, though: imponderables. It is close to the denouement of a life and any loose threads of the clothes you once wore are carefully trimmed away and cleaned in preparation for the final scene.

My most recent attempt at cutting away loose strands concerned wonder. Why is wonder? I asked myself one evening, as I sat on a beach watching the cooling sun slowly sinking into the ocean. But when no answers surfaced and I turned to leave, a rim of clouds still glowed orange on the horizon like embers on a dying campfire and I found I couldn’t move. Why did I find myself so filled with awe, so taken with a scene I must have admired in countless versions in magazines or coffee-table books? Why did the experience stop time? Why did I hold my breath? And, what purpose could it possibly serve?

Different from mere beauty, which is shallow in comparison, wonder seems more spiritual -more intense– and less describable in words. It is a feeling, a wordless experience of awe. And yet, ineffable as it may be, is it just a curiosity, or more like an emergent phenomenon: a synergism of factors each of which, taken by itself, is not at all unique? And if so, why would their collective presence have such a major effect on me?

I am far from a religious person, but I am reminded of an idea I found somewhere in the writings of the philosopher/theologian Paul Tillich, years ago. I was wondering what prayer was, apart from an attempt to request a favour from whatever deity you had been taught was in charge. It seemed to me unlikely that words alone could establish any real communication with the god; more was required to make prayers feel like they were being heard. If I understood Tillich, he seemed to imply that prayer was -or should be- like the ephemeral, but almost overwhelming sense of awe felt on seeing, say, the array of burning clouds that could still bathe in the sun setting behind a now-silhouetted mountain. An unknitted communion…

Is that, then, what wonder is: an unintended prayer…? An unasked question? I happened across an essay by Jesse Prinz, a professor of philosophy at the City University of New York, who attempted an examination of wonder, and felt that it shared much in common with Science, Religion, and even Art: https://aeon.co/essays/why-wonder-is-the-most-human-of-all-emotions

The experience of wonder, he writes, seems to be associated with certain bodily sensations which ‘point to three dimensions that might in fact be essential components of wonder. The first is sensory: wondrous things engage our senses — we stare and widen our eyes… The second is cognitive: such things are perplexing because we cannot rely on past experience to comprehend them. This leads to a suspension of breath, akin to the freezing response that kicks in when we are startled… Finally, wonder has a dimension that can be described as spiritual: we look upwards in veneration.’

That may be the ‘what’ of wonder -its component parts- but not the ‘why’; his explanation seemed more of a Bayeux Tapestry chronicling events, not why they existed in the first place. I dug deeper into the essay.

Prinz goes on to mention the ‘French philosopher René Descartes, who in his Discourse on the Method (1637) described wonder as the emotion that motivates scientists to investigate rainbows and other strange phenomena.’ Their investigations, in turn produce knowledge. ‘Knowledge does not abolish wonder; indeed, scientific discoveries are often more wondrous than the mysteries they unravel. Without science, we are stuck with the drab world of appearances.’ Fair enough -Science can not only be motivated by wonder, it also contributes to it.

‘In this respect,’ Prinz writes, ‘science shares much with religion… like science, religion has a striking capacity to make us feel simultaneously insignificant and elevated.’ Awe, an intense form of wonder, makes people feel physically smaller than they are. ‘It is no accident that places of worship often exaggerate these feelings. Temples have grand, looming columns, dazzling stained glass windows, vaulting ceilings, and intricately decorated surfaces.’

Art, too, began to partake of the sublime, especially when it parted company from religion in the 18th century. ‘Artists began to be described as ‘creative’ individuals, whereas the power of creation had formerly been reserved for God alone.’ Artists also started to sign their paintings. ‘A signature showed that this was no longer the product of an anonymous craftsman, and drew attention to the occult powers of the maker, who converted humble oils and pigments into objects of captivating beauty, and brought imaginary worlds to life.’

Interestingly, then, ‘science, religion and art are unified in wonder. Each engages our senses, elicits curiosity and instils reverence. Without wonder, it is hard to believe that we would engage in these distinctively human pursuits.’ Mere survival does not require any of these, and yet they exist. ‘Art, science and religion are all forms of excess; they transcend the practical ends of daily life. Perhaps evolution never selected for wonder itself… For most of our history, humans travelled in small groups in constant search for subsistence, which left little opportunity to devise theories or create artworks. As we gained more control over our environment, resources increased, leading to larger group sizes, more permanent dwellings, leisure time, and a division of labour. Only then could wonder bear its fruit. Art, science and religion reflect the cultural maturation of our species.’

‘For the mature mind, wondrous experience can be used to inspire a painting, a myth or a scientific hypothesis. These things take patience, and an audience equally eager to move beyond the initial state of bewilderment. The late arrival of the most human institutions suggests that our species took some time to reach this stage. We needed to master our environment enough to exceed the basic necessities of survival before we could make use of wonder.’

Maybe, then, that is the answer to the ‘why’ of wonder. Perhaps it’s a fortunate -some might say providential- byproduct of who we are. Not inevitable, by any means, and not meant for any particular purpose, and yet, however accidental, it was the spur that pricked the sides of our dreams, to paraphrase Shakespeare’s Macbeth.

I’m not so sure it’s even that complicated, though. It seems to me that wonder is more of an acknowledgement, than anything else: an acknowledgment that we are indeed a part of the world around us; a tiny thread in a larger pattern. And every once in a while, when we step back, we catch a glimpse of the motif and marvel at its complexity. It is, then, an acknowledgment of gratitude that we are even a small part of it all… Yes, a prayer, if you will.

An accident of birth

For years now, I have picked through the garden of my life -sometimes for pleasure, and sometimes for utility. I weed, of course -the privilege of growing in my aging plot is largely contingent on my having planted it in the first place. Contingent on the purpose for which it was intended. Things that arrive unannounced might be tolerated at times, but the recent discovery of a flower tucked in amongst the lettuce plants instead of growing where I’d planted others of its kind, spoke more of my neglect than serendipity.

And now that I’ve been retired long enough to ponder these things, it occurred to me that the peripatetic guest may not have the same value in its new home. It’s still a flower to be sure -it’s still beautiful, and still proffers its petals as seductively to passing bees- but is it really the same flower as one that was the product of my labour? Does the intent flavour the result?

For some, I suspect it’s a trivial question: surely a daisy, say, is a daisy, no matter whether it arrived accidentally or was planted in the spot. It is a gift, they might say -something for which gratitude not deliberation is appropriate. In a sense, of course, they are correct. And yet, is all the work I may have expended -choosing its pedigree and colour,  calculating a location that might offer it the best chance to thrive, and then watering and weeding- do these not affect the appreciation of the resulting flower? And was appreciation not a large part of the original incentive that led to its planting?

For that matter, does a gift share an equal merit as the same item obtained through work and planning? Does it even possess the same meaning?

It occurred to me that maybe I simply have too much time on my hands now that I’m retired, and I tried to shelve the thought along with all those books I have been meaning to read once the opportunity presented itself. But the question continued to poke annoyingly at my brain in the evenings whenever my eyes tired of reading. I just could not understand what it was about the problem that was continuing to disturb me; and more, was I the only one who even thought there might be something to it?

I can’t say I actively sought an answer -quite frankly, I couldn’t even think of a way to phrase the question- but I did stumble upon a short philosophical enquiry written by Jonny Robinson, a tutor and ‘casual lecturer’ in the department of philosophy at Macquarie University in Australia: https://aeon.co/ideas/would-you-rather-have-a-fish-or-know-how-to-fish

It touched on a theme that seemed eerily similar: how there may be a difference in the quality of the knowledge of Truth, depending upon how it was acquired. ‘Many are born into severe poverty with a slim chance at a good education, and others grow up in religious or social communities that prohibit certain lines of enquiry. Others still face restrictions because of language, transport, money, sickness, technology, bad luck and so on. The truth, for various reasons, is much harder to access at these times. At the opposite end of the scale, some are effectively handed the truth about some matter as if it were a mint on their pillow, pleasantly materialising and not a big deal. Pride in this mere knowledge of the truth ignores the way in which some people come to possess it without any care or effort, and the way that others strive relentlessly against the odds for it and still miss out.’

Each type is in possession of the same Truth, presumably, although in one case it is a gift and in the other, has required an effort to obtain it. It seems to me there is a difference, though: ‘the person ready to correct herself, courageous in her pursuit of the truth, open-minded in her deliberation, and driven by a deep curiosity has a better relationship to truth even where she occasionally fails to obtain it than does the indifferent person who is occasionally handed the truth on a silver platter.’

So, to my question about the itinerant daisy: does it possess the same intrinsic worth as one that has been purposely planted and nourished? Robinson, for his essay, puts the question slightly differently: ‘Is it better to know, or to seek to know?’ Both seem labyrinthine, and unanswerable -trivial, perhaps- largely because they are both perspectival.

So he rephrases the question in the form of a thought-experiment: ‘Would you rather have a fish or know how to fish?’ If having a fish is the result of knowing how to catch it, that is different from having to wait for someone who knows how to fish, and hoping she will actually give the one she caught to you.

Robinson feels it is the same with knowledge. An isolated fact (knowledge) may be valuable, but if you have learned how to acquire more knowledge, you are not limited to that one fact. It is, in fact, a type of synergism: knowledge plus the ability to add to it turns out to be better than the mere fact of knowledge on its own.

That accidental daisy growing by itself amongst the lettuce is still beautiful, but if it truly was an accident, that may or may not be the end of the line for it -especially if I don’t know how to care for it. It is, in that case, on its own. In fact, given its location, I may even think of it as an undesirable -a weed- and pull it out.

It does seem to suggest that it has a different value, a different essence, from a bed of cherished Gerbera Daisies planted and growing contentedly, in their assigned place. In a sense, it is no longer a flower -or, at any rate, not one that I treasure.

One question, though, inevitably leads to another: what is growing alongside the lettuce then…?

Sapere audi

Sapere audi – ‘Dare to know’, as the Roman poet Horace wrote. It was later taken up by famous Enlightenment philosopher Immanuel Kant, and it seemed like a suitable rallying cry as I negotiated the years that led from youth to, well, Age. Who could argue that ignorance is preferable to knowledge? That understanding something, better facilitates an informed decision about whether to believe or reject? To welcome, or close the door?

Admittedly, knowledge can be a moving target, both in time and perhaps in temperament as well. Whatever ‘knowing’ is that determines the appeal of a particular political philosophy, say, is not immutable, not forever carved in marble like the letters in Trajan’s column. One could start off in one camp, and then wander into another as the years wear thin. Perhaps it is the gradual friction of experience rubbing on hope that effects the change- but however it works, exposure can alter what we believe. If nothing else, it speeds adaptation, and enables us to habituate to things that we might once have shunned. And it is precisely this ability to acclimatize that may prove worrisome.

An essay by the philosopher Daniel Callcut drew this to my attention a while ago: https://aeon.co/ideas/if-anyone-can-see-the-morally-unthinkable-online-what-then

‘There are at least two senses of ‘morally unthinkable’. The first, that of something you have no inkling of is perhaps the purest form of moral innocence. Not only can you not contemplate doing X: you don’t even know what X is. This is the innocence that parents worry their children will lose online… Then there is the worry that if something becomes thinkable in the imaginative sense, then it might eventually become thinkable in the practical sense too… If virtue depends in part on actions being unthinkable, then the internet doubtless has a tendency to make unvirtuous actions all too thinkable… The idea that being a decent person involves controlling the kinds of thoughts you allow yourself to think can easily be met with resistance. If virtue depends on limits to what is thinkable, and a certain free-thought ideal celebrates no limits, then the potential conflict between freethinking and virtue is obvious.’

Of course, one of the several elephants in the room is the pornographic one -the ‘public discussion of the internet’s potential to undermine virtue focuses on the vast amount of easily accessible pornography… Porn, the research suggests, has the tendency to encourage the prevalence of thoughts that shouldn’t be thought: that women enjoy rape, and that No doesn’t really mean No. More generally, it has the tendency to encourage what the British feminist film theorist Laura Mulvey in the 1970s dubbed the ‘male gaze’: men staring at women’s bodies in a way that bypasses concern for a woman’s consent.’ And, not only that, there was the intriguing suggestion that ‘Liberals, worried about potential censorship, can sometimes find themselves defending the implausible position that great art has great benefits but that junk culture never produces any harms.’

As Callcut writes, ‘What we imagine is not inert: what we think about changes the people we are, either quickly or over time – but it still changes us.’ So, ‘If the image you are looking at is disturbing,’ he asks, ‘is it because it is explicit and unfamiliar to you, or is it because it is wrong? When are you looking at a problem, and when is the problem you?’ There is a definite tension ‘between virtues that by their nature restrict thought and imagination and the prevailing spirit of the internet that encourages the idea that everything should be viewable and thinkable.’

In other words, is it better not to know something? Is Sapere audi anachronistic, inappropriate -dangerous, even?

I find myself drawn back in time to something that happened to me when I was around 13 or 14 years of age. There was no internet, in those days, of course, and word of mouth, or naughty whispers with subtle nudges were sometimes how we learned about adult things.

A somewhat duplicitous friend had lent me a book to read: The Facts of Life and Love for Teenagers, I think it was called. His parents had given it to him when they’d found his stash of overly-suggestive magazines hidden in a closet. I wasn’t sure what to make of the loan, but at that tender age, and in those pre-social media days, there was much about life that remained mysterious and hidden from me. I hadn’t yet given much thought to girls; it was still an innocent time.

I remember being embarrassed even handling the book -especially since it didn’t look as if it had even been opened. My first instinct was to hide it somewhere my mother wouldn’t find it. Obviously the closet hadn’t worked for my friend, so, since it was summer, I decided to put it at the bottom of my sock drawer where I kept the ones I only used in winter. She’d never need to burrow down that deeply.

But, oddly enough, a few days later, I discovered the book had acquired a folded piece of paper in the ‘How babies are made’, section. ‘Read this,’ the note said in my mother’s unmistakeable cursive.

The next morning at breakfast I could hardly look up from my plate, but to her credit, she acted as if it was just another summer’s day: the radio on the shelf was playing some music softly in the background, and my father was buried behind his newspaper.

But the discovery triggered an embarrassing walk with my father who had obviously been delegated by my mother to deliver the Talk, as my friends termed it in those days. And although it turned out well, I couldn’t help but think I had crossed a line in my life. And judging by the gravity with which he approached it, I had just been initiated into a hitherto forbidden club.

In this case, fortunately, the not-yet imagined realm was discussed sensitively and, with many blushes on both our faces, placed in a realistic context -and with what I would later realize was a sensible perspective…

Despite my age, and after all these years, I continue to be naïve about many things I suspect, and yet I still feel there is a need to defend the ‘Dare to know,’ exhortation. Virtue does not depend on actions never considered, nor on a drought of as-yet-unimagined things; decency does not simply require controlling what you allow yourself to think, any more than pulling the covers over your head at night protected you from the bogeyman in the room when you were a child.

Virtue -morality- isn’t the absence of temptation; there is, and probably will continue to be, an allure to what we do not know -to what is kept hidden from us. There will always be a struggle, I imagine, and the more you know about it -and about the world- the more you enable yourself to understand context. I still wonder what type of adulthood I might have wandered into had my mother not found that book and realized there was an opportunity.

Sapere audi, I almost wish she had written instead, in that note to her already nerdy child -I think I would have loved the Latin.

Learned without Opinion…

Sometimes we are almost too confident, aren’t we? Encouraged by something we’ve just read, and recognizing it as being already on file in our internal library, we congratulate ourselves on the depth and breadth of our scope. Perhaps it’s the title of an abstruse article, and even the picture at the top of the page that helped identify it. Or… was it something online, whose excellent graphics made it memorable? And, of course, what does it matter where you saw it? You’ve remembered it; it’s yours. Anyway, you know where to find it if the details are a bit fuzzy.

So, does that mean you know it -have thought it through? Analyzed it? Understood it…? Unfortunately, the answer is too often no. It’s merely filed somewhere, should the need arise. But knowledge, and especially the wisdom that might be expected to accompany it, is often lacking.

This was brought -worrisomely- to my attention in an article in Aeon. https://aeon.co/ideas/overvaluing-confidence-we-ve-forgotten-the-power-of-humility

Drawing, in part, on an essay by the psychologist Tania Lombrozo of the University of California –https://www.edge.org/response-detail/23731– Jacob Burak, the founder of Alaxon, a digital magazine about culture, art and popular science, writes ‘The internet and digital media have created the impression of limitless knowledge at our fingertips. But, by making us lazy, they have opened up a space that ignorance can fill.’ They both argue that ‘technology enhances our illusions of wisdom. She [Lombrozo] argues that the way we access information about an issue is critical to our understanding – and the more easily we can recall an image, word or statement, the more likely we’ll think we’ve successfully learned it, and so refrain from effortful cognitive processing.’

As Lombrozo writes, ‘people rely on a variety of cues in assessing their own understanding. Many of these cues involve the way in which information is accessed. For example, if you can easily (“fluently”) call an image, word, or statement to mind, you’re more likely to think that you’ve successfully learned it and to refrain from effortful cognitive processing. Fluency is sometimes a reliable guide to understanding, but it’s also easy to fool. Just presenting a problem in a font that’s harder to read can decrease fluency and trigger more effortful processing… It seems to follow that smarter and more efficient information retrieval on the part of machines could foster dumber and less effective information processing on the part of human minds.’ And furthermore, ‘educational psychologist Marcia Linn and her collaborators have studied the “deceptive clarity” that can result from complex scientific visualizations of the kind that technology in the classroom and on-line education are making ever more readily available. Such clarity can be deceptive because the transparency and memorability of the visualization is mistaken for genuine understanding.’

I don’t know about you, but I find all of this disturbing. Humbling, even. Not that I have ever been intellectually arrogant -that requires far more than I have ever had to offer- but it does make me pause to reflect on my own knowledge base, and the reliability of the conclusions derived from it.

So, ‘Are technological advances and illusions of understanding inevitably intertwined? Fortunately not. If a change in font or a delay in access can attenuate fluency, then a host of other minor tweaks to the way information is accessed and presented can surely do the same. In educational contexts, deceptive clarity can partially be overcome by introducing what psychologist Robert Bjork calls “desirable difficulties,” such as varying the conditions under which information is presented, delaying feedback, or engaging learners in generation and critique, which help disrupt a false sense of comprehension.’

To be honest, I’m not sure I know what to think of this. Presentation seems a key factor in memory for me -I remember books by the colours or patterns of their covers, for example. Seeing a book on a shelf often helps me remember, if not its exact contents, then at least how I felt about reading it. But I suppose the point of the article is that remembering is not necessarily understanding.

And yet, the book I see on the shelf may, in some fashion, have been incorporated into my thinking -changed something inside me. I’ve read quite a few books over the years, and been surprised, on re-reading them -or later, reading about them- that what I had learned from them was something totally different from what I suppose the author had likely intended.

A good example is Nobel Prize laureate Hermann Hesse’s Magister Ludi (the Glass Bead Game), which I read in the early 1960ies. I completely misremembered the plot (and no doubt the intention of the narrative) and for some reason was convinced that the whole purpose of this story was to suggest that a young student, Knecht, who had devoted his entire life to mastering the Game, comes to realize that his ambition was meaningless in the grand scheme of things -and near the end of the rather long novel, drowns himself as a result. Anybody who has actually read Magister Ludi, blinks in disbelief if I tell them what I remember of the story: I hadn’t really understood what Hesse had been trying to say, they tell me…

But, nonetheless, the novel had quite an effect on me. Because I remembered it the way I did, I began to realize how we come to rank our beliefs -prioritize our choices compared to those around us. So, was it worthwhile to train for years, dedicate his life, and eventually succeed in becoming the Master of a Glass Bead Game, for goodness sakes? And if he did, so what? Would that really make a difference anywhere, and to anybody?

For that matter, are there other choices that might have mattered more? How would you know? Maybe any choice is essentially the same: of equal value. I thought Hesse’s point terribly profound at the time -and still do, for that matter, despite the fact he probably didn’t intend my interpretation…

Perhaps you see what I am getting at: ‘understanding’ is multifaceted. I learned something important, despite my memory distorting the actual contents of what I read. I incorporated what I remembered as deeply meaningful, somehow. Was what I learned, however unintended, useful? Was it not a type of understanding of what might have been written between the lines? And even if not, the message I obtained was an epiphany. Can that be bad?

I’m certainly not arguing with Lombrozo, or Burak -their points are definitely intriguing and thought-provoking; I just worry that they are not all-encompassing -perhaps they overlook the side-effects. The unintended consequences. Maybe knowledge -and understanding- is less about what facts we retain, and more about what we glean from the exposure.

So, did I understand the novel? Perhaps not, but what I learned from it is now a part of me -and that’s just as valuable… What author, what teacher, could hope for more?

Is your wisdom consumed in confidence?

How do we know what we know? It’s a question I used to think was obvious: if we cannot investigate the answer ourselves, we turn to others –somebody will know. Even the polymaths of old relied on other people for the groundwork on which they built. Nobody can know everything -knowledge is a jigsaw puzzle, the integral pieces of which make little sense on their own. We have to know what fits, and where.

But how do we know who to trust? How do we know who knows? If the foundation on which we construct is badly planned -or worse, wrong– the building will not last. Think of Ptolemy and his epicycles that became hopelessly complicated in a vain attempt to explain celestial movements and maintain earth as the center of the universe.

And it’s not as if Scientists are always reliable anyway. Consider the disappointment of Fleischmann-Pons’ claims that they had produced ‘cold fusion’ -a nuclear reaction occurring at room temperature? More ominous by far, however, was Andrew Wakefield’s fraudulent 1998 paper in the prestigious British medical journal The Lancet that claimed that the MMR vaccine (measles, mumps, rubella) caused autism. The paper was retracted by the journal in 2004, but by then, the damage had been done.

My point is that if we are not careful about the source -the reputation- of our information we may be led astray. It’s an almost trite observation, perhaps, but in this era of ‘Fake News’, one best kept in mind. I was again reminded of the importance of this in an essay by Gloria Origgi, an Italian philosopher, and a tenured senior researcher at CNRS (the French National Centre for Scientific Research) in Paris. She was writing in Aeon: https://aeon.co/ideas/say-goodbye-to-the-information-age-its-all-about-reputation-now

As she observes, ‘[T]he greater the amount of information that circulates, the more we rely on so-called reputational devices to evaluate it. What makes this paradoxical is that the vastly increased access to information and knowledge we have today does not empower us or make us more cognitively autonomous. Rather, it renders us more dependent on other people’s judgments and evaluations of the information with which we are faced … we are moving towards the ‘reputation age’, in which information will have value only if it is already filtered, evaluated and commented upon by others … reputation has become a central pillar of collective intelligence today. It is the gatekeeper to knowledge, and the keys to the gate are held by others. The way in which the authority of knowledge is now constructed makes us reliant on what are the inevitably biased judgments of other people, most of whom we do not know … In the best-case scenario, you trust the reputation of scientific research and believe that peer-review is a reasonable way of sifting out ‘truths’ from false hypotheses and complete ‘bullshit’ about nature. In the average-case scenario, you trust newspapers, magazines or TV channels that endorse a political view which supports scientific research to summarise its findings for you. In this latter case, you are twice-removed from the sources: you trust other people’s trust in reputable science.’

So how do we ever know whether we are building on sand or rock? Let’s face it, few of us are competent to judge the raw data of a scientific study, let alone repeat the experiment to verify the results. And how many of us would be inclined to repeat it even if we could? No, some things we simply have to take on trust.

Even so, Origgi offers us another option: ‘What a mature citizen of the digital age should be competent at is not spotting and confirming the veracity of the news. Rather, she should be competent at reconstructing the reputational path of the piece of information in question, evaluating the intentions of those who circulated it, and figuring out the agendas of those authorities that leant it credibility.’ As the Nobel laureate Friedrich Hayek, an Austrian economist and political philosopher wrote, ‘civilisation rests on the fact that we all benefit from knowledge which we do not possess.’

I’m trying to learn from Origgi, though. I’m trying to pick my filters carefully. Figure out their agendas. Sometimes you can even do that by listening.

I was sitting in my favourite dark corner of Starbucks the other day when two women sat down at the table next to me. I’m not sure they even noticed my ears in the shadows because they seemed to be in the middle of a conversation about technology as they each held their phones in front of them like crucifixes warding off the devil.

“I got a new running app, Fran,” said a tall thin woman with short curly dark hair and attired in expensive looking running gear.

“Which app you using, Dor?” her friend responded, equally attired and reaching for Dor’s phone.

“It’s a new one,” Dor said, holding it out of Fran’s reach. “Supposed to be the best at approximating calorie expenditure. Takes account of your weight, leg length, and then adds in changes in altitude on the run, as well as the time taken.” She looked at it again. “Even asks for a picture so you can post.”

Fran smiled benevolently. “Your IP address and Email, too?”

“Huh?”

“Privacy, Dor. Privacy.”

Dor stared at her quizzically for a moment. “I just figured they were being thorough, eh? More accurate… Anyway, they know all that other stuff nowadays.”

Fran stared back, and then sighed. “I suppose they do, but I refuse to make it easy for them… Sometimes you’re so naïve, my friend.”

“But…”

Fran shook her head. “I’ve just got a simple running app. And they didn’t ask for my picture.”

Dor blinked -rather provocatively I thought. “The more info, the more accurate the assessment, don’t you think?”

Fran rolled her eyes. “Well, we’ve just run together this morning -let’s see if the calorie count is the same.” She glanced at her screen. “I’ve got 725 cals. And 5K. for distance. How about you?”

“1100… and 4.85 K” Dor smiled. “I like mine better.”

Fran leaned across the table and peeked at the other screen. “Your app looks pretty well the same as mine… Yours play music?” Dor nodded. “And give verbal encouragement?”

“Uhmm, well I don’t turn on all the audio stuff… But I had to pay to download this one so it probably does.” She started tapping and then turned the screen so Fran could see it. “See? It won some sort of award for excellence.”

Fran sat back in her seat, her expression unreadable. “You paid? Mine’s free…” She began a similar tapping frenzy. “Mine won an award, too… Who makes yours?”

Dor started scrolling down her screen and then turned it towards Fran again. “Can’t pronounce it, but here…”

Fran showed her own screen. “It’s the same company, Dor!”

They were both silent for a moment. Then Dor smiled contentedly. “You get what you pay for, I guess, eh?”

I smiled to myself, still hidden in the shadows, and wondered what Origgi would make of the effort of these two mature citizens of the digital age. At least they were trying -and after all, they had pretty well figured out the intentions and agendas of their source…

Should We Bell the Cat?

What should you do at a dinner party if the hostess, say, declares that she believes something that you know to be inaccurate -or worse, that you consider repellent? Abhorrent? Should you wait to see how others respond, or take it upon yourself to attempt to correct her belief? If it is merely a divergence of opinion, it might be considered a doctrinaire exercise -a Catholics vs Protestant type of skirmish- and likely unwinnable.

But, suppose it is something about which you are recognized to have particular credentials so your response would not be considered to be merely an opinion, but rather a statement of fact? Should that alter your decision as to whether or not to take issue with her pronouncement? Would your silence imply agreement -acquiescence to a view that you know to be not only wrong, but offensive? And would your failure to contradict her, signal something about her opinion to the others at the table? If it is an ethical issue, should you attempt to teach?

It is a difficult situation to be sure, and one that is no doubt difficult to isolate from context and the responsibilities incumbent upon a guest. Still, what should you do if, uncorrected, she persists in promulgating her belief? Should you leave the table, try to change the topic, or merely smile and wait to see if she is able to sway those around you to her views?

I can’t say that the situation has arisen all that often for me, to tell the truth -we tend to choose our friends, and they theirs, on the basis of shared values- but what risks might inhere in whatever course of action I might choose? I happened upon an insightful and intriguing article that touched on that very subject in Aeon, an online magazine:  https://aeon.co/ideas/should-you-shield-yourself-from-others-abhorrent-beliefs It was written by John Schwenkler, an associate professor in philosophy at Florida State University.

He starts, by pointing out that ‘Many of our choices have the potential to change how we think about the world. Often the choices taken are for some kind of betterment: to teach us something, to increase understanding or to improve ways of thinking. What happens, though, when a choice promises to alter our cognitive perspective in ways that we regard as a loss rather than a gain?’

And further, ‘When we consider how a certain choice would alter our knowledge, understanding or ways of thinking, we do this according to the cognitive perspective that we have right now. This means that it’s according to our current cognitive perspective that we determine whether a choice will result in an improvement or impairment of that very perspective. And this way of proceeding seems to privilege our present perspective in ways that are dogmatic or closed-minded: we might miss the chance to improve our cognitive situation simply because, by our current lights, that improvement appears as a loss. Yet it seems irresponsible to do away entirely with this sort of cognitive caution… And is it right to trust your current cognitive perspective as you work out an answer to those questions? (If not, what other perspective are you going to trust instead?)’

You can see the dilemma: is the choice or opinion you hold based on knowledge, or simply belief? And here he employs a sort of thought experiment: ‘This dilemma is escapable, but only by abandoning an appealing assumption about the sort of grasp we have on the reasons for which we act. Imagine someone who believes that her local grocery store is open for business today, so she goes to buy some milk. But the store isn’t open after all… It makes sense for this person to go to the store, but she doesn’t have as good a reason to go there as she would if she didn’t just think, but rather knew, that the store were open. If that were case she’d be able to go to the store because it is open, and not merely because she thinks it is.’

But suppose that by allowing an argument -an opinion, say- to be aired frequently or uncontested, you fear you might eventually be convinced by it? It’s how propaganda endeavours to convince, after all. What then? Do you withdraw, or smile and smile and see a villain (to paraphrase Hamlet)? ‘If this is on the right track, then the crucial difference between the dogmatic or closed-minded person and the person who exercises appropriate cognitive caution might be that the second sort of person knows, while the first merely believes, that the choice she decides against is one that would be harmful to her cognitive perspective. The person who knows that a choice will harm her perspective can decide against it simply because it will do so, while the person who merely believes this can make this choice only because that is what she thinks.’

This is philosophical equivocation, and Schwenkler even admits as much: ‘What’s still troubling is that the person who acts non-knowingly and from a mere belief might still believe that she knows the thing in question… In that case, she’ll believe that her choices are grounded in the facts themselves, and not just in her beliefs about them. She will act for a worse sort of reason than the sort of reason she takes herself to have.’

As much as I enjoy the verbiage and logical progression of his argument, I have to admit to being a little disappointed in the concluding paragraph in the article, that seems to admit that he has painted himself into a corner: ‘What’s still troubling is that the person who acts non-knowingly and from a mere belief might still believe that she knows the thing in question: that climate change is a hoax, say, or that the Earth is less than 10,000 years old. In that case, she’ll believe that her choices are grounded in the facts themselves, and not just in her beliefs about them. She will act for a worse sort of reason than the sort of reason she takes herself to have. And what could assure us, when we exercise cognitive caution in order to avoid what we take to be a potential impairment of our understanding or a loss of our grip on the facts, that we aren’t in that situation as well?’

But, I think what this teaches me is the value of critical analysis, not only of statements, but also of context. First of all, obviously, to be aware of the validity of whatever argument is being aired, but then deciding whether or not an attempted refutation would contribute anything to the situation, or merely further entrench the individual in their beliefs, if only to save face. And as well, it’s important to step back for a moment, and assess the real reason I am choosing to disagree. Is it self-aggrandizement, dominance, or an incontestable conviction -incontestable based on knowledge or unprovable belief…?

I realize this is pretty confusing stuff -and, although profound, not overly enlightening- but sometimes we need to re-examine who it is we have come to be. In the words of the poet Kahlil Gibran, The soul walks not upon a line, neither does it grow like a reed. The soul unfolds itself like a lotus of countless petals.

Understanding as…

There is so much stuff out there that I don’t know -things that I hadn’t even thought of as knowledge. Things that I just accepted as ‘givens’.  You know, take the ability to understand something like, say, an arrangement of numbers as a series rather than a bunch of numbers, or the ability to extract meaning from some sounds -for example words spoken in English- and yet not others in a different language.

And, perhaps equally mysterious is the moment when that epiphany strikes. What suddenly changes those numbers into a series? Is it similar to what makes figure-ground alterations flip back and forth in my head: aspect perception? Is it analogous to the assignation of meaning to things -or, indeed, picking them out of the chaos of background and recognizing them as somehow special in the first place? Is it what Plato meant when he referred to the Forms –‘chairness’ or ‘tableness’ for example- abstractions that allow us to identify either, no matter how varied the shapes or sizes -the true essence of what things really are?

I suppose I’m becoming rather opaque -or is it obtuse?- but the whole idea of aspect perception, of ‘seeing as’, is an exciting, yet labyrinthine terra incognita, don’t you think? I’m afraid that what started it all was an essay in the online Aeon publication: https://aeon.co/ideas/do-you-see-a-duck-or-a-rabbit-just-what-is-aspect-perception

It was the edited version of an essay written by Stephen Law, the editor of the Royal Institute of Philosophy journal THINK. He begins by discussing some of the figure-ground changes found in, say Necker cubes whose sides keep flipping back and forth (a type of aspect perception) and then suggests that ‘A[nother] reason why changes in aspect perception might be thought philosophically significant is that they draw our attention to the fact that we see aspects all the time, though we don’t usually notice we’re doing so… For example, when I see a pair of scissors, I don’t see them as a mere physical thing – I immediately grasp that this is a tool with which I can do various things.’

Another example might be ‘…our ability to suddenly ‘get’ a tune or a rule, so we are then able to carry on ourselves.’ Or, how about religion? ‘The idea of ‘seeing as’ also crops up in religious thinking. Some religious folk suggest that belief in God doesn’t consist in signing up to a certain hypothesis, but rather in a way of seeing things.’ But then the caveat: ‘Seeing something as a so-and-so doesn’t guarantee that it is a so-and-so. I might see a pile of clothes in the shadows at the end of my bed as a monster. But of course, if I believe it’s a monster, then I’m very much mistaken.’

I have always loved wandering around bookstores. Maybe it’s an asylum -a refuge from the noisy street, or a spiritual sanctuary in a chaotic mall -but it’s more likely that the range and choice of books allows me to exercise an epiphanic region of my brain, and to practice ‘seeing as’ to my heart’s content. I’d never thought of bookstores as exercise before, of course, but I suppose the seed of ‘understanding as’ was sown by that article… or maybe it was the little girl.

Shortly after reading the essay, I found myself wandering blissfully through the quiet aisles of a rather large bookstore that seemed otologically removed from the noisy mall in which it hid. Coloured titles greeted me like silent hawkers in a park, the ones that sat dislodged from their otherwise tidy rows, sometimes reaching out to me with greater promise: curiosity, as to why someone might have dislodged them, perhaps. But nonetheless, I also found myself amused at their choices: book shops are catholic in the selection they proffer and I relish the opportunity to switch my perspectives… and expand my Weltanschauung, as the Philosophy section into which I had meandered might have put it when the thought occurred.

Of course, unexpected concepts like that are one of the delights of a bookstore -turn a corner into a different aisle and the world changes. It’s where I met the little girl talking to her mother about something in a book she was holding.

No more than four or five years old, she was wearing what I suppose was a pink Princess costume, and trying to be very… mature. Her mother, on the other hand, was dressed for the mall: black baseball cap, jeans, sneakers, and a grey sweatshirt with a yellow mustard stain on the front. Maybe they’d just come from a party, or, more likely, the Food Court, but the mother was trying to explain something in the book to her little daughter. The aisle wasn’t in the children’s section, but seemed to have a lot of titles about puzzles, and illusions, so maybe they’d wandered into it for something different: for surprises.

As I pretended to examine some books nearby, I noticed a Necker’s cube prominently displayed on the page the girl was holding open.

“Why does it do that, mommy?” Even as she spoke the perspective of the cube was flipping back and forth, with one face, then another seeming to be closer.

The mother smiled at this obvious teaching moment.

“It’s a great idea, anyway,” the daughter continued, before she got an answer.

“Idea…?” the mother said, with a patient look on her face. “What’s the idea, Angie?”

Angie scrunched her forehead and gave her mother a rather condescending look. “It’s an exercise book, remember?”

That apparently caught the mother by surprise. “It’s a book of puzzles and magic, sweetheart. I didn’t see any exercises.”

Angie rolled her eyes at her mother’s obvious obtuseness. “The nexercise cube, mommy…!”

Necker’s cube, sweetie,” she responded, trying to suppress a giggle. “It’s not an exercise cube.”

But Angie was having none of that, and stared at her like a teacher with a slow pupil. “It keeps making my mind move, mommy!” She shook her head in an obviously disappointed rebuke. “That’s exercise.”

I slipped back around the corner, unnoticed by them both I think. I felt I’d intruded on a very intimate moment and I didn’t want to trespass, but I couldn’t help wondering if Angie had come far closer to understanding Plato’s Forms than her mother or I could ever hope to.

Is there nothing either good or bad but thinking makes it so?

Sometimes there are articles that set my head spinning. Or my mind. Ideas that I’d never thought of before. Ideas that make me rummage around deep inside, like I’m searching for a pencil, or my internal keyboard where I write the things I should remember. I often don’t, of course –remember them, I mean -how do you summarize and store enlightenment for a day, a week, a lifetime later? Sometimes you just have to explain it to yourself in your own words.

I subscribe to the online Aeon magazine -well, its newsletter, anyway- and I have to say that many of its articles are sufficiently mind-expanding as to qualify as epiphanous. Heuristic.

One such article on belief started me thinking. It was written by Daniel DeNicola, professor and chair of philosophy at Gettysburg College in Pennsylvania: https://aeon.co/ideas/you-dont-have-a-right-to-believe-whatever-you-want-to  It questions whether we have the right to believe whatever we want -the abnegation of which is in itself anathema in many quarters. But surely there’s no harm in having the unfettered right to believe whatever.

And yet, as he asserts, ‘We do recognise the right to know certain things. I have a right to know the conditions of my employment, the physician’s diagnosis of my ailments … and so on. But belief is not knowledge. Beliefs are factive: to believe is to take to be true … Beliefs aspire to truth – but they do not entail it. Beliefs can be false, unwarranted by evidence or reasoned consideration. They can also be morally repugnant… If we find these morally wrong, we condemn not only the potential acts that spring from such beliefs, but the content of the belief itself, the act of believing it, and thus the believer.’

‘Such judgments can imply that believing is a voluntary act. But beliefs are often more like states of mind or attitudes than decisive actions … For this reason, I think, it is not always the coming-to-hold-this-belief that is problematic; it is rather the sustaining of such beliefs, the refusal to disbelieve or discard them that can be voluntary and ethically wrong.’

In other words, I may inherit a belief from my family, or the circle of friends I inhabit, but that is no excuse for continuing to hold them if I come realize they are harmful or factually incorrect.

‘Believing has what philosophers call a ‘mind-to-world direction of fit’. Our beliefs are intended to reflect the real world – and it is on this point that beliefs can go haywire. There are irresponsible beliefs; more precisely, there are beliefs that are acquired and retained in an irresponsible way. One might disregard evidence; accept gossip, rumour, or testimony from dubious sources; ignore incoherence with one’s other beliefs; embrace wishful thinking ….’

Here, though, DeNicola issues a caveat. He does not wish to claim, that it is wrong, always, everywhere, and for anyone, to believe anything upon insufficient evidence. ‘This is too restrictive. In any complex society, one has to rely on the testimony of reliable sources, expert judgment and the best available evidence. Moreover, as the psychologist William James responded in 1896, some of our most important beliefs about the world and the human prospect must be formed without the possibility of sufficient evidence. In such circumstances … one’s ‘will to believe’ entitles us to choose to believe the alternative that projects a better life.’

Our beliefs do not have to be true if it is not possible to know what actually is true, or turns out to be the truth. As an example, ‘In exploring the varieties of religious experience, James would remind us that the ‘right to believe’ can establish a climate of religious tolerance.’ But even here, intolerant beliefs need not be tolerated: ‘Rights have limits and carry responsibilities. Unfortunately, many people today seem to take great licence with the right to believe, flouting their responsibility. The wilful ignorance and false knowledge that are commonly defended by the assertion ‘I have a right to my belief’ do not meet James’s requirements. Consider those who believe that the lunar landings or the Sandy Hook school shooting were unreal, government-created dramas … In such cases, the right to believe is proclaimed as a negative right; that is, its intent is to foreclose dialogue, to deflect all challenges; to enjoin others from interfering with one’s belief-commitment. The mind is closed, not open for learning. They might be ‘true believers’, but they are not believers in the truth.’

So, do we accept the right of the bearers of those beliefs to silence the rest of us, or should they merely be allowed to coexist in the noise? DeNicola wants to think, ‘[T]here is an ethic of believing, of acquiring, sustaining, and relinquishing beliefs – and that ethic both generates and limits our right to believe.’

But, I wonder if the ethic is truly assignable -the noise can be overwhelming, and those who are the most persistent in its production end up deafening the rest of us. And although any responsibility for their belief should imply accountability, to whom are they accountable -to those in power, or to those who also share the belief? Do those with firmly held beliefs read articles like the ones in Aeon? And would they be swayed by the arguments even if they did?

Is it my responsibility to convince my opponents that my beliefs are right, or rather to set about proving that theirs are wrong? A fine distinction to be sure, and one that seems inextricably embedded in the web of other beliefs I have come to accept as valid markers of reality. And yet I think the thesis of DeNicola’s argument -that a belief, even if possibly untrue, should at the very least, not be dangerous, threaten harm, or prevent others from believing something else- is the most defensible. If nothing else, it carries the imprimatur of several thousand years of wisdom: the concept of reciprocity -the Golden Rule, if you will: what you wish upon others, you wish upon yourself. Or, in the Latin of the Hippocratic Oath: Primum non nocere -First of all, do no harm.

Nobody in Particular

Why do we believe something? How do we know that we are right? When I was a child, I was certain that the Fleetwood television set my parents had just purchased, was the best. So was the make of our car -and our vacuum cleaner too, come to think of it. But why? Was it simply because authority figures in my young life had told me, or was there an objective reality to their assertions? For that matter, how did they know, anyway? Other parents had different opinions, so who was right?

I was too young to question these things then, but gradually, I came to seek other sources of knowledge. And yet, even these sometimes differed. It’s difficult to know in what direction to face when confronted with disparate opinions. Different ‘truths’. Everybody can’t be right. Usually, in fact, the correct answer lies somewhere in the middle of it all, and it becomes a matter of knowing which truths to discard -choosing the ‘correct’ truth.

Despite the fact that most of us rely on some method like this, it sounds completely counterintuitive. How many truths can there be? Is each a truth, or merely an opinion? And what’s wrong with having a particular opinion? Again, how would we know? How could we know?

Nowadays, with social media algorithms selecting which particular news they report on the basis of our past choices, it’s difficult to know if we are in an echo chamber unless we purposely and critically examine whatever truths we hold dear -step back to burst the bubble. Canvas different people, and sample different opinions. But, even then, without resorting to mythology, or a presumed ‘revealed’ truth that substantiates a particular religious dogma, is there an objective truth that somehow transcends all the others? Conversely is all truth relative -situationally contextualized, temporally dependent, and ultimately socially endorsed?

Should we, in fact, rely on a random sample of opinions to arrive at an answer to some questions that are only a matter of values, but not about realistically verifiable facts -such as the height of a building, say, or maybe the type of bacterium that causes a particular disease? Would that bring us closer to the truth, or simply yet another truth?

Well, it turns out that the average of a large group of diverse and even contrary opinions has some statistical merit: http://www.bbc.com/future/story/20140708-when-crowd-wisdom-goes-wrong  ‘[T]here is some truth underpinning the idea that the masses can make more accurate collective judgements than expert individuals.’ The Wisdom of Crowds ‘is generally traced back to an observation by Charles Darwin’s cousin Francis Galton in 1907. Galton pointed out that the average of all the entries in a ‘guess the weight of the ox’ competition at a country fair was amazingly accurate – beating not only most of the individual guesses but also those of alleged cattle experts. This is the essence of the wisdom of crowds: their average judgement converges on the right solution.’

But the problem is in the sampling -the diversity of the members of that crowd. ‘If everyone let themselves be influenced by each other’s guesses, there’s more chance that the guesses will drift towards a misplaced bias.’ Of course ‘This finding challenges a common view in management and politics that it is best to seek consensus in group decision making. What you can end up with instead is herding towards a relatively arbitrary position. Just how arbitrary depends on what kind of pool of opinions you start off with. […] copycat behaviour has been widely regarded as one of the major contributing factors to the financial crisis, and indeed to all financial crises of the past. [And] this detrimental herding effect is likely to be even greater for deciding problems for which no objectively correct answer exists. […] All of these findings suggest that knowing who is in the crowd, and how diverse they are, is vital before you attribute to them any real wisdom.’

This might imply that ‘you should add random individuals whose decisions are unrelated to those of existing group members. That would be good, but it’s better still to add individuals who aren’t simply independent thinkers but whose views are ‘negatively correlated’ – as different as possible – from the existing members. In other words, diversity trumps independence. If you want accuracy, then, add those who might disagree strongly with your group.’

Do you see where I’m going with all this? We should try to be open enough to consider all sides of an argument before making a considered decision. Let’s face it, you have to know what it is that you’re up against before you can arrive at a compromise. And perhaps, the thing you thought you were opposing is not so different from your own view after all.

Even our values fluctuate. Unless we are willing to be historical revisionists, it’s obvious that people in the past often assigned values differently to how we do today -sexual orientation, for example, or racial characteristic and stereotyping. And who nowadays would dare argue that women are not the equal of men, and deserve the same rights?

There are some things about which we will continue to disagree, no doubt. And yet, even a willingness to listen to an opposing opinion instead of shutting it down without a fair acknowledgment of whatever merits it might have hidden within it, or commonalities it might share with ours, is a step in the right direction.

I’m not at all sure that it’s healthy to agree about everything, anyway, nor to assume we possess the truth. It’s our truth. I think that without some dissenting input, we’d be bored, condemned to float in the increasingly stagnant backwater we chose, while just beyond our banks, a creek runs merrily past, excited to discover another view that lies beyond and behind the next hill.

After all, remember what happened to Caesar after Shakespeare had him boast: “I am constant as the northern star, of whose true-fix’d and resting quality there is no fellow in the firmament.”

Just saying…

 

Places that we’ve come to trust

 

When I was a child, the world was an even stranger place than it is now. I knew so much less then, and the boundaries of almost every experience were unexplored and mysterious. I suppose that’s to be expected when the menu is large, and the stomach limited. So, with no internet to answer each question, and teachers who, despite their qualifications and zeal, were unable to fill in more than a decidedly modest number of the blanks, children my age migrated to the Delphic Oracle of the era: the library.

Although sometimes an imposing stone-and-pillared structure in the middle of a large city, in more modest towns it was often only a converted cottage, or a tiny building that housed the books. But however it was dressed, it was the library with all those answers on the shelves, all that magic in the musty perfume of the books. And yes, there was the reigning priestess, the keeper of the tomes, who seemed to know just how to organize our questions and then lead us directly to the shelf where the answers lay.

It was an enchanted place, the library, and one we children got to know even years before we started school. A place where we would gather each Saturday morning in a little circle on the floor to hear someone read stories to us of faeries that danced on little flowers, of kings and queens who disguised themselves as people just like us, of bears who spoke, and fawns that cavorted through the woods all day then slept in beds of moss each night.

Later, of course, we began to read things for ourselves, and to decide what made sense and what to believe. We would read a book the librarian recommended, and then another that she hadn’t -just to check. I sometimes thought I’d wandered alone and secretly through the new ideas, but then she’d smile and congratulate me on my journey when I saw her at her desk.

I suppose we’re never really on our own when we have a book, though. It is the world, or at least its a door that opens inwards. The book is the sacred space, not the shelf on which it is forced to sleep. And I have long suspected that many things are similar to that -a school, for example, or a yoga class, a police officer, or a program on the radio -they each represent an expertise we cannot all possess. A knowledge so extensive we must partition it out in little bits to make it work. It is what a civilization does; it is what constitutes a society.

I found an incredibly insightful article entitled Truth is also a place in the online magazine Aeon on the subject that helped me to set things in context: https://aeon.co/essays/labs-courts-and-altars-are-also-traveling-truth-spots  It was written by Thomas Gieryn, a sociologist at Indiana University Bloomington, who suggests that ‘Some places make people believe.’ He describes the aura of wisdom ascribed to the ancient Oracle at Delphi. It was in a place so remote that even getting there was a struggle, and hence no doubt augmented the reliability of whatever advice was proffered. Other places, he argues, are similarly sacred: law courts, churches, laboratories, and so forth. The very stability of their location, and their often unique and recognizable architectures, lends an almost sacred air to their functions. ‘Ordinarily, truth-spots stay put over time, and those who seek believable knowledge must travel to them – not the other way around.’

But he wonders if the reliability and permanence of the location is still really necessary to perpetuate the authority. ‘[…] is longevity in a particular location always needed in order for a place to make people believe? Some truth-spots travel: they inhabit a place only temporarily. Sometimes a portable assemblage of material objects might be enough to consecrate an otherwise mundane place as a source for legitimate understandings – but only for the time that the stuff is there, before it moves on. But if a church or lab or courtroom can be folded up like a tent and pitched someplace else, can it really sustain its persuasive powers as a source for truth?’

In the abstract, that seems like an unlikely possibility. After all, part of the solace of religion, say, is in the majesty of the venue -the comfort of the pew, the quiet place that is a refuge from the busy street outside. Or, at other times it may lie in the reverberations of the organ, or the echo of a choir singing somewhere hidden in a large cathedral.

But Gieryn illustrates his thesis with examples of how the authority, if not the venue is transportable. Travelling justices can set up a court in the most unlikely of locations -a small village in China, for example, with ducks and geese waddling past. Justice can be fairly meted out to the satisfaction of villagers who might otherwise never be able to travel to a big city courtroom. Religion, too, could be promulgated outside of the boundaries of a church so long as those ceremonial symbols seen as sacred and important, accompany the duly recognized religious official.

But I suppose these things are so common nowadays, with our internet connections and social media flurries, that the very idea of immutability has become a myth. With the possible exception of religious structures, buildings permanently dedicated to a particular purpose, seem anachronistic. Atavistic. Time itself is out of joint.

Surely we are not so shallow that we think that it is the edifice that contains the authority, so naïve that we confuse the vehicle with the driver. It’s not the library that contains the book, nor even the book itself we need -it’s the ideas, the perspectives, and the wisdom travelling in an ever-expanding ripple that we should attempt to grasp…

And yet… I’d miss the smile of that wonderful lady with the dirty glasses, who sat behind the library desk and watched with motherly pride as I carried out an armful of books for another week. Call me sentimental, or just an old man trapped in reverie, but I think there is still something sacred in a place where a person like her could sit and watch -and smile encouragingly- as we struggle past.