He’s mad that trusts in the tameness of a wolf

I am an obstetrician, and not a neuropsychiatrist, but I feel a definite uneasiness with the idea of messing with brains –especially from the inside. Talking at it, sure –maybe even tweaking it with medications- but it seems to me there is something… sacrosanct about its boundaries. Something akin to black-boxhood -or pregnant-wombhood, if you will– where we have a knowledge of its inputs and outputs, but the internal mechanisms still too complex and interdependent to be other than interrogated from without.

I suppose I have a fear of the unintended consequences that seem to dog science like afternoon shadows -a glut of caution born of reading about well-meaning enthusiasms in my own field. And yet, although I do not even pretend to such arcane knowledge as might tempt me to meddle with the innards of a clock let alone the complexities of a head, I do watch from afar, albeit through a glass darkly. And I am troubled.

My concern bubbled to the surface with a November 2017 article from Nature that I stumbled upon: https://www.nature.com/news/ai-controlled-brain-implants-for-mood-disorders-tested-in-people-1.23031 I recognize that the report is dated, and merely scratches the surface, but it hinted at things to come. The involvement of DARPA (the Defense Advanced Research Projects Agency of the U.S. military) did little to calm my fears, either –they had apparently ‘begun preliminary trials of ‘closed-loop’ brain implants that use algorithms to detect patterns associated with mood disorders. These devices can shock the brain back to a healthy state without input from a physician.’

‘The general approach —using a brain implant to deliver electric pulses that alter neural activity— is known as deep-brain stimulation. It is used to treat movement disorders such as Parkinson’s disease, but has been less successful when tested against mood disorders… The scientists behind the DARPA-funded projects say that their work might succeed where earlier attempts failed, because they have designed their brain implants specifically to treat mental illness — and to switch on only when needed.’

And how could the device know when to switch on and off? How could it even recognize the complex neural activity in mental illnesses? Well, apparently, an ‘electrical engineer Omid Sani of the University of Southern California in Los Angeles — who is working with Chang’s team [a neuroscientist at UCSF] — showed the first map of how mood is encoded in the brain over time. He and his colleagues worked with six people with epilepsy who had implanted electrodes, tracking their brain activity and moods in detail over the course of one to three weeks. By comparing the two types of information, the researchers could create an algorithm to ‘decode’ that person’s changing moods from their brain activity. Some broad patterns emerged, particularly in brain areas that have previously been associated with mood.’

Perhaps this might be the time to wonder if ‘broad patterns’ can adequately capture the complexities of any mood, let alone a dysphoric one. Another group, this time in Boston, is taking a slightly different approach: ‘Rather than detecting a particular mood or mental illness, they want to map the brain activity associated with behaviours that are present in multiple disorders — such as difficulties with concentration and empathy.’ If anything, that sounds even broader -more unlikely to specifically hit the neural bullseye. But, I know, I know –it’s early yet. The work is just beginning… And yet, if there ever was a methodology more susceptible to causing collateral damage, and unintended, unforeseeable consequences, or one that might fall more afoul of a hospital’s ethics committee, I can’t think of it.

For example, ‘One challenge with stimulating areas of the brain associated with mood … is the possibility of overcorrecting emotions to create extreme happiness that overwhelms all other feelings. Other ethical considerations arise from the fact that the algorithms used in closed-loop stimulation can tell the researchers about the person’s mood, beyond what may be visible from behaviour or facial expressions. While researchers won’t be able to read people’s minds, “we will have access to activity that encodes their feelings,” says  Alik Widge, a neuroengineer and psychiatrist at Harvard University in Cambridge, Massachusetts, and engineering director of the MGH [Massachusetts General Hospital] team.’ Great! I assume they’ve read Orwell, for some tips.

It’s one of the great conundrums of Science, though, isn’t it? When one stretches societal orthodoxy, and approaches the edge of the reigning ethical paradigm, how should one proceed? I don’t believe merely assuming that someone else, somewhere else, and sometime else will undoubtedly forge ahead with the same knowledge, is a sufficient reason to proceed. It seems to me that in the current climate of public scientific skepticism, it would be best to tread carefully. Science succeeds best when it is funded, fêted, and understood, not obscured by clouds of suspicion or plagued by doubt -not to mention mistrust. Just look at how genetically modified foods are regarded in many countries. Or vaccinations. Or climate change…

Of course, the rewards of successful and innovative procedures are great, but so is the damage if they fail. A promise broken is more noteworthy, more disconcerting, than a promise never made.

Time for a thought experiment. Suppose I’ve advertised myself as an expert in computer hardware and you come to me with particularly vexing problem that nobody else seemed to be able to fix. You tell me there is a semi-autobiographical novel about your life that you’d been writing in your spare time for years, stored somewhere inside your laptop that you can no longer access. Nothing was backed up elsewhere –you never thought it would be necessary- and now, of course, it’s too late for that. The computer won’t even work, and you’re desperate.

I have a cursory look at the model and the year, and assure you that I know enough about the mechanisms in the computer to get it working again.

So you come back in a couple of weeks to pick it up. “Were you able to fix it?” is the first thing you say when you come in the door.

I smile and nod my head slowly. Sagely. “It was tougher than I thought,” I say. “But I was finally able to get it running again.”

“Yes, but does it work? What about the contents? What about my novel…?”

I try to keep my expression neutral as befits an expert talking to someone who knows nothing about how complex the circuitry in a computer can be. “Well,” I explain, “It was really damaged, you know. I don’t know what you did to it… but a lot of it was beyond repair.”

“But…”

“But I managed to salvage quite a bit of the function. The word processor works now –you can continue writing your novel.”

You look at me with a puzzled expression. “I thought you said you could fix it -the area where my novel is…”

I smile and hand you back the computer. “I did fix it. You can write again -just like before.”

“All that information… all those stories… They’re gone?”

I nod pleasantly, the smile on my face broadening. “But without my work you wouldn’t have had them either, remember. I’ve given you the opportunity to write some more.”

“But… But was stored in there,” you say, pointing at the laptop in front of you on the counter. “How do I know who I am now?”

“You’re the person who has been given the chance to start again.”

Sometimes that’s enough, I suppose…

 

 

 

 

 

 

 

 

 

Advertisements

A Pound of Flesh?

 

I’m retired now, and my kids have long since passed the age when, even if I were so disposed, I would dare lay a hand on either them or their children. But of course I wouldn’t -parenting wasn’t like that in my family.

I suspect I rarely hung out in the Goldilocks zone in childhood. I was prey to all of the usual temptations on offer in a 1950ies Winnipeg, but it’s unclear to me just what things I would have to have done to require corporal punishments. I realize that sounds naïve, even all these years later, but my father was not quick with the hand. In fact, on the one occasion he resorted to it, he seemed more upset by it than me, his recalcitrant offspring. And anyway, I think it was my mother’s idea that he wreak some stronger retribution than she could inflict on me with her voice.

My mother was into noise, actually. I imagine I was a frustrating child for her and she would resort to yelling fits when things didn’t go well. Clearly I have a limited, and no doubt statistically insignificant data set when it comes to the effects of corporal punishment, but I would venture to say that I feared my mother’s mouth far more than my father’s hand. My mother’s facial expression bespoke rage, my father’s, though, suggested sorrow -betrayal…

But I do not mean to disparage either of them, nor to suggest that they meted out cruel and unusual punishments under duress -I’m sure they were well-intentioned. And anyway, anecdotal evidence is a poor substitute for well-designed research, so I was pleased to see a more recent attempt to summarize what has been learned about the effects of, in this case, corporally disciplining children: https://theconversation.com/why-parents-should-never-spank-children-85962 The article was co-written by Tracie O. Afifi, Associate Professor, University of Manitoba, and Elisa Romano, Full Professor of Clinical Psychology, University of Ottawa.

‘The use of spanking has been hotly debated over the last several decades. Supporters state that it is safe, necessary and effective; opponents argue that spanking is harmful to children and violates their human rights to protection.’ But despite how common and widespread its use, it has been banned in 53 countries and states throughout the world. ‘The research clearly shows that spanking is related to an increased likelihood of many poor health, social and developmental outcomes. These poor outcomes include mental health problems, substance use, suicide attempts and physical health conditions along with developmental, behavioural, social and cognitive problems. Equally important, there are no research studies showing that spanking is beneficial for children.’ And, indeed,  ‘An updated meta-analysis was most recently published in 2016. This reviewed and analyzed 75 studies from the previous 13 years, concluding that there was no evidence that spanking improved child behaviour and that spanking was associated with an increased risk of 13 detrimental outcomes. These include aggression, antisocial behaviour, mental health problems and negative relationships with parents.’ I suspect there were other things going on in both intent and degree that might have confounded these studies and led to the negative outcomes, though -apples are simply not oranges, and beating or assaulting someone is not the same as striking a buttock with an open hand as a way to deter an unwanted behaviour.

Of course, the researchers hasten to add that ‘this does not make parents who have used spanking bad parents. In the past, we simply did not know the risks.’ I think that lets my father off the hook; I’m not so sure about my mother, though. It seems to me that it is all too easy to condemn corporal punishments, while ignoring –or, perhaps, paying less attention to- the other forms of discipline that, intuitively at least, might be expected to result in equally detrimental  consequences for a developing child. One of these, of course, is verbal haranguing.

I don’t believe that I was ever subject to verbal abuse, however. I was never demeaned, or insulted by my mother –just confronted with my miscreant behaviour, and anointed with the requisite guilt- but I can understand how it could get out of hand under different circumstances and with different personalities. I find that worrisome –alarming, in fact. It is a behaviour that could all too easily slip under the radar. Be explained away.

I recognize that parenting is stressful, and that we all come to it with different temperaments, different abilities to tolerate stress, and different support structures that could be called upon in times of intolerable tension, but I suppose that is just the point. I wrote about this a while ago: https://musingsonwomenshealth.com/2017/05/17/time-out-eh/

But I fear that it sometimes requires the patience of Job to stand-down enough to be able to socially isolate the misbehaving child with a time-out. It is clearly preferable to spanking, to be sure, but I still wonder if what precedes it may be just that verbal abuse it seeks to avoid.

So, given our human propensity to react unpredictably and often adversely to stress, what am I advocating? Well, I have to admit that I have neither the background, nor the temerity to suggest that I have any productive answers. But although the Conversation article I quoted above was focused on spanking –physical punishment- it contains some suggestions that I think would be applicable to other punitive modalities like verbal abuse and insults.

‘Research already shows some evidence that parenting programs specifically aimed at preventing physical punishment can be successful. Some evidence for reducing harsh parenting and physical punishment has been found for Parent-Child Interaction Therapy (PCIT), the Incredible Years (IY) program and the Nurse Family Partnership (NFP). Other promising home visiting initiatives and interventions taking place in community and paediatric settings are also being examined for proven effectiveness.’

I know –education, education, education… But sometimes education is merely making people aware that alternatives exist. That there could be support out there of which they may not have been aware -both with friends and in the community. Remember that African proverb: It takes a village to raise a child

 

The Idea of Ideal

Just when you think that you have a handle on what you’re supposed to look like, just when you’ve lost the weight, dyed your hair, and even forsworn relaxing at the beach on your days off, they up and change it on you. And the worst part: you don’t even know who ‘they’ are so you can’t post something against them on Facebook. But fads are like that, I guess. You never know when somebody is going to start one. You never know when the train is going to leave the station -with or without you.

Fitspiration -a neologism presumably coined to inspire fitness- would seem to encourage not just fitness, but a particularly muscular form of fitness: brawny fitness. Not only should you aspire to being fit, but also to looking fit. Thinspiration, fortunately, is in decline -unless, of course, it is accompanied by visibly toned muscles that reassure anybody who cares to observe, that the wearer is healthy and vigorous. Just dieting can’t do that.

But as an elderly male, I have to be careful here. Presumably I represent the dark side of the equation -or more accurately, I am non-representative of the case at issue. I have observer status at best. And yet, as detached from the fray as I am, I can claim to have witnessed a similar phenomenon in my admittedly testosterone-sodden brethren. I am old enough (barely) to remember those comic book ads for chest-expander-springs that promised relief for thin, but otherwise healthy young men who were constantly having sand kicked in their faces by muscular bullies on beaches populated by attractive, and admiring young women. Laughable in today’s world, they nonetheless suggested that the route to popularity, and attractiveness, was a physical one. A buff one.

And it seems to me that this undue emphasis on muscularity -on power, if you will- is a seductive trope that is no longer gendered. There is no compelling reason why it should ever have been, I suppose, although I have to say that physical power is illusory. It is what those people actually in control -the Mafia dons, for example- hire to protect them. Not command them.

I am clearly a product of my era: a consequence of the prevailing Weltanschauung on offer at the time. I suppose I was conditioned by those around me to view muscularity as a marker of fitness in athletes -male and female- but neither particularly desirable nor realistically attainable in the average person. Some men, to be fair, seemed to flaunt burl as signs of their masculinity, but apart from avoiding them on sandy beaches, I did not feel overly disadvantaged. Of course, they out-competed me for women, but so did everybody else.

My point, if I have to admit it, is that I have grown used to muscularity in men over the years. It seemed a natural thing, I guess, not an ostentatious badge of physicality. But I find it interesting that the fitspiration trend in women has now been noticed by the scientific community, as I discovered in an article written by two PhD candidates, Frances Bozsik at the University of Missouri-Kansas City, and Brooke L. Bennett at the University of Hawaii for the Conversation: https://theconversation.com/the-ideal-female-body-type-is-getting-even-harder-to-attain-91373

‘By now, most women are probably aware of the discrepancy between their bodies and the impossibly thin women who appear on TV and in magazines. This disparity was first identified in a 1980 study that compared the body weights of regular American women to prominent media figures … The researchers found that between 1959 and 1978, average female weights in the general population increased, while the women appearing in the media were actually getting thinner.

‘This matters because, particularly for women, exposure to thinner bodies contributes to body dissatisfaction, which can worsen your mood and lead to lower self-esteem. Those who aspire to this ideal figure can end up engaging in negative behaviors like restrictive eating or purging.’

So, ‘One trend that has gained traction is “fitspiration.” These are images and videos that depict women engaged in workouts or poses that highlight particular muscle groups like the abdomen or buttocks. In promoting muscularity, these images seem to be promoting healthy exercise. But analyses of the text accompanying the images have found that they often include guilt-inducing messages that focus on body image (e.g. “Suck it up now, so you don’t have to suck it in later”). In fact, one study has shown that an overwhelming percentage (72 percent) of these posts emphasize appearance, rather than health (22 percent). And it’s an appearance that’s not only muscular, but also thin.’

The authors go further: ‘You might wonder: Isn’t it healthy that women are increasingly preferring muscularity? Studies have examined the impact of viewing thin and toned bodies, and have found that they have a negative impact on the body image of female viewers. Just like the previous studies on media images that promote thinness, seeing thin, muscular women can lead to a negative mood and decreased body satisfaction.’

I think the aspect of the fitspiration movement that concerns me the most is its emphasis on appearance rather than health. I mean, believe me, I’m all for beauty, but not if it is at the expense of well-being. Not if an inability to live up to some ideal female body form leads to dysfunctional consequences. Heaven only knows there are enough things out there to admire, without requiring membership.

I suppose I could be accused of cherry-picking, though, of selecting an article that just happens to align rather conveniently with my own apparent biases, but there are many other studies out there with similar findings -for example: https://www.tandfonline.com/doi/full/10.1080/10410236.2016.1140273 or https://onlinelibrary.wiley.com/doi/abs/10.1002/eat.22403

And yes, although I try to remain objective, I find I am still conflicted about the muscular trend in women. Fortunately, in my circle of friends, their numbers are still too small to attract much attention -although I guess it’s quite possible that large muscles bulge unseen beneath an increasing number of coats and designer sweat shirts. Maybe, in fact, I should spend more time on beaches than I do. In truth, I’d love to see if sand-kicking has changed gender over the years.

Beauty is but a vain and doubtful good

When I was a child and began discovering myself in a mirror, I wondered about my nose. I thought it different from my friends -different from Teddy’s at any rate. He was my best friend and we went everywhere together. We had the same kind of jeans, and shared a similar taste in ice cream. We even rode the same kind of bikes to school. But he had a small, straight nose, and mine was fatter and had a little hook in it. Teddy didn’t think much about it -but that was because there was nothing wrong with his. Mine was ugly.

Only later did I begin to understand that it wasn’t ugliness I had been noticing, it was difference. And that looking the same as someone else wasn’t really a sign of beauty, any more than looking different was something shameful or unfair. But that awareness requires some maturity, I think, a Weltanschauung born of more experience than we can expect of a child.

And yet, what is beauty? Can one define it in isolation from what it is not, or must one be forever trapped on a Mobius strip of perspective? And where, on a Bell Curve, does beauty start -or ugliness begin? Short of a Goldilockean definition of ‘just-right-baby-bear’, is beauty actually amenable to definition? Or is it dependent on culture? Historical epoch?

Humanity has struggled with this at least since records have been kept. And the beauty/ugly antipodes have survived largely as antagonists, dependent on each other as contrast -each is what the other cannot be

At any rate, always alert to the nuances of the struggle, I was pleased to come across an essay on the history of ugliness in the online Aeon magazine, written by Gretchen Henderson, a teacher at Georgetown University and a Hodson Trust-JCB Fellow at Brown University: https://aeon.co/ideas/the-history-of-ugliness-shows-that-there-is-no-such-thing

‘This word [ugly] has medieval Norse roots meaning ‘to be feared or dreaded … Ugliness has long posed a challenge to aesthetics and taste, and complicated what it means to be beautiful and valued. Western traditions often set ugliness in opposition to beauty, but the concept carries positive meanings in different cultural contexts. The Japanese concept of wabi-sabi values imperfection and impermanence, qualities that might be deemed ‘ugly’ in another culture.’

‘‘Ugly’ is usually meant to slander, but in recent decades, aesthetic categories have been treated with growing suspicion. ‘We cannot see beauty as innocent,’ writes the philosopher Kathleen Marie Higgins, when ‘the sublime splendor of the mushroom cloud accompanies moral evil.’’

As Henderson says, ‘When we call something ugly, we say something about ourselves – and what we fear or dread.’

I have to say, I am very attracted to the concept embodied by the Koine Greek word for beautiful: horaios­—etymologically related to hora, or ‘hour’. In other words, beauty means being in one’s hour. Wikipedia gives an example: ‘a ripe fruit (of its time) was considered beautiful, whereas a young woman trying to appear older or an older woman trying to appear younger would not be considered beautiful.’

So, can a nose be ugly? And what would that mean, exactly? To Teddy, it seemed a non-issue, and my constant reference to it bothered him.

“You talk about ugly,” he finally yelled over his shoulder at me as we were racing our bikes to school one morning. “Just look at Cindy, eh?”

“What’s wrong with Cindy?” I shouted back in little grunts, as I tried to catch up with him. Cindy was a girl that sat a few seats in front of me in class. I kind of thought she was cute, although I was much to shy to tell her.

“Ever look at her ears?”

I hadn’t, actually. But she had long brown hair that hung down to her shoulders in big, wavy curls, so the only thing that showed on her head was her face. I didn’t need to see her ears. And anyway, if you couldn’t see something, then it didn’t seem fair to call it ugly.

It made me glance up at Teddy’s ears as I finally caught up to him on a corner near the school, though; I hadn’t noticed them before. They were kind of weird -especially the way the little skin lobes hung down and danced around like earrings when he pedalled.

I checked my ears in the big mirror in the boy’s room when I got to school. Mine were normal, at least. Actually, if you ignored the nose, my face wasn’t too bad either. Everything seemed to match, and as far as I could tell, was in the right place. It was reassuring that I wasn’t a total mess.

That day, as I daydreamed in class, I glanced at the kids around me. Jacob’s chin was kind of long for his face, but Brian’s was almost not there -his neck seemed to join his face with only a little bump just below his lips. I hadn’t noticed that before. Janna had greasy hair, but of course that was no surprise -she also had red marks all over her cheeks, like she was infected, or something.

I could only see Cindy’s back from where I sat, of course, but it was wonderful enough to see her hair dancing over her shoulders as she moved around trying to find something on her desk. It was perfect hair -even Teddy couldn’t deny that. And I was pretty sure that it would smell like roses, or whatever. Beautiful girls were like that.

I think she could feel me staring at her, because just before the class ended she suddenly turned her head and glared at me. He face was tight with… well, with anger, I think, and she screwed her face up into a really horrid scowl. I’d never seen her like that before, and for a moment, I wondered if I’d misjudged her. Without warning, her beauty disappeared, and her eyes ripped into me like knives. Fury is really scary when you don’t expect it. Then, seeing that she had succeeded in punishing me, her face relaxed again and she turned away -whether back to beauty, though, I couldn’t tell.

And yet I suppose we are all different people at different times, aren’t we? Looking back at that memory after all these years, it seems obvious to me that the way we see the world is dependent on factors that are often out of our immediate control. Even our appearance in the mirror is contingent… although I’m used to my nose now.

Is there nothing either good or bad but thinking makes it so?

Sometimes there are articles that set my head spinning. Or my mind. Ideas that I’d never thought of before. Ideas that make me rummage around deep inside, like I’m searching for a pencil, or my internal keyboard where I write the things I should remember. I often don’t, of course –remember them, I mean -how do you summarize and store enlightenment for a day, a week, a lifetime later? Sometimes you just have to explain it to yourself in your own words.

I subscribe to the online Aeon magazine -well, its newsletter, anyway- and I have to say that many of its articles are sufficiently mind-expanding as to qualify as epiphanous. Heuristic.

One such article on belief started me thinking. It was written by Daniel DeNicola, professor and chair of philosophy at Gettysburg College in Pennsylvania: https://aeon.co/ideas/you-dont-have-a-right-to-believe-whatever-you-want-to  It questions whether we have the right to believe whatever we want -the abnegation of which is in itself anathema in many quarters. But surely there’s no harm in having the unfettered right to believe whatever.

And yet, as he asserts, ‘We do recognise the right to know certain things. I have a right to know the conditions of my employment, the physician’s diagnosis of my ailments … and so on. But belief is not knowledge. Beliefs are factive: to believe is to take to be true … Beliefs aspire to truth – but they do not entail it. Beliefs can be false, unwarranted by evidence or reasoned consideration. They can also be morally repugnant… If we find these morally wrong, we condemn not only the potential acts that spring from such beliefs, but the content of the belief itself, the act of believing it, and thus the believer.’

‘Such judgments can imply that believing is a voluntary act. But beliefs are often more like states of mind or attitudes than decisive actions … For this reason, I think, it is not always the coming-to-hold-this-belief that is problematic; it is rather the sustaining of such beliefs, the refusal to disbelieve or discard them that can be voluntary and ethically wrong.’

In other words, I may inherit a belief from my family, or the circle of friends I inhabit, but that is no excuse for continuing to hold them if I come realize they are harmful or factually incorrect.

‘Believing has what philosophers call a ‘mind-to-world direction of fit’. Our beliefs are intended to reflect the real world – and it is on this point that beliefs can go haywire. There are irresponsible beliefs; more precisely, there are beliefs that are acquired and retained in an irresponsible way. One might disregard evidence; accept gossip, rumour, or testimony from dubious sources; ignore incoherence with one’s other beliefs; embrace wishful thinking ….’

Here, though, DeNicola issues a caveat. He does not wish to claim, that it is wrong, always, everywhere, and for anyone, to believe anything upon insufficient evidence. ‘This is too restrictive. In any complex society, one has to rely on the testimony of reliable sources, expert judgment and the best available evidence. Moreover, as the psychologist William James responded in 1896, some of our most important beliefs about the world and the human prospect must be formed without the possibility of sufficient evidence. In such circumstances … one’s ‘will to believe’ entitles us to choose to believe the alternative that projects a better life.’

Our beliefs do not have to be true if it is not possible to know what actually is true, or turns out to be the truth. As an example, ‘In exploring the varieties of religious experience, James would remind us that the ‘right to believe’ can establish a climate of religious tolerance.’ But even here, intolerant beliefs need not be tolerated: ‘Rights have limits and carry responsibilities. Unfortunately, many people today seem to take great licence with the right to believe, flouting their responsibility. The wilful ignorance and false knowledge that are commonly defended by the assertion ‘I have a right to my belief’ do not meet James’s requirements. Consider those who believe that the lunar landings or the Sandy Hook school shooting were unreal, government-created dramas … In such cases, the right to believe is proclaimed as a negative right; that is, its intent is to foreclose dialogue, to deflect all challenges; to enjoin others from interfering with one’s belief-commitment. The mind is closed, not open for learning. They might be ‘true believers’, but they are not believers in the truth.’

So, do we accept the right of the bearers of those beliefs to silence the rest of us, or should they merely be allowed to coexist in the noise? DeNicola wants to think, ‘[T]here is an ethic of believing, of acquiring, sustaining, and relinquishing beliefs – and that ethic both generates and limits our right to believe.’

But, I wonder if the ethic is truly assignable -the noise can be overwhelming, and those who are the most persistent in its production end up deafening the rest of us. And although any responsibility for their belief should imply accountability, to whom are they accountable -to those in power, or to those who also share the belief? Do those with firmly held beliefs read articles like the ones in Aeon? And would they be swayed by the arguments even if they did?

Is it my responsibility to convince my opponents that my beliefs are right, or rather to set about proving that theirs are wrong? A fine distinction to be sure, and one that seems inextricably embedded in the web of other beliefs I have come to accept as valid markers of reality. And yet I think the thesis of DeNicola’s argument -that a belief, even if possibly untrue, should at the very least, not be dangerous, threaten harm, or prevent others from believing something else- is the most defensible. If nothing else, it carries the imprimatur of several thousand years of wisdom: the concept of reciprocity -the Golden Rule, if you will: what you wish upon others, you wish upon yourself. Or, in the Latin of the Hippocratic Oath: Primum non nocere -First of all, do no harm.

Is Seeing Believing?

Isn’t it interesting that some of us can look at a forest and miss the wind riffling through the leaves, while others see the moon as a ‘ghostly galleon tossed upon cloudy seas’? What determines what we see? Does it have to relate to something we’ve seen before -patterns that we recognize? Is our apprehension of reality an expectation? A sorting through the chaos and discarding what we don’t understand -the noise– until something more familiar emerges? Why do we not all see the same thing?

If patterns are what we are evolved to see, if they are what we use to make sense of the world, are there always patterns everywhere? These are things I wonder about, now that I have time to wonder. Now that I am retired, I suppose I can wade more thoughtfully into the turbulence I once found swirling about my days. Clarity is certainly not a common property of old age, but occasionally it descends as softly as a gossamer thread, and then as quickly drifts away leaving only traces of its presence. Doubts about its visit.

Are these mere hints of what the gifted see? Is peering beyond the horizon just a gift, or is it fleeting and unstable unless learned? There was an interesting essay in Aeon, an online offering that touched on the subject of insightful examination, by Gene Tracy, the founding director of the Center for the Liberal Arts at William and Mary in Williamsburg, Virginia: https://aeon.co/essays/seeing-is-not-simple-you-need-to-be-both-knowing-and-naive

‘When Galileo looked at the Moon through his new telescope in early 1610, he immediately grasped that the shifting patterns of light and dark were caused by the changing angle of the Sun’s rays on a rough surface. He described mountain ranges ‘ablaze with the splendour of his beams’, and deep craters in shadow as ‘the hollows of the Earth’. […] Six months before, the English astronomer Thomas Harriot had also turned the viewfinder of his telescope towards the Moon. But where Galileo saw a new world to explore, Harriot’s sketch from July 1609 suggests that he saw a dimpled cow pie.’ And so, the question must be asked, ‘Why was Galileo’s mind so receptive to what lay before his eyes, while Harriot’s vision deserves its mere footnote in history?’ But, as the author notes, ‘Learning to see is not an innate gift; it is an iterative process, always in flux and constituted by the culture in which we find ourselves and the tools we have to hand. […] the historian Samuel Y Edgerton has argued that Harriot’s initial (and literal) lack of vision had more to do with his ignorance of chiaroscuro – a technique from the visual arts first brought to full development by Italian artists in the late 15th century. By Galileo’s time, the Florentines were masters of perspective, using shapes and shadings on a two-dimensional canvas to evoke three-dimensional bodies in space. […] Harriot, on the other hand, lived in England, where general knowledge of these representational techniques hadn’t yet arrived. The first book on the mathematics of perspective in English – The Art of Shadows by John Wells – appeared only in 1635.’

But is it really as fortuitous as that? As temporally serendipitous? Tracy makes the point that, at least in the case of Science, observations are ‘often complex, contingent and distributed.’ And, ‘By exploring vision as a metaphor for scientific observation, and scientific observation as a kind of seeing, we might ask: how does prior knowledge about the world affect what we observe? If prior patterns are essential for making sense of things, how can we avoid falling into well-worn channels of perception? And most importantly, how can we learn to see in genuinely new ways?

‘Scientific objectivity is the achievement of a shared perspective. It requires what the historian of science Lorraine Daston and her colleagues call ‘idealisation’: the creation of some simplified essence or model of what is to be seen, such as the dendrite in neuroscience, the leaf of a species of plant in botany, or the tuning-fork diagram of galaxies in astronomy. Even today, scientific textbooks often use drawings rather than photographs to illustrate categories for students, because individual examples are almost always idiosyncratic; too large, or too small, or not of a typical colouration. The world is profligate in its variability, and the development of stable scientific categories requires much of that visual richness to be simplified and tamed. […] So, crucially, some understanding of the expected signal usually exists prior to its detection: to be able to see, we must know what it is we’re looking for, and predict its appearance, which in turn influences the visual experience itself.’

‘If the brain is a taxonomising engine, anxious to map the things and people we experience into familiar categories, then true learning must always be disorienting. […]Because of the complexity of both visual experience and scientific observation, it is clear that while seeing might be believing, it is also true that believing affects our understanding of what we see. The filter we bring to sensory experience is commonly known as cognitive bias, but in the context of a scientific observation it is called prior knowledge. […] If we make no prior assumptions, then we have no ground to stand on.’

In his opinion, there is a thrust and parry between learning to see, and seeing to learn. I have no trouble with that, but I have to say that Science is only one Magisterium in a world of several. Science is neither omniscient, nor omnispective.

I happened across a friend standing transfixed in the middle of a trail in the woods the other day. A gentle breeze was coaxing her hair across her face, but her eyes were closed and she was smiling as if she had just been awarded an epiphany.

At first I wondered if I should try to pass her unannounced, but I suppose she heard my approach and glanced at me before I had made up my mind. Her eyes fluttered briefly over my face for a moment, like birds investigating a place to perch, then landed as softly as a whisper on my cheek.

“I… I’m sorry, Mira,” I stammered, as surprised by her eyes as her expression. “You looked so peaceful, I didn’t want to disturb you…”

Her smile remained almost beatific, rapturous, but she recalled her eyes to brief them for a moment before returning them to me. “I was just listening to that bird,” she said and glanced into the thick green spaces between the trees to show me where, “when I felt the breeze…” I have to say, I hadn’t noticed anything -I hadn’t even heard the bird. “…And it touched my forehead like a kiss,” she said, and blushed for describing it like that. She closed her eyes and thought about it for a moment. “I can’t think of another word,” she added, and slowly walked away from me with a wink, onto a nearby path.

I don’t think that what she was saying was Science, or even meant to require a proof, and yet I felt far better knowing there are people like her in my world. I think I even felt a brief nuzzle by the wind as I watched her disappear into the waiting, excited fondle of the leaves.

 

 

 

Nobody in Particular

Why do we believe something? How do we know that we are right? When I was a child, I was certain that the Fleetwood television set my parents had just purchased, was the best. So was the make of our car -and our vacuum cleaner too, come to think of it. But why? Was it simply because authority figures in my young life had told me, or was there an objective reality to their assertions? For that matter, how did they know, anyway? Other parents had different opinions, so who was right?

I was too young to question these things then, but gradually, I came to seek other sources of knowledge. And yet, even these sometimes differed. It’s difficult to know in what direction to face when confronted with disparate opinions. Different ‘truths’. Everybody can’t be right. Usually, in fact, the correct answer lies somewhere in the middle of it all, and it becomes a matter of knowing which truths to discard -choosing the ‘correct’ truth.

Despite the fact that most of us rely on some method like this, it sounds completely counterintuitive. How many truths can there be? Is each a truth, or merely an opinion? And what’s wrong with having a particular opinion? Again, how would we know? How could we know?

Nowadays, with social media algorithms selecting which particular news they report on the basis of our past choices, it’s difficult to know if we are in an echo chamber unless we purposely and critically examine whatever truths we hold dear -step back to burst the bubble. Canvas different people, and sample different opinions. But, even then, without resorting to mythology, or a presumed ‘revealed’ truth that substantiates a particular religious dogma, is there an objective truth that somehow transcends all the others? Conversely is all truth relative -situationally contextualized, temporally dependent, and ultimately socially endorsed?

Should we, in fact, rely on a random sample of opinions to arrive at an answer to some questions that are only a matter of values, but not about realistically verifiable facts -such as the height of a building, say, or maybe the type of bacterium that causes a particular disease? Would that bring us closer to the truth, or simply yet another truth?

Well, it turns out that the average of a large group of diverse and even contrary opinions has some statistical merit: http://www.bbc.com/future/story/20140708-when-crowd-wisdom-goes-wrong  ‘[T]here is some truth underpinning the idea that the masses can make more accurate collective judgements than expert individuals.’ The Wisdom of Crowds ‘is generally traced back to an observation by Charles Darwin’s cousin Francis Galton in 1907. Galton pointed out that the average of all the entries in a ‘guess the weight of the ox’ competition at a country fair was amazingly accurate – beating not only most of the individual guesses but also those of alleged cattle experts. This is the essence of the wisdom of crowds: their average judgement converges on the right solution.’

But the problem is in the sampling -the diversity of the members of that crowd. ‘If everyone let themselves be influenced by each other’s guesses, there’s more chance that the guesses will drift towards a misplaced bias.’ Of course ‘This finding challenges a common view in management and politics that it is best to seek consensus in group decision making. What you can end up with instead is herding towards a relatively arbitrary position. Just how arbitrary depends on what kind of pool of opinions you start off with. […] copycat behaviour has been widely regarded as one of the major contributing factors to the financial crisis, and indeed to all financial crises of the past. [And] this detrimental herding effect is likely to be even greater for deciding problems for which no objectively correct answer exists. […] All of these findings suggest that knowing who is in the crowd, and how diverse they are, is vital before you attribute to them any real wisdom.’

This might imply that ‘you should add random individuals whose decisions are unrelated to those of existing group members. That would be good, but it’s better still to add individuals who aren’t simply independent thinkers but whose views are ‘negatively correlated’ – as different as possible – from the existing members. In other words, diversity trumps independence. If you want accuracy, then, add those who might disagree strongly with your group.’

Do you see where I’m going with all this? We should try to be open enough to consider all sides of an argument before making a considered decision. Let’s face it, you have to know what it is that you’re up against before you can arrive at a compromise. And perhaps, the thing you thought you were opposing is not so different from your own view after all.

Even our values fluctuate. Unless we are willing to be historical revisionists, it’s obvious that people in the past often assigned values differently to how we do today -sexual orientation, for example, or racial characteristic and stereotyping. And who nowadays would dare argue that women are not the equal of men, and deserve the same rights?

There are some things about which we will continue to disagree, no doubt. And yet, even a willingness to listen to an opposing opinion instead of shutting it down without a fair acknowledgment of whatever merits it might have hidden within it, or commonalities it might share with ours, is a step in the right direction.

I’m not at all sure that it’s healthy to agree about everything, anyway, nor to assume we possess the truth. It’s our truth. I think that without some dissenting input, we’d be bored, condemned to float in the increasingly stagnant backwater we chose, while just beyond our banks, a creek runs merrily past, excited to discover another view that lies beyond and behind the next hill.

After all, remember what happened to Caesar after Shakespeare had him boast: “I am constant as the northern star, of whose true-fix’d and resting quality there is no fellow in the firmament.”

Just saying…