Is Everybody a Petard?

Sociology is certainly interesting; it turns out that none of us are normal -well, perhaps more revealingly, there is no normal ‘us’. We are, at best, data points spread out on a rather amorphous Bell curve, vaguely generalizable depending on the homogeneity of the group chosen, but often unrepresentative of populations further afield.

And yet, why should that be a surprise to anybody who has vacationed in a different hemisphere -or, for that matter, simply walked through a poorer section of their own town? Or mingled with members of another ethnic community? Or even talked to a different age group…?

We seem enamoured with reducing people to numbers -statistics- as if by accumulating and analyzing them appropriately, we have proven something… Undoubtedly we have demonstrated something, but what? And how applicable is it over time and culture?

I have to admit that I have long felt that the generalizations were overdone, and in the current era of rapid dissemination of ideas that seem as stable as clothes in a washing machine, not terribly relevant. But the idea was reintroduced to me in an essay in by Kensy Cooperrider, a cognitive scientist in the Department of Psychology at the University of Chicago:

His contention was that ‘On all continents, even in the world’s remotest regions, indigenous people are swapping their distinctive ways of parsing the world for Western, globalised ones. As a result, human cognitive diversity is dwindling… This marks a major change of course for our species. For tens of thousands of years, as we fanned out across the globe, we adapted to radically different niches, and created new types of societies; in the process, we developed new practices, frameworks, technologies and conceptual systems. But then, some time in the past few centuries, we reached an inflection point. A peculiar cognitive toolkit that had been consolidated in the industrialising West began to gain global traction. Other tools were abandoned. Diversity started to ebb.’

The toolkit he is referencing is the use of WEIRD -an acronym meaning the use of Western Educated, Industrialized, Rich and Democratic students as fodder for the studies that were being published in the sociological literature. He references a famous paper published in 2010 led by the psychologist Joe Henrich at the University of British Columbia entitled ‘The Weirdest people in the World?’

And in that paper, Henrich claimed, ‘researchers in the behavioural sciences had almost exclusively focused on a small sliver of humanity: people from Western, educated, industrialised, rich, democratic societies. The second was that this sliver is not representative of the larger whole, but that people in London, Buenos Aires and Seattle were, in an acronym, WEIRD.’ They were definitely not representative of the world at large, and yet since this type of group was being referenced constantly, the psychologist Paul Rozin at the University of Pennsylvania, felt it might be how otherwise disparate groups were beginning to see themselves; where he found cross-cultural differences, ‘they were more pronounced in older generations. The world’s young people, in other words, are converging.’

One example, as I have mentioned, is our obsession with numbers to quantify and measure things. There is nothing inherently wrong with this, of course, and yet it does represent a unique weltanschauung that ignores other, no less valid, ways of engaging with everyday reality.

Another might be our fixation on Time -that artificial construct we append to every action, whether actual or impending. Again, for those of us who are tied to schedules it seems not only appropriate, but also necessary. How else could we survive and prosper in the life in which we are enmeshed?

There are other examples of the stamp our culture has had on far flung peoples, but the one that intrigues me the most is language. The currently evolving Lingua Franca (a strikingly ironic oxymoron) could reasonably be argued to be English. And why might that be important? ‘English is an egocentric language whereas most others are allocentric: English-speakers describe objects’ location in relation to themselves or other people, and not to other objects (for example, ‘the bike is five metres to my left’ rather than ‘the bike is next to the fire hydrant’).’

I had never thought of my language like that, I must admit, but if the contention is valid, the ramifications are interesting and it affects the kinds of studies that are carried out. ‘Our cultural bias means that not only do we ignore concepts that might be important in other countries – such as face, caste or honour – but that you often end up testing for an English-language concept (‘shame’, for example) which might have no direct equivalent in another society, or have different connotations.’

Henrich argued that ‘what we think of as science is all too often ‘WEIRD’ science… Between 2003 and 2007, 96 per cent of experimental volunteers in the leading psychology journals were WEIRD; 68 per cent of papers relied exclusively on US subjects; and in the prestigious Journal of Personality and Social Psychology, 67 per cent of total subjects were US psychology students. ‘Many fields have a model organism that they study… A lot of medicine is done with mice, a lot of genetics is done with fruit flies. And in psychology, the model organism is the American undergraduate.’ Perhaps things have changed since those statistics were collated, and yet, I’m sure fiscal constraints still limit both the amount of diversity attainable and the ability to replicate and validate whatever conclusions were obtained.

But, apart from paring off a few charming idiosyncrasies, and allowing -forcing?- strangers to adapt to how we in the WEIRD west view the world, is there any harm done? It’s still valuable information, right?

All information is no doubt valuable, but is it useful? Cooperrider summarizes his concern at the end: ‘For much of human history, one of our most distinctive traits as a species has been our sheer diversity.’ So, is that something we can afford to lose?

Not that I have any realistic say in the matter, but now that I understand the trend, I have to ask myself if I really want to live in a vanilla ice-cream world -one with no lumps in it. No mysterious colours, no fireside tales of how each of us came to be.

Are we not such stuff as dreams are made on?


Should We Bell the Cat?

What should you do at a dinner party if the hostess, say, declares that she believes something that you know to be inaccurate -or worse, that you consider repellent? Abhorrent? Should you wait to see how others respond, or take it upon yourself to attempt to correct her belief? If it is merely a divergence of opinion, it might be considered a doctrinaire exercise -a Catholics vs Protestant type of skirmish- and likely unwinnable.

But, suppose it is something about which you are recognized to have particular credentials so your response would not be considered to be merely an opinion, but rather a statement of fact? Should that alter your decision as to whether or not to take issue with her pronouncement? Would your silence imply agreement -acquiescence to a view that you know to be not only wrong, but offensive? And would your failure to contradict her, signal something about her opinion to the others at the table? If it is an ethical issue, should you attempt to teach?

It is a difficult situation to be sure, and one that is no doubt difficult to isolate from context and the responsibilities incumbent upon a guest. Still, what should you do if, uncorrected, she persists in promulgating her belief? Should you leave the table, try to change the topic, or merely smile and wait to see if she is able to sway those around you to her views?

I can’t say that the situation has arisen all that often for me, to tell the truth -we tend to choose our friends, and they theirs, on the basis of shared values- but what risks might inhere in whatever course of action I might choose? I happened upon an insightful and intriguing article that touched on that very subject in Aeon, an online magazine: It was written by John Schwenkler, an associate professor in philosophy at Florida State University.

He starts, by pointing out that ‘Many of our choices have the potential to change how we think about the world. Often the choices taken are for some kind of betterment: to teach us something, to increase understanding or to improve ways of thinking. What happens, though, when a choice promises to alter our cognitive perspective in ways that we regard as a loss rather than a gain?’

And further, ‘When we consider how a certain choice would alter our knowledge, understanding or ways of thinking, we do this according to the cognitive perspective that we have right now. This means that it’s according to our current cognitive perspective that we determine whether a choice will result in an improvement or impairment of that very perspective. And this way of proceeding seems to privilege our present perspective in ways that are dogmatic or closed-minded: we might miss the chance to improve our cognitive situation simply because, by our current lights, that improvement appears as a loss. Yet it seems irresponsible to do away entirely with this sort of cognitive caution… And is it right to trust your current cognitive perspective as you work out an answer to those questions? (If not, what other perspective are you going to trust instead?)’

You can see the dilemma: is the choice or opinion you hold based on knowledge, or simply belief? And here he employs a sort of thought experiment: ‘This dilemma is escapable, but only by abandoning an appealing assumption about the sort of grasp we have on the reasons for which we act. Imagine someone who believes that her local grocery store is open for business today, so she goes to buy some milk. But the store isn’t open after all… It makes sense for this person to go to the store, but she doesn’t have as good a reason to go there as she would if she didn’t just think, but rather knew, that the store were open. If that were case she’d be able to go to the store because it is open, and not merely because she thinks it is.’

But suppose that by allowing an argument -an opinion, say- to be aired frequently or uncontested, you fear you might eventually be convinced by it? It’s how propaganda endeavours to convince, after all. What then? Do you withdraw, or smile and smile and see a villain (to paraphrase Hamlet)? ‘If this is on the right track, then the crucial difference between the dogmatic or closed-minded person and the person who exercises appropriate cognitive caution might be that the second sort of person knows, while the first merely believes, that the choice she decides against is one that would be harmful to her cognitive perspective. The person who knows that a choice will harm her perspective can decide against it simply because it will do so, while the person who merely believes this can make this choice only because that is what she thinks.’

This is philosophical equivocation, and Schwenkler even admits as much: ‘What’s still troubling is that the person who acts non-knowingly and from a mere belief might still believe that she knows the thing in question… In that case, she’ll believe that her choices are grounded in the facts themselves, and not just in her beliefs about them. She will act for a worse sort of reason than the sort of reason she takes herself to have.’

As much as I enjoy the verbiage and logical progression of his argument, I have to admit to being a little disappointed in the concluding paragraph in the article, that seems to admit that he has painted himself into a corner: ‘What’s still troubling is that the person who acts non-knowingly and from a mere belief might still believe that she knows the thing in question: that climate change is a hoax, say, or that the Earth is less than 10,000 years old. In that case, she’ll believe that her choices are grounded in the facts themselves, and not just in her beliefs about them. She will act for a worse sort of reason than the sort of reason she takes herself to have. And what could assure us, when we exercise cognitive caution in order to avoid what we take to be a potential impairment of our understanding or a loss of our grip on the facts, that we aren’t in that situation as well?’

But, I think what this teaches me is the value of critical analysis, not only of statements, but also of context. First of all, obviously, to be aware of the validity of whatever argument is being aired, but then deciding whether or not an attempted refutation would contribute anything to the situation, or merely further entrench the individual in their beliefs, if only to save face. And as well, it’s important to step back for a moment, and assess the real reason I am choosing to disagree. Is it self-aggrandizement, dominance, or an incontestable conviction -incontestable based on knowledge or unprovable belief…?

I realize this is pretty confusing stuff -and, although profound, not overly enlightening- but sometimes we need to re-examine who it is we have come to be. In the words of the poet Kahlil Gibran, The soul walks not upon a line, neither does it grow like a reed. The soul unfolds itself like a lotus of countless petals.

Understanding as…

There is so much stuff out there that I don’t know -things that I hadn’t even thought of as knowledge. Things that I just accepted as ‘givens’.  You know, take the ability to understand something like, say, an arrangement of numbers as a series rather than a bunch of numbers, or the ability to extract meaning from some sounds -for example words spoken in English- and yet not others in a different language.

And, perhaps equally mysterious is the moment when that epiphany strikes. What suddenly changes those numbers into a series? Is it similar to what makes figure-ground alterations flip back and forth in my head: aspect perception? Is it analogous to the assignation of meaning to things -or, indeed, picking them out of the chaos of background and recognizing them as somehow special in the first place? Is it what Plato meant when he referred to the Forms –‘chairness’ or ‘tableness’ for example- abstractions that allow us to identify either, no matter how varied the shapes or sizes -the true essence of what things really are?

I suppose I’m becoming rather opaque -or is it obtuse?- but the whole idea of aspect perception, of ‘seeing as’, is an exciting, yet labyrinthine terra incognita, don’t you think? I’m afraid that what started it all was an essay in the online Aeon publication:

It was the edited version of an essay written by Stephen Law, the editor of the Royal Institute of Philosophy journal THINK. He begins by discussing some of the figure-ground changes found in, say Necker cubes whose sides keep flipping back and forth (a type of aspect perception) and then suggests that ‘A[nother] reason why changes in aspect perception might be thought philosophically significant is that they draw our attention to the fact that we see aspects all the time, though we don’t usually notice we’re doing so… For example, when I see a pair of scissors, I don’t see them as a mere physical thing – I immediately grasp that this is a tool with which I can do various things.’

Another example might be ‘…our ability to suddenly ‘get’ a tune or a rule, so we are then able to carry on ourselves.’ Or, how about religion? ‘The idea of ‘seeing as’ also crops up in religious thinking. Some religious folk suggest that belief in God doesn’t consist in signing up to a certain hypothesis, but rather in a way of seeing things.’ But then the caveat: ‘Seeing something as a so-and-so doesn’t guarantee that it is a so-and-so. I might see a pile of clothes in the shadows at the end of my bed as a monster. But of course, if I believe it’s a monster, then I’m very much mistaken.’

I have always loved wandering around bookstores. Maybe it’s an asylum -a refuge from the noisy street, or a spiritual sanctuary in a chaotic mall -but it’s more likely that the range and choice of books allows me to exercise an epiphanic region of my brain, and to practice ‘seeing as’ to my heart’s content. I’d never thought of bookstores as exercise before, of course, but I suppose the seed of ‘understanding as’ was sown by that article… or maybe it was the little girl.

Shortly after reading the essay, I found myself wandering blissfully through the quiet aisles of a rather large bookstore that seemed otologically removed from the noisy mall in which it hid. Coloured titles greeted me like silent hawkers in a park, the ones that sat dislodged from their otherwise tidy rows, sometimes reaching out to me with greater promise: curiosity, as to why someone might have dislodged them, perhaps. But nonetheless, I also found myself amused at their choices: book shops are catholic in the selection they proffer and I relish the opportunity to switch my perspectives… and expand my Weltanschauung, as the Philosophy section into which I had meandered might have put it when the thought occurred.

Of course, unexpected concepts like that are one of the delights of a bookstore -turn a corner into a different aisle and the world changes. It’s where I met the little girl talking to her mother about something in a book she was holding.

No more than four or five years old, she was wearing what I suppose was a pink Princess costume, and trying to be very… mature. Her mother, on the other hand, was dressed for the mall: black baseball cap, jeans, sneakers, and a grey sweatshirt with a yellow mustard stain on the front. Maybe they’d just come from a party, or, more likely, the Food Court, but the mother was trying to explain something in the book to her little daughter. The aisle wasn’t in the children’s section, but seemed to have a lot of titles about puzzles, and illusions, so maybe they’d wandered into it for something different: for surprises.

As I pretended to examine some books nearby, I noticed a Necker’s cube prominently displayed on the page the girl was holding open.

“Why does it do that, mommy?” Even as she spoke the perspective of the cube was flipping back and forth, with one face, then another seeming to be closer.

The mother smiled at this obvious teaching moment.

“It’s a great idea, anyway,” the daughter continued, before she got an answer.

“Idea…?” the mother said, with a patient look on her face. “What’s the idea, Angie?”

Angie scrunched her forehead and gave her mother a rather condescending look. “It’s an exercise book, remember?”

That apparently caught the mother by surprise. “It’s a book of puzzles and magic, sweetheart. I didn’t see any exercises.”

Angie rolled her eyes at her mother’s obvious obtuseness. “The nexercise cube, mommy…!”

Necker’s cube, sweetie,” she responded, trying to suppress a giggle. “It’s not an exercise cube.”

But Angie was having none of that, and stared at her like a teacher with a slow pupil. “It keeps making my mind move, mommy!” She shook her head in an obviously disappointed rebuke. “That’s exercise.”

I slipped back around the corner, unnoticed by them both I think. I felt I’d intruded on a very intimate moment and I didn’t want to trespass, but I couldn’t help wondering if Angie had come far closer to understanding Plato’s Forms than her mother or I could ever hope to.

Life’s Fitful Fever

I have never been terribly interested in historical statues I must confess. Pigeon- encrusted metal or a moulding stone person staring blankly at nothing and rooted firmly to a static prancing horse, does little to attract the attention of passersby like myself with lives and histories of their own to contemplate. Its attempts to dominate a plaza, or commemorate a public square, still do not often produce sufficient motivation to inspect its fading plaque. Perhaps I am alone in this, but unless intentionally identified, it is merely background. I do not notice it in the Gestalt.

And yet, it would seem that there are those who would have me think otherwise and devote undue attention to its original function. They would have me reconsider its historical significance.

Historical revisionism has never been a strong suit of mine –the present, with all of its problems, occupies most of my time. As an aging white male, I suppose I am history, or at least have lived through some of its more recent manifestations and survived. But of course I am one of the fortunate ones, and have been largely cosseted by my gender, ethnicity, and geography, so I appreciate the need to consider the lives and background of those not so blessed. And, had I been one of them, I’m certain I would also be less accepting of a majoritarian view of historical interpretation. History, after all, is written by the victors, not to mention the oppressors, and it frequently ignores the darker side -or at least reinterprets it to suit the prevailing ethos of the time.

And so the current movement to amend our view of omnia praeterita, is understandable. It’s just that the solution often chosen –pulling down statues, or renaming public edifices- creates adversaries who otherwise wouldn’t exist, and problems that rise like bubbles in a boiling pot. Statues, of course, can be seen as emblems of past injustice, and the contributions they commemorate, either misleading, or frankly misinformed. That none of us –not even the protesters- are blemish-free, is lost in the fervour to acknowledge historical repression or exploitation. I neither can, nor wish to deny any calumny that may be hidden in stone, or trapped in rusting metal, but I do hope that there is a middle ground. A workable compromise.

Despite the rush to take sides, I suspect it would be beneficial for all concerned to step back from the abyss of righteousness and look for solutions that neither polarize, nor punish. The past need not be prologue, but the question of historical truth, and unfair representation, is a vexing issue, and solutions are often fraught. In my search for background, however, I found an interesting and fairly balanced discussion of the problems: There are, perhaps, no definitive prescriptions, but at least it attempts a balanced context. A philosophical analysis, of a sort.

First, there is a succinct formulation of the problem: ‘All around the world, institutions are dealing with a conundrum. What to do about statues or buildings or scholarships or awards, honouring or funded by people we now regard as seriously morally flawed?’

And then there follows a discussion about possible solutions. ‘One approach is to do nothing. The do-nothing advocates say history shouldn’t be rewritten. To do so would be a form of censorship. And, they say, it’s ridiculous to expect every great historical figure to be blemish-free, to have lived a life of unadulterated purity.’ But, as is pointed out, with that approach, ‘Even those held up as saintly figures, such as Mahatma Gandhi or Nelson Mandela, had flaws (Gandhi’s attitude to women is excruciating, seen through 21st Century eyes)… And what message would it send to contemporary philanthropists? Give generously today, and risk having your reputation trashed tomorrow.

‘But this “do-nothing” position seems too extreme. Imagine that Goebbels had endowed scholarships to Oxford, like Rhodes. Would anybody seriously claim the Goebbels Scholarships shouldn’t be renamed (would anybody want to be a Goebbels Scholar?) or that a Goebbels statue shouldn’t be demolished?’

But, of course, the vast majority of people are neither complete monsters nor complete angels. So, ‘What sort of considerations, then, should come into play? One may be whether the views or actions of the figure in question were typical for their time. If so, that could make them less blameworthy. Another is the extent of their misdeeds and how that is evaluated against their achievements. Churchill held opinions that would disbar him from political office today – despicable yes, but surely massively outweighed by the scale of his accomplishments.’

The article, written by David Edmonds of the BBC, then goes on to point something out that I hadn’t thought of before: ‘… there are what philosophers call consequentialist considerations. How does looking at the statue make passers-by feel? This, in turn, will be connected to whether the history still resonates – an ancient statue of some medieval warlord, however bloody and brutal his conquests, probably won’t bother anybody. And, arguably, a statue of Rhodes in Cape Town will arouse more offence than one of the same man in Oxford.’

In fact, ‘Intergenerational justice is a hugely complex topic, not least because over the passage of time it becomes tricky to identify beneficiaries and victims. Daniel Butt [a politics fellow at Balliol College, Oxford] believes that where there is a clear historical continuity with the past, a modern institution has a duty to remedy wrongs, most especially when the impact of these wrongs is still being felt – for example in racial discrimination. He says Oxford’s complicity in colonialism confers upon it obligations ….’ But that complicity needs recognition -it needs to see the light of day.

There are other solutions that try to mitigate any past harms that the statue or name might invoke. One such approach would be for the institute with the offending statue, to offer scholarships, donations, or aid in areas of the world affected by the individual enstatued. Or, perhaps, as has been suggested in the American south, changing the information on the existing plaques to acknowledge the injustices meted out. Make the reader, of whatever persuasion, aware of them.

It seems to me that merely removing statues, or changing a commemorative name, is equivalent to burying the past. Out of sight, out of mind. If there truly has been an injustice, then surely recognizing it, making those hitherto unaware of it confront the issues, is more likely to prevent it from happening again. More likely to be a lesson, a cautionary tale that has to be heard, a violation that has to be seen. As Brutus says in Shakespeare’s Julius Caesar: It is the bright day that brings forth the adder and that craves wary walking.