To wear an undeserved dignity

 

Lately, I’ve been worried about dignity -not my own, you understand, although I’m sure that could use a little work. I’m more concerned that what I assumed was an inherent quality possessed -if not always demonstrated- by us all, may not be as innate as I thought. An essay in the online publication Aeon, by Remy Debes, an associate professor of philosophy at the University of Memphis entitled Dignity is Delicate, helped me to understand some of its issues: https://aeon.co/essays/human-dignity-is-an-ideal-with-remarkably-shallow-roots?

The word itself is derived from the Latin dignus, meaning ‘worthy’, but as with most words, it can be used in different ways, each with slightly different meanings. ‘Dignity has three broad meanings. There is an historically old sense of poise or gravitas that we still associate with refined manners, and expect of those with high social rank… Much more common is the family of meanings associated with self-esteem and integrity, which is what we tend to mean when we talk of a person’s own ‘sense of dignity’… Third, there is the more abstract but no less widespread meaning of human dignity as an inherent or unearned worth or status, which all human beings share equally.’

This latter aspect, which Debes calls the ‘moralized connotation’ ‘is the kind of worth everyone has, and has equally, just because we are persons.’ As Immanuel Kant wrote, in his Groundwork for the Metaphysics of Morals in 1785: ‘ whatever is above all price, and therefore admits of no equivalent, has a dignity.’ He also argued that we have a duty to treat other humans ‘always at the same time as an end, never merely as a means’ -with respect, in other words. Unfortunately, ‘the Groundwork wasn’t professionally translated until 1836. And even that translation wasn’t easily available until a revised edition appeared in 1869.’

So, in terms of its moral and ethical aspects, the concept of dignity is a recent one. ‘[U]ntil at least 1850, the English term ‘dignity’ had no currency as meaning anything like the ‘unearned worth or status of humans’, and very little such currency well into the 1900s. When the Universal Declaration of Human Rights (1948) used the terminology of human dignity to justify itself, this turned out to be a conceptual watershed.’

What am I missing here? As Debes illustrates in his essay, ‘the idea of human dignity is beset by hypocrisy. After all, our Western ethos evolved from, and with, the most violent oppression. For 200 years, we’ve breathed in the heady aspirations of liberty and justice for all, but somehow breathed out genocide, slavery, eugenics, colonisation, segregation, mass incarceration, racism, sexism, classism and, in short, blood, rape, misery and murder.’ So what is going on? Debes thinks ‘The primary way we have dealt with this shock and the hypocrisy it marks has been to tell ourselves a story – a story of progress… the story’s common hook is the way it moves the ‘real’ hypocrisy into the past: ‘Our forebears made a terrible mistake trumpeting ideas such as equality and human dignity, while simultaneously practising slavery, keeping the vote from women, and so on. But today we recognise this hypocrisy, and, though it might not be extinct, we are worlds away from the errors of the past.’

Of course, a still different way of explaining our abysmal lack of dignity is to suggest, not that we are getting better, but that we are getting worse -that there was a time when it was not so, and we need try going back to that ‘better time’.

Uhmm, they can’t both be correct. Perhaps, like me, you have noticed the presence of gerunds (verbs functioning as nouns with –ing endings), or implied gerunds, in the description: from the Latin gerundum –‘that which is to be carried on’. In other words, that which is not yet completed, or is in the process of happening, and hopefully will be so in the indefinite future.  As Debes writes, ‘facing up to the hypocrisy in our Western ethos requires resisting the temptation to scapegoat both the past and the present. We must not divorce ourselves from the fact that the present is possible only because of our past, the one we helped to create. Likewise, the existential question isn’t, are we really who we say we are? The question is, have we ever been?’

But why is everything so viscid? Humans have always been seen as valuable -the concept evolving through time. ‘The chorus in Sophocles’ Antigone, for example, praises man as the most ‘wondrous’ thing on Earth, a prodigy cutting through the natural world the way a sailor cuts through the ‘perilous’, ‘surging seas’ that threaten to engulf him.’ The word ‘dignity’ was not used, but it seems to me he was on the right track, although perhaps not in the sense that mankind’s value was incommensurable and couldn’t be exchanged for other kinds of worth as Kant had concluded.

Or how about Aristotle: ‘Dignity does not consist in possessing honours, but in deserving them’

Even Shakespeare’s Hector says to Troilus about whether Helen of Troy is worth going to war for: Value dwells not in a particular will; it holds his estimate and dignity as well wherein ‘tis precious of itself as in the prizer. In other words, value -dignity- isn’t subjective, it’s intrinsic.

So what has kept us from believing in that ‘inherent or unearned worth or status, which all human beings share equally’? Admittedly we are children of our era, and very few of us can escape from the Weltanschauung of our time, let alone the political and social ethos in which we find ourselves embedded. There is much that conspires to homogenize and temper our views, I suspect.

Maybe it was as simple as a fear of the unknown, and fear of disruption, that kept the lid on the pot -better the devil we know than the devil we don’t. Moral dignity –ethical dignity- did not accord with the status quo: keeping slaves, or a class system that offered wealth and status to the powerful; women were trapped in a never-ending cycle of pregnancies and children, and so were themselves essentially biologically enslaved… A clock will not work unless all of the parts are in their proper places.

So many levels: civilization -well, at least culture– has always been a matryoshka doll –‘a riddle wrapped in a mystery inside an enigma’, as Winston Churchill so famously said about Russia. But maybe, concealed inside the innermost layer, the sanctum sanctorum of the inner doll, a flower lives, not a minotaur.

We can only hope.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Understanding as…

There is so much stuff out there that I don’t know -things that I hadn’t even thought of as knowledge. Things that I just accepted as ‘givens’.  You know, take the ability to understand something like, say, an arrangement of numbers as a series rather than a bunch of numbers, or the ability to extract meaning from some sounds -for example words spoken in English- and yet not others in a different language.

And, perhaps equally mysterious is the moment when that epiphany strikes. What suddenly changes those numbers into a series? Is it similar to what makes figure-ground alterations flip back and forth in my head: aspect perception? Is it analogous to the assignation of meaning to things -or, indeed, picking them out of the chaos of background and recognizing them as somehow special in the first place? Is it what Plato meant when he referred to the Forms –‘chairness’ or ‘tableness’ for example- abstractions that allow us to identify either, no matter how varied the shapes or sizes -the true essence of what things really are?

I suppose I’m becoming rather opaque -or is it obtuse?- but the whole idea of aspect perception, of ‘seeing as’, is an exciting, yet labyrinthine terra incognita, don’t you think? I’m afraid that what started it all was an essay in the online Aeon publication: https://aeon.co/ideas/do-you-see-a-duck-or-a-rabbit-just-what-is-aspect-perception

It was the edited version of an essay written by Stephen Law, the editor of the Royal Institute of Philosophy journal THINK. He begins by discussing some of the figure-ground changes found in, say Necker cubes whose sides keep flipping back and forth (a type of aspect perception) and then suggests that ‘A[nother] reason why changes in aspect perception might be thought philosophically significant is that they draw our attention to the fact that we see aspects all the time, though we don’t usually notice we’re doing so… For example, when I see a pair of scissors, I don’t see them as a mere physical thing – I immediately grasp that this is a tool with which I can do various things.’

Another example might be ‘…our ability to suddenly ‘get’ a tune or a rule, so we are then able to carry on ourselves.’ Or, how about religion? ‘The idea of ‘seeing as’ also crops up in religious thinking. Some religious folk suggest that belief in God doesn’t consist in signing up to a certain hypothesis, but rather in a way of seeing things.’ But then the caveat: ‘Seeing something as a so-and-so doesn’t guarantee that it is a so-and-so. I might see a pile of clothes in the shadows at the end of my bed as a monster. But of course, if I believe it’s a monster, then I’m very much mistaken.’

I have always loved wandering around bookstores. Maybe it’s an asylum -a refuge from the noisy street, or a spiritual sanctuary in a chaotic mall -but it’s more likely that the range and choice of books allows me to exercise an epiphanic region of my brain, and to practice ‘seeing as’ to my heart’s content. I’d never thought of bookstores as exercise before, of course, but I suppose the seed of ‘understanding as’ was sown by that article… or maybe it was the little girl.

Shortly after reading the essay, I found myself wandering blissfully through the quiet aisles of a rather large bookstore that seemed otologically removed from the noisy mall in which it hid. Coloured titles greeted me like silent hawkers in a park, the ones that sat dislodged from their otherwise tidy rows, sometimes reaching out to me with greater promise: curiosity, as to why someone might have dislodged them, perhaps. But nonetheless, I also found myself amused at their choices: book shops are catholic in the selection they proffer and I relish the opportunity to switch my perspectives… and expand my Weltanschauung, as the Philosophy section into which I had meandered might have put it when the thought occurred.

Of course, unexpected concepts like that are one of the delights of a bookstore -turn a corner into a different aisle and the world changes. It’s where I met the little girl talking to her mother about something in a book she was holding.

No more than four or five years old, she was wearing what I suppose was a pink Princess costume, and trying to be very… mature. Her mother, on the other hand, was dressed for the mall: black baseball cap, jeans, sneakers, and a grey sweatshirt with a yellow mustard stain on the front. Maybe they’d just come from a party, or, more likely, the Food Court, but the mother was trying to explain something in the book to her little daughter. The aisle wasn’t in the children’s section, but seemed to have a lot of titles about puzzles, and illusions, so maybe they’d wandered into it for something different: for surprises.

As I pretended to examine some books nearby, I noticed a Necker’s cube prominently displayed on the page the girl was holding open.

“Why does it do that, mommy?” Even as she spoke the perspective of the cube was flipping back and forth, with one face, then another seeming to be closer.

The mother smiled at this obvious teaching moment.

“It’s a great idea, anyway,” the daughter continued, before she got an answer.

“Idea…?” the mother said, with a patient look on her face. “What’s the idea, Angie?”

Angie scrunched her forehead and gave her mother a rather condescending look. “It’s an exercise book, remember?”

That apparently caught the mother by surprise. “It’s a book of puzzles and magic, sweetheart. I didn’t see any exercises.”

Angie rolled her eyes at her mother’s obvious obtuseness. “The nexercise cube, mommy…!”

Necker’s cube, sweetie,” she responded, trying to suppress a giggle. “It’s not an exercise cube.”

But Angie was having none of that, and stared at her like a teacher with a slow pupil. “It keeps making my mind move, mommy!” She shook her head in an obviously disappointed rebuke. “That’s exercise.”

I slipped back around the corner, unnoticed by them both I think. I felt I’d intruded on a very intimate moment and I didn’t want to trespass, but I couldn’t help wondering if Angie had come far closer to understanding Plato’s Forms than her mother or I could ever hope to.

Medical Revisionism

Words -that’s all they are: sounds that by their very presence magically communicate meaning. They are more than mere noise or background. They are not the wind rustling through the leaves, nor the sounds of a frog in a pond; in a way, they are entities that resolve uncertainty, and in as much as they can be interpreted, contain information. Data. So, in a sense, they transcend Time: the information in the words of an ancient document still exists. But information is subject to interpretation; the same data may be seen as having different meaning as time and societal norms change. But does that change the information conveyed? I think not.

I’ve covered this topic in previous blogs (for example: https://musingsonwomenshealth.wordpress.com/2013/11/01/whats-in-a-name-cancer/ ) but the topic is a source of continuing intrigue for me, so I was once again interested in seeing it broached in an article in the BBC News last fall: http://www.bbc.com/news/blogs-ouch-34385738  It seems we are constant and insatiable revisionists. It’s as if by changing the descriptor, we somehow alleviate the pejoration its ancestor accumulated. And yet the information remains; only the colour changes.

I suppose that this is useful, but I can’t help but wonder if there is some other way of doing it. Of course, some words seemed to have been coined originally with a belittling intent -Cripple springs to mind- and even without our penchant for viewing the machinations of history through modern eyes, the word is disparaging; it is simply not fair. It derives from the Old English word crypel which has the suggestion of creeping. It was a condition in clear need of a new term.

Other words were more naively-attempted descriptions –designations that were no doubt thought to help others picture what was being named. There was unlikely to have been any attempt at denigration -despite how they might now offend or upset us. Mongolism is one such term. According to the New Oxford American Dictionary:mongol, or Mongoloid, was adopted in the late 19th century to refer to a person with Down syndrome (named after John L. H. Down [1828–96], the English physician who first described it), owing to the similarity of some of the physical symptoms of the disorder with the normal facial characteristics of eastern Asian people. The syndrome itself was thus called mongolism.’ But the problem remains –what happens when the term ‘Down Syndrome’ itself also becomes offensive?

Sometimes, it seems to me, the words will also change for no apparent reason. Think of the various expression changes for sexual diseases over the years and the somewhat clumsy attempts to strip the prejudice out of them. When I first started medical school, the expression was ‘venereal disease’ –or VD. Then, when that became too pejorative, or at least discriminatory, it morphed into STD (‘sexually transmitted disease’), and currently STI for ‘sexually transmitted infection’… Or am I already out-of-date? The reason for any of these transformations, however, is totally beyond me.

Words, it seems –or maybe it’s me– just can’t keep up. Maybe, like Fashion, they’re bound to change because of user-boredom or a need for novelty, but I think it’s probably deeper than that. I suspect that it relates more to societal attitudes than societal ennui. And I think that it may be a lost cause to expect consistency of usage. As we change our approach to issues and our opinions, so we change our words to describe them. It starts off with the more curmudgeonly amongst us –usually those for whom tradition provides a stable and secure platform- proclaiming the changes to be ‘political correctness’- to use the current phrase. But then, gradually, sometimes imperceptibly, the expression achieves a common parlance and not using it courts sideways glances, or even incomprehension. It is, perhaps, an aurally measurable example of society’s changing attitudes, if not its mores.

My biggest complaint, however –although minor in the scheme of things- is that it seems a waste of perfectly good words. One of my favourite ones ‘awe’ and its brother ‘awesome’ which used to bespeak a form of reverence, was ripped from my useful vocabulary only a few years ago and I’ve never really gotten over it. The words now have little value -they’re the scrapings from a different, grander time. Crumbs. Leftovers.

I am reminded of the words of Moth, the page of the soldier Don Armado in Love’s Labour’s Lost by Shakespeare: ‘They have been at a great feast of languages, and stol’n the scraps.’ 

The Linguistic Pregnancy

What is pregnancy? What’s in a name, for that matter..? Is it true that a rose by any other name would smell as sweet, or is there something in the name itself that alters and affects that to which it refers? Neo-Whorfianism, in other words…

For example, the Chinese word for what we in English call ‘pregnancy’ is youxi (transliterated, of course). If you break it apart, though, it is composed of two Chinese characters: you –which means something like ‘to have’ and xi which means ‘joy’ or words to that effect. Only when strung together as a unit, does it mean ‘pregnancy’.

There are many other similar examples, of course; the one that comes to mind here in bilingual Canada is the French word for pregnancy: la grossesse –largeness. Or how about Spanish: embarazada –etymologically it derives from the same root as does the English ‘embarrassed’.

But is this really telling us anything important about the culture – or anything at all? Look up our own English word ‘pregnant’. It derives (probably), says the Oxford English Dictionary, from two Latin words: prae –meaning ‘before’ and the base of gnasci, or nasci –be born. Not much to talk about there… Time for a little background.

In the 1930ies, Benjamin Whorf hypothesized that language alters how its users view reality. If there exists no word in a society for numbers, then how could its members count? He discovered that in the Hopi language –a Native American people- there were no markers of time –no later, or earlier, for example. So maybe they considered past, present and future the same? No words for time, no sense of time… The hypothesis put the cart before the horse it would seem, but the idea caught on… For a while, anyway.

Tempting as it may be to read cultural and etymological significance into the words that have come to be used for pregnancy –are Spaniards really embarrassed about being pregnant, for example?- many linguists have suggested that there is little if any validity in so doing. Well perhaps they’re right –all I  know about language is what the experts tell me and this seems to change over time.

So maybe I can take my pick of the plethora of  linguistic opinions. I mean it all seems to hinge on which theory is ascendant, which linguist is the most convincing/charismatic, and which theory gets the most press –a rare thing at best in Language Theory. But sensitivities do change, and revisionism usually rears its head to correct insensitive contentions. Data appears to go in and out of fashion; each side argues about it and then poof, a paradigm shift, and they’re off again. It’s almost like watching a hockey game.

From a decidedly lay position, though -one firmly rooted in popular mythology- I’ve come to suspect that linguists are trying to take the soul out of language: the fun. So I’m throwing in my lot with the opposition. Languages are alive; they simmer and bubble neologistically; they evolve according to need. They incorporate metaphor…They are metaphor until a suitable word is created to fill a niche.

The richness of a language resides both in the changes it undergoes and what it does with the remnants. With Semantic Drift, nothing is wasted; old ideas -old words- hide just beneath the surface, noticed only when pointed out. Borrowed words from other languages and other times play with meanings like colours play with fashion. It’s likely the same in all languages, all cultures, but I’d be stretching the obvious if I pretended to comment intelligently about anything other than English.

So does that make me a culturalist –or whatever the term would be for someone who loves to think each culture adds its own unique iridescence to the mix? And am I really harming anyone -or any society- if I smile at how some languages have managed to add a whiff of descriptive ingenuity to a word as important as ‘pregnancy’? Isn’t it wonderful to think that a language could transmute one or two words, conceal them in plain sight (or hearing?) -but unobtrusively so they don’t stand out like hitchhikers- and have them function as ambassadors for something totally new? And yet, like ‘Where’s Waldo’ they are there all the while, chuckling in the background at their clever disguises.

Personally, I think the world is more of a family if we can search inside each culture’s heritage for these shared gems without the fear of opening a racial Pandora’s box.  To unveil them should not court accusations of malevolent intent, or naïve generalizations. Just because, for example, one of the terms to describe being pregnant in Russian (Beremenaya in transliteration) translates, roughly, as ‘load’, ‘burden’, or even ‘punishment’, it says little more about the culture’s attitude to pregnancy than that it has a sense of perspective –and humour. Should we seriously speculate that because of their word for it, Malawians (in the Chichewa language) really, deep down, consider pregnancy an illness?

I think everybody should just lighten up and enjoy the archeologized meanings for what they are: a demonstration of the incredible ability of humans to bend their words and meld them into new and intricate designs. I don’t know, sort of like Isaiah’s idea of beating swords into ploughshares… Or would that be denigrated as a neo-neo-Whorfianism?