The colour of truth is gray

It’s back again… Well, actually I suppose it never left. We still seem to be obsessed with the genderization of colours -as if it were an established biological given; as if it were as obvious as handedness, or as necessary as the assignation of gender at birth. ‘Pink is for girls and Blue is for boys’ -its self-evidence is once again being called into question; it seems an endless, pointless cycle.

There have been many attempts to link gendered colour preference to Weltanschauungen, genetic atavisms, and of course, persistent, market-savvy fashion manipulation (even I attempted a commentary in a previous essay: https://musingsonwomenshealth.com/2013/12/06/nature-versus-princess-nurture/) -but none seem adequate explanations for its persistence in our culture. Indeed, those studies that have sought to resolve the issue seem to have canvassed opinions from predominantly western cultures. And apart from the probable sampling bias, there are other factors that likely come into play, as suggested in a 2015 article in Frontiers in Psychology: ‘… red symbolizes good luck in China, Denmark, and Argentina, while it means bad luck in Germany, Nigeria, and Chad (Schmitt, 1995Neal et al., 2002). White is a color of happiness and purity in the USA, Australia, and New Zealand, but symbolizes death in East Asia (Ricks, 1983Neal et al., 2002). Green represents envy in the USA and Belgium, while in Malaysia it represents danger or disease (Ricks, 1983Hupka et al., 1997).’ In other words, ‘this variation in the symbolism of color could lead to variation on color preference between cultures.’ We’d best choose our colours carefully.

But, I suppose what got me interested again in this perpetual, gendered debate was a rather lengthy and thoughtful article (extracted from her book Gender and Our Brains) in Aeon by Gina Rippon, an emerita professor of cognitive neuroimaging at Aston University in Birmingham, UK: https://aeon.co/essays/is-pink-for-girls-and-blue-for-boys-biology-or-conditioning

I have to say I was lured into reading the entire article when she quickly introduced me to the dreadful concept of ‘gender reveal’ parties. They apparently come in two varieties: in one, the pregnant woman for whom the party is held, does not know the sex of her fetus as do the organizers (the ultrasound sex, by agreement, has been sent only to them) -it is guarded in a sealed envelope as is the colour motif; in the second variety, the mother knows and reveals it with all the appropriately coloured hoopla at the party.

And why, apart from the puerile attempts to colourize the event, do I find it so disagreeable? Well, as Rippon suggests, ‘20 weeks before little humans even arrive into it, their world is already tucking them firmly into a pink or a blue box. And… in some cases, different values are attached to the pinkness or blueness of the news.’

I also read further, in hopes that the author had some convincing insights as to whether the colour assigned to each gender was biologically or culturally determined. Unfortunately, the evidence she cites seems able to support either -or neither- side. One study, however, did make some progress in resolving the problem: ‘American psychologists Vanessa LoBue and Judy DeLoache tracked more closely just how early this preference emerges. Nearly 200 children, aged seven months to five years, were offered pairs of objects, one of which was always pink. The result was clear: up to the age of about two, neither boys nor girls showed any kind of pink preference. After that point, though, there was quite a dramatic change, with girls showing an above-chance enthusiasm for pink things, whereas boys were actively rejecting them. This became most marked from about three years old onwards.’ This suggests a cultural rather than biological explanation: ‘once children learn gender labels, their behaviour alters to fit in with the portfolio of clues about genders and their differences that they are gradually gathering.’

But why, then, the cultural preference? There was recently what may be an Urban Legend suggesting that at one time, the gendered colour preferences were actually reversed and ‘that any kind of gender-related colour-coding was established little more than 100 years ago, and seems to vary with fashion, or depending on whether you were reading The New York Times in 1893 [pink for a boy]… or the Los Angeles Times in the same year [pink for a girl].’

But, at least in our current milieu, the issue is not so much the colour as what it has come to suggest, consciously or not: ‘Pink has become a cultural signpost or signifier, a code for one particular brand: Being a Girl. The issue is that this code can also be a ‘gender segregation limiter’, channelling its target audience (girls) towards an extraordinarily limited and limiting package of expectations, and additionally excluding the non-target audience (boys).’

Of course, as Rippon points out, the fact that Pink may be a signifier of what is acceptable to females, allows it to bridge the gender gap: colour a toy truck pink, and it becomes acceptable for a girl to play with it. Unfortunately, the other side of the permission can be that ‘pinkification is all too often linked with a patronising undertow, where you can’t get females to engage with the thrills of engineering or science unless you can link them to looks or lipstick, ideally viewed through – literally – rose-tinted glasses.’ And viewed through prevailing stereotypes as well, I might add.

And yet, what determines what constitutes a ‘boy toy’? Is it what the child sees -or what their parents and grandparents saw in the world in which they grew up? In the world today, women drive trucks, operate diggers, become doctors and lawyers -not just secretaries, teachers, and nurses.

There is also a danger to pandering to ill-conceived remedies, of course. Take Rippon’s example of the STEM Barbie doll (STEM -for the older, more naïve readers like me- stands for Science, Technology, Engineering, and Mathematics -traditionally male-dominated fields, apparently): ‘efforts to level the playing field get swamped in the pink tide – Mattel has produced a STEM Barbie doll to stimulate girls’ interest in becoming scientists. And what is it that our Engineer Barbie can build? A pink washing machine, a pink rotating wardrobe, a pink jewellery carousel.’

Only in the penultimate and last paragraph of the article does Rippon come close to answering the question on the reader’s lips from the beginning of her 4500 word document: ‘It is clear that boys and girls play with different toys. But an additional question should be – why?… The answer to these questions could lie in our new understanding of how, from the moment of birth (if not before), our brains drive us to be social beings – to understand social scripts, social norms, social behaviour – to make sure we understand the groups we should belong to and how we can fit in… our brains are scouring our world for the rules of the social game – and if that world is full of powerful messages about gender, helpfully flagged by all sorts of gendered labelling and gendered colour-coding, our brains will pick up such messages and drive their owners to behave ‘appropriately.’’

Perhaps Rippon is correct, but I wonder if it’s more accurate to say that we were stuck with gendered colours; I think there is room for hope: what the child sees when she looks around is changing. So I am instead inclined to the view of André Gide, the French author who won the Nobel Prize in Literature in 1947: ‘The colour of truth is gray,’ he wrote.

May we all be free to mix our own colours…

A Predilection for Extinction?

 

There appears to be a lot of concern about extinctions nowadays -everything from spotted owls to indigenous languages pepper the list. Things around us that we took for granted seem to be disappearing before we even get to know or appreciate them. One has to wonder whether this is accompanied by furtive, yet anxious, glances in the mirror each morning.

Extinction. I wonder what it would be like -or can we even imagine it? If we could, then presumably we’re not extinct, of course, but our view of history is necessarily a short one. Oral traditions aside, we can only confidently access information from the onset of written accounts; many extinctions require a longer time-frame to detect… although, perhaps even that is changing as we become more aware of the disappearance of less threatening -less obvious- species. Given our obsessive proclivity for expanding our knowledge, someone somewhere is bound to have studied issues that have simply not occurred to the rest of us.

And yet, it’s one thing to comment on the absence of Neanderthals amongst us and tut-tut about their extinction, but yet another to fail to fully appreciate the profound changes in climate that are gradually occurring. Could the same fate that befell Neanderthals be forecasting our own demise -a refashioning of the Cassandra myth for our self-declared Anthropocene?

It would not be the first time we failed to see our own noses, though, would it? For all our perceived sophistication, we often forget the ragged undergarments of hubris we hide beneath our freshly-laundered clothes.

Religion has long hinted at our ultimate extinction, of course -especially the Christian one with which those of us in the West are most familiar- with its talk of End-of-Days. But, if you think more closely about it, this is predicted to occur at the end of Time; extinction, on the other hand, occurs -as with, say, the dinosaurs- within Time. After all, we are able to talk about it, measure its extent, and determine how long ago it happened.

And yet, for most of us, I suspect, the idea of extinction of our own species is not inextricably linked to our own demise. Yes, each of us will cease to exist at some point, but our children will live on after us -and their children, too. And so on for a long, long time. It is enough to think that since we are here, our children will continue on when we are not. Our species is somehow different than our own progeny…

Darwin, and the subsequent recognition of the evolutionary pressures that favour the more successfully adapted no doubt planted some concerns, but an essay in Aeon by Thomas Moynihan (who completed his PhD at Oxford), set the issue of Extinction in a more historical context for me, however. https://aeon.co/essays/to-imagine-our-own-extinction-is-to-be-able-to-answer-for-it

Moynihan believes that only after the Enlightenment (generally attributed to the philosophical movement between the late 17th to the 19th century) did the idea of human extinction become an issue for consideration. ‘It was the philosopher Immanuel Kant who defined ‘Enlightenment’ itself as humanity’s assumption of self-responsibility. The history of the idea of human extinction is therefore also a history of enlightening. It concerns the modern loss of the ancient conviction that we live in a cosmos inherently imbued with value, and the connected realisation that our human values would not be natural realities independently of our continued championing and guardianship of them.’

But, one may well ask, why was there no serious consideration of human extinction before then? It would appear to be related to what the American historian of ideas, Arthur Lovejoy, has called the Principle of Plenitude that seemed to have been believed in the West since the time of Aristotle right up until the time of Leibniz (who died in 1716): things as they are, could be no other way. It would be meaningless to think of any species (even human) not continuing to exist, because they were meant to exist. Period. I am reminded -as I am meant to be- of Voltaire’s satirical novel Candide and the uncritical espousal of Leibniz’ belief that they were all living in ‘the best of all possible worlds’ -despite proof to the contrary.

I realize that in our current era, this idea seems difficult to accept, but Moynihan goes on to list several historical examples of the persistence of this type of thinking -including those that led ‘Thomas Jefferson to argue, in 1799, in the face of mounting anatomical evidence to the contrary, that specimens such as the newly unearthed Mammuthus or Megalonyx represented species still extant and populous throughout the unexplored regions of the Americas.’

Still, ‘A related issue obstructed thinking on human extinction. This was the conviction that the cosmos itself is imbued with value and justice. This assumption dates back to the roots of Western philosophy… Where ‘being’ is presumed inherently rational, reason cannot itself cease ‘to be’… So, human extinction could become meaningful (and thus a motivating target for enquiry and anticipation) only after value was fully ‘localised’ to the minds of value-mongering creatures.’ Us, in other words.

And, of course, the emerging findings in geology and archeology helped to increase our awareness of the transience of existence. So too, ‘the rise of demography [the statistical analysis of human populations] was a crucial factor in growing receptivity to our existential precariousness because demography cemented humanity’s awareness of itself as a biological species.’

Having set the stage, Moynihan’s argument is finally ready: ‘And so, given new awareness of the vicissitude of Earth history, of our precarious position within it as a biological species, and of our wider placement within a cosmic backdrop of roaming hazards, we were finally in a position to become receptive to the prospect of human extinction. Yet none of this could truly matter until ‘fact’ was fully separated from ‘value’. Only through full acceptance that the Universe is not itself inherently imbued with value could ‘human extinction’ gain the unique moral stakes that pick it out as a distinctive concept.’

And interestingly, it was Kant who, as he aged, became ‘increasingly preoccupied with the prospect of human extinction…  During an essay on futurology, or what he calls ‘predictive history’, Kant’s projections upon humanity’s perfectibility are interrupted by the plausibility of an ‘epoch of natural revolution which will push aside the human race… Kant himself characteristically defined enlightening as humanity’s undertaking of self-responsibility: and human rationality assumes culpability for itself only to the exact extent that it progressively spells out the stakes involved… This means that predicting increasingly severe threats is part and parcel of our progressive and historical assumption of accountability to ourselves.’

So, I don’t see this recognition of the possibility of human extinction as a necessarily bad thing. The more we consider the prospect of our disappearance, the more we become motivated to do something about it. Or, as Moynihan points out, ‘The story of the discovery of our species’ precariousness is also the story of humanity’s progressive undertaking of responsibility for itself. One is only responsible for oneself to the extent that one understands the risks one faces and is thereby motivated to mitigate against them.’ That’s what the  Enlightenment was all about: humanity’s assumption of self-responsibility.

Maybe there is still hope for us… well, inshallah.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Illeism, or Sillyism?

Who would have thought that it might be good to talk about yourself in the third person? As if you weren’t you, but him? As if you weren’t actually there, and anyway, you didn’t want yourself to find out you were talking about him in case it seemed like, well, gossip? I mean, only royalty, or the personality-disordered, are able to talk like that without somebody phoning the police.

Illeism, it’s called -from the Latin ‘he’- and it’s an ancient rhetorical technique that was used by various equally ancient personages -like, for example, Julius Caesar in the accounts he wrote about his exploits in various wars. It’s still in occasional use, apparently, but it stands out like a yellow MacDonald’s arch unless you’re a member of a small cabal, sworn to secrecy.

Now that I mention it, I remember trying it once when I was very young, and blamed our cat for scattering cookies all over the floor; but I suppose that doesn’t count because my mother instantly realized I was actually using the third-person-singular in its grammatical sense, and sent me to my room for fibbing -without the cat. I didn’t even get a hug for my clever use of ancient rhetoric.

The episode kind of put me off third-personism until I read a little more about it in an adaptation of an article originally published by The British Psychological Society’s Research Digest, by David Robson and edited for Aeon. He is a science journalist and a feature writer for the BBC: https://aeon.co/ideas/why-speaking-to-yourself-in-the-third-person-makes-you-wiser

It seems illeism can be an effective tool for self-reflection. And, although you may be tempted to opt for simple rumination – which is ‘the process of churning your concerns around in your head… research has shown that people who are prone to rumination also often suffer from impaired decision making under pressure, and are at a substantially increased risk of depression…’

Robson was intrigued by the work of the psychologist Igor Grossmann at the University of Waterloo in Canada writing in PsyArxiv which suggests that third-person thinking ‘can temporarily improve decision making… [and] that it can also bring long-term benefits to thinking and emotional regulation.’ -presumably related to the perspective change allowing the user to bypass -or at least appreciate- their previously held biases.

Grossmann, it seems, studies wisdom, and [w]orking with Ethan Kross at the University of Michigan in the United States…  found that people tend to be humbler, and readier to consider other perspectives, when they are asked to describe problems in the third person.’

Hmm…

He read the article with a fair soupçon of wariness. Might this not, he wondered, be academic legerdemain? It managed to fool Robson, but surely not he who reads with not even a hatchet to grind. He, after all, is only a retired GYN who is only accustomed to addressing freshly delivered newborns and their unique anatomical appendages with the appropriate third-person labels. It’s hard to do otherwise with the unnamed. Indeed, it had always seemed situationally germane, given the circumstances. To turn that on himself, however, might be contextually confusing -as well as suspicious.

So, his days as an accoucheur long past, he decided there would be little harm in trying it out in front of a mirror before he unleashed his full third-person on an unsuspecting face in Starbucks.

It seemed… unusual at first: he knew the individual in the reflection as well as himself, and addressing him as ‘he’ felt rude -creepy, actually. He did manage to get around the vertigo by pretending he was actually talking to a younger version of his brother, though, and ignored the fact that his brother was moving his lips at the same time and apparently not listening.

“Your brother,” he started, “is wondering if he should use the third-person approach when he is anxious about whether or not to order the sausage and egg bagel or just a cookie for breakfast at Starbucks.” A sudden thought occurred to him: “he could pretend he was sent to order it for his friend who is currently guarding a table for the two of them.”

He stared at the image in the mirror and frowned, suddenly remembering the cat-and-cookie incident.

He was uncertain where this was going. Was he supposed to ask what he -that is ‘himself’- thought about the idea? And who, exactly, would be answering? The whole thing seemed like an endless hall of mirrors, an infinite regression of Matryoshka dolls.

“Okay,” he added, to assuage the guilt he assumed he would have fibbing to the barista, “He is just trying an experiment in non-gendered, non-directional conversation to solve a personal decisional paralysis. So, he is not trying to be weird or anything. He is actually just asking for your advice: would bagel or cookie be a better breakfast?”

Suddenly, an unexpected epiphany -maybe produced by the comparative ‘better’, but nonetheless apparent in the way in which the third person had phrased his question. Of course the bagel with its protein rich contents was the ‘better’ breakfast! He was pretty sure that First-person-singular would never have seen that with such clarity –could never have seen it. Only by divorcing himself from his stomach, and mentioning it as if he were discussing a friend did it become clear.

He stepped away from his brother at the mirror and smiled to himself. He’d discovered a way of distancing himself from himself long enough to see who he was from an outside perspective. Still, there was a nagging question that kept tugging at his sleeve: who was he when he asked those questions? And did he risk permanently closing the door to the person he used to be, or was it sort of like losing himself in a story and then swapping realities when he closed the book…? But, what if he preferred what he was reading to what he was living…?

Whoa -pretty heavy stuff, that.

You know, it’s harder coming back to First-person than closing the book after a while, and I found myself switching back and forth for the longest time. I really wonder how hard Grossman and Kross had thought this through. And I wonder if Robson got caught up in their web as well. Nobody mentioned anything about collateral damage -but of course, they wouldn’t, would they?

All I can say is be careful, readers -there might be a third-person Minotaur at the end of the labyrinth.

Who’s afraid of the Deodand?

Sometimes Philosophy hides in plain sight; interesting questions emerge, unbidden, when you least expect them. A few months ago I was waiting in a line to order a coffee in a poorly-lit shop, when the woman behind bumped into me as she struggled to read the menu posted on the wall over the counter.

“They don’t make it easy in here, do they?” she grumbled in a token apology.

I turned and smiled; I’d been having the same difficulty. “I should have brought a flashlight,” I added, trying to make light of it.

“Photons should be free,” she mumbled. “It’s not like we should have to carry them with us to get a coffee…” She looked at me with a mischievous grin creeping across her shadowed face. “I mean they don’t have to pay by the pound for them like bananas, or anything…”

I chuckled. “Photons have mass…? I didn’t realize they were Catholic.” It was a silly thing to say, I suppose, but it just popped out.

She actually laughed out loud at that point. “That’s very clever…” she said, and despite the dim light, I could feel her examining me with more interest.

But I found myself standing in front of the barista at that point, so I ordered my coffee, and headed for a table in the corner. A moment later, the woman from the lineup surfaced out of the darkness and sat beside me under a feeble wall light at the next table.

“Do you mind if I sit here?” she asked, not really waiting for my reply.

I smiled pleasantly in response, but in truth, I had been looking forward to the solitude usually offered by a dark coffee-shop corner.

“I’m sorry,” she said, immediately sensing my mood. “It’s just that you cheered me up in that horrid line, and I wanted to thank you…”

“It was a bit of a trial, wasn’t it?”

She nodded as she sipped her coffee. “Your comment on the mass of photons was hilarious -I’m a Science teacher at the Mary Magdalene Women’s College, so I enjoyed the reference to Catholics. My students will love it.”

I looked at her for a moment and shrugged. “I’m afraid it’s not original, but thank you.”

She chuckled at my honesty and picked up her coffee again. “I don’t recognize it,” she added after a moment’s reflection, still holding her steaming cup in front of her and staring at it like a lover.

“I think maybe it was one of my favourite comedians who said it…” But I wasn’t sure.

“Oh? And who might that be?” she asked, smiling in anticipation of a shared interest.

I thought about it for a moment. “I don’t know… Woody Allen, perhaps.”

She put down her cup with a sudden bang on the table and stared at me. Even in the dim light, I could feel her eyes boring into my face. “A horrid man!” she said between clenched teeth. “How could you ever think that anything he said was funny?” she muttered.

I was beginning to find her eyes painful. I was aware of the controversies about Woody, of course, but I suppose I was able to separate them from his humour. And yet, I have to admit, that when the woman reminded me of his behaviour, I felt guilty -as if by laughing at his jokes, I was tacitly approving of his other activities.

It’s a puzzling, and yet fascinating relationship we have with things used by, or even owned by people we consider evil: deodands. The word, once used in English Common Law, was originally from Medieval Latin –Deo dandum -a thing to be given to God. The idea was that if the object had caused a human death, it had to be forfeited to the Crown, and its value would equal the compensation given to charity, or the family of the victim.

The question, though, is why we feel such revulsion for something that, through no fault of its own, was used in the commission of a crime? It could have been any knife, say, that was used in a stabbing, so why is this particular knife somehow different? Does the aura of what it did cling to it? Haunt it…? Would Woody Allen’s unrelated jokes -or, for that matter, Bill Cosby’s- be funny if we didn’t know their sources?

I have to admit that humour is a lot more reflective of the personality that created it than, for example, an assassin’s gun, or a criminal’s knife, but in isolation -ie divorced from context- is there really any difference? I certainly have no answer, but I have to say that I was pleasantly surprised that the issue was not one that I was puzzling over on my  own. I came across an essay in an issue of Aeon by Paul Sagar, a lecturer in political theory at King’s College London that looked at first as if it might be helpful: https://aeon.co/essays/why-do-we-allow-objects-to-become-tainted-by-chance-links

He wrote that ‘It is not uncommon to find that one’s enjoyment of something is irrevocably damaged if that thing turns out to be closely connected to somebody who has committed serious wrongs…  knowledge of somebody – or something – having done a bad thing can deeply affect how we view the status of the thing itself.’ But why should that be?

Obviously, the answer is not easily obtained, and in a roundabout way he throws himself on the mercy of the 18th-century Scottish Enlightenment thinker Adam Smith, and his first book, The Theory of Moral Sentiments (1759). ‘Smith thought it undeniable that we assess the morality of actions not by their actual consequences, but by the intentions of the agent who brings them about.’ And yet, if a person were to throw a brick over a wall and hit someone accidentally, he would also be judged by the consequences even though he hadn’t intended to injure anyone. ‘Smith thought that our moral sentiments in such cases were ‘irregular’. Why do we respond so differently to consequences that have bad outcomes, when those outcomes are purely a matter of luck? Smith was confident that, although he could not explain why we are like this, on balance we should nonetheless be grateful that we are indeed rigged up this way.’

Have patience -this may slowly lead us to a sort of answer. First of all, ‘if, in practice, we really did go around judging everybody solely by their intentions, and not by the actual consequence of their actions, life would be unliveable. We would spend all our time prying into people’s secret motivations, fearing that others were prying into ours, and finding ourselves literally on trial for committing thought crimes.’ Only a god on Judgement Day should be allowed that privilege.

Also, it is good be bothered by consequences rather than just about hidden intentions for social reasons: you have to do good things to get praise, not just intend to do them. And conversely you have to do the bad things to get the punishment. Uhmm… Well, okay, but that doesn’t really explain deodands, or anything.

At this point, Sagar kind of gives up on Smith’s attempts at moral philosophy and heads off on his own wandering trail to find an answer. ‘It is good that we feel aversion to artifacts (be they physical objects, films, records or whatever) associated with sex crimes, murders and other horrors – even if this is a matter of sheer luck or coincidence – because this fosters in us not only an aversion to those sorts of crimes, but an affirmation of the sanctity of the individuals who are the victims of them.’ Somehow that makes us less likely to act the same way? Whoaa…

In the last paragraph, he essentially throws up his hands in frustration (or maybe those were my hands…) and as good as admits he doesn’t know why we would even think about deodands.

And me? How should I have responded to the woman in the coffee shop? Well, probably not by talking about Adam Smith -but changing the subject might have been a good first step, though…

Too cute for words?

I love cute as much as anyone else, I suppose, although it’s not a quality I have possessed since early childhood I’m afraid. Many things are cute, though: puppies, babies, toddlers… and they all seem to have certain attributes in common: large, or prominent eyes, larger than expected head, and so on –neoteny, it’s called. They all seem vulnerable, and deserving of protection. Needing cuddling.

And yet, apart from agreeing affably with doting parents describing their newborns, or singles obsessing over their new puppies, I didn’t give cuteness any extra thought, I have to admit. I mean, cute is, well, cute. There didn’t seem to be any need to dwell on the features, or deify their appearance. But an older article by Joel Frohlich, then a PhD student at UCLA, about cuteness in Aeon did tweak my interest: https://aeon.co/ideas/how-the-cute-pikachu-is-a-chocolate-milkshake-for-the-brain

Perhaps it was the etymology of the word that initially intrigued me. ‘The word emerged as a shortened form of the word ‘acute’, originally meaning sharp, clever or shrewd. Schoolboys in the United States began using cute to mean pretty or attractive in the early 19th century. But cuteness also implies weakness. Mignon, the French word for cute or dainty, is the origin of the English word ‘minion’, a weak follower or underling… It was not until the 20th century that the Nobel laureates Konrad Lorenz and Niko Tinbergen described the ‘infant schema’ that humans find cute or endearing: round eyes, chubby cheeks, high eyebrows, a small chin and a high head-to-body-size ratio. These features serve an important evolutionary purpose by helping the brain recognise helpless infants who need our attention and affection for their survival.’

In other words, ‘cute’ was a mechanism to elicit protection and caring. Indeed it seems to be neurologically wired. MRI studies of adults presented with infant faces revealed that the ‘brain starts recognising faces as cute or infantile in less than a seventh of a second after the face is presented.’ These stimuli activate ‘the nucleus accumbens, a critical piece of neural machinery in the brain’s reward circuit. The nucleus accumbens contains neurons that release dopamine.’

But it can be tricked, so ‘baby-like features might exceed those of real infants, making the character a supernormal stimulus: unbearably adorable, but without the high maintenance of a real baby.’ So, is cuteness in these circumstances, actually a Trojan Horse? An interesting thought.

Cuteness is situational -or at least, should be. Cuteness out of context can be frightening, and even grotesque. Think of the clown in Stephen King’s novel It for example. Imitation, when recognized as such, seems out of place. Wrong. Cute is a beginning -an early stage of something that will eventually change as it grows up. Its transience is perhaps what makes it loveable. At that stage it is genderless, asexual, and powerless. It poses no threat -in fact, it solicits our indulgence. Think what would happen if it were a trick, however: our guard would be down and we would be vulnerable.

But there’s a spectrum of cuteness; there must be, because it –or its homologues- seem to be appearing in situations that don’t remotely suggest innocence, youth, or vulnerability. Think of the proliferation of cutesy Emojis. As Simon May, a visiting professor of philosophy at King’s College London points out in an essay (also in Aeon https://aeon.co/ideas/why-the-power-of-cute-is-colonising-our-world ) ‘This faintly menacing subversion of boundaries – between the fragile and the resilient, the reassuring and the unsettling, the innocent and the knowing – when presented in cute’s frivolous, teasing idiom, is central to its immense popularity… Cute is above all a teasing expression of the unclarity, uncertainty, uncanniness and the continuous flux or ‘becoming’ that our era detects at the heart of all existence, living and nonliving. In the ever-changing styles and objects that exemplify it, it is nothing if not transient, and it lacks any claim to lasting significance. Plus it exploits the way that indeterminacy, when pressed beyond a certain point, becomes menacing – which is a reality that cute is able to render beguiling precisely because it does so trivially, charmingly, unmenacingly. Cute expresses an intuition that life has no firm foundations, no enduring, stable ‘being’.’

Perhaps that’s what makes non-contextual cute so inappropriate, so menacing. ‘This ‘unpindownability’, as we might call it, that pervades cute – the erosion of borders between what used to be seen as distinct or discontinuous realms, such as childhood and adulthood – is also reflected in the blurred gender of many cute objects… Moreover, as a sensibility, cute is incompatible with the modern cult of sincerity and authenticity, which dates from the 18th century and assumes that each of us has an ‘inner’ authentic self – or at least a set of beliefs, feelings, drives and tastes that uniquely identifies us, and that we can clearly grasp and know to be truthfully expressed. Cute has nothing to do with showing inwardness. In its more uncanny forms, at least, it steps entirely aside from our prevailing faith that we can know – and control – when we are being sincere and authentic.’

Whoa -that takes cute down a road I don’t care to travel: it’s an unnecessary detour away from the destination I had intended. Away from attraction, and into the Maelstrom of distraction. Contrived cute is uncute, actually. It is a tiger in a kitten’s mask.

I think there is a depth to beauty which -as ephemeral and idiosyncratic as it may seem- is missing from cute. Both hint at admirability, and yet in different ways: one describes a surface,  the other a content -much as if the value of a wallet could be captured by its outward appearance alone.

For some reason cute reminds of a Latin phrase of Virgil in his Aeneid -although in a different context, to be sure: Timeo Danaos et dona ferentes –‘I fear the Greeks even bearing gifts’- a reference to the prolonged Greek war against Troy and their attempt to gain entrance to the city with their gift of the infamous Trojan Horse. Cute is a first impression, not a conclusion. I suppose it’s as good a place to start as any, but I wonder if, in the end, it’s much ado about nothing…

On the perils of ad hominism

I think one of the first Latin expressions I learned was ad hominem. I was 14 years old and, we were having a discussion about plays in our English literature class. Mr. Graham, our teacher and apparently a writer himself, had asked us what we thought of the first scene of Macbeth that we had been assigned to read as homework for the class.

Everybody shuffled in their seats, because only a few of us had actually read it. Gladis, of course, had. She always sat in the seat beside mine in the second row -I think it was an alphabetical thing to help Mr. Graham remember our names- and she was a fastidious student. She’d even made some notes the night before on what she considered the salient features of the opening scene of Macbeth. Naturally she put up her hand to attract his attention.

“It was kinda short, Mr. Graham -only 13 lines…”

Gladis always seemed to bristle me: why would she put her hand up for that? “You actually counted the lines, Gladis?” I said contemptuously -trying to shame her, I suppose.

I remember her looking at me, tensing her face, and then blinking. Slowly. “Some things are so obvious -if you read them, that is…” She shifted her gaze to Mr. Graham. “But there is a reason that Shakespeare made such a brief conversation into an entire scene,” she added sweetly.

Mr. Graham took it as a teaching point. “Anybody other than Gladis have an idea why Shakespeare made it into an entire scene?” There were no hands, unsurprisingly, so he stared at me. “What do you think, G?” Everybody used my nickname in those days.

It was entirely expected, though: I was the usual go-to seat when everybody else was quiet -probably because of my proximity to Gladis, but also maybe because I usually had my hand up. “Uhmm… Well, Shakespeare was probably trying to grab our attention at the start -you know, capture our interest right away so we’ll be curious about what follows.” I was going to stop there, but I noticed the sarcastic expression on Gladis’ face, so I kept going. “Isn’t that what you writers try to do, Mr. Graham: make the first paragraph so riveting everybody will want to read more?”

Gladis snuck a quick look at me, thought I was trying to curry Mr. Graham’s favour, and then decided to expand on her initial statement. “I think the scene set the mood for the whole play: ambition, paradox, and evil…” She smirked at me, and then continued. “I mean, ‘fair is foul, and foul is fair’ is really dark. You can just tell what you’re in for.”

She glared at me for a moment, and then smiled innocently at Mr. Graham: teacher’s pet.

I think he could see the dynamic developing, and thought he might use it to stimulate some discussion in the silent majority surrounding us.

“Anybody else?” he said, casting his eyes about the room. Nobody looked up from their notebooks. “What about ‘When the battle’s lost and won…?” What do you think that means?” His eyes settled on one of the quiet students who seldom volunteered an answer. “Kerry?”

Kerry looked up, totally surprised that he’d been singled out. “I… Uhmm… Well, I suppose there’s going to be a fight somewhere later in the play, and…”

Gladis turned and stared at him. “The first scene was so short, didn’t you even peek at the next scene?”

Kerry stared at her defiantly. “Mr. Graham just assigned the first scene, Gladis. I didn’t want to get confused with too much information, eh?” The class snickered in relieved agreement.

Gladis somersaulted her eyes and sent them rolling and tumbling towards Mr. Graham. “Anybody who was at all interested in plays would have read further, Kerry,” she said and sighed theatrically.

Kerry stared down at his desk in embarrassment.

But, if she thought that might have curried the teacher’s favour, she was sadly mistaken. Mr. Graham noticed Kerry’s distress, frowned briefly and then loosed his eyes on me again, for some reason. “G, do you know what an argument is called when you attack the person rather than their position?” Another teaching moment, I supposed.

I thought about it for a moment, and then shrugged.

“Anybody…?” he asked, once again hoping for a response from the class. “It’s called an ad hominem -Latin, meaning ‘to the person’. It’s a type of argument that is often very difficult to refute, because the individual who uses it usually does so in frustration because he or she cannot counter the argument itself and so attacks the person in an attempt to win that way.” He let his eyes rest on Gladis again -but only briefly.

“I mention it now because later in the play you’re going to realize that when Lady Macbeth argues with Macbeth about killing the king, she almost always uses ad hominem arguments… Just warning you,” he added, and winked at Gladis in a subtle rebuke that wasn’t lost on me. Very clever, I thought.

Over the years, I’ve come to realize that resorting to an ad hominem offers only a kind of pyrrhic victory -if not a defeat- for the user. Still, I’ve I have to admit that there were occasions when I felt I’d be losing more than just face if I backed down. Of course, the tone of my voice, or the blush on my face, usually unmasked my efforts, and I’d end up apologizing, rather than wearing any ill-gotten gains.

But I ran across an interesting variation on that theme in an essay in Aeon by Moti Mizrahi, an associate professor of philosophy from the School of Arts and Communication at the Florida Institute of Technology: https://aeon.co/ideas/how-ad-hominem-arguments-can-demolish-appeals-to-authority

‘According to the Urban Dictionary site,’ she writes, ‘Ad hominems are used by immature and/or unintelligent people because they are unable to counter their opponent using logic and intelligence.’ But isn’t this definition itself an ad hominem attack on those who make ad hominem arguments?’ Food for thought. Although ‘… ad hominem arguments can be good arguments, especially when they are construed as rebuttals to appeals to authority.’

Seeking advice from experts is something which we all find ourselves doing from time to time -none of us can know everything. But suppose, as she posits, ‘children respond to their parents’ plea to refrain from smoking by saying: ‘You use tobacco, so why shouldn’t I? … Arguments against the person are attempts to undermine what someone says, not by engaging with what is said but by casting aspersions on the person who says it. For example, the child’s retort is directed at the parents, in light of their failure to set a positive example, not at their parents’ concerns about smoking.’

I like that example -it somehow proves to me that nothing is so sacred that it can’t be re-evaluated from a different perspective. You’re a fool if you don’t believe in evolution… Or are you not allowed to ad hominem yourself?

Whisper music to my weary spirit

Is music just sounds -a series of notes bundled together, like words in a conversation, or shapes in a painting? Like them, is musical appreciation an attempt by the brain to assign meaning, relevance, and structure to differentiate it from the ambient sounds we encounter every day: the whistle of wind leaking through a partially opened window, the rustle of leaves in a forest, the chirrup of the first robin in a nearby tree at dawn?

Do we break music down like we do the grammar of sentences: subject, object, verb,  noun, adjective…? Is music, in other words, merely the phonetic equivalent of morphemes strung out, not into mere sentences, or even paragraphs, but into whole stories?

In a way, we can maybe see the similarity of a memorable story with that of a catchy tune, or perhaps a moving symphony; and yet, should we -can we- equate the information and emotional content of a novel, say, with that of a concerto, or maybe a choral requiem? There seems to be a qualitative difference -each may be stirring, but somehow in non-identical ways.

I have wondered about this ever since I was an admittedly nerdy child. What was the difference between a gorgeous sunset, and an inspiring story; between a Rachmaninoff prelude, and a poem by Robert Frost…? In those early, naive days, I suspect I was wont to conflate things that drew me into them -things that had the magic quality of dissolving whatever boundaries confined me inside my own thoughts. Music was a potent drug, and so many of the intervening years have been occupied with a search for more and more purveyors: dealers.

I have therefore been attracted to articles dealing with that hard to describe boundary between music and, well, the rest of reality. There was an article I found in Aeon that caught my eye: https://aeon.co/essays/music-is-in-your-brain-and-your-body-and-your-life

It was written by Elizabeth Hellmuth Margulis, director of the music cognition lab at the University of Arkansas. I suppose what initially intrigued me was her contention that ‘the past few decades of work in the cognitive sciences of music have demonstrated with increasing persuasiveness that the human capacity for music is not cordoned off from the rest of the mind. On the contrary, music perception is deeply interwoven with other perceptual systems, making music less a matter of notes, the province of theorists and professional musicians, and more a matter of fundamental human experience.’ In other words, that music is somehow different, something more than mere sounds piled one on top of each other -more than whatever order may be ascribed to the pattern emerging from a dropped deck of cards.

Indeed, ‘Brain imaging produces a particularly clear picture of this interconnectedness. When people listen to music, no single ‘music centre’ lights up. Instead, a widely distributed network activates, including areas devoted to vision, motor control, emotion, speech, memory and planning. Far from revealing an isolated, music-specific area, the most sophisticated technology we have available to peer inside the brain suggests that listening to music calls on a broad range of faculties, testifying to how deeply its perception is interwoven with other aspects of human experience. Beyond just what we hear, what we see, what we expect, how we move, and the sum of our life experiences all contribute to how we experience music.’

And as she writes, ‘Music, it seems, is a highly multimodal phenomenon. The movements that produce the sound contribute essentially, not just peripherally, to our experience of it – and the visual input can sometimes outweigh the influence of the sound itself. Visual information can convey not only information about a performance’s emotional content, but also about its basic structural characteristics.’

I was struck by the picture that begins her essay: a photograph of the late Janis Joplin performing at The Fillmore, San Francisco in 1968; she was a particular, and long-time, favourite of mine. Just seeing Janis, with her head tilted back, and eyes closed, I felt I could hear her again. Feel the energy… I could hardly stop my foot from tapping out the rhythm of Try (Just a little harder) and I was transported back to all the various other concerts of the 60ies I had attended. Amazing, eh? That a memory, a photograph, can bundle so much together. That music can knit the ravelled sleeve of care, and ‘paint an embodied picture of music-listening, where not just what you see, hear and know about the music shapes the experience, but also the way you physically interact with it matters as well. This is true in the more common participatory musical cultures around the world, where everyone tends to join in the music-making, but also in the less common presentational cultures, where circumstances seem to call for stationary, passive listening.’

‘Neuroimaging has revealed that passive music-listening can activate the motor system. This intertwining of music and movement is a deep and widespread phenomenon, prevalent in cultures throughout the world. Infants’ first musical experiences often involve being rocked as they’re sung to. The interconnection means not only that what we hear can influence how we move, but also that how we move can influence what we hear.’

I have always found music to be so much more than the sound or the rhythm, and I have to admit that although I have never felt compelled to dance, I have never been able to remain motionless -or for that matter, emotionless- in its presence. And, as with everything else in life, I am affected more by some songs, some genres, some performances than others, but these things, too, vary. Music isn’t static, any more than a particular recipe always tastes the same no matter the cook.

As the author, Margulis, writes, ‘Music cannot be conceptualised as a straightforwardly acoustic phenomenon. It is a deeply culturally embedded, multimodal experience. At a moment in history when neuroscience enjoys almost magical authority, it is instructive to be reminded that the path from sound to perception weaves through imagery, memories, stories, movement and words.’

The threads that music has woven through my years have not frayed; unlike the patchwork pattern of my life it has held together -indeed, held me together. I am reminded of a proverb I read somewhere: A bird does not sing because it has an answer. It sings because it has a song. And sometimes, you know, that is really all you need…

Does Beauty live with Kindness?

I don’t know how many times I’ve written about beauty, but it continues to intrigue me. Not so much about what it is -its constituent parts, its definitions, or even its historical and sociological roots- but more its ability to morph -mutate, if you will- from something that is to something that isn’t. How, in other words, can beauty -or its antonym, ugliness- change to its opposite without materially altering anything about its appearance?

To be sure, the duality has not gone unnoticed in historical philosophy (the appearance vs the charisma of Socrates), literature (think of the handsome Dorian Grey and his increasingly ugly portrait), or even in fairy tales (Hans Christian Andersen’s The Ugly Duckling), but its seeming capriciousness only adds to the mystique, I think.

For years, centuries, indeed millennia, we have sought to decipher beauty, and yet apart from vague generalizations like youthfulness, proportionality, or perhaps, symmetry, it has eluded our grasp, and slipped through our fingers like slowly moving mist. The most apt description for me, comes from Koine Greek, where beauty was associated with being of one’s hour -not trying to appear older or younger: authentic, I suppose. And yet even here, beauty remains a moving target, doesn’t it?

Amongst the many attempts to pigeonhole the concept, I am always on the lookout for seemingly unique approaches -although I fully recognize that over the centuries, pretty well every perspective has likely been canvassed. At any rate, I found myself drawn to an article in Aeon by the British philosopher Panos Paris: https://aeon.co/essays/how-virtue-morphs-into-beauty-in-the-eye-of-the-beholder

His opening sentence certainly captured my interest: ‘Have you ever thought that someone is far from attractive – perhaps even ugly – only to later come to find that person beautiful?’ For sure this would not be a unique experience for any of us, and yet it made me wonder how such a perceptual change could happen -was it merely that we had come to know that person better and so ignored their outward appearance, or was there an actual phase-change somehow?

Paris links our perceptions to moral qualities: ‘[B]eauty and morality, and ugliness and immorality, are intrinsically linked. Specifically, the moral virtues – honesty, kindness, fairness, empathy, etc – are beautiful character traits, and the moral vices – their contraries – are ugly.’

That seemed a little too simplistic a view, but it was enough to make me read further. He qualifies it almost immediately: ‘Of course, the kind of beauty or ugliness in question is independent of physical appearances – it belongs to characters and actions.’ He calls it the ‘moral beauty’ view, and further qualifies it by saying ‘This view is rather unfashionable today. Contemporary philosophical and lay orthodoxy construes the realms of aesthetics and morality as distinct. It regards theories such as the moral-beauty view as signs of past conceptual immaturity that we have since thankfully shaken off our intellectual shoulders.’

But then he points to diverse historical languages and how many of these (admittedly cherry-picked examples) conflated beauty and morality. ‘In Ancient Greek, kalon meant both beautiful and good, while the [African] Yoruba word ewa normally translated as ‘beauty’, is primarily used to refer to human moral qualities.’ Or, more recently, ‘Adam Smith wrote in The Theory of Moral Sentiments (1759) that ‘benevolence bestows upon those actions which proceed from it, a beauty superior to all others, [while] the want of it, and much more the contrary inclination, communicates a peculiar deformity to whatever evidences such a disposition’.

And, Paris explains, this conflation was not because of linguistic poverty. ‘[T]he Enlightenment philosophers did have the terminology to distinguish not only between beauty and goodness, but also between natural and artistic beauty, inner and outer beauty, and so on. Thus, their acknowledgement of an aesthetic dimension in morality, far from evincing confusion, seems to me to have reflected ordinary experience.’ This seemed a bit of a stretch to me -a mistaking of metaphor for prose, perhaps- but I pressed on nevertheless.

‘[W]hen people encountered others who were morally virtuous or vicious in their everyday life or in art… they felt, respectively, the sort of pleasure and displeasure evoked by other beautiful and ugly objects, and this phenomenon found its way into their language and thought.’ But with time, this view of beauty began to fade, and various detractors criticized the old approach -people like ‘Edmund Burke, who in 1757 considered it a ‘loose and inaccurate manner of speaking, [that] misled us both in the theory of taste and of morals’.

So, ‘beauty was thought to be mostly a matter of pleasure in the form of an object, and ugliness of displeasure in deformity; and form was limited to the visible or aural properties of an object. By contrast, goodness, and traits such as honesty and kindness, or selfishness and cowardice, are not like that; they are imperceptible, psychological traits, the goodness or badness of which stems from adherence to or violation of rational principles… Moreover, while the good is, or should be, desirable for its own sake, the beautiful is desirable because it’s pleasurable. So linking beauty and goodness might lead to a corruption or degeneration of moral motivation by encouraging the pursuit of goodness for its beauty.’

I began to lose interest at this point in his sign-wave and ultimately reductionist type of historical approach to beauty. I mean, let us suppose that beauty is largely subjective whereas, morality, because of the duties and obligations associated with being moral, is more objective… What does that mean? Is it an important distinction…?

Or… are we merely throwing everything into the pot in our frantic need for definition? Are we so desperate for a word, for a concept, that describes the pleasurable sensation of encounter, and engagement, that we flounder in the stew ourselves? Could it be that all the while, beauty was simply a metaphor -a way of saying we are pleased, and that what we are really struggling with is a way of expressing this?

And could it be why the word metaphor is so apt? Not to over-emphasize the need of delving into etymological derivations whenever we are stuck for something to say, its component morphemes are instructive: phore meaning ‘bearer of’ and meta designating an analysis at a higher, more abstract level. Personally, I think the famous 18th century French writer, Stendhal defined beauty the best: he called it la promesse de bonheur (the promise of happiness).

Do we really need more than that…?

An Achilles Heel?

 

I’m going to go out on a limb and suggest that the average person, even if they’re only vaguely aware of Homer’s poems The Iliad, or The Odyssey, even if they are mildly conversant with the story of the siege of Troy and the Trojan horse, even if they have sort of heard of the Grecian heroes Odysseus and Achilles or perhaps the Trojan hero Aeneas, and even if they could pretend they remember that the author -not to mention the stories and characters- may or may not have been reality based… even if this were the case, the colours of their skin and hair probably do not rank particularly high in the recollection. Frankly, I -certainly not a card-carrying member of any historical society- had not given it much thought. Well, none, actually -some things are just not that important, I guess.

And when I think of the way Homer was taught in my freshman class in university, I suppose I merely assumed that detailed descriptions were unnecessary -obviously, they would each look similar to how we have portrayed Christ in all the medieval religious art: vaguely Caucasian. And in my student days, the zeitgeist of academia as well as the rest of western society, seemed to be swimming in what we might now call white privilege. Of course the ancient Greeks were white -I mean, just look at the white marble statues they have bequeathed to us. The fact that they were originally brightly painted was not known -or at least not communicated to most of us in my day.

So, although the article in an edition of Aeon that questioned the skin colour of Achilles, did not shock me, it did make me think about the long held western conceit that the ancient Greeks, on whom we have modelled so many of our democratic ideas, were fair-skinned. Even as I put this assumption into words, I realize that, however unintended, it seems terribly racist. And yet, some things do need to be probed, clarified: https://aeon.co/essays/when-homer-envisioned-achilles-did-he-see-a-black-man

The essay, written by Tim Whitmarsh, a professor of Greek culture at the University of Cambridge, attempts to make sense of what little historical evidence exists from those almost pre-historical times. ‘The poems are rooted in ancient stories transmitted orally, but the decisive moment in stabilising them in their current form was the period from the 8th to the 7th centuries BCE. The siege of Troy, the central event in the mythical cycle to which the Homeric poems belong, might or might not be based on a real event that took place in the earlier Bronze Age, in the 13th or 12th century BCE. Historically speaking, the poems are an amalgam of different temporal layers: some elements are drawn from the contemporary world of the 8th century BCE, some are genuine memories of Bronze Age times… Achilles was not a historical personage; or, rather, the figure in the poem might or might not be distantly connected to a real figure, but that isn’t the point. Achilles, as we have him and as the Greeks had him, is a mythical figure and a poetic creation. So the question is not ‘What did Achilles look like?’ but ‘How does Homer portray him?’

Fragments of evidence exist, but many are fraught with translational discrepancies and contemporaneous social conventions that confuse the issue. For example, at the time, ‘females are praised for being ‘white-armed’, but men never are. This differentiation finds its way into the conventions of Greek (and indeed Egyptian) art too, where we find women often depicted as much lighter of skin than men. To call a Greek man ‘white’ was to call him ‘effeminate’.’

Also, ‘Achilles is said in the Iliad to have xanthos hair. This word is often translated as ‘blond’… [But] the Greek colour vocabulary simply doesn’t map directly onto that of modern English. Xanthos could be used for things that we would call ‘brown’, ‘ruddy’, ‘yellow’ or ‘golden’.’ And, ‘Weirdly, some early Greek terms for colour seem also to indicate intense movement… xanthos is etymologically connected to another word, xouthos, which indicates a rapid, vibrating movement. So, while xanthos certainly suggests hair in the ‘brown-to-fair’ range, the adjective also captures Achilles’ famous swift-footedness, and indeed his emotional volatility.’

‘So to ask whether Achilles and Odysseus are white or black is at one level to misread Homer. His colour terms aren’t designed to put people into racial categories, but to contribute to the characterisation of the individuals, using subtle poetic associations… Greeks simply didn’t think of the world as starkly divided along racial lines into black and white: that’s a strange aberration of the modern, Western world, a product of many different historical forces, but in particular the transatlantic slave trade and the cruder aspects of 19th-century racial theory. No one in Greece or Rome ever speaks of a white or a black genos (‘descent group’). Greeks certainly noticed different shades of pigmentation (of course), and they differentiated themselves from the darker peoples of Africa and India… but they also differentiated themselves from the paler peoples of the North.’ In other words, concludes, Whitmarsh, ‘Greeks did not, by and large, think of themselves as ‘white’.’

This information would be filed in the ho-hum section of our need-to-know list for most of us, I think, and yet, Whitmarsh, in his introduction points out that ‘in an article published in Forbes, the Classics scholar Sarah Bond at the University of Iowa caused a storm by pointing out that many of the Greek statues that seem white to us now were in antiquity painted in colour. This is an uncontroversial position, and demonstrably correct, but Bond received a shower of online abuse for daring to suggest that the reason why some like to think of their Greek statues as marble-white might just have something to do with their politics.’

That there are people out there who seem threatened by knowledge which doesn’t accord with their own confirmation biases is, to me, more deeply troubling than mere disagreement. After all, we can disagree with something without being threatened by it. Disagreement allows for discussion, and possible attempts at rebuttal, using other evidence. Or countering with other interpretations of the same facts. In the end, isn’t it all just a game? An academic exercise which, after the initial flurry of excitement and barrage of words, should end, like all closely fought games, with a glass of wine?

The primrose path?

 

Every so often, I feel I have been blindsided -kept out of the loop either because I haven’t been diligent in my reading, or, more likely, haven’t thought things through adequately.

Philosophy concerns itself with the fundamental nature of reality, so I had always assumed there were few, if any, territories left untouched. In fact, I would have thought that the very nature of the discipline would have enticed its members to explore the more problematic subjects, if only to test the waters.

Of course, it’s one thing to continue to study the big topics -Beauty, Truth, and Knowledge and so on- but yet another to subject the more controversial, unpleasant issues like, say, Garbage, or Filth to critical philosophical analysis. At best one might argue it would be a waste of time commenting on their existential value. In fact, even suggesting that they might be worthy of philosophical consideration borders on the ridiculous, and the pointless -yet another example of a discipline grown dotty with age.

I have always felt that Plato was on to something in his insistence that what we experience are only particular and incomplete examples of what he called ideal Forms. We can all recognize a chair, for example, despite the fact that chairs can assume many forms, with innumerable shapes and sizes. And yet somehow, out of all the variations, even a child can recognize a chair: they can recognize the chairness of the object, if you will. So, it seems we can all understand the idea that any one particular example of a chair, or a triangle, say, is only a sample of the Forms of chairness, or triangleness… And because the Forms are only describable in the particular, we can never experience the true Forms except in our imagination. The Forms are, in effect, perfect and unchanging, unlike their earthly examples.

Where am I going with this? Well, although we might accept that this imaginary and essentially indescribable Form of what we’re calling chairness is ‘perfect’, could we say the same of other objects that make up our everyday reality -Garbage, for example? Is there an analogously ‘perfect’ Form for Garbage? Even thinking about that seems, well, valueless. Silly.

But, then again, uncharted waters have always attracted the brave -some may say, the unusual– among us. For my part, I was on my way elsewhere when I tripped over an article sticking out like a root on a forest trail. I suppose I should have known better than to start reading it. https://aeon.co/ideas/philosophy-should-care-about-the-filthy-excessive-and-unclean

‘[C]an the ‘unclean’ – dirt, mud, bodily wastes, the grime of existence – be relevant to the philosopher’s quest for wisdom and the truth?’ the author, Thomas White, asks. ‘Philosophers don’t often discuss filth and all its disgusting variations, but investigating the unclean turns out to be as useful an exercise as examining the highest ideals of justice, morality and metaphysics. In his dialogue Parmenides, Plato gives us an inkling of the significance of philosophising about the unclean, which he names ‘undignified objects’, such as hair, mud and dirt.’ When Parmenides questions Socrates about the issue, even Socrates is troubled and changes the subject. What hope is there, then, to include it as a legitimate topic for philosophical inquiry?

As White observes, ‘The unclean’s ‘undignified objects’ represent a kind of outer twilight zone – a metaphysical no-man’s land – that eludes overarching theories about the meaning of reality… The unclean’s raw existence is a great intractable that rudely interrupts a philosopher’s thinking when it fails to fit neatly into the theory of forms, thus forcing the philosopher to curb hasty, ambitious generalisations, and think even harder and more clearly.’ Of course, it has been suggested that ‘Plato attacked his own theory of Platonic ideas in order to know the truth, not to defend his own preconceived views.’ Indeed, maybe we need to be careful about insisting that any one particular philosophical model should be able explain everything. Even the discipline of physics admits that quantum theory and Newtonian theory seem to belong to separate Magisteria: each has its own domain -its own kingdom. Its own validity…

And yet for some reason, even in my dotage, I am reluctant to abandon Plato’s idea of Forms, no matter how societally objectionable the subject matter. Is there something to be said for, let’s say, filth -as in ‘not clean’- for which there may be a perfect Form? A ‘not-cleanness’ even a child could recognize?

When my children were young -so young that the world was fresh and new- they felt the need to explore: to climb whatever presented itself to their eyes, to look under things for what might be hidden there, and, of course, to taste whatever titillated their imaginations, or seduced their gaze.

As a parent, I have to admit that I assumed I should restrict their investigations to what I felt was safe and otherwise to what I found personably acceptable, but I couldn’t microscope them every second they were in my charge.

I remember one time, shortly after my daughter had learned to toddle around, I took her and her older brother out for a walk in a park near my house. The day was warm, and there was only one available park bench particularly appropriate as a base from which to watch the two of them wander around noisily within a little grassy clearing.

I must have dozed off in the sunlight, because when I opened my eyes the two of them seemed praeternaturally quiet and huddled over something they’d found in the grass. Curious to see what they’d found so interesting, I sauntered over to find my daughter contentedly munching away at something she’d found.

It didn’t look particularly edible, so I gently disentangled it from her mouth. I’m not sure what it was, and although parts of it were white, other parts where she had managed to break through the exterior, were brown and, frankly, disgusting.

“That’s not a good thing to eat, Cath,” I said, as her face contorted into a proto-wail.

“She thought it was popcorn,” my son explained, with a theatrical shrug.

I saw another similar white object on the grass nearby that promptly disintegrated as I picked it up. “That’s not popcorn, Michael,” I said as I brought it as close to my nose as I dared.

He shrugged again, as Catherine began to cry. “I didn’t think it was,” he explained. “And anyway, I didn’t try any…” he added, rather guiltily I thought.

I picked up my daughter to calm her and stared at Michael. “Then why did you let her eat it?” I asked, shaking my head disapprovingly.

His little eyes slid up my face with all the innocence of childhood. “She thought it was pretty…” he explained.

I looked at the aged piece of canine detritus with new eyes. It was kind of attractive, I had to admit…