Illeism, or Sillyism?

Who would have thought that it might be good to talk about yourself in the third person? As if you weren’t you, but him? As if you weren’t actually there, and anyway, you didn’t want yourself to find out you were talking about him in case it seemed like, well, gossip? I mean, only royalty, or the personality-disordered, are able to talk like that without somebody phoning the police.

Illeism, it’s called -from the Latin ‘he’- and it’s an ancient rhetorical technique that was used by various equally ancient personages -like, for example, Julius Caesar in the accounts he wrote about his exploits in various wars. It’s still in occasional use, apparently, but it stands out like a yellow MacDonald’s arch unless you’re a member of a small cabal, sworn to secrecy.

Now that I mention it, I remember trying it once when I was very young, and blamed our cat for scattering cookies all over the floor; but I suppose that doesn’t count because my mother instantly realized I was actually using the third-person-singular in its grammatical sense, and sent me to my room for fibbing -without the cat. I didn’t even get a hug for my clever use of ancient rhetoric.

The episode kind of put me off third-personism until I read a little more about it in an adaptation of an article originally published by The British Psychological Society’s Research Digest, by David Robson and edited for Aeon. He is a science journalist and a feature writer for the BBC: https://aeon.co/ideas/why-speaking-to-yourself-in-the-third-person-makes-you-wiser

It seems illeism can be an effective tool for self-reflection. And, although you may be tempted to opt for simple rumination – which is ‘the process of churning your concerns around in your head… research has shown that people who are prone to rumination also often suffer from impaired decision making under pressure, and are at a substantially increased risk of depression…’

Robson was intrigued by the work of the psychologist Igor Grossmann at the University of Waterloo in Canada writing in PsyArxiv which suggests that third-person thinking ‘can temporarily improve decision making… [and] that it can also bring long-term benefits to thinking and emotional regulation.’ -presumably related to the perspective change allowing the user to bypass -or at least appreciate- their previously held biases.

Grossmann, it seems, studies wisdom, and [w]orking with Ethan Kross at the University of Michigan in the United States…  found that people tend to be humbler, and readier to consider other perspectives, when they are asked to describe problems in the third person.’

Hmm…

He read the article with a fair soupçon of wariness. Might this not, he wondered, be academic legerdemain? It managed to fool Robson, but surely not he who reads with not even a hatchet to grind. He, after all, is only a retired GYN who is only accustomed to addressing freshly delivered newborns and their unique anatomical appendages with the appropriate third-person labels. It’s hard to do otherwise with the unnamed. Indeed, it had always seemed situationally germane, given the circumstances. To turn that on himself, however, might be contextually confusing -as well as suspicious.

So, his days as an accoucheur long past, he decided there would be little harm in trying it out in front of a mirror before he unleashed his full third-person on an unsuspecting face in Starbucks.

It seemed… unusual at first: he knew the individual in the reflection as well as himself, and addressing him as ‘he’ felt rude -creepy, actually. He did manage to get around the vertigo by pretending he was actually talking to a younger version of his brother, though, and ignored the fact that his brother was moving his lips at the same time and apparently not listening.

“Your brother,” he started, “is wondering if he should use the third-person approach when he is anxious about whether or not to order the sausage and egg bagel or just a cookie for breakfast at Starbucks.” A sudden thought occurred to him: “he could pretend he was sent to order it for his friend who is currently guarding a table for the two of them.”

He stared at the image in the mirror and frowned, suddenly remembering the cat-and-cookie incident.

He was uncertain where this was going. Was he supposed to ask what he -that is ‘himself’- thought about the idea? And who, exactly, would be answering? The whole thing seemed like an endless hall of mirrors, an infinite regression of Matryoshka dolls.

“Okay,” he added, to assuage the guilt he assumed he would have fibbing to the barista, “He is just trying an experiment in non-gendered, non-directional conversation to solve a personal decisional paralysis. So, he is not trying to be weird or anything. He is actually just asking for your advice: would bagel or cookie be a better breakfast?”

Suddenly, an unexpected epiphany -maybe produced by the comparative ‘better’, but nonetheless apparent in the way in which the third person had phrased his question. Of course the bagel with its protein rich contents was the ‘better’ breakfast! He was pretty sure that First-person-singular would never have seen that with such clarity –could never have seen it. Only by divorcing himself from his stomach, and mentioning it as if he were discussing a friend did it become clear.

He stepped away from his brother at the mirror and smiled to himself. He’d discovered a way of distancing himself from himself long enough to see who he was from an outside perspective. Still, there was a nagging question that kept tugging at his sleeve: who was he when he asked those questions? And did he risk permanently closing the door to the person he used to be, or was it sort of like losing himself in a story and then swapping realities when he closed the book…? But, what if he preferred what he was reading to what he was living…?

Whoa -pretty heavy stuff, that.

You know, it’s harder coming back to First-person than closing the book after a while, and I found myself switching back and forth for the longest time. I really wonder how hard Grossman and Kross had thought this through. And I wonder if Robson got caught up in their web as well. Nobody mentioned anything about collateral damage -but of course, they wouldn’t, would they?

All I can say is be careful, readers -there might be a third-person Minotaur at the end of the labyrinth.

Remembering Forgetting

We have to be careful, don’t we? Sometimes, we have to force ourselves to step back for a moment. When we want something –need something- to reassure us that we will be okay despite signs to the contrary, it’s all too easy to believe. All too easy to slip back into the warm, reassuring arms of a parent who tells us what we want so desperately to hear: that everything will turn out all right…

And I suppose that each of us has her favourite skeleton. However farfetched it may seem to others, it is a source of undue angst whenever the subject is broached, albeit innocently. With my mother, it was her curls. She lived in the sure and certain knowledge that when she got old, her hair would turn as straight as hay. It didn’t, but then again, I was never privy to whether or not her hairdresser was an accomplice.

My father, on the other hand, worried about God –but only, it has to be revealed, after I began to bring home my university textbooks on Philosophy to try their arguments out on him. At the time, I think I felt I was sharing my newfound freedom of ideas, but in retrospect I realize it was unkind.  His background religious beliefs had not prepared him for the convincing effectiveness of rhetoric in destroying what clever minds had decided were untenable arguments. He had not learned to step back; he had not learned to consider the source. Nor had I, for that matter…

It is why I have to be careful. It is one thing to cherish words and venerate ideas, and another to be convinced by those which foster only those with which I have formed an allegiance. Perhaps that’s unfair not only to me, but to the ideas, and yet there is something distinctly unsettling about pernicious change. It’s why, throwing critical thinking to one side on occasion, I revel in reassurance. I want to believe in good-news experiments that cradle me, however briefly, in their arms.

There was a brief summary in a CBC News Second Opinion section with the title ‘Remembering forgetting could be a good thing.’ Now, how could that not attract the attention of someone whose bête noir is just that? Someone who chafes at the declining powers of a once proud memory? Someone who wants to blame it on age, and yet dares not –and whose mind, scrabbling among shards of memory, is persistently reassured that it can still remember the lament of Macbeth before his battle with Macduff at Dunsinane: ‘My way of life is fall’n into the sere, the yellow leaf, and that which should accompany old age, as honor, love, obedience, troops of friends, I must not look to have, but, in their stead, curses, not loud but deep, mouth-honor, breath which the poor heart would fain deny and dare not.’ Some things burrow deeply into the unguarded psyche, however irrelevant.

But the article, reporting on a study published by Dr. Philip Gerretsen (a clinician scientist at Toronto’s Centre for Addiction and Mental Health) in the Journal of Clinical Psychiatry: http://www.psychiatrist.com/JCP/article/Pages/2017/v78n08/16m11367.aspx said that ‘Using brain imaging data and other clinical information from more than 1,000 patients with early cognitive decline, his new study suggests there’s a relationship between a person’s level of awareness of memory issues, and their risk of future disease.’ I cling desperately to fragments like this. ‘”Most intriguingly it’s the patients that seem to be hyper-aware of having some cognitive problems relative to their caregivers that actually don’t go on to develop dementia,” Gerretsen said, adding that those people might be suffering memory loss for other reasons, including anxiety or depression.’

And not only do I derive some satisfaction from his findings, I’ve also learned a new word that I hope to sprinkle surreptitiously into a conversation if I can actually remember it long enough: anosognosia — a neurological term for not knowing that you’re sick. Not realizing, in other words, that you’re forgetting things. ‘Gerretsen says there’s a suggestion that Alzheimer’s disease might be affecting the brain regions involved in illness awareness.’ I’ve decided that’s what I now think, too. It’s another straw to grasp, I suppose.

And yet, true to its etymology, the concept of anosognosia is not very well known. I was in a hospital elevator one afternoon on my way to the subterranean parking lot after visiting a friend. Normally crowded, there were only two older, but tired-looking nurses huddled in the corner of the little chamber leaning heavily on the walls, and one was shaking her head slowly. “I get so annoyed with myself, Fran,” she continued, hardly noticing the novelty of my presence.

Fran, a stout woman with short, messy hair, managed to raise her eyes enough to rest them on her friend’s face. “Why’s that, Judy?” She didn’t really sound that engaged in the conversation –just polite.

Judy, equally stout, but perhaps because of her bright red dress, looking the more refreshed of the two, sighed. “I always forget where I parked the car.”

The thought seemed to perk Fran up a little. “Happens to me all the time… I guess we park here so often, one space seems just like any other.”

“Yeah, but I really tried this morning… I did something or saw something I was sure would help me remember…”

Fran chuckled, more fully awake at the thought. “And now you can’t remember?”

Judy shook her head, smiling. “Worrisome, eh?”

They were both silent for a moment, and then Judy rescued her body from the wall in preparation for leaving, and glanced at her friend. “Do you think remembering that I’m forgetting things is a good sign…?”

Fran thought about it for a moment. “I would think that forgetting that you’re forgetting things would be worse…” she said as the elevator door opened and the two of them got out, giggling like schoolgirls.

Maybe some things are intuitive. Maybe hope is one of those things.

Fairness Which Strikes the Eye

Sometimes it seems we cannot help ourselves –the pull of the tide is just too strong to resist. And sometimes an argument, when considered too quickly, too uncritically, captures us with its ostensibly intuitive wisdom. We have no need to question it. No need to probe the basis of its logic.

The rhetoricians of old were well versed in this form of argument –the art of persuasion and how to best achieve it. Aristotle, for example, suggested three essential features of a convincing argument: ethos –the credibility of the contention; pathos –understanding the needs and emotions of the audience; and logos –the patterns of reasoning and the words chosen. His wisdom, although modified and woven into the contemporary tapestry, has not been lost in modern times.

What could provoke a greater sense of outrage in a population than the 1% contention? That is to say, in at least one of the iterations fostered by the Occupy Movement, that in the United States, 1% of the population controls 40% of the wealth. And to many, that unequal distribution of wealth, is symptomatic of what is wrong with Capitalism. It certainly resonates with those of us in the 99% who hear it. It begs for remonstrance; it demands rectification.

And yet there are usually many sides to a story –or at least this one, at any rate. There are times  when we need to move back a step or two in order to appreciate the different perspectives. Even so, I have to admit that an article in the BBC Future series came as an intriguing surprise: http://www.bbc.com/future/story/20170706-theres-a-problem-with-the-way-we-define-inequality It allowed me to entertain an alternative that I had not even considered.

As they tease at the beginning, ‘Some researchers argue that income disparity itself may not be the main problem. The issue, they say, is not the existence of a gap between rich and poor, but the existence of unfairness. Some people are treated preferentially and others unjustly – and acknowledging that both poverty and unfairness are related may be the challenge that matters more […] While many people may already view inequality as unfairness, making the distinction much clearer is important.’

They go on to say that ‘In a paper published in April in the journal Nature Human Behaviour called ‘Why people prefer unequal societies’, a team of researchers from Yale University argue that humans – even as young children and babies – actually prefer living in a world in which inequality exists. […] Because if people find themselves in a situation where everyone is equal, studies suggest that many become angry or bitter if people who work hard aren’t rewarded, or if slackers are over-rewarded.

‘“We argue that the public perception of wealth inequality itself being aversive to most people is incorrect, and that instead, what people are truly concerned about is unfairness,” says Christina Starmans, a psychology post-doc at Yale who worked on the paper.

“In the present-day US, and much of the world, these two issues are confounded, because there is so much inequality that the assumption is that it must be unfair. But this has led to an incorrect focus on wealth inequality itself as the problem that needs addressing, rather than the more central issue of fairness.” And as Mark Sheskin, one of the co-authors remarks, ‘“People typically prefer fair inequality to unfair equality”’.

In a way, a lot of the argument hinges on definitions. There are, after all, several ways to look at inequality: equality of opportunity, equality of distribution of benefits, and of course, equality of outcome. Must all of them be addressed, or is there a priority? Is the existence of a super-rich 1% the problem, or would it be more helpful ‘ to concentrate more on helping those less fortunate, who via a lack of fairness, are unable to improve their situation’?

‘Harry G Frankfurt is a professor emeritus of philosophy at Princeton University. In his book On Inequality, he argues that the moral obligation should be on eliminating poverty, not achieving equality, and striving to make sure everyone has the means to lead a good life.’ Poverty, in other words, is the problem; it is unfair…

I suppose, when considered practically, it would be unrealistic and unduly Utopian, to think that we could ever dispense with at least some degree of income disparity. People ‘don’t typically work, create or strive without the motivation to do so’. It seems to me that the unfairness does not lie in the money fairly accumulated for work done, so much as in the fact that ‘not everyone is afforded the same opportunities to succeed, even if they put in that hard work.’

But, on the other hand, it’s not all simply a matter of the equality of opportunity, nor even of equality, per se. Fairness is something different. The issue of fairness is in a different Magisterium altogether. I’m Canadian, and I believe that no one should have to live in poverty. Not everyone has the skills, or indeed, the capacity to hold a job, even if an opportunity presents itself. Some are disadvantaged by appearance, or gender; some are discriminated against by virtue of their origins, or life-style; some, even, have succumbed to past failures and have given up trying… It is unfair to give up on them –any of them- simply because of the lotteries of birth or circumstance.

Fairness, it seems to me, is universally available and accessible health care. It is a living wage that allows even the poorest to feed their family. It is safe and obtainable shelter. It is the respect afforded even to those we do not understand. It is toleration of difference, even when the rest of us may not understand, or agree with it.

It seems to me that inequality, by itself, is not what drives revolutions. Inequality is not what causes societies to weaken and their moral fabric to unweave. Inequality is just the chipped and discoloured veneer most easily visible on the surface. What festers directly underneath, sometimes only detectable when the surface weakens or is pulled asunder, is inequity. Injustice. Unfairness… Poverty, unlike wealth, offers little protection. And that is the iniquitous thing.

For some reason, I’m reminded of Shakespeare’s King Lear: Through tattered clothes great vices do appear; Robes and furred gowns hide all. Plate sin with gold and the strong lance of justice hurtless breaks. Arm it in rags, a pigmy’s straw does pierce it.

Prove me wrong…