Is Lateral a Direction?

Damn! There they go again, pulling the masks off the faces of those of us who grew up hoping we were uniquely creative; those of us who eschewed the logical pathway of thoughts and instead stepped off the trail to see if anything was hiding in the bushes. That’s what we lateral thinkers like to think we do -I say ‘we’ laterally, of course. Although I certainly wasn’t a child in the 1960ies, it was a time when my hormones had settled sufficiently to allow me to think of things further afield -more laterally than chromosomally. In fact, I suspect it still wandered more than I would have liked, but the era was wrapped in sunlight and the resulting chiaroscuro made it hard to look in one direction only.

The stage was set, it seems, for someone like  Edward de Bono, a Maltese doctor and researcher, to write The Use of Lateral Thinking which espoused just what I had found myself doing: relaxing the need for vertical (logical?) thinking. Actually, he decided that several things needed to change if we wanted to be creative: things like recognizing that some ideas were especially persuasive, so we needed to find different ways of looking at them -and this meant pursuing a different way of approaching and solving problems, even if it involved incorporating serendipity.

It struck me as a relaxing way to approach life and although I never really went in for piercings or drugs, I could see how it might be seductive to some people. But it’s hard to maintain a belief when people keep poking holes in it. And it’s especially hard to feel safe within a dogma when the innards of the pillars you thought were supporting the roof are showing signs of decay. Still, I imagine stuff evolves as time moves on.

I suppose I did, anyway, although I couldn’t quite surrender the suspicion that there was value in approaching questions as if they were actually answers in disguise -well, at least that was the message I took from the lateral thinking craze. And then, I happened upon an essay in Aeon by Antonio Melechi, an honorary research fellow in the department of sociology at the University of York, which seemed to suggest that lateral thinking was, of all things, a pseudoscience: https://aeon.co/essays/lateral-thinking-is-classic-pseudoscience-derivative-and-untested

Imagine my disappointment when I read that ‘Historians of science questioned why de Bono invested so much in the genius ‘eureka’ moment, when invention and paradigm shifts were more commonly the work of communal endeavour and disputation.’ Not that I’ve ever experienced a genius moment, or anything, but I’ve always revelled in sudden surges in understanding that seemed to spring from a good night’s sleep. ‘Psychologists had more questions than most. Lateral thinking clearly overplayed the importance of the creative breakthrough at the expense of trial and error, feedback and reflection, not to mention unconscious incubation.’

I suppose I’ve always been an incubator, though -much as the process may cast the strength of my underlying gender into shadowy regions. But nonetheless incubation, however unconscious, is still eurekoid, don’t you think? It’s still sort of lateral -something that conscious processes await until ideas can be suddenly and perhaps even mysteriously hatched from wherever.

De Bono and his lateral thinking has been criticized as being more derivative than seminal; but, does it matter who first named it? Melechi points out that, there has been ‘a long history of research into creativity, a rich treasury of thought and experiment that had almost certainly provided lateral thinking with most of its magpie principles and pre-owned methods.’ For example, in a lecture in 1880, the famous American philosopher and psychologist, William James ‘observed that the ‘highest order of minds’ had a knack for straying from the ‘beaten track of habitual suggestion… we seem suddenly introduced into a seething caldron of ideas, where everything is fizzing and bobbing about in a state of bewildering activity, where partnerships can be joined or loosened in an instant, treadmill routine is unknown, and the unexpected seems the only law.’

Actually, ‘Known to some Enlightenment philosophes as ‘negative imagination’, this mercurially creative sensibility remained in the shadow of pathology and degeneration for much of the 19th century, and it fell to a new wave of French psychologists to push for its study and rehabilitation.’ So, ‘the mathematician and polymath Henri Poincaré dug deep into the ‘sudden illuminations’ that punctuated his research. The unbidden insights that had propelled him to make discoveries across various fields were, Poincaré insisted, evidence of complex work being undertaken subliminally, over days and weeks, as he busied himself with unrelated issues.’

Gestalt psychology also identified the very notion of lateral thinking in all but name. ‘Wertheimer [psychologist Max Wertheimer] noted that logical-analytical thinking, or reproductive thinking, was hostage to repetition, habit and intellectual precedent. Insight and breakthrough, in science and everyday life, needed the irruption of ‘productive thinking’, the ability to look at a situation or problem from a new perspective.’

On and on goes the evidence to suggest that only the name ‘Lateral Thinking’ was new, and there I was thinking that I was finally surfing on a wave I could handle. That somehow it had unlocked the door to a personal heuristic (to use a lateral-thinkingly-derived change of idiom).

Perhaps there are no shortcuts to wisdom, though. There’s no sense in simply wandering through the woods only to stumble into an unsuspected field of flowers rather than the missing answer for which you were searching.

Is it a sign of Age, though, that searching always needs to be teleologically driven? Planned, in other words? That there needs to be a reason for the search, other than bald, ungarnished curiosity? That you need an already prepared question to get an answer…?

Answers lie all around us, scattered like those wildflowers in the meadow; surely what we really need to do is find the right questions. The right keys that fit the locks. I don’t know about you, but I have always travelled with questions stuffed in my pockets. And so, if I happen to stumble upon an answer I hadn’t expected to be sleeping just off the trail somewhere, I merely fumble around in my jacket for the suitable question that I didn’t even know I was carrying.

Is that Lateral Thinking?

Should Life be a walking shadow?

I have to admit that I had not heard of the ‘attention economy’ before -never even thought about it like that, in fact. And yet, when I think about attention, I suppose I’ve always heard it used as a currency, a thing that was paid to a specified ‘other’ -the thing attended to, in other words. Inattention, was no attention at all; it was an aimless, uncontrolled drift that gathered nothing of importance, and hence was to be discouraged; it was the kind of activity that, if noticed, would inevitably evoke a ‘pay attention’ rebuke from the teacher in a classroom, thus reinforcing the idea that there was only one flavour of the concept: the directed attention.

Indeed, capturing attention is the way business works, isn’t it? Getting you to think about a particular product when the need arises, and making you aware of its attributes and benefits, is the only way to interest you in buying it. We tend to segment reality into little fragments that we grasp, one at a time, to sew into the fabric we call our day; from the moment we awaken until we slip, often exhausted, into sleep, we wear a patchwork quilt of attendances as our realities. Only in retrospect, should we ever choose to examine them, are we able to recognize the qualitative dissonances we have accepted as seamless.

Perhaps the perceptually dissolute may decide to comb the day for information, but somehow that act seems as bad as stripping a sentence of all its adjectives to get at the nouns and thereby losing the colour and vibrancy each was intended to wear. Sometimes attention misses the beauty of the forest by examining the trees too closely. At any rate, I’ve often thought there must be more than one variety of ‘attending to’…

I can’t help but notice people on the street walking past me wearing earphones, or staring at their phones, oblivious of my presence unless we bump. Do they see the series of comic faces in the clouds, or the hawk circling high above the little park? Do they notice the missing brick in the wall of the hospital across the street, or the sad old man leaning on the post outside the Emergency department? Would they stop to smell the flowers in the little pot beside the barbershop; do they wonder if the owner takes it in each night so it doesn’t disappear? Do they even care?

Do they feel the wind on their cheeks, and hear the rustle of leaves as it fights its way into the meadow? Do they see the squirrel that stares at them from a branch above their heads and wonder which tree he calls home? Even more important, do they notice that there are less birds singing in the trees above the trail in the nearby woods, and more litter along the way?

I suppose they are entangled in another, more important, world -and yet it’s probably the same one as mine, but without the distracting detours that make the journey as important as the destination. Of course I realize that few of us are monolithic, or so wedded to each moment that we are not tempted by diversions from time to time; we all daydream, I imagine, although some of us are more open to it than others; some of us are not wracked by guilt at the imagined loss of time that should have been better employed.

We have all, no doubt, travelled to someplace new to us, and been completely absorbed in the novelty of discovery: the unusual smells, the strangely loud buzz of traffic, or maybe the unanticipated imaginative architecture, or the flash of unfamiliar clothes hurrying by on unexpectedly familiar bodies. In those moments, we are immersed in the experience and only when the shock wears off do we re-emerge to attend to particulars and grasp at purpose.

Yes, I know I am not alone in seeking different ways of defining what it is to pay attention, but I have to say that I was delighted to find that someone had actually written an essay about it:

https://aeon.co/ideas/attention-is-not-a-resource-but-a-way-of-being-alive-to-the-world

Dan Nixon, a freelance writer and senior researcher at the Mindfulness Initiative in England writes that, ‘Talk of the attention economy relies on the notion of attention-as-resource: our attention is to be applied in the service of some goal.’ So ‘Our attention, when we fail to put it to use for our own objectives, becomes a tool to be used and exploited by others… However, conceiving of attention as a resource misses the fact that attention is not just useful. It’s more fundamental than that: attention is what joins us with the outside world. ‘Instrumentally’ attending is important, sure. But we also have the capacity to attend in a more ‘exploratory’ way: to be truly open to whatever we find before us, without any particular agenda.’

‘An instrumental mode of attention… tends to divide up whatever it’s presented with into component parts: to analyse and categorise things so that it can utilise them towards some ends.’ There is also, however, an exploratory way of attending: ‘a more embodied awareness, one that is open to whatever makes itself present before us, in all its fullness. This mode of attending comes into play, for instance, when we pay attention to other people, to the natural world and to works of art.’ And it’s this exploratory mode that likely offers us a broader and more inclusive way to experience reality: an attention-as-experience. In fact, it is probably ‘what the American philosopher William James had in mind in 1890 when he wrote that ‘what we attend to is reality’: the simple but profound idea that what we pay attention to, and how we pay attention, shapes our reality, moment to moment, day to day.’ And ‘It is also the exploratory mode of attention that can connect us to our deepest sense of purpose… the American Zen teacher David Loy characterises an unenlightened existence (samsara) as simply the state in which one’s attention becomes ‘trapped’ as it grasps from one thing to another, always looking for the next thing to latch on to. Nirvana, for Loy, is simply a free and open attention that is completely liberated from such fixations.’

I like the idea of liberation; I cherish the notion that by simply opening myself to what is going on around me, I am, in a sense experiencing what the French mystic Simone Weil called ‘the infinite in an instant’.  It sure beats grasping at Samsara straws.

Is there nothing either good or bad but thinking makes it so?

Sometimes there are articles that set my head spinning. Or my mind. Ideas that I’d never thought of before. Ideas that make me rummage around deep inside, like I’m searching for a pencil, or my internal keyboard where I write the things I should remember. I often don’t, of course –remember them, I mean -how do you summarize and store enlightenment for a day, a week, a lifetime later? Sometimes you just have to explain it to yourself in your own words.

I subscribe to the online Aeon magazine -well, its newsletter, anyway- and I have to say that many of its articles are sufficiently mind-expanding as to qualify as epiphanous. Heuristic.

One such article on belief started me thinking. It was written by Daniel DeNicola, professor and chair of philosophy at Gettysburg College in Pennsylvania: https://aeon.co/ideas/you-dont-have-a-right-to-believe-whatever-you-want-to  It questions whether we have the right to believe whatever we want -the abnegation of which is in itself anathema in many quarters. But surely there’s no harm in having the unfettered right to believe whatever.

And yet, as he asserts, ‘We do recognise the right to know certain things. I have a right to know the conditions of my employment, the physician’s diagnosis of my ailments … and so on. But belief is not knowledge. Beliefs are factive: to believe is to take to be true … Beliefs aspire to truth – but they do not entail it. Beliefs can be false, unwarranted by evidence or reasoned consideration. They can also be morally repugnant… If we find these morally wrong, we condemn not only the potential acts that spring from such beliefs, but the content of the belief itself, the act of believing it, and thus the believer.’

‘Such judgments can imply that believing is a voluntary act. But beliefs are often more like states of mind or attitudes than decisive actions … For this reason, I think, it is not always the coming-to-hold-this-belief that is problematic; it is rather the sustaining of such beliefs, the refusal to disbelieve or discard them that can be voluntary and ethically wrong.’

In other words, I may inherit a belief from my family, or the circle of friends I inhabit, but that is no excuse for continuing to hold them if I come realize they are harmful or factually incorrect.

‘Believing has what philosophers call a ‘mind-to-world direction of fit’. Our beliefs are intended to reflect the real world – and it is on this point that beliefs can go haywire. There are irresponsible beliefs; more precisely, there are beliefs that are acquired and retained in an irresponsible way. One might disregard evidence; accept gossip, rumour, or testimony from dubious sources; ignore incoherence with one’s other beliefs; embrace wishful thinking ….’

Here, though, DeNicola issues a caveat. He does not wish to claim, that it is wrong, always, everywhere, and for anyone, to believe anything upon insufficient evidence. ‘This is too restrictive. In any complex society, one has to rely on the testimony of reliable sources, expert judgment and the best available evidence. Moreover, as the psychologist William James responded in 1896, some of our most important beliefs about the world and the human prospect must be formed without the possibility of sufficient evidence. In such circumstances … one’s ‘will to believe’ entitles us to choose to believe the alternative that projects a better life.’

Our beliefs do not have to be true if it is not possible to know what actually is true, or turns out to be the truth. As an example, ‘In exploring the varieties of religious experience, James would remind us that the ‘right to believe’ can establish a climate of religious tolerance.’ But even here, intolerant beliefs need not be tolerated: ‘Rights have limits and carry responsibilities. Unfortunately, many people today seem to take great licence with the right to believe, flouting their responsibility. The wilful ignorance and false knowledge that are commonly defended by the assertion ‘I have a right to my belief’ do not meet James’s requirements. Consider those who believe that the lunar landings or the Sandy Hook school shooting were unreal, government-created dramas … In such cases, the right to believe is proclaimed as a negative right; that is, its intent is to foreclose dialogue, to deflect all challenges; to enjoin others from interfering with one’s belief-commitment. The mind is closed, not open for learning. They might be ‘true believers’, but they are not believers in the truth.’

So, do we accept the right of the bearers of those beliefs to silence the rest of us, or should they merely be allowed to coexist in the noise? DeNicola wants to think, ‘[T]here is an ethic of believing, of acquiring, sustaining, and relinquishing beliefs – and that ethic both generates and limits our right to believe.’

But, I wonder if the ethic is truly assignable -the noise can be overwhelming, and those who are the most persistent in its production end up deafening the rest of us. And although any responsibility for their belief should imply accountability, to whom are they accountable -to those in power, or to those who also share the belief? Do those with firmly held beliefs read articles like the ones in Aeon? And would they be swayed by the arguments even if they did?

Is it my responsibility to convince my opponents that my beliefs are right, or rather to set about proving that theirs are wrong? A fine distinction to be sure, and one that seems inextricably embedded in the web of other beliefs I have come to accept as valid markers of reality. And yet I think the thesis of DeNicola’s argument -that a belief, even if possibly untrue, should at the very least, not be dangerous, threaten harm, or prevent others from believing something else- is the most defensible. If nothing else, it carries the imprimatur of several thousand years of wisdom: the concept of reciprocity -the Golden Rule, if you will: what you wish upon others, you wish upon yourself. Or, in the Latin of the Hippocratic Oath: Primum non nocere -First of all, do no harm.