An Achilles Heel?

 

I’m going to go out on a limb and suggest that the average person, even if they’re only vaguely aware of Homer’s poems The Iliad, or The Odyssey, even if they are mildly conversant with the story of the siege of Troy and the Trojan horse, even if they have sort of heard of the Grecian heroes Odysseus and Achilles or perhaps the Trojan hero Aeneas, and even if they could pretend they remember that the author -not to mention the stories and characters- may or may not have been reality based… even if this were the case, the colours of their skin and hair probably do not rank particularly high in the recollection. Frankly, I -certainly not a card-carrying member of any historical society- had not given it much thought. Well, none, actually -some things are just not that important, I guess.

And when I think of the way Homer was taught in my freshman class in university, I suppose I merely assumed that detailed descriptions were unnecessary -obviously, they would each look similar to how we have portrayed Christ in all the medieval religious art: vaguely Caucasian. And in my student days, the zeitgeist of academia as well as the rest of western society, seemed to be swimming in what we might now call white privilege. Of course the ancient Greeks were white -I mean, just look at the white marble statues they have bequeathed to us. The fact that they were originally brightly painted was not known -or at least not communicated to most of us in my day.

So, although the article in an edition of Aeon that questioned the skin colour of Achilles, did not shock me, it did make me think about the long held western conceit that the ancient Greeks, on whom we have modelled so many of our democratic ideas, were fair-skinned. Even as I put this assumption into words, I realize that, however unintended, it seems terribly racist. And yet, some things do need to be probed, clarified: https://aeon.co/essays/when-homer-envisioned-achilles-did-he-see-a-black-man

The essay, written by Tim Whitmarsh, a professor of Greek culture at the University of Cambridge, attempts to make sense of what little historical evidence exists from those almost pre-historical times. ‘The poems are rooted in ancient stories transmitted orally, but the decisive moment in stabilising them in their current form was the period from the 8th to the 7th centuries BCE. The siege of Troy, the central event in the mythical cycle to which the Homeric poems belong, might or might not be based on a real event that took place in the earlier Bronze Age, in the 13th or 12th century BCE. Historically speaking, the poems are an amalgam of different temporal layers: some elements are drawn from the contemporary world of the 8th century BCE, some are genuine memories of Bronze Age times… Achilles was not a historical personage; or, rather, the figure in the poem might or might not be distantly connected to a real figure, but that isn’t the point. Achilles, as we have him and as the Greeks had him, is a mythical figure and a poetic creation. So the question is not ‘What did Achilles look like?’ but ‘How does Homer portray him?’

Fragments of evidence exist, but many are fraught with translational discrepancies and contemporaneous social conventions that confuse the issue. For example, at the time, ‘females are praised for being ‘white-armed’, but men never are. This differentiation finds its way into the conventions of Greek (and indeed Egyptian) art too, where we find women often depicted as much lighter of skin than men. To call a Greek man ‘white’ was to call him ‘effeminate’.’

Also, ‘Achilles is said in the Iliad to have xanthos hair. This word is often translated as ‘blond’… [But] the Greek colour vocabulary simply doesn’t map directly onto that of modern English. Xanthos could be used for things that we would call ‘brown’, ‘ruddy’, ‘yellow’ or ‘golden’.’ And, ‘Weirdly, some early Greek terms for colour seem also to indicate intense movement… xanthos is etymologically connected to another word, xouthos, which indicates a rapid, vibrating movement. So, while xanthos certainly suggests hair in the ‘brown-to-fair’ range, the adjective also captures Achilles’ famous swift-footedness, and indeed his emotional volatility.’

‘So to ask whether Achilles and Odysseus are white or black is at one level to misread Homer. His colour terms aren’t designed to put people into racial categories, but to contribute to the characterisation of the individuals, using subtle poetic associations… Greeks simply didn’t think of the world as starkly divided along racial lines into black and white: that’s a strange aberration of the modern, Western world, a product of many different historical forces, but in particular the transatlantic slave trade and the cruder aspects of 19th-century racial theory. No one in Greece or Rome ever speaks of a white or a black genos (‘descent group’). Greeks certainly noticed different shades of pigmentation (of course), and they differentiated themselves from the darker peoples of Africa and India… but they also differentiated themselves from the paler peoples of the North.’ In other words, concludes, Whitmarsh, ‘Greeks did not, by and large, think of themselves as ‘white’.’

This information would be filed in the ho-hum section of our need-to-know list for most of us, I think, and yet, Whitmarsh, in his introduction points out that ‘in an article published in Forbes, the Classics scholar Sarah Bond at the University of Iowa caused a storm by pointing out that many of the Greek statues that seem white to us now were in antiquity painted in colour. This is an uncontroversial position, and demonstrably correct, but Bond received a shower of online abuse for daring to suggest that the reason why some like to think of their Greek statues as marble-white might just have something to do with their politics.’

That there are people out there who seem threatened by knowledge which doesn’t accord with their own confirmation biases is, to me, more deeply troubling than mere disagreement. After all, we can disagree with something without being threatened by it. Disagreement allows for discussion, and possible attempts at rebuttal, using other evidence. Or countering with other interpretations of the same facts. In the end, isn’t it all just a game? An academic exercise which, after the initial flurry of excitement and barrage of words, should end, like all closely fought games, with a glass of wine?

The primrose path?

 

Every so often, I feel I have been blindsided -kept out of the loop either because I haven’t been diligent in my reading, or, more likely, haven’t thought things through adequately.

Philosophy concerns itself with the fundamental nature of reality, so I had always assumed there were few, if any, territories left untouched. In fact, I would have thought that the very nature of the discipline would have enticed its members to explore the more problematic subjects, if only to test the waters.

Of course, it’s one thing to continue to study the big topics -Beauty, Truth, and Knowledge and so on- but yet another to subject the more controversial, unpleasant issues like, say, Garbage, or Filth to critical philosophical analysis. At best one might argue it would be a waste of time commenting on their existential value. In fact, even suggesting that they might be worthy of philosophical consideration borders on the ridiculous, and the pointless -yet another example of a discipline grown dotty with age.

I have always felt that Plato was on to something in his insistence that what we experience are only particular and incomplete examples of what he called ideal Forms. We can all recognize a chair, for example, despite the fact that chairs can assume many forms, with innumerable shapes and sizes. And yet somehow, out of all the variations, even a child can recognize a chair: they can recognize the chairness of the object, if you will. So, it seems we can all understand the idea that any one particular example of a chair, or a triangle, say, is only a sample of the Forms of chairness, or triangleness… And because the Forms are only describable in the particular, we can never experience the true Forms except in our imagination. The Forms are, in effect, perfect and unchanging, unlike their earthly examples.

Where am I going with this? Well, although we might accept that this imaginary and essentially indescribable Form of what we’re calling chairness is ‘perfect’, could we say the same of other objects that make up our everyday reality -Garbage, for example? Is there an analogously ‘perfect’ Form for Garbage? Even thinking about that seems, well, valueless. Silly.

But, then again, uncharted waters have always attracted the brave -some may say, the unusual– among us. For my part, I was on my way elsewhere when I tripped over an article sticking out like a root on a forest trail. I suppose I should have known better than to start reading it. https://aeon.co/ideas/philosophy-should-care-about-the-filthy-excessive-and-unclean

‘[C]an the ‘unclean’ – dirt, mud, bodily wastes, the grime of existence – be relevant to the philosopher’s quest for wisdom and the truth?’ the author, Thomas White, asks. ‘Philosophers don’t often discuss filth and all its disgusting variations, but investigating the unclean turns out to be as useful an exercise as examining the highest ideals of justice, morality and metaphysics. In his dialogue Parmenides, Plato gives us an inkling of the significance of philosophising about the unclean, which he names ‘undignified objects’, such as hair, mud and dirt.’ When Parmenides questions Socrates about the issue, even Socrates is troubled and changes the subject. What hope is there, then, to include it as a legitimate topic for philosophical inquiry?

As White observes, ‘The unclean’s ‘undignified objects’ represent a kind of outer twilight zone – a metaphysical no-man’s land – that eludes overarching theories about the meaning of reality… The unclean’s raw existence is a great intractable that rudely interrupts a philosopher’s thinking when it fails to fit neatly into the theory of forms, thus forcing the philosopher to curb hasty, ambitious generalisations, and think even harder and more clearly.’ Of course, it has been suggested that ‘Plato attacked his own theory of Platonic ideas in order to know the truth, not to defend his own preconceived views.’ Indeed, maybe we need to be careful about insisting that any one particular philosophical model should be able explain everything. Even the discipline of physics admits that quantum theory and Newtonian theory seem to belong to separate Magisteria: each has its own domain -its own kingdom. Its own validity…

And yet for some reason, even in my dotage, I am reluctant to abandon Plato’s idea of Forms, no matter how societally objectionable the subject matter. Is there something to be said for, let’s say, filth -as in ‘not clean’- for which there may be a perfect Form? A ‘not-cleanness’ even a child could recognize?

When my children were young -so young that the world was fresh and new- they felt the need to explore: to climb whatever presented itself to their eyes, to look under things for what might be hidden there, and, of course, to taste whatever titillated their imaginations, or seduced their gaze.

As a parent, I have to admit that I assumed I should restrict their investigations to what I felt was safe and otherwise to what I found personably acceptable, but I couldn’t microscope them every second they were in my charge.

I remember one time, shortly after my daughter had learned to toddle around, I took her and her older brother out for a walk in a park near my house. The day was warm, and there was only one available park bench particularly appropriate as a base from which to watch the two of them wander around noisily within a little grassy clearing.

I must have dozed off in the sunlight, because when I opened my eyes the two of them seemed praeternaturally quiet and huddled over something they’d found in the grass. Curious to see what they’d found so interesting, I sauntered over to find my daughter contentedly munching away at something she’d found.

It didn’t look particularly edible, so I gently disentangled it from her mouth. I’m not sure what it was, and although parts of it were white, other parts where she had managed to break through the exterior, were brown and, frankly, disgusting.

“That’s not a good thing to eat, Cath,” I said, as her face contorted into a proto-wail.

“She thought it was popcorn,” my son explained, with a theatrical shrug.

I saw another similar white object on the grass nearby that promptly disintegrated as I picked it up. “That’s not popcorn, Michael,” I said as I brought it as close to my nose as I dared.

He shrugged again, as Catherine began to cry. “I didn’t think it was,” he explained. “And anyway, I didn’t try any…” he added, rather guiltily I thought.

I picked up my daughter to calm her and stared at Michael. “Then why did you let her eat it?” I asked, shaking my head disapprovingly.

His little eyes slid up my face with all the innocence of childhood. “She thought it was pretty…” he explained.

I looked at the aged piece of canine detritus with new eyes. It was kind of attractive, I had to admit…

A snowball’s chance… where?

Remember when Goldilocks sampled the porridge in the three bear’s cottage? One was too hot, another too cold, but baby bear’s was just right. Well, when it comes right down to it, I think I am pretty well a just-right-baby-bear kind of person. In fact, until recently, I figured we all were… But, as it usually turns out when I declare my allegiance to one side or the other, I’ve just discovered I made the wrong choice. Again.

I mean, it just makes sense to split the difference, eh? Try to choose the middle of the Bell curve so you’ll have room to maneuver if -or in my case, when– you back the wrong horse. From the middle, you can always say you were actually leaning towards the winning side -which you can’t from across the room. I learned that as a child who was owned by a railroad family which moved every year or so to a different part of Canada.

When we lived in the Prairies, I tried to pretend I liked the cold, but apart from throwing snowballs at passing busses, or hurling myself down snowdrifts on a piece of cardboard, I actually hated winter -it was far too cold. And on each blizzard-filled journey to and from the neighbourhood school -we were expected to walk in those days, not be driven- I was bundled up in so many layers, and my face shrouded by a scarf wrapped around it a hundred times, I would sometimes trundle off in the wrong direction until my mother ran out to point me another way. I was quite young then, of course, and each time I hoped she was coming to tell me school had been cancelled; I soon realized that in Winnipeg, they only cancelled classes if one of the rivers flooded.

The summers were not much better there -but they were even worse in the parts of Ontario where we ended up on our next several migrations. Put simply, even if you discounted the mosquitoes, the black flies, and pollen, and were careful not to step on snakes, or wander through poison ivy, or for that matter, follow the dog through the bush and end up having your mother pull ticks off your arms and legs when you got home, it was far too hot. Far too muggy. We couldn’t have afforded an air conditioner in those days -even if they had been invented- so I had to fight my brother to sit directly in front of the household’s only fan; and never behind him, because, well, my brother smelled like a gym-bag when he perspired.

But, I had always felt there was a credible argument for compromise. And, let’s face it, with temperature, it’s probably easier to don a coat or a sweater if it’s a little chilly, than to start stripping down if it’s too hot. I mean, I know you can’t please everybody, but I always thought that my compromises could stand the rough and tumble of any contrarian opinion. Until, that is, I bumped into the article in the Smithsonian Magazine that reported on a study published in PLOS One by researchers Tom Chang and Agne Kajackaite: https://www.smithsonianmag.com/smart-news/chilly-rooms-may-cool-womens-productivity-180972279

Their work suggested that ‘cold temperatures can negatively impact women’s cognitive performance.’ It would seem that ‘Temperature systems in many modern offices follow a decades-old model based on the resting metabolic rate of an “average male,” which is typically faster than a woman’s metabolic rate. Faster metabolisms also generate more body heat, which in turn means that women are often left shivering in the workplace.’

Now, we’re not talking Antarctic conditions in the room, or anything, and the performance differences measured were not Trump-resigns-under-pressure headlines, for sure, but nevertheless differences there were: ‘An increase in temperature of just 1.8 degrees Fahrenheit was associated with a 1.76 percent increase in the number of math questions that female participants answered correctly—which may not seem like a lot, but it is nearly half of the four percent performance gap that exists between male and female high school students on the math section of the SAT … Increasing the temperature by 1.8 degrees Fahrenheit also boosted women’s performance on the verbal task by around one percent. Men, on the other hand, performed more poorly on the math and verbal tests in warmer temperatures.’

But wait a minute here. ‘[W]omen’s enhanced cognitive performance in warmer environments seemed to be driven by the fact that they were answering more of the test questions; the dip in male cognitive performance, on the other hand, was linked to a decrease in the number of questions answered.’ Uhmm… Isn’t that a little like equating absence of evidence with evidence of absence? (I always enjoy using that aphorism whenever I can fit it in.)

Anyway, I have no reason question the results and I have to say I was further softened by one author’s explanation that ‘the students might simply have felt better, which in turn prompted them to exert more effort.’ Fair enough -that’s something a Winnipeg kid would understand -it’s hard to concentrate with a scarf wrapped around your face, or wherever.

There may be a little more work to do in resolving the so-called ‘battle of the thermostat’, however.  ‘[T]he pool of participants [543 students from universities in Berlin], though large, was made up solely of college students. The research is, in other words, not representative of the age and education level of the general population.’ Still, ‘the study suggests that dismantling the “thermostat patriarchy” is about more than fostering women’s comfort—it’s also a question of productivity.’

Too bad they couldn’t have done a study like that during a Winnipeg blizzard when I was young and wrapped. But then again, the sample studied -male or female- would have been horribly biased: only those of us who actually made it to school would have survived to take the test. And, who knows anything about those whose mother’s weren’t watching the direction their little tykes were pointed when they left the safety of the house? Could we use the ‘evidence of absence’ thing again…?

Sapere audi

Sapere audi – ‘Dare to know’, as the Roman poet Horace wrote. It was later taken up by famous Enlightenment philosopher Immanuel Kant, and it seemed like a suitable rallying cry as I negotiated the years that led from youth to, well, Age. Who could argue that ignorance is preferable to knowledge? That understanding something, better facilitates an informed decision about whether to believe or reject? To welcome, or close the door?

Admittedly, knowledge can be a moving target, both in time and perhaps in temperament as well. Whatever ‘knowing’ is that determines the appeal of a particular political philosophy, say, is not immutable, not forever carved in marble like the letters in Trajan’s column. One could start off in one camp, and then wander into another as the years wear thin. Perhaps it is the gradual friction of experience rubbing on hope that effects the change- but however it works, exposure can alter what we believe. If nothing else, it speeds adaptation, and enables us to habituate to things that we might once have shunned. And it is precisely this ability to acclimatize that may prove worrisome.

An essay by the philosopher Daniel Callcut drew this to my attention a while ago: https://aeon.co/ideas/if-anyone-can-see-the-morally-unthinkable-online-what-then

‘There are at least two senses of ‘morally unthinkable’. The first, that of something you have no inkling of is perhaps the purest form of moral innocence. Not only can you not contemplate doing X: you don’t even know what X is. This is the innocence that parents worry their children will lose online… Then there is the worry that if something becomes thinkable in the imaginative sense, then it might eventually become thinkable in the practical sense too… If virtue depends in part on actions being unthinkable, then the internet doubtless has a tendency to make unvirtuous actions all too thinkable… The idea that being a decent person involves controlling the kinds of thoughts you allow yourself to think can easily be met with resistance. If virtue depends on limits to what is thinkable, and a certain free-thought ideal celebrates no limits, then the potential conflict between freethinking and virtue is obvious.’

Of course, one of the several elephants in the room is the pornographic one -the ‘public discussion of the internet’s potential to undermine virtue focuses on the vast amount of easily accessible pornography… Porn, the research suggests, has the tendency to encourage the prevalence of thoughts that shouldn’t be thought: that women enjoy rape, and that No doesn’t really mean No. More generally, it has the tendency to encourage what the British feminist film theorist Laura Mulvey in the 1970s dubbed the ‘male gaze’: men staring at women’s bodies in a way that bypasses concern for a woman’s consent.’ And, not only that, there was the intriguing suggestion that ‘Liberals, worried about potential censorship, can sometimes find themselves defending the implausible position that great art has great benefits but that junk culture never produces any harms.’

As Callcut writes, ‘What we imagine is not inert: what we think about changes the people we are, either quickly or over time – but it still changes us.’ So, ‘If the image you are looking at is disturbing,’ he asks, ‘is it because it is explicit and unfamiliar to you, or is it because it is wrong? When are you looking at a problem, and when is the problem you?’ There is a definite tension ‘between virtues that by their nature restrict thought and imagination and the prevailing spirit of the internet that encourages the idea that everything should be viewable and thinkable.’

In other words, is it better not to know something? Is Sapere audi anachronistic, inappropriate -dangerous, even?

I find myself drawn back in time to something that happened to me when I was around 13 or 14 years of age. There was no internet, in those days, of course, and word of mouth, or naughty whispers with subtle nudges were sometimes how we learned about adult things.

A somewhat duplicitous friend had lent me a book to read: The Facts of Life and Love for Teenagers, I think it was called. His parents had given it to him when they’d found his stash of overly-suggestive magazines hidden in a closet. I wasn’t sure what to make of the loan, but at that tender age, and in those pre-social media days, there was much about life that remained mysterious and hidden from me. I hadn’t yet given much thought to girls; it was still an innocent time.

I remember being embarrassed even handling the book -especially since it didn’t look as if it had even been opened. My first instinct was to hide it somewhere my mother wouldn’t find it. Obviously the closet hadn’t worked for my friend, so, since it was summer, I decided to put it at the bottom of my sock drawer where I kept the ones I only used in winter. She’d never need to burrow down that deeply.

But, oddly enough, a few days later, I discovered the book had acquired a folded piece of paper in the ‘How babies are made’, section. ‘Read this,’ the note said in my mother’s unmistakeable cursive.

The next morning at breakfast I could hardly look up from my plate, but to her credit, she acted as if it was just another summer’s day: the radio on the shelf was playing some music softly in the background, and my father was buried behind his newspaper.

But the discovery triggered an embarrassing walk with my father who had obviously been delegated by my mother to deliver the Talk, as my friends termed it in those days. And although it turned out well, I couldn’t help but think I had crossed a line in my life. And judging by the gravity with which he approached it, I had just been initiated into a hitherto forbidden club.

In this case, fortunately, the not-yet imagined realm was discussed sensitively and, with many blushes on both our faces, placed in a realistic context -and with what I would later realize was a sensible perspective…

Despite my age, and after all these years, I continue to be naïve about many things I suspect, and yet I still feel there is a need to defend the ‘Dare to know,’ exhortation. Virtue does not depend on actions never considered, nor on a drought of as-yet-unimagined things; decency does not simply require controlling what you allow yourself to think, any more than pulling the covers over your head at night protected you from the bogeyman in the room when you were a child.

Virtue -morality- isn’t the absence of temptation; there is, and probably will continue to be, an allure to what we do not know -to what is kept hidden from us. There will always be a struggle, I imagine, and the more you know about it -and about the world- the more you enable yourself to understand context. I still wonder what type of adulthood I might have wandered into had my mother not found that book and realized there was an opportunity.

Sapere audi, I almost wish she had written instead, in that note to her already nerdy child -I think I would have loved the Latin.

Do you play crib?

I’m afraid I was a user, but long ago, you understand -before I really knew what I was doing. At that age, you have to depend on your parents, I suppose, but we all know what a lottery that is… At any rate, so the story goes, I escaped unscathed when the contraption I was using tipped over in the parental bed during the night.

It was a crib my father had built and carefully re-sanded so his youngest son would not suffer the same splinters his older child had gathered in his cheek from the same container. Even parental beds are inherently unstable and tippy -a property he felt would work in his favour to rock the baby and insure a modicum of sleep for my exhausted mother… and him, of course.

It was a clunky thing though, I’m told. It had high walls to prevent inadvertent crawl-out, but no breast-holes for ease of night-feeding. It also failed to position its center of gravity low enough to counter any endogenous, let alone exogenous activity, and apparently all three of us were, well, active in the depths of night. The result was predictable: the crib and I spent the rest of my useful infancy on the floor near -but not too near- the bed.

I was reminded of this autobiographical detail from my early life by a delightful article written by Christina Szalinski in the Smithsonian Magazine about novel ways of conquering the nocturnally insomnial tendencies of babies: https://www.smithsonianmag.com/innovation/history-cribs-other-brilliant-bizarre-inventions-getting-babies-to-sleep-180972138

‘Throughout history parents have invented places for their babies to rest—rockers, hammocks, swings, carriers, cribs and more… The original baby rockers were likely hammocks. Wooden cradles came later, and in the nineteenth century, metal became popular for hygienic reasons.’ But, of course, rocking required work -repetitive work- and a person would get tired, not to say bored after a while so ‘turn of the twentieth-century inventors added cogs, spring motors or hand cranks to cradles so they could rock, at least for a while, on their own.’ And, as technology and catchy labels evolved, new and improved models soon took over: ‘Now we have the Bluetooth-enabled 4moms mamaRoo 4 swing that “moves like you do,” the Graco Sense2Soothe swing with “Cry Detection Technology,” the SNOO Smart Sleeper bassinet with “calming sensations of the womb,” and Ford’s Max Motor Dreams crib that was made to mimic a car ride (however, this one was never sold to the public).’ No need for sanding -my father would have loved them.

But I can’t help but think he didn’t do much in the way of historical research into his project. As a matter of fact, for years the only books I remember in the bathroom library were Reader’s Digests. Szalinski tells us that ‘A half barrel with all but three slats removed, one on each side and one on top, was probably the world’s first device designed for nighttime sleep. Called an “arcuccio” or “arcutio,” Italian for “little arch,” this seventeenth-century creation was put on the mother’s bed with baby inside, allowing a mother to sleep and breastfeed throughout the night without the possibility of rolling onto her infant, or having her infant roll out of bed.’

But, in a way, I’m glad my father was who he was -a no-nonsense, practical inventor who was unswayed by neonatal fashionistas- because there was an American pediatrician Luther Emmett Holt who wrote a book called The Care and Feeding of Children. In it, he said he believed that ‘“fresh air is required to renew and purify the blood” and that “those who sleep out of doors are stronger children.”

We lived in Winnipeg in those halcyon days, and exposing me to the whims of a prairie winter would have been counterproductive (I was born in December); mind you, the summer recourse to which city dwellers apparently resorted was to ‘put baby in a cage suspended out the window, much like an air conditioning unit.’ Apparently, writes Szalinski, ‘Eleanor Roosevelt used one in their townhouse window for their daughter, Anna, until a neighbor threatened to report her for child cruelty.’ We only had a one-story house at the time, so my plight might have gone unnoticed for weeks… Okay, hours…

The crib that I found a bit creepy, though, was one invented in 1944 by the experimental psychologist B.F. Skinner (of Skinner Box fame, for studying animal behaviour using -amongst other things- operant conditioning ). His baby box, which he called the ‘air crib’ was ‘a completely enclosed crib with three solid walls and a ceiling, and a safety glass front, that allowed both temperature and humidity to be controlled for baby.’ He was apparently concerned that ‘being bundled up meant a child’s self-directed movement would be inhibited.’ But -surprise- what with his widely publicized animal experiments, the  crib seemed a little too familiar  and never caught on -especially amongst his lab associates.

Anyway, speaking of dealing with the very young and their undeniable penchant for rocking, Szalinski brings us up to the simple, why-didn’t-they-think-of-this-before, bi-gendered methodology of my own parental era: wearing them, of course. I mean, how hard is that?

And yet, ‘Babywearing fell out of favor in the mid- to late-nineteenth century in European and U.S. cities when roads were paved and strollers became a status symbol.’ Nevertheless, I can remember many a hike I took with my son comfortably strapped to my chest in a Snugli. I suppose I was lucky, though –first of all because, so far as I remember, neither my wife nor I tripped very much, and ‘because attachment theory shifted parenting attitudes in the 1970s and 80s. Warm, sensitive care and physical contact was no longer seen as a threat to a baby’s development of autonomy (like it was from the turn of the 20th century to the 1960s)—you could hold your baby (again) without “spoiling” them.’

I’m trying to remember whether or not I was spoiled. I don’t recall ever being carried around, nor, except for the crib-episode in those proto-Anthropocene, Snugliless days, ever being dropped, so I guess it all worked out. My father never taught me any carpentry, though.