There is nothing either good or bad but thinking makes it so.

If I’m brutally honest with myself, I suspect I side with Goldilocks in her preference for the just-right-baby-bear stance: not too hard, not too soft. I have always been more comfortable in the middle of the Bell Curve with lots of wiggle room on either side; I’m not really cut out to be an extremist. And I was always taught to be sensible and use things wisely: buy good quality and look after whatever I bought so it would last. But times change, I guess, and as the price of goods declined with improved production, and fashion began to favour frequent change, the temptation to buy and replace increased.

Still, making-do was what my parents did, although perhaps it was easier in the days when planned obsolescence was just an economic dream. Unfortunately, now I replace things more often than I reuse them -and more often than I really need to, as well. As the months slip past and the years pile up one upon another on the shelf, I fear I have also become a creature of things. A collector of more than just Time.

And yet I have always harboured a suspicion that new is not necessarily better, nor is more always preferable to less. I would probably make a rather poor salesman, with the menace of the sharp needle of conscience only millimetres away -although I have to confess to feeling the thrill of buying a new and different watch each time the 5 year battery runs out. I justify it by assuming I’m assuring that someone will have a job -and anyway, I’m keeping the economy afloat… Aren’t I?

Dichotomies like these make it difficult to choose sides, though, don’t they? Sometimes it helps to step back far enough to see both sides of the street, and an essay by the author Nick Thorpe in Aeon put things in a rather intriguing perspective for me: https://aeon.co/essays/we-should-love-material-things-more-than-we-do-now-not-less

‘We’ve got used to the transitory nature of our possessions, the way things are routinely swept aside and replaced,’ he writes. ‘It’s one of the challenges facing the UK Department of Energy and Climate Change, whose chief scientific adviser, Professor David MacKay, in January bemoaned ‘the way in which economic activity and growth currently is coupled to buying lots of stuff and then throwing it away’… According to data aggregated by the Global Footprint Network, it takes the biosphere a year to produce what humanity habitually consumes in roughly eight months.’

One could try to avoid consumer goods altogether of course, and yet things do wear out. Things do break. And some things become sufficiently outmoded that they no longer function in the rapidly evolving technosphere. So I suppose we have to persevere ‘with what the British psychologist Michael Eysenck calls the ‘hedonic treadmill’, holding out the unlikely hope that the spike of satisfaction from our next purchase will somehow prove less transitory than the last.’ But as Thorpe observes, ‘If Western consumer culture sometimes resembles a bulimic binge in which we taste and then spew back things that never quite nourish us, the ascetic, anorexic alternative of rejecting materialism altogether will leave us equally starved.’

The answer, as Goldilocks knew all along, lies in compromise. It’s not that we value things too much, but rather that we don’t value them enough. ‘The challenge is to cherish our possessions enough to care about where they came from, who made them, what will happen to them in the future.’

We seem to be innate categorizers, more addicted to the hierarchies of price than cognizant of intrinsic value. And yet, with only a slight shift in perspective, wouldn’t it be valuable to ‘retain the pulse of their making’ as the British ceramicist Edmund de Waal put it? Much as a gift wears a different aura, and adds a different value to the object than is merely contained in its function, consideration of its history and its craft may well do the same.

In the modern world we are too far from the source to marvel at the genius of its production. I am reminded of one of the novels of Mark Twain –A Connecticut Yankee in King Arthur’s Court– in which an American engineer from Connecticut sustains a head injury and awakens in the medieval English court of the mythical King Arthur. It hadn’t developed the processes and gadgets we take for granted nowadays, so it fell to the engineer to develop them from scratch.

I tried to imagine making a working bicycle and the need to fashion it from whatever raw materials were at hand. And had I succeeded -which would have been well beyond my skill- I certainly would have viewed it in a different light than nowadays. The same had I built a clock, or fashioned reading glasses, or even devised a flashlight… My midden would not have required weekly removal.

Still, it’s not just keeping everything you’ve ever bought -that would be hoarding- but in valuing the item enough to repair it and continue using it: Repair, Reuse, then Recycle. Thorpe writes about a growing trend (at least in his part of the UK) of repair shops, and about an absolutely delightfully named social enterprise ‘‘Remade in Edinburgh’ [which] is one of a growing network of community repair shops dedicated to teaching ordinary people to mend and reuse household goods.’

Of course, the ability to avail oneself of this sort of thing is dependent on the initial quality of the item, as well as the opportunity to avoid the planned obsolescence of a technology wrapped in its own hubris. It also requires role models that we can all admire and emulate -people -and things- of proven worth.

Much as we tend to look up to sports heroes, say, or famous scientists, perhaps we need look no further than ourselves, and our biology’s long evolutionary history of successful strategies to reuse what we already have. Sometimes there is no need to develop new genes, or even new organs, to increase our success: existing equipment can be repurposed. Exaptation is the word that the palaeontologist and evolutionary biologist Stephen Jay Gould and his colleague Elisabeth Vrba proposed in 1982 to describe a new function for which an organ, say, was not originally intended. An example might be that of feathers on dinosaurs -in that instance, they may have been helpful for warmth, but were obviously not originally intended for flight as in their later descendants, birds. Or another example, drawn from my own specialty (Obstetrics), but drawn to my attention in an essay in Livescience by Wynne Parry: ‘All vertebrates have sutures between the bones of their skulls to allow for growth, but in young mammals these sutures have acquired an additional use easing birth by allowing the skull to compress as it passes through the birth canal.’

Inventions are seldom wasted in nature; why should we think our own artifacts need be an exception? There are so many obvious precedents that should encourage it -maybe would encourage it- if they were more widely known.

Okay, I’ll admit it’s quite a stretch, and perhaps unduly naïve to think that it would have much of an effect on the average person. But sometimes I think it’s important for us to believe that we have permission to rethink our obsession with novelty, and to realize that we are here today largely because of repurposing. Reusing. And then, ultimately, being ourselves recycled so we can all begin anew…

A Plague on Both Your Houses

The plague –nothing conjures up death quite like that word -after all, the bubonic plague wiped out half of Europe in the 14th century. But there have been others of its ilk –and all probably caused by the Yersinia pestis bacterium. Although the yet-unnamed infectious agent was identified in the 1890ies by the bacteriologist Alexandre Yersin -working at the time in the Pasteur Institute on plague samples from an outbreak in Hong Kong- the name was initially misattributed… Never work for somebody really famous when you discover something important. Personally, I preferred its previous name of Pasteurella pestis because that’s the name I was first taught and I liked the alliteration. But never mind.

The plague has three different presentations, depending upon the organs infected: bubonic plague, from infection of the lymphatic system and localized as buboes (swellings of infected lymph nodes which may become necrotic and turn black in their attempt to defend the body); pneumonic plague –infection of the lungs, presumably from aerosolized droplets from coughing or the like; and the rarest and likely most fatal of the three, septicaemic plague, which is an infection of the blood stream. All are carried by fleas, which are carried by rats, which then carry them to us.

Although we tend to associate the word ‘plague’ with the infamous ‘Black Death’ of European fame -not least because of the shock value of its name, I suspect- there have been several plagues throughout history. The first was originally thought to have been as early as 430 BCE in Athens, but a study published in the journal Cell in 2015 suggests that it began long before that –about 5,353 years before, actually. But perhaps a more assimilable article that outlines the background is found in a BBC news report, also in 2015: http://www.bbc.com/news/health-34603116

‘Samples taken from the teeth of seven bodies contained traces of the bacterial infection in the Bronze Age. They also showed it had, at the time, been unable to cause the bubonic form of plague or spread through fleas – abilities it evolved later.’ You have to love this kind of information, eh?

‘In its early days, it could cause only septicaemic or pneumonic plague – which is nearly always deadly and would have been passed on by coughing. By analysing the bacterium’s genetic code through history, the researchers estimate it took until 1000 BC for plague to evolve into its more familiar form. One mutation – acquiring the ymt gene – allowed the bacterium to survive inside the hostile environment of a flea’s gut. […]Developing a separate gene, called pla, allowed the infection to penetrate different tissues and cause bubonic plague.’

But all things change, don’t they? Things fall apart; the centre cannot hold, in the unforgettable words of Yeats. And yet why would a pathogen evolve to destroy the very hosts on which it depends? Why burn the hotel…?

I suppose an easy explanation might be that of a game in which each side –host/pathogen- continually attempts to outsmart the other. More virulence in the invader leads to more defensive mechanisms in the invaded –things as overt as quarantine or antibiotics, to the more subtle, but hopefully preventative development of immune resources by vaccination or over the longer term, adaptation of endogenous immune defenses: survival of the fittest.

But for me, the intriguingly unanswered question still remains: why kill your host? Why not coexist as, say, a parasite –or even a commensal- in the gut, or create a chronic condition that might weaken the owners, but not eliminate them? Of course, some pathogens are just evolutionary dead-ends – fireworks that illuminate the sky briefly and then disappear as suddenly as they appeared, or maybe finally settle into a desk-job and plod along just under the radar. But I suppose even germs want some time on the pedestal, though. Nothing ventured, nothing gained… Ecological opportunities beg for exploitation –leave a window unlocked, and something will find it.

Of course there are other ways of making a living: attack and retreat to fight again… While not strictly analogous, I am reminded of the Champawat tiger of Nepal (and later in the Kumaon district of India) in the late 19th century. She used to attack suddenly and then disappear before anybody could do anything about her. True, she was finally shot, but not before she’d managed to kill almost 450 people in different locations and instilled fear of her return for years. Fear is like that –especially fear of what Donald Rumsfeld (a once upon a time U.S. secretary of Defence, remember?) oxymoronically called the ‘known unknowns’.

The plague has managed a similar trick over the centuries, flaring up in one region, only to hide, then reappear in a totally different region later –often much later. ‘The most recent plague epidemics have been reported in India during the first half of the 20th century, and in Vietnam during wartime in the 1960s and 1970s. Plague is now commonly found in sub-Saharan Africa and Madagascar, areas which now account for over 95% of reported cases (Stenseth, 2008)’ [https://www.cdc.gov/plague/history/index.html]

But, even those of us living in North America are not entirely safe -remember that Hong Kong plague that Yersin was studying in the 1890ies? A ship from there arrived in San Francisco in the summer of 1899 with plague found among some stowaways, two of whom escaped and drowned in the Bay. An epidemic of plague hit San Francisco nine months later. Whether it was from them or from rats that swam ashore, is not known, but the disease has been with us ever since.

http://www.livescience.com/51792-plague-united-states.html  ‘Plague cases occur sporadically in the United States — between 1970 and 2012, an average of seven plague cases occurred yearly […] But plague cases don’t show up everywhere. Rather, most occur in rural areas in western states […] the CDC says. One reason why cases of plague are restricted to the West is that the rodent populations there carry the disease […] “Prairie dogs are one of the major rodent species that serves as a reservoir for plague, and they tend to be west of the 100th meridian” in the United States. For this reason, this line of longitude is sometimes referred to as the “plague line”.’

What, will the line stretch out to th’ crack of doom? asks Macbeth. I suspect that he would have found it fascinating that any of us would think we might be immune from history. And yet, despite all its bad press and the terrifying epithet of ‘Black Death’, plague cases in North America are rare. They can occur when people visit rural areas, says, Dr. Adalja, an infectious disease specialist at the University of Pittsburgh’s Center for Health Security, although ‘people are more likely to be infected with tick-borne illnesses such as Lyme disease, than plague.’

Uhmm, I’d be careful with squirrels in California, though…