Wednesday, April 21, 2021

“Daddy, I am going to talk about my experience in an asilum” (by Paulo Coelho)

 

‘I entered a tiled cubicle. There was a bed covered with a rubber sheet and beside the bed some sort of apparatus with a handle.

“So you’re going to give me electric shock treatment,” I said to Dr Benjamim Gaspar Gomes.

“Don’t worry. It’s far more traumatic watching someone being treated than actually having the treatment yourself. It doesn’t hurt at all.”

I lay down and the male nurse put a kind of tube in my mouth so that my tongue wouldn’t roll back. Then, on either temple, he placed two electrodes, rather like the earpieces of a telephone.

I was looking up at the peeling paint on the ceiling when I heard the handle being turned. The next moment, a curtain seemed to fall over my eyes; my vision quickly reduced down to a single point, and then everything went dark.

The doctor was right; it didn’t hurt at all.’

 

The scene I have just described is not taken from my book, “Veronika Decides to Die”. It comes from the diary I wrote during my second stay in a mental hospital. That was in 1966, the beginning of the blackest period of Brazil’s military dictatorship (1964-1989), and, as if by some natural reflex of the social mechanism, that external repression was gradually becoming internalised (not unlike what is happening in the United States today, where a man doesn’t even dare look at a woman without having a lawyer by his side). So much so that good middle-class families found it simply unacceptable that their children or grandchildren should want to be ‘artists’. In Brazil at the time, the word ‘artist’ was synonymous with homosexual, communist, drug addict and layabout.

 

When I was 18, I believed that my world and that of my parents could coexist peacefully. I did my best to get good marks at the Jesuit school where I was studying, I worked every afternoon, but at night, I wanted to live out my dream of being an artist. Not knowing quite where to begin, I became involved in an amateur theatre group. Although I had no desire to act professionally, at least I was amongst people with whom I felt some affinity.

 

Unfortunately, my parents did not share my belief in the peaceful coexistence of two such diametrically opposed worlds. One night, I came home drunk, and the following morning, I was woken by two burly male nurses.

‘You’re coming with us,’ one of them said.

My mother was crying, and my father was doing his best to hide any feelings he might have.

‘It’s for your own good,’ he said. ‘We’re just going to have some tests done.’

 

And thus began my journey through various psychiatric hospitals. I was admitted, I was given all kinds of different treatments, and I ran away at the first opportunity, travelling around for as long as I could bear it, then going back to my parents’ house. We enjoyed a kind of honeymoon period, but, after a while, I again started to get into what my family called ‘bad company’, and the nurses reappeared.

 

There are some battles in life that have only two possible outcomes: they either destroy us or they make us strong. The psychiatric hospital was one such battle.

 

One night, talking to another patient, I said:

‘You know, I think nearly everyone, at some point in his life, has dreamed of being President of the Republic. But neither you nor I can ever aspire to that, because our medical record won’t let us.’

‘Then we’ve got nothing to lose,’ said the other man. ‘We can just do whatever we want to do.’

 

It seemed to me he was right. The situation I found myself in was so strange, so extreme, that it brought with it something unprecedented: total freedom. All my family’s efforts to make me the same as everyone else had exactly the opposite result: I was now completely different from all the other young men of my own age.

 

That same night, I considered my future. One option was to become a writer; the other, which seemed more viable, was to go properly mad. I would be supported by the State, I would never have to work or take on any responsibility. I would, of course, have to spend a great deal of time in mental institutions, but I knew from my own experience that patients there do not behave like the mad people you see in Hollywood films. Apart from a few pathological cases of catatonia or schizophrenia, all the other patients were perfectly capable of talking about life and had their own highly original ideas on the subject. Every now and then, they would suffer panic attacks, bouts of depression or aggression, but these did not last.

 

The greatest risk I ran in hospital was not of losing all hope of ever becoming President of the Republic, nor of feeling marginalised or unfairly treated by my family – because in my heart I knew that having me admitted to hospital was a desperate act of love and over-protectiveness on their part. The greatest risk I ran was of coming to think of that situation as normal.

When I came out of hospital for the third time – after the usual cycle of escaping from hospital/travelling around/going back home/enjoying a honeymoon period with my family/getting into bad company again/being readmitted into hospital – I was nearly twenty and had become accustomed to that rhythm of events. This time, however, something had changed.

 

Although I again got into ‘bad company’, my parents were growing reluctant to have me readmitted to a mental hospital. Unbeknown to me, they were by then convinced that I was a hopeless case, and preferred to keep me with them and to support me for the rest of my life.

 

My behaviour went from bad to worse, I became more aggressive, but still there was no mention of hospital. I experienced a period of great joy as I tried to exercise my so-called freedom, in order, finally, to live the ‘artist’s life’. I left the new job my parents had found for me, I stopped studying, and I dedicated myself exclusively to the theatre and to frequenting the bars favoured by intellectuals. For one long year, I did exactly as I pleased; but then the theatre group was broken up by the political police, the bars became infiltrated by spies, my stories were rejected by every publisher I sent them to, and none of the girls I knew wanted to go out with me – because I was a young man without a future, with no real career, and who had never even been to university.

 

So, one day, I decided to trash my bedroom. It was a way of saying, without words: ‘You see, I can’t live in the real world. I can’t get a job, I can’t realise my dream. I think you’re absolutely right: I am mad, and I want to go back to the mental hospital!’

 

Fate can be so ironic* When I had finished wrecking my room, I was relieved to see that my parents were phoning the psychiatric hospital. However, the doctor who usually dealt with me was on holiday. The nurses arrived with a junior doctor in tow. He saw me sitting there surrounded by torn-up books, broken records, ripped curtains, and asked my family and the nurses to leave the room.

‘What’s going on?’ he asked.

I didn’t reply. A madman should always behave like someone not of this world.

‘Stop playing around,’ he said. ‘I’ve been reading your case history. You’re not mad at all, and I won’t admit you to the hospital.’

He left the room, wrote a prescription for some tranquillisers and (so I found out later) told my parents that I was suffering from ‘admission syndrome’. Normal people who, at some point, find themselves in an abnormal situation – such as depression, panic, etc. – occasionally use illness as an alternative to life. That is, they choose to be ill, because being ‘normal’ is too much like hard work. My parents listened to his advice and never again had me admitted into a mental institution.

 

From then on, I could no longer seek comfort in madness. I had to lick my wounds alone, I had to lose some battles and win others, I often had to abandon my impossible dream and work in offices instead, until, one day, I gave it all up for the nth time and I went on a pilgrimage to Santiago de Compostela. There I realised that I could not keep refusing to face up to my fate of ‘being an artist’, which, in my case, meant being a writer. So, at 38, I decided to write my first book and to risk entering into a battle which I had always subconsciously feared: the battle for a dream.

 

I found a publisher and that first book (The Pilgrimage – about my experience on the Road to Santiago) led me to The Alchemist, which led me to others, which led to translations, which led to lectures and conferences all over the world. Although I had kept postponing my dream, I realised that I could do so no longer, and that the Universe always favours those who fight for what they want.

 

In 1997, after an exhausting promotional tour across three continents, I began to notice a very odd phenomenon: what I had wanted on that day when I trashed my bedroom seemed to be something a lot of other people wanted too. People preferred to live in a huge asylum, religiously following rules written by who knows who, rather than fighting for the right to be different. On a flight to Tokyo, I read the following in a newspaper:

 

According to Statistics Canada: 40% of people between 15 and 34, 33% of people between 35 and 54, and 20% of people between 55 and 64 have already had some kind of mental illness. It is thought that one in every five individuals suffers from some form of psychiatric disorder.

 

I thought: Canada has never had a military dictatorship, it’s considered to have the best quality of life in the world, why then are there so many mad people there? Why aren’t they in mental hospitals?

 

That question led me on to another: what exactly is madness?

I found the answers to both those questions. First, people aren’t in mental institutions because they continue to be socially productive. If you are capable of getting in to work at 9.00 a.m. and staying until 5.00 p.m., then society does not consider you incapacitated. It doesn’t matter if, from 5.01 p.m. until 8.59 a.m. you sit in a catatonic state in front of the television, indulge in the most perverted sexual fantasies on the Internet, stare at the wall, blaming the world for everything and feeling generally put upon, feel afraid to go out into the street, are obsessed with cleanliness or a lack of cleanliness, suffer from bouts of depression and compulsive crying. As long as you can turn up for work and do your bit for society, you don’t represent a threat. You’re only a threat when the cup finally overflows and you go out into the street with a machine gun in your hand, like a character in a child’s cartoon, and kill fifteen children in order to alert the world to the pernicious effects of Tom and Jerry. Until you do that, you are deemed to be normal.

 

And madness? Madness is the inability to communicate.

 

Between normality and madness, which are basically the same thing, there exists an intermediary stage: it is called ‘being different’. And people were becoming more and more afraid of ‘being different’. In Japan, after giving much thought to the statistical information I had just read, I decided to write a book based on my own experiences. I wrote Veronika Decides to Die, in the third person and using my feminine ego, because I knew that the important subject to be addressed was not what I personally had experienced in mental institutions, but, rather, the risks we run by being different and yet our horror of being the same.

When I had finished, I went and talked to my father. Once the difficult time of adolescence and early youth was over, my parents never forgave themselves for what they did to me. I always told them that it really hadn’t been that bad and that prison (for I was imprisoned three times for political reasons) had left far deeper scars, but my parents refused to believe me and spent the rest of their lives blaming themselves.

 

‘I’ve written a book about a mental institution,’ I said to my 85-year-old father. ‘It’s a fictional work, but there are a couple of pages where I speak as myself. It means going public about the time I spent in mental hospitals.’

My father looked me in the eye and said:

‘Are you sure it won’t harm you in any way?’

‘Yes, I’m sure.’

‘Then go ahead. I’m tired of secrets.’

 

Veronika Decides to Die came out in Brazil in August 1998. By September, I had received more than 1,200 e-mails and letters relating similar experiences. In October, some of the themes touched on in the book – depression, panic attacks, suicide – were discussed in a seminar that had national repercussions. On 22 January 1999, Senator Eduardo Suplicy, read out passages from my book to the other senators, and managed to get approval for a law which they had been trying to get through the Brazilian Congress for the last ten years, a law forbidding arbitrary admissions into mental institutions.

 

                    Paulo Coelho

                    Translated by Margaret Jull Costa

 

5 MIN READING: “Daddy, I am going to talk about my experience in an asilum” (paulocoelhoblog.com)

 

Wednesday, April 14, 2021

The journey of the wounded healer (by Matt Licata)

 

Many of us interested in spirituality and healing have been wounded in our lives – physically, emotionally, or at a soul level. Whether this wounding takes form by way of relational or attachment trauma, or through personal and archetypal betrayal, it has a way of coloring our perception and affecting our capacity to feel safe.


Our increasingly speedy and fragmented culture has come to pathologize valid human experience such as grief, melancholy, anger, and uncertainty, giving rise to a psychiatric and self-help community determined to “cure” or “transcend” dimensions of the psyche that contain important (and even holy) data for our unique paths of individuation, creativity, and meaning.

For some, initiation occurs only by way of transition, dissolution, and loss, through an embodied confrontation with the unconscious and the unlived life. These experiences are not signs of error or mistake, but calls to depth and evidence of how our wounding can serve an initiatory function.

At times, deeper healing will require that the wound disclose itself in more subtle ways within the psyche and the body, where it can seem like things are getting worse. Tending to this organic unfolding of the healing process requires newfound levels of courage, patience, and trust.

It is not an easy life, that of the wounded healer – one that we do not choose consciously – but is the honorable and noble inheritance of many who are called to the path.

It requires that we walk in this world against the grain and remain open to further wounding and revelation of shadow, and to dare to consider that radical possibility that the ally will appear in infinite ways. Not to harm, but to reveal.

Even though it may seem as if we are alone – and in part we must walk this path by ourselves – we are never truly alone, as unseen helpers, friends, and companions are always nearby. But at times will take forms that are not always immediately recognizable.

 

A Healing Space... reflections on love, meaning, and the aliveness of immediate experience: The journey of the wounded healer (alovinghealingspace.blogspot.com)

 

Monday, April 12, 2021

But the silence in the mind (by R.S.Thomas)

 

But the silence in the mind

is when we live best, within

listening distance of the silence

we call God. This is the deep

calling to deep of the psalm-

writer, the bottomless ocean

we launch the armada of

our thoughts on, never arriving.

 

It is a presence, then,

whose margins are our margins;

that calls us out over our

own fathoms. What to do

but draw a little nearer to

such ubiquity by remaining still?


by R.S. Thomas, from the collection Counterpoint, 1990

But the silence in the mind | Faithful to Science (grievingturtle.com)


Saturday, April 10, 2021

True gratitude is a communal emotion, not a wellness practice (by Michal Zechariah)

 

In March 2020, Europeans started gathering on balconies and by windows to cheer, applaud and show gratitude to their healthcare workers providing life-saving services during the pandemic. The regular cheering and clapping became a symbol of hope: human solidarity triumphing over fear and enforced isolation. Contrast that scene with another. Following the remission of her COVID-19 symptoms in July, Jazmin Grimaldi (daughter of Albert II, Prince of Monaco) told her thousands of followers on Instagram: ‘I am so thankful that I am starting to finally feel like myself today … I am grateful to be alive and healthy at this present moment.’

How were these situations different? Why is it inspiring to hear about the people of Europe shouting thankfulness from their rooftops, whereas Grimaldi’s message, while it induces sympathy, doesn’t inspire?

Besides the obvious discrepancy in the magnitude of the two displays, another clue is found in the question: to whom were these people thankful? The gratitude of the European public toward their healthcare workers is highly relatable. We understand the sacrifices that medical professionals make, and know that we owe them a debt that can never be paid in full. Our inability to adequately reciprocate their efforts only increases our gratefulness. By contrast, Grimaldi’s gratitude, however heartfelt, lacked an addressee – as do similar public utterances made by countless others. It wafted into digital space and dispersed, clinging to no one in particular. Grimaldi’s message conveyed an understandable sense of happiness and relief, but her sentiment didn’t necessarily establish a bond with other people.

Grimaldi’s style of gratitude is part of a wider societal pattern. The disappearance of benefactors (or donors) from scenes of thanksgiving has become particularly endemic to current American thought about gratitude, as its focus has shifted from the interpersonal function of thankfulness to its personal advantages. This is partly down to the influence of positive psychology: in the past two decades, scientists have grown increasingly attuned to the contribution of gratitude to both personal and interpersonal flourishing, crediting it with improving emotional wellbeing and promoting prosocial behaviours, changing our brains to help emotion regulation, and even with relieving symptoms of asthma. However, only a minority of studies have highlighted the social nature of gratitude, with most focusing on its benefits for the grateful subject. The aforementioned studies privileged personal rather than interpersonal aspects of gratitude. Test subjects were instructed to keep a written record of things for which they felt grateful or to perform a gratitude meditation, rather than to share their thankfulness with anyone else.

Seen as a personal emotion with considerable benefits, gratitude is increasingly marketed as a self-help instrument, as epitomised in the popularity of gratitude journals such as Good Days Start with Gratitude (2017): diaries designed to keep track of events, people and circumstances for which one feels grateful. With promises that they will bring a host of personal benefits, these journals translate some of the scientific findings about thankfulness and wellbeing into a wellness practice. And since the journal is a private document meant only for the writer’s eyes, any benefactors mentioned in it will probably never learn about the journal-keeper’s feelings.

The contemporary preoccupation with gratitude as an individual experience, and viewing it as a path to psychological wellness, is significantly different from how the emotion was understood historically. Whereas earlier theories of gratitude also concentrated on the importance of gratitude as an inward disposition, such theories nevertheless emphasised that gratitude derived its value from its interpersonal nature. In his treatise on religious affections, the 18th-century American preacher Jonathan Edwards described gratitude as a natural affection felt toward another who has benefited us. The power of gratitude is so great, according to Edwards, that it can momentarily induce positive feelings even toward our enemies (Edwards gives an example from the Old Testament with Saul’s thankfulness to his enemy David for sparing his life). By this view, although our gratitude might be rooted in a basic concern for our personal interests, its effect is decidedly interpersonal. Edwards considered gratitude one of the ‘better principles of human nature’ and viewed ingratitude as an especially heinous sin for its unnaturalness.

Recognition of the importance of gratitude as an interpersonal sentiment extends into antiquity, when bestowing gifts and reciprocating them was a central aspect of economic life. The Stoic philosopher Lucius Annaeus Seneca’s book On Benefits (composed after the year 56 CE, and newly translated by Miriam Griffin and Brad Inwood in 2011) offered the most extensive discussion of gratitude in the ancient world, and it continued to shape the Western concept of gratitude for centuries. In it, Seneca treats gratitude as a virtue that ought to be cultivated for social purposes:

That gratitude is an attitude to be chosen for itself follows from the fact that ingratitude is something to be avoided in itself, because nothing so dissolves and disrupts the harmony of mankind as this vice. For what else keeps us safe, except helping each other by reciprocal services? Only one thing protects our lives and fortifies them against sudden attacks: the exchange of benefits.

Seneca argues that the generosity of benefactors and the gratitude of recipients are the glue that holds society together and guarantees its survival. As a Stoic thinker who prioritised inner dispositions over outward circumstances, Seneca emphasised that feeling gratitude was more important than acting to reciprocate benefits received, but that the feeling was morally virtuous only insofar as it was directed at a benefactor. This interpersonal aspect of gratitude is essential: if a person only feels fortunate without crediting anyone for their good fortune (as in the social media messages and gratitude diary entries written by so many), they are not really being grateful at all. Contemporary philosophers even propose that being grateful for general states of affairs, rather than to any specific person, is a misnomer: when I am grateful for my general health, or grateful that it didn’t rain on my wedding day, what I’m actually feeling is not gratitude but appreciation.

Aside from positive psychology’s influence, why else is gratitude coming to be understood as a personal rather than interpersonal emotion, and even less as a virtue? Another part of the answer is surely that the interpersonal bonds and duties with which true gratitude saddles us are not always pleasant.

I remember a time when I was visiting my hometown in Israel and ran into a relative with whom I had a strained relationship. I’d popped into a coffee shop to change a large bill for coins for the bus, and she was there waiting on her order. Surprised but happy to see me, my relative insisted on giving me the change herself. It was a modest act of goodwill on her part but I found myself struggling to receive it: as I suppressed my instinct to refuse the gesture and cupped my hands to take the coins, an image of holding burning coals flashed through my mind. Why was I reacting so dramatically to such a small kindness? It wasn’t the monetary value of the gift that made me reluctant to accept it, but rather the interpersonal bond that accepting it would imply.

The French sociologist Marcel Mauss captured the essence of my predicament when he wrote in 1950 that a gift is received ‘with a burden attached’ because it binds the recipient to the donor. My mother tongue, Hebrew, reflects this hold of benefactors over their recipients in the expression assir todah – the equivalent of the English word ‘grateful’, it means literally a prisoner of thankfulness. The coins my relative gave me had little value, but taking them placed me in a debt of gratitude that, given the history of our relationship, I found hard to accept. On this occasion, gratitude did not feel good.

Literature too sometimes pushes against the claims of positive psychology about the personal benefits of gratitude. For example, in Samuel Beckett’s novel Molloy (1951), the titular protagonist, old and disabled, is apprehended by the police and interrogated aggressively at the police station. At last, he is approached by a woman whom he suspects to be a social worker. When she offers him a cup of tea, Molloy reflects:

Against the charitable gesture there is no defence, that I know of. You sink your head, you put out your hands all trembling and twined together and you say, Thank you, thank you lady, thank you kind lady.

The feeling that Molloy identifies with effusive thanksgiving is not gratitude in the contemporary sense, nor appreciation even, but rather humiliation. Even if he hadn’t wanted a cup of tea in the first place and preferred to be left alone, the cultural expectation to reciprocate the social worker’s kindness with an equal or greater measure of gratitude placed him in her immediate debt. To be grateful, Molloy says, is to cede power to the benefactor. By extension, Beckett’s protagonist offers a troubling alternative to Seneca’s egalitarian vision: for Seneca, gratitude offers a way in which even the poorest members of society can reciprocate the greatest benefits bestowed on them, simply by being thankful. But Molloy suggests that the expectation of gratitude risks deepening existing social gaps, since the less powerful in society will be forced into perpetual humiliating indebtedness, with the more powerful left to enjoy the role of charitable benefactors.

A similar concern about this aspect of gratitude is blown to an epic scale in John Milton’s Paradise Lost (1667). After leading an unsuccessful rebellion against God that cost him his heavenly position, Satan is on the brink of repentance:

What could be less than to afford him [God] praise,
The easiest recompense, and pay him thanks,
How due! Yet all his good proved ill in me,
And wrought but malice; lifted up so high
I sdeign’d subjection, and thought one step higher
Would set me high’st, and in a moment quit
The debt immense of endless gratitude, …

Satan faces a difficult problem: he understands that he ought to be grateful to God for everything he has received from him, but he can’t bear the emotional consequences. To be grateful would mean being burdened by the endless, joyless debt he owes his creator.

The recent popularity of gratitude as an instrument for enhancing personal wellbeing has obscured some of the complexity of this mental phenomenon: far from being only a personal emotion that points us to our blessings, it is primarily an interpersonal emotion that points us to our benefactors. This doesn’t mean that gratitude can’t feel good. More often than not, it does feel good to acknowledge others’ kindness toward us. But like any emotion that connects us to other people, gratitude can also be psychologically challenging. If as a society, we can recover the interpersonal significance of gratitude, it will confront us with the extent of our dependence on other people and their power over us. At the same time, and as Seneca argued, recognising this aspect of gratitude has the potential to bind us closer to one another, to strengthen our communities and relationships.

True gratitude is a communal emotion, not a wellness practice | Psyche Ideas


Wednesday, April 7, 2021

Nature is good for you. That doesn’t mean we should prescribe it (by Jeremy Mynott)

 

Iris Murdoch is best-known as a novelist but she was also a professional philosopher. Here she is in The Sovereignty of the Good (1970), reflecting on the transformative power of attention, in this case attention to the natural world:

I am looking out of my window in an anxious and resentful state of mind, oblivious of my surroundings, brooding perhaps on some damage done to my prestige. Then suddenly I observe a hovering kestrel. In a moment everything is altered. The brooding self with its hurt vanity has disappeared. There is nothing now but kestrel.

The passage looks back to a long tradition of reflections about the restorative powers of nature, but it also looks forward to what has, in recent years, become a torrent of public affirmations of nature as therapy. The idea is now almost a commonplace: confirmed by the moving testimony of many personal memoirs; supported by scientific research that quantifies the effects on our mental and physical health; and enthusiastically endorsed by all the big wildlife and conservation bodies. Indeed, the UK government itself confidently announces in its new Environment Bill: ‘Nature plays a vital role in public health and wellbeing.’ So, it’s official. Nature is good for you. But is it really so simple? Are they all talking about the same ‘nature’ and the same human needs? And where does ‘human nature’ come into it? I think Murdoch points us to a different and deeper insight.

It’s easy to believe in a general way in the positive benefits of nature. Don’t we all instinctively feel better for a walk in the fresh air, a view of some greenery and the sound of bird song? Even more so, surely, if we actively explore the natural world and engage with it in some way. I feel that myself, very strongly. Indeed, I’m co-author with two other naturalists, Michael McCarthy and Peter Marren, of a recent bookThe Consolation of Nature (2020), sharing just such experiences. But in writing for this book, even as I found myself more than ever absorbed in the wonders and delights of nature, I grew increasingly sceptical about its proclaimed status as primarily a medical commodity.

To begin with, can nature really provide experiences that are either a necessary or a sufficient condition of enjoying good health?

Not a necessary condition, surely, since we can all think of people in excellent physical and psychological health who have little interest in nature. Nor a sufficient condition either, since conversely there are many people devoted to nature who are not thereby protected from serious illness, depression or stress. Richard Mabey’s book Nature Cure (2005) is regularly invoked as the inspiration for a whole genre of ‘nature therapy’ memoirs, celebrating the healing power of nature against various forms of bereavement, depression, addiction and despair. But, in fact, though Mabey’s work is a literary classic by a great naturalist, its title is somewhat misleading. His severe depression prevented him from responding as he usually did to nature, and it was only after he began to be cured that he could again enjoy what had been a lifelong passion. The other titles, too, all tell more complicated personal stories, with a range of (sometimes unhappy) outcomes. The idea of a genre of effective nature cure books, in short, is more of a publisher’s invention than a reliable medical library.

What these books do demonstrate, however, is that there’s no ‘nature pill’ or ‘green Prozac’ you can simply take for a quick fix. It doesn’t work that way. Which way it does work, though, is much harder to say. If we’re now considering the much more modest claim that it helps some people, some of the time, and in some respects, we need to be more specific. Does it make a difference, for example, which part or aspect of nature we’re exposed to? Are plants and birds better for us than mammals or insects? Some birds better than others? Wild or tame? Helen Macdonald, after all, the author of another literary tour de forceH is for Hawk (2014), found her solace in a captive bird. How about slime-moulds, snakes and spiders? Or bacteria and viruses – all part of nature? And a key theme in much current environmental rhetoric is to emphasise that we ourselves are also inextricably a part of nature. But aren’t other people part of what we’re trying to avoid in getting more in touch with nature and the wild …?

Ever since Roger Ulrich’s paper was published in the journal Science in 1984, demonstrating that patients recovering from gall bladder surgery made substantially quicker and better recoveries if they had a view from their beds looking outward to trees and greenery rather than inward to the brick walls of the ward, scientists have been trying to isolate such variables. Ulrich himself pointed to some of the difficulties in identifying the crucial factors, however. His paper is now more cited than read, but he was careful to emphasise its limitations. Would it have made a difference if those brick walls had attractive pictures on them? Was it the trees that made the difference to the lucky patients with an outside view, or would a view of inanimate nature – sky, mountains or water – have done so as well? How would they have reacted to a busy urban street scene? Was their main problem boredom or anxiety?

Subsequent studies have gone some way to answering such questions, in particular through the measurement of associated physiological phenomena such as blood-pressure levels and neurochemical rewards in the form of serotonin, dopamine and endorphins. It’s become something of an industry, in fact. If you Google ‘scientific studies of nature and health’, you will in less than a second get more than 1 billion results, referencing research in universities worldwide, both in mainline departments of biological, medical and environmental studies and in such emerging sub-disciplines as ecopsychology and ecotherapy. To the extent that such studies rely on patients’ own reports of their sense of wellbeing, however, the headline results remain very generic. What does it really mean, for example, to be told: ‘Go Wild – And Feel 30 Per Cent Healthier And Significantly Happier’, as The Sunday Times said on 31 May 2020, reporting on research from Derby University? Or ‘How Much Nature Is Enough? 120 Minutes A Week, Doctors Say’ as The New York Times said on 13 June 2019, reporting the study done at Exeter University? We are still a very long way from a functional analysis of what specific items in ‘nature’ produce what effects.

But there’s a larger problem, too, in this line of thought. It treats such medical rewards as just one more of the ‘services’ we receive from the natural world, alongside the pollination of our crops, waste recycling, carbon capture, flood protection, ecotourism and so on. The environmentalist Tony Juniper wrote a very persuasive bookWhat Has Nature Ever Done for Us? (2013), quantifying such benefits in monetary terms in order to attract the sympathetic attention of policymakers. The demonstration of these natural services has become another minor publishing industry, establishing important evidence for conservationists to deploy in their campaigns.

These might indeed be the only arguments that have decisive political force, but effective arguments aren’t the same as real reasons. These utilitarian considerations are not why we thrill to a nightingale’s song, a peacock butterfly’s fragile beauty or a bluebell wood in spring. As individuals, we respond to such things directly and for their own sakes, not after or because of some financial calculation. To appreciate the natural world in a sense of wonder, awe, curiosity, joy or affinity is to recognise an intrinsic value, not an instrumental one. Henry David Thoreau made a similar point in his graduation address at Harvard College in 1837:

This curious world which we inhabit is more wonderful than it is convenient; more beautiful than it is useful; it is more to be admired and enjoyed than used.

It’s the same with creative work in art, music, poetry and science, surely. They can all be demonstrated to bring quantifiable public benefits, ranging from cultural tourism to various practical applications. But such beneficial consequences are far from fully explaining the motivations of the practitioners themselves.

Indeed, logic itself requires that there must be goods with intrinsic value that need no further justification. Something can be good as a means only if there are some other things that are good as ends, otherwise the question ‘Good for what?’ leads to an infinite regress. Beauty, truth and happiness are all examples of intrinsic goods. And it’s the common experience, as well as the credo of naturalists, that the natural world offers us one form of direct access to such values.

But there’s a final catch, Murdoch’s catch. She goes on from the passage I quoted at the beginning to say:

A self-directed enjoyment of nature seems to me to be something forced. More naturally, as well as more properly, we take a self-forgetful pleasure in the sheer alien pointless independent existence of animals, birds, stones and trees.

The authors of the ‘nature cure’ books were discovering that nature could offer some balm for their ills. But, to find that relief, they had to attend directly to what they could see, hear, touch or smell; and then, if they were lucky, the psychological and other benefits might follow. It’s a corner-of-the-eye thing. There’s no point in putting beauty, wonder, inspiration, understanding and the other positive experiences we rightly associate with nature on some sort of ‘to do’ list. They are what philosophers call ‘supervenient’ on the experiences themselves. Try too hard and you’ll fail. It’s a subtle distinction but a fundamental one. It’s the act of attention, as Murdoch says, that takes you out of yourself and so delivers delights that a preoccupation with self would deny you. You have to lose yourself to find yourself.

Nature is good for you. That doesn’t mean we should prescribe it | Psyche Ideas

Should Computers Run the World? - with Hannah Fry


Algorithms are sensitive. People are specific. We should exploit their respective strengths

The capabilities of algorithms and human brainpower overlap, intersect and contrast in a multitude of ways, argues Hannah Fry, an associate professor in the mathematics of cities at University College London, in this lecture at the Royal Institution from 2018. And, says Fry, planning for an efficient, ethical future demands that we carefully consider the respective strengths of each without stereotyping either as inherently good or bad, while always keeping their real-world consequences in mind. Borrowing from her book Hello World: Being Human in the Age of Algorithms (2018), Fry’s presentation synthesises fascinating studies, entertaining anecdotes and her own personal experiences to build a compelling argument for how we ought to think about algorithms if we’d like them to amplify – and not erode – our humanity.

Video by The Royal Institution

Algorithms are sensitive. People are specific. We should exploit their respective strengths | Aeon Videos

The joy of being animal (by Melanie Challenger)

Human exceptionalism is dead: for the sake of our own happiness and the planet we should embrace our true animal nature

When I visited my grandmother at the undertakers, an hour or so before her funeral, I was struck by how different death is from sleep. A sleeping individual shimmers with fractional movements. The dead seem to rest in paused animation, so still they look smaller than in life. It’s almost impossible not to feel as if something very like the soul is no longer present. Yet my grandmother had also died with Alzheimer’s. Even in life, something of who she was had begun to abandon her. And I wondered, as her memories vanished, had she become a little less herself, a little less human?

These end-of-life stages prick our imaginations. They confront us with some unsettling ideas. We don’t like to face the possibility that irreversible biological processes in our bodies can snuff out the stunning light of our individual experience. We prefer to deny our bodies altogether, and push away the dark tendrils of a living world we fear. The trouble for us is that this story – that we aren’t really our bodies but some special, separate ‘thing’ – has made a muddle of reality. Problems flow from the notion that we’re split between a superior human half and the inferior, mortal body of an animal. In short, we’ve come to believe that our bodies and their feelings are a lesser kind of existence. But what if we’re wrong? What if all parts of us, including our minds, are deeply biological, and our physical experiences are far more meaningful and richer than we’ve been willing to accept?

As far as we know, early hunter-gatherer animist societies saw spirit everywhere. All life possessed a special, non-physical essence. In European classical thought, many also believed that every living thing had a soul. But souls were graded. Humans were thought to have a superior soul within a hierarchy. By the time of theologians such as the Italian Dominican friar and philosopher Thomas Aquinas, in the 13th century, this soulful view of life had retreated, leaving humans the only creature still in possession of an immortal one. As beings with a unique soul, we were more than mere animals. Our lives were set on a path to salvation. Life was now a great chain of being, with only the angels and God above us.

But, as the Middle Ages came to a close in the 16th century, a fresh, apparently rational form of exceptionalism began to spread. The origins for this shift lie in the thinking of René Descartes, who gave the world a new version of dualism. Descartes argued that thought is so different from the physical, machine-like substance of the body that we should see humans as having two parts: the thoughtful mind and the thoughtless, physical body. This was religion refocused through a rational lens. The division between humans and the rest of nature was no longer the soul – or, at least, not only the soul – but rather our intellectual capabilities: our reason, our moral sensibilities, our gifts for abstraction. He assumed, of course, that other animals don’t think.

Enlightenment figures such as John Locke and Immanuel Kant in the 17th and 18th centuries developed this further. According to them, it was the fruits of our intelligence that made us truly human. Through mental powers, humans live more meaningfully than other beings. In other words, we humans have a soulful mind. It was even suggested that we are our thoughts, and that these phantasmal mental aspects of humans are more important and even, daringly, separable from the impoverished biology that we share with other animals.

Bottom of Form

In many ways, Darwinism posed a threat to this intensifying vision of the human and our place in nature. Charles Darwin disrupted both the idea of a neat divide between humans and other forms of life, and also complicated the possibilities for mind-body dualism. If humans had evolved from earlier, ancestral primates, then our minds, too, must have emerged through ordinary, evolutionary processes with deep roots in nature. It’s easy to forget today just how shattering Darwinism was for a whole generation. Darwin himself wrote to his friend, the American botanist Asa Gray, to express his acute fear of seeing humans as a fully integrated part of a seemingly amoral natural world, where there’s ‘too much misery’. Perhaps we shouldn’t be surprised, then, to find the redoubling of efforts to assert new forms of human redemption in the years after publication of On the Origin of Species (1859).

One such effort came in the form of the ‘human revolution’ – the idea that some kind of cognitive leap took place in the recent evolution of Homo sapiens that forever split us from other species. Another was in the 20th-century reworking of Enlightenment humanism that sought to find scientific proofs of human exception, and to argue that only these ultimately matter. Modern humanism promised to be about the ‘complete realisation of human personality’ in an onward march ‘to move farther into space and perhaps inhabit other planets’. The history of global philosophy on humans and other animals might be mischievously summarised as a long study in mental bias.

Having a humanlike mind has become a moral dividing line

Today, our thinking has shifted along with scientific evidence, incorporating the genetic insights of the past century. We now know we’re animals, related to all other life on our planet. We’ve also learned much about cognition, including the uneasy separation between instinct and intention, and the investment of the whole body in thought and action. As such, we might expect attitudes to have changed. But that isn’t the case. We still live with the belief that humans, in some essential way, aren’t really animals. We still cling to the possibility that there’s something extrabiological that delivers us from the troubling state of being an organism trapped by flesh and death. In the words of the philosopher Derek Parfit, ‘the body below the neck is not an essential part of us.’ Many of us still deny that human actions are the result of our animal being, instead maintaining that they’re the manifestation of reason. We think our world into being. And that’s sometimes true. The trouble comes when we think our thoughts are our being.

There are real-world consequences to these ideas. Having a humanlike mind has become a moral dividing line. In our courts, we determine what we can and can’t do to other sentient beings on the basis of the absence of a mind with features like ours. Those things that look too disturbingly body-centred, like impulse or agency, regardless of their outcomes or role in flourishing, are viewed as lower down on the moral scale. Meanwhile, the view that physical, animal properties (many of which we share with other species) have little significance has left us with the absurd idea that we can live without our bodies. So it is that we pursue biological enhancement in search of the true essence of our humanity. Some of the world’s largest biotech companies are developing not only artificial forms of intelligence but brain-machine interfaces in the hope that we might one day achieve superintelligence or even mental immortality by downloading our minds into a synthetic form. It follows that our bodies, our flesh and our feelings – from laughing with our friends to listening to music to cuddling our children – can be seen as a threat to this paradigm.

Why is this animal-denialism so entrenched in the human psyche, across cultures and millennia of time? The orthodox (if, still, speculative) story from evolutionary biology, as suggested by figures such as the American zoologist Richard Alexander in the 1970s, is that our subjective, imaginative mind has its origins in a bundle of adaptations for social cognition. As primates who lived in groups, our ancestors needed one another to survive; yet their social environment was also competitive, making the need patchy enough to empower the kind of cognition that gives us our sense of ‘me’. Add to that the need to gather insight into our own motivations and those of others, and to incorporate a rich, layered memory of experience, and we’re left with a staggering attention to internal states and external stimuli – the exact flavour of which consciousness researchers endlessly battle about. These many biological routes to attention gift us our selfhood.

Unfortunately for us, this self-salience has left us with the bizarre sensation that who we really are is some kind of floating mind, our identity a kind of thinking, or rather, a thinking about thinking, rather than the whole feeling, sensing, sometimes instinctual colony of cells that makes up the entire unit of our animal being. Our selfhood gives rise to the sensation that we’re a thing trapped inside a body. And we can speculate that several things flow from this. We have a heightened awareness of the threats we face as animals – not least an awareness that we’ll die one day. As W B Yeats put it in his poem ‘Sailing to Byzantium’ (1928), we are ‘fastened to a dying animal’. And, because we feel as if we’re somehow more than our bodies, we’re reassured that we can escape the frightening limits of our flesh. In other words, our sensation of mental distinctiveness becomes our hope for salvation.

But humans don’t only have selfhood – we also have insight into other selves. By making sound judgments about the internal states of others and communicating as we do, we have an extraordinary network of exchanges available to us. When the pathogen SARS-CoV-2 hit our societies causing havoc and heartbreak, we nonetheless had systems of communication, healthcare infrastructure, and methods for understanding the virus that would be impossible without the biological mechanisms that encourage us to work together for shared benefit. One of the glues of our cooperation is our ability to think into and make judgments about each other’s minds, experiences and intentions.

The ideas that bring us together have physical consequences

There’s still more evidence for the adaptive nature of cooperation in something we’ve come to call ‘social buffering’. This is the way that measurable stress can be reduced by proximity to a member of our community. Closeness and good relationships affect our wellbeing, modulating the release of stress hormones such as cortisol that can suppress our immune systems. A hug, the holding of a partner’s hands during a tough situation, access to our group in times of stress – all these create measurable effects on our health, and improve our ability to cope with the knocks of life. These benefits accumulate across a lifetime and have been found elsewhere among group-living animals, especially mammals.

But it’s a little more complicated for an animal like us. We use ideas to bring about the kinds of buffering responses created by our relationships. Where other social animals gain support by physical proximity to a relative or group-member, humans gain this through psychological proximity as well. In other words, the ideas that bring us together have physical consequences. This buffering is active for any worldview or ideology that facilitates group-belonging – and that might be something as innocuous as a local football team.

But there’s a little twist in this tale.

Evidence shows that social buffering often involves ranking the minds and skills of our own group (including the largest set, the Homo sapiens) as higher than those of others. Research by the Dutch psychologist Carsten De Dreu has revealed how some of the beliefs about the superior mental content of our own groups affect oxytocin, reinforcing our bonds with each other and increasing our commitment to the thoughts and feelings of our compatriots. Elsewhere, the work of the Italian psychologist Jeroen Vaes has demonstrated how fears and dangers prompt people to renew their group bonds, and this includes seeing group members as more human than those outside the group (with ‘more human’ measured as individuals with higher intelligence and greater signals of secondary emotions such as empathy or pride). In other words, we see the minds of our own group as superior to the minds of those on the outside, and when we want to reinforce that – especially, if we feel under threat – we increase our beliefs in the superior judgment of our own centre of belonging, and can denigrate anyone or anything that contradicts this. While that ‘group’ is often the culture or ideology with which we identify, for humans that group can also simply be us.

Intriguingly, recent studies have shown that the idea that we’re not really animals – and especially the idea that humans are hierarchically superior forms of life – is one of those profoundly reassuring ideas that we favour. Nour Kteily, a researcher at Northwestern University in Illinois, studies the ways that groups of people interact. He has developed the ‘ascent of man measure’, which exploits the progressive idea of humans rising to the top of a biological hierarchy. What he found is that stress or the presence of threats can prime us to favour human uniqueness. This generates a curious paradox, of course, if we view being animal as a threat in and of itself.

So why does this matter now? Nobody is denying that humans are exceptional. The concept of human uniqueness is only a problem when we deny the beauty and necessity both of our animal lives and the lives of other animals. No matter whether our origin stories tell us we’re possessors of spiritual properties or our courts tell us we’re ‘persons’ with dignity, we privilege the transcendent over the physical. The root word for ‘exception’ is the Latin excipere, which means ‘to take out’. We have always longed to be saved, to be ‘taken out’ from what we dislike or fear of our animal condition. But the pursuit of escape becomes more serious once we have powerful technologies to engineer and exploit biology.

These days, there is substantial investment in different technical routes to escape the limits or dangers of being animal, whether through DNA repair or stem-cell treatments or the transfer of more and more of ourselves to synthetic or machine forms. Google, Amazon and Elon Musk’s Neuralink are just three of the major corporations working in some of these areas. These are all part of a general trend to control and technologise more and more of our animal life. But, in seeking ways to enhance ourselves, people rarely acknowledge what we’d be leaving behind. As we start to use these new powers, it’s imperative that we dwell on what we stand to lose. The point here is not to argue that we ought to act as animals but rather that we are animals, and that a huge amount of the quality of our experience lies in a fully embodied animal life.

Some of the most important stages of life happen in the womb and in the early bonds with our carers in the weeks and months after birth. And the quality of those bonds and the wellbeing of our mothers can have lasting effects on us and the people we come to be. As the Israeli psychologist Ruth Feldman has written: ‘Later attachments … repurpose the basic machinery established by the mother-offspring bond during early “sensitive periods”.’ These crucial years in human development involve crosstalk between hormones, environment and touch that influence how the baby’s neural networks are organised. The central nervous system, the resilience to stress, all bear the marks of the early, deeply embodied years of our lives. When a parent and child embrace, the effects are staggering, regulating body temperature, heart rate and respiration. People in a temporary alliance, whether queer or straight, old or young, conservative or liberal, synch in ways that are measurable, from hormonal shifts to oscillations of the gamma and alpha rhythms of our brains, and these nourishing alliances reservice those first intimate, mammalian bonds.

Far from being solely the product of our brains and self-direction, then, humans are intimately affected by their whole physical being and its environment. Some devastating evidence for this comes from the children of Romania’s orphanages, who were abandoned with little physical or sensory affection by the cruelties and excesses of the country’s leader Nicolae CeauÈ™escu’s regime. This neglect left them with lifelong struggles, extending to language delays and visual-spatial disruption. These are painful reminders that our ability to flourish and express ourselves is profoundly influenced by the way our bodies are treated and valued in the earliest stages of our lifecycle.

Time online crowds out time spent in physical contact with others and in contact with the physical world

And that should matter to those who seek ways to define what’s important about human life. There’s the exciting possibility to some in the research and business community that we might soon exploit new biotechnologies such as the CRISPR gene and genome-editing tool, or have access to a kind of embryo ‘agriculture’ through frontier reproductive tools such as in-vitro gametogenesis (a technology that enables the switch from something like a skin cell into a stem cell), and thereby select the most disease-resistant or intelligence-scored embryos as our children. It’s of note, however, that the pursuit of this would disrupt and industrialise human life from its inception. In other words, we wouldn’t make our babies through sex, and nurturing might be done by engineering rather than by love and touch.

In less dramatic disruptions, we’re increasingly turning over our lives to our smartphones, with little attention paid to how our whole bodies influence who we become and how we thrive. The fact that children learn better through physical movement and gesture, including language acquisition, is ignored by those who want to operationalise teaching online. The research community is troublingly divided on how time online affects our mental and physical wellbeing. Studies from equally reputable sources announce that social media doesn’t affect mental health at the same time as another provides convincing evidence that it does. But a little common sense is useful here. What we can say for sure is that time online crowds out time spent in physical contact with others and in contact with the physical world. Only a belief that our animal lives are somehow less important than our mental lives can allow us to minimise what that reduction of our physical experience might mean.

And this is to say nothing of what our denigration of being animal means for the other animals. We have spent thousands of years arguing that we’re the moral overlords of our world. That’s looking harder to justify now that we’re the agents of extinction and pollution. For centuries, we have tamped down these contradictions. But it’s no longer possible to ignore the long shadow we cast. Mammal sizes have been shrinking on our watch, and are now the smallest they’ve been since dinosaurs roamed the Earth. Our planet’s biomass of mammals now breaks down into a mere 4 per cent wild species, around 30 per cent humans, and the rest are animals we produce for food.

Of course, as we explore our animal being, we’re confronted by the inconvenient possibility that these animals that are disappearing have worlds of experience that ought to press on our moral circuitry far harder than we’ve allowed up to now. Life on Earth is full of diverse forms of intelligence and purpose. We’re only at the beginning of scientific discoveries about the way memory and intentions grip animal bodies from tip to claw. Eventually, we’re going to have to reckon with the true complexity of the other lives that surround us. The more we learn about other animals, the more we recognise other experiences that ought to matter if, by this logic, our own do.

It might well be in the rallying of our own bodily resources that our greatest opportunities lie. When we reconsider all that we gain by being animals, we’re confronted by some powerful resources for positive change. Just think of the gobsmacking beauty of bonding. If you have a dog beside you as you read this, bend down, look into her eyes, and stroke her. Via the hypothalamus inside your body, oxytocin will get to work, and dopamine – organic chemicals implicated in animal bonding – and, before you know it, you’ll be feeling good, even in the dark times of a pandemic. And, as it happens, so will your dog, who will experience a similar physical response to the bond between you both. Oxytocin is produced in the hypothalamus of all mammals. In other words, our bodies might well be our best and most effective tool in the effort to strike a new balance between humans and the rest of the living world. If we can tip ourselves more into a bonding frame of mind, we might find it easier to recognise the beauty and intelligence that we’re hellbent on destroying. By accepting that we’re animals too, we create the opportunity to think about how we might play to the strengths of our evolutionary legacies in ways that we all stand to gain from. If we can build a better relationship with our own reality and, indeed, a better relationship with other animals, we’ll be on the road to recovery.

To be fully human, we must also be fully embodied animal | Aeon Essays