Updated: Oct 10, 2020
How – as individuals – we can be narcissistic, hyper-connected, radically social and broken, sometimes all at once
What are the principal forces that shape the self, and what are the stories we tell ourselves about them? These questions spring from long-standing debates on the prevalence of nature or nurture, free will or determinism, the formative role of childhood experiences, and the more elusive impact of cultural change.
Answers, drawn from sociology, biology, psychology, economics, or, more recently, neuroscience, have always been historically contingent, reflecting the larger concerns of the time – hence Margaret Thatcher’s statement in 1981 that “Economics are the method: the object is to change the soul”. Today, we must consider the extent to which the internet and the various electronic devices and apps on which we rely are altering not just our social habits, but the so-called hard-wiring of our brains, adversely affecting our cognitive powers, stress levels, and ability to empathize. Many fear that the technologies designed to render our lives easier have instead become mental and physical health hazards. Michael Harris, the author of Solitude: In pursuit of a singular life in a crowded world, is far from alone in asking whether the new social media have made us “socially obese – gorged on constant connection but never properly nourished”. It may be increasingly easy to project an image of oneself, but what of the quality of the source? There is a growing body of evidence to suggest that new media render us lonelier, more narcissistic, more depressed and more anxious than our forebears.
The question of the self becomes more disturbing when we consider specific values so widespread and deeply rooted in our culture as to appear natural. In Selfie: How we became so self-obsessed and what it’s doing to us, Will Storr explores the history of an idea at the very heart of Western individualism – that of the infinitely perfectible self. Storr traces the evolution of this premiss from ancient Greek philosophy through Christianity, the Industrial Revolution, the rise of the natural sciences and modern psychology, to Silicon Valley. For him, the “extroverted, slim, individualistic, optimistic, hard-working, popular, socially aware yet high-self-esteeming individual with entrepreneurial guile” has become the prevailing model for the contemporary self, in whose shadow silently weep all those who cannot achieve such an exalted sense of themselves. This rests on the assumption that we are all free and able to shape who we are and what we do. Those who fail are encouraged to judge themselves harshly. As a result, both in the US and in the UK, where Storr bases his study, self-harm, eating disorders, depression, anxiety and body dysmorphia are on the rise.
Storr argues persuasively that the origins of the idea of the perfectible self lie in the ancient Greek notions of citizenship, individual freedom and, particularly, Aristotle’s conception of the human as the political animal. By contrast, in the Far East the dominant model of the self was developed in relation to group harmony. In part, this was owing to the nature of the Chinese landscape and to the agricultural methods adopted to benefit from it: massive river-irrigation and water-conservation projects, as well as large-scale farming schemes (for the cultivation of rice, for example) were only possible through sustained group effort. In Confucian philosophy, these geo-cultural realities found their equivalent in an emphasis on the interconnectedness of all things. Storr contrasts this with some Christian conceptions of interiority and the “pure” inner self, before moving on to the first self-help manuals, in the second half of the nineteenth century, the work of William James and the “mind-cure movement”, with its emphasis on “positive thinking”. Freud is mentioned primarily to illustrate that “most people tend to significantly over-estimate the extent to which others share their feelings and beliefs”: the founder of psychoanalysis concluded that just because he wanted to sleep with his mother, everybody wanted to sleep with theirs as well.
While Freud may have overestimated the significance of the Oedipus complex, Storr in turn underestimates the profound impact of the psychoanalytic paradigm in the twentieth century, and, above all, its role in shaping the phenomena on which he concentrates next: the fast-spreading pop cultural appeal of Humanist Psychology, including the Human Potential Movement and the Esalen Institute in California, which developed from the 1960s counterculture and whose “humanistic alternative education” programmes are going from strength to strength. Instead, Storr focuses on thinkers such as Carl Rogers, Fritz Perls and Will Schutz, who believed in our unlimited capacity to achieve ever better, ever more authentic versions of ourselves, and helped to redefine psycho-therapeutic interventions as tools to bring about personal growth rather than recovery from mental illness. Their “be what you are” mantra has become common cultural currency. One of the more bizarre characters aligned with the movement – which has counted Aldous Huxley and Jane Fonda among its disciples – was the Californian politician John Vasconcellos, a Human Potential evangelist who in 1986 managed to secure state funding for a task force to promote self-esteem. Initially a much-mocked initiative, the task force set out to prove that self-esteem is “the vaccine for all social disease”, including child abuse, educational failure, teenage pregnancy, alcohol and drug abuse, welfare dependency, social violence, and criminal activity. This resulted in self-esteem lessons being taught in schools, some of which subsequently chose to prioritize the boosting of self-confidence over the imparting of academic knowledge. Instead of studying maths and English, children in “I’m loveable and capable” T-shirts would spend their days chanting “We’re kids! We’re super-cool! We’re great!”, and be awarded sports trophies simply for participating.
Next, Storr turns to the Russian-born novelist Ayn Rand, whose The Fountainhead (1943) and Atlas Shrugged (1957) are celebrated by neo-liberals for advancing the doctrine of “virtuous selfishness”. Among her many admirers are the former Chairman of the Federal Reserve, Alan Greenspan, and President Trump. The connection between the Californian self-esteem boom in the second half of the twentieth century and the “free minds require a free market” ideology is one of the darker revelations in Storr’s history. Another is the measurable increase in narcissism. In their book The Narcissism Epidemic: Living in the age of entitlement (2009), Jean Twenge and Keith Campbell traced a steep rise since the 1980s in the number of American college students endorsing narcissistic traits (in themselves and others), and show that almost 10 per cent of people in their twenties have experienced “Narcissistic Personality Disorder”. All of which fuels the promotion of self-actualization above group-orientated values, such as caring for the community and, indeed, the planet.
Generation Self-Esteem has given birth to Generation Selfie, and scientists are now investigating the links between hollow forms of self-esteem boosting, parental overpraise and narcissistic behaviour. Most parents would probably agree that shouting “You’re a prodigy! You should be on TV!” when their child comes second to last in a competition is not the best parenting strategy. Yet the fact that a well-known corporation is currently investing millions in the development of a tiny selfie drone that sits on the wrist like a watch ready to fly to the perfect angle is not reassuring. Clearly those in Silicon Valley – themselves raised on the values of self-realization and self-esteem – do not expect the trend to die out any time soon.
The self that overestimates its own importance is closely linked to the self that is digitally hyper-connected, yet paradoxically lonely and cut off from its own deeper selfhood. Solitude, a compelling study of the subtle ways in which modern life and technologies have transformed our behaviour and sense of self, proposes an ancient antidote. Distinguishing between true solitude and “the failed solitude that we call loneliness”, Harris makes a case for being alone as a critically underestimated resource, a fertile state of mind that can engender new ideas, better health and a fuller understanding of ourselves and our relation to others. The strength of Harris’s argument lies in his showing how seemingly harmless new technologies insidiously influence our ways of being, from Google Maps, which deny us the pleasure of those discoveries that occur only when we get lost, to digital games structured by “addictive, nihilistic ludic loops” that erase the potential benefits of time spent alone, and the pernicious taste management by platforms such as Netflix, with their “if you liked x you will also like y” recommendations. There is a for the most part obvious self-help dimension to all this, as Harris proposes ways in which we can discover ourselves within an increasingly digitally connected world, rather than simply turning our back on reality: daydream, get lost, reconnect with nature, disconnect from devices.
Still, for some, perhaps, it really must be all or nothing, and Michael Finkel’s The Stranger in the Woods: The extraordinary story of the last true hermit speaks to the latter, telling the moving tale of a modern-day recluse. For twenty-seven years, Christopher Knight lived alone in the woods in Maine. He survived by stealing food from neighbouring cabins, and was forty-seven years old when he was finally caught by the police. He spoke just one word during his years in retreat, when he encountered a hiker: “Hi”. The self that shuns all connection with others, renouncing its status as a social being, has long been an object of fascination, finding its first literary treatment as early as the third millennium BC, in the Epic of Gilgamesh. Those who turn their back on civilization tend to fall into one of three categories, Finkel explains: protester, pilgrim, or pursuer. Lao-tzu, Siddhartha Gautama, Jesus, St Antony and Muhammad all spent extended periods alone before they experienced their spiritual epiphanies. In eighteenth-century England, having a hermit on one’s estate became a fashion among the upper classes: advertisements were even placed in newspapers for “ornamental hermits” who were “slack in grooming and willing to sleep in a cave”. The job had its attractions: it was well paid, often included a daily free meal, and contracts frequently lasted as long as seven years. The English aristocracy apparently believed that hermits “radiated kindness and thoughtfulness”. And they were not alone (as it were). Throughout history, hermits have repeatedly been venerated as oracles or sages.
This is not the case with Finkel’s subject. At the age of twenty, Knight abandoned his home and job to begin a new life alone in the woods. The decision was made on impulse. He is unable to describe what he was turning his back on (the internet was not the all-pervasive presence it is today, apps had not even been invented), or what he was looking for. He erected a secret camp, which he would perfect over the years, just three minutes away from the nearest cabin. He never lit a fire or left any visible trace of his presence. He survived by stealing food and essential supplies, such as propane gas and batteries, from the cabins in the vicinity. When he was eventually caught, he confessed to 1,000 break-ins. Although he was relatively ethical about what he stole, never taking more than he needed to survive, he inflicted psychological harm on his victims, many of whom felt violated and afraid as a result of the repeated burglaries. He regularly listened to the radio and even owned a television for a while, powering his devices with stolen batteries. He remained neat and clean, had a sweet tooth, liked a drink or two, and read anything he could get his hands on, including pornography.
There was nothing spiritual about this quest; his only given reason for quitting civilization was that he preferred to be by himself – which is not to say that he came to know himself better (indeed, Finkel quotes the American Trappist monk Thomas Merton: “the true solitary does not seek himself, but loses himself”). Finkel is unable to unearth any dark secrets or troubled family history, and Knight has no special wisdom to impart. When pressed to reveal his greatest insight, his answer was: “Get enough sleep”. We also learn that Knight’s closest companion was probably a mushroom. He watched it grow over the years, “unhurriedly, wearing a Santa’s hat of snow all winter”, eventually reaching “the size of a dinner plate, striated with black and gray bands”. While Finkel’s account is engrossing – one can imagine Werner Herzog making a film about this strangest of lives – this story is curiously anti-climactic. It is tempting to conclude that Knight had a personality disorder. Several clinical psychologists interviewed by Finkel do precisely this, their diagnoses including Asperger’s, depression, schizoid personality disorder, a damaged amygdala, an oxytocin deficiency, and an imbalance of endorphins. Knight’s model of complete withdrawal from the world – the self that rejects all connectivity – remains problematic. Like the mushroom he venerated, there was something parasitical about his lifestyle; he could not survive without drawing on the resources of others, and in a manner that caused them considerable distress.
There are echoes of this in Andrew O’Hagan’s gracefully intelligent The Secret Life: Three true stories, which considers the nature of selfhood in the digital age. While Selfie and Solitude take the form of classic transformation tales – the reporting self departs on a journey of self-discovery and emerges positively altered – O’Hagan’s essays, by contrast, are exercises in style and deliberate genre-crossing, vacillating between reportage, essayistic reflection and what he calls “actuality-seeking”. He charts not positive transformations but forms of failure, two of the three essays in the book being accounts of writing projects that ran aground. The first focuses on Julian Assange, the founder of WikiLeaks, whose autobiography O’Hagan was commissioned to ghost-write; the second, on O’Hagan’s experiment with constructing a false identity in cyberspace; and the third, on Craig Steven Wright, who may or may not be “Satoshi Nakamoto”, the legendary inventor of the crypto-currency Bitcoin. In what the author describes as “bulletins from the edge of modern selfhood”, O’Hagan presents “a few carnivalesque men” who are “bent out of shape – by their pasts, by their ambitions, by their illusions or by me – under the internet’s big tent”. He explores the seamier recesses of the web – hacking, cyber crime, online security and identity theft – and those of the human psyche, with both Assange and Wright appearing to struggle with some kind of personality disorder. The essays capture two “ghosts in the gleaming machine”, individuals who are their own worst enemies and whose selfhood hinges on complex ethical arguments pertaining to questions of freedom, transparency and the powers of the market. This is perhaps nowhere clearer than in the case of Assange, who believes, in principle, in the absolute freedom of information, no matter how sensitive. When it came to his autobiography, he both desired and feared its publication, driving forward and then thwarting the process of self-exposure in ways that were both sophisticated and, for O’Hagan, highly frustrating. We gain insight into a brilliant mind that cannot tolerate dissent and criticism, a man with atrocious table manners who spends most of his time raging about perceived traitors (including almost all of his former supporters). For Assange, as for Wright, the sense of self – and self-worth – is dictated by the same digital world for which they designed the parameters. Simultaneously masters and subjects, they not only provide potent fables of the connected age and studies in psychopathology, but also embody the core themes of classical tragedy: hubris and self-punishment.
Modern selfhood can be narcissistic, hyper-connected, radically solitary and broken. It can be all these things at once. These four books confirm that the relationship between the world and the self remains fundamentally dialectical – there is no model of selfhood that can rise above its context. Just as a fuller understanding of selfhood requires us to draw on the findings of a range of disciplines, ancient and modern, in order to appreciate what the French sociologist Edgar Morin terms “human complexity”, so, too, it requires us to reflect on the ways in which selfhood shapes those disciplines and the wider world. Freud famously wrote of the three great wounds inflicted by science on humanity’s “naive self-love”: the first by Copernicus, the second by Darwin, and the third by Freud himself. I have never found any of these intellectual revolutions particularly disconcerting, least of all Freud’s – after all, we seem to be the only animal that insists on throwing spanners into its own works, as is most obviously the case with Assange and Wright. Much more troubling is the more insidious socio-economic determination of our supposedly sovereign selves: as Storr and Harris argue so persuasively, our personal behaviours, aspirations and values are shaped to an ever larger extent by cultural forces that extend far beyond the influence of our parents, teachers and immediate surroundings, and that include technologies and economic models over which we have little if any control. Should, then, sociology – the science that studies the interaction between social structures and individual agency – not be counted as the fourth insult to our sense of sovereign selfhood? It is perhaps not surprising to learn that the nature vs nurture debate has recently been revisited by researchers in Queensland, Australia, who have analysed the conclusions of 2,748 papers on the topic and discovered that nurture “wins” – albeit by a margin as narrow as that of the Leave campaign in the Brexit referendum, with its championing of sovereignty, “taking back control”, and defining one’s own borders.