From Socialist Register 2002, pp. 229–244.
Transcribed by Ian Birchall, Nina Kidron & Richard Kuper.
Marked up by Einde O’Callaghan for the Marxists’ Internet Archive.
Two hundred years ago the personal columns of newspapers read by the literate middle class stressed the most general social and personal characteristics of the advertiser and of the prospective respondent. From The Times (London), Tuesday, 15 December, 1801: ‘Gentleman (a Bachelor), about 26 ... man of good property, agreeable person, and in an old-established profitable Business ... Any Lady (Widow or Spinster) not exceeding 30 years of age’; or, from The Times, Wednesday, 28 December, 1803: ‘A Tradesman, in a pleasant part of London wishes to meet with a Partner for Life ... an agreeable, prudent Person; a Widow would not be objected to if her age did not much exceed his own which is under 30. Some fortune is expected’.
Today the typical advertiser concentrates on his or her singular characteristics, idiosyncratic proclivities, desires, interests, passions, hates and so on, and those of the target reader. Thus, from a recent issue of The New York Review: ‘loving good music, cats, nature’, ‘great listener/ great lover’, ‘seeking non-smoking, dynamic, sensitive, Jewish male 40–55’, ‘gay’, ‘deep feelings for nature and expressionistic art’, ‘socially/environmentally responsible, enjoys long walks, the arts, travel, mountains, skiing’, ‘loves to travel, go for walks, and have good conversations’, ‘attracted to Moroccan pillows’.
In two hundred years personal advertisers made the passage from representing themselves in terms of a wider social identity (a lineage or family, an occupation), through presenting themselves as individual personalities, increasingly free of a social dimension, to, finally, presenting themselves as a collection of attributes or characteristics without a unifying principle. The transition marked a momentous change in the socio-psychological landscape of market society as it lurched into its industrial-urban phase.
Curious things happened in the course of the eighteenth century in Europe, England in particular. The impersonal, single-shot, single-purpose transaction between strangers which constitutes the essence of a market broke free from age-old constraints, and took over economic life. God began to be approached on His day, not on any other day; the person one married was no longer a person one lived near; the space between people became more real than the people themselves; an ‘enough’ society gave way to a ‘more’, or ‘never quite enough’ one.
The transition was catastrophic for most of those who experienced it. Their family and social networks were torn apart. They were detached from fixed locations and clear identities. Their material life deteriorated. Their moral world, their understandings, their sense of self were damaged, often beyond repair. For a few the changes amounted to an anastrophe (‘a coming together of disparate elements to form a coherent, connected whole’). For them a new, living creation emerged – the modern capitalist market system. Driven by the interactions between people, rather than by individual natures, this society became free to pursue its logic to the limits of human capacity, and beyond. It created ever-rising standards of performance, physical and mental – standards that few achieve, that some approach, that many aspire to but fail to attain.
True, the vast mass of Market Being that resulted – average Market Beings – are better fed, better protected against disease, more knowledgeable, able to accomplish more than is conceivable in any other society, past or present. The exceptional individual in our midst would appear little short of superhuman anywhere else. But that is not the whole story. Even the most remarkable of individuals could not accomplish one iota of the achievements routinely expected of ordinary Market Being – in terms of strength, speed, agility, consistency, concentration – without help.
These miracles are possible because human capabilities are augmented by machines – by desktop computers that compute at three million times the speed of a numerically competent human, tirelessly and with (mostly) perfect reliability; cars that typically carry their occupants 20–30 times faster than they can walk – without tiring; airplanes that multiply that multiple by a factor of 8 or 10. At best a human can lift 50 kilos one metre, compared to an earth-moving machine’s 2,700 kilos and seven metres in the same time; a fighter can kill a handful where a nuclear weapon can wipe out millions.
But these machines really confirm people in their deficiencies, encouraging them to settle for what they can do without effort, and result in the kind of dependence that reduced the American GI in Vietnam to impotence: can’t move without the chopper, can’t see without the glasses, can’t smell without the sniffer, can’t hear without the booster, can’t sense without apparatus, adjuncts, aids. In short, can’t live without a vast supply train – the pills, the pre-cooked food, the ice cream. For the mass of people machines widen the gap between the possible and the probable. What people can accomplish themselves loses importance. People lose coherence, find it easy to dissociate persona from person or even to replace a vestigial person with many personas.
Consider the entertainment business, where self-estrangement and self-thingification reach an apogee. Hear Arnold Schwarzenegger – who redesigned his body, punishingly, over many years – speaking about himself (if ‘self’ can be said to come into it):
I know one thing. I have been very fortunate that every year the public’s interest in me and in my films has gone up, whatever is on the market. If it is magazine covers, they sell better this year than last year, and last year they sold better than the year before, so there is increase all the time. When you look at the last five years there has been a steady increase. We hope it will continue like that ... I really only had a plan to creating and publicizing myself... I went to a lot of acting classes, voice classes, accent removal classes and on and on and on. I mean I covered myself really well.
The effort clearly paid off. By the mid-to-late nineties the fees demanded by his agents started approaching the GDP of a small country.
Or look at Michael Jackson, constantly recreated at vast expense, constantly remodelled, constantly mutating. Supported by a retinue of plastic surgeons, dermatologists, hair specialists, and other personscape artists, he produced a supremely marketable product, neither black nor white, neither man nor woman; ageless, with narrowed nose, straightened hair, sculpted eyebrows, bleached skin – an embodiment of fantasy, and a far cry from the rather appealing little boy launched into superstardom in the early seventies.
Schwarzenegger and Jackson are, however, only highly visible exponents of a culture in which we permit ourselves to be treated like commodities in the hope that we may, one day, be treated like valuable commodities, a culture which encourages extreme professionalization, a slide from participation to spectacle, and the use of drugs to enhance (and recover from) performance.
It is a culture that supports an army of counsellors and therapists devoted to creating ‘new expectations’ and to persuading people to ‘rethink their self-images’; that has spread psychometric testing, adapted from educational, military and clinical uses, to selecting applicants for almost any job; that rewards behavioural consultants with six-figure salaries for helping managers handle workplace relationships by projecting an acceptable image; that brings fashion consultants and sartorial psychologists to match dress to station (rather than to self); which divorces behaviour from feeling (as in Tokyo’s smile classes); which supports a multi-billion dollar market for cosmetic surgery, cosmetic products, beauty salons, health farms, fitness centres, body shapers, exercise instructors, cosmetic prostheses and so on, each with its magazines, manuals, treatises and experts; and which brings a third of all adults in Britain (41 percent of women) to try slimming at any one time. Altogether the personal packaging business, the business of presenting a self-other-than-self, of separating persona from person, of constructing a presence distinct from the essence, accounts for perhaps $200 billion in sales worldwide. And this is apart from the huge sums spent on clothing, housing and generally providing this persona with its accoutrements.
It is a culture which demands of the body, particularly the female body, heroic feats of adaptation. The ideal figure, the one composed by fashion designers on their drawing boards, is one that is impossible in nature. Magazine art departments elongate necks, enhance breasts, narrow waists. And where depiction is not enough, the plastic surgeon is invoked: $3 billion a year was spent on some 2 million cosmetic procedures in the US alone in the early nineties, keeping as many as 15,000 cosmetic surgeons in relative luxury. In the early 1960s there were 108 plastic surgery clinics in Tokyo, serving 200,000 women a year. Today South Africa is bolstering its currency with ‘scalpel safaris’, in which women from Europe fly in, have eyes and other parts lifted, get a tan while watching a wild beast or two in a national park, and return home – all for one third of the cost of the procedure (minus tan and tour) at home. A recent development in the field is the adjustable breast implant, into which silicone can be added or siphoned out, according to the dictates of fashion or the tastes of the current partner in a woman’s life.
The sense of self rests, ultimately, on three bases, or pillars, which the death of subjectivity undermines.
First, it depends on a broad, balanced understanding of the social and physical world, and of one’s place in it – an understanding which can withstand the test of performance, which can therefore be self-reinforcing and cumulative. This has been seriously weakened by the narrowing of cognitive modes in market society. Of the four major modes of understanding on which our sense of self is based – the scientific, the religious, the intuitive, and the sensuous – only the scientific has triumphed and been dignified. Exhausted it may be, too weak to carry the full weight of traffic between society and its environment, or between the individuals who make up society, it is nevertheless active, powerful, and the more imposing because of its rivals’ even greater decline.
The second pillar of the self is the sense, based on accomplishment, of being an autonomous agent in the world. That sense is not encouraged by current arrangements. Natural events – puberty, ageing, change, death and bereavement – have become the province of specialists, consultants, counsellors. Every organ, every biological event, every infection has its experts; health comes to depend on drugs and specialist care. Life is medicalized; health, illness, pregnancy, birth, sexuality, death, eating and drinking, are standardized. We lose our capacity to take charge of our own physical state and to face the events and trials of our own existence. We don’t recover; we are cured. We don’t die in the natural course of life, but are carried off by a disease in hospital, and then often only with the doctor’s permission. Death ceases to be for amateurs: it’s a professional business. You don’t control even the passages of your life: surgery takes over the function of the fallopian tubes and womb-management increases in scope.
This passivity is reinforced by the disconnection of most work from a direct relationship with nature and by its migration from the household. For most people work takes place in a dedicated workplace, separate from social and cultural life (which also becomes the business of specialists), separate from the household and the wider community which it sustains, separate from everything but personal survival. Dependency supplants agency. Loss of control over work, its culture and hygiene, leads to a desire to avoid or escape from it and, by a natural progression, to the development, by specialists, of ‘objective’, ‘scientific’ criteria, for ‘justified’ idleness and unjustified ‘malingering’.
Medicine aligns itself with school, army, and prisons, in producing individuals adjusted, by chemical means if need be, to the social roles assigned them: company doctors steer workers back to jobs they cannot abide; prison and army psychiatrists try to make prisoners and troops adjust to inhuman conditions. The loss of agency leads to the treatment of symptoms rather than underlying causes, and converts medical care into, ultimately, a licensed purveyor of normative Market Being.
The third pillar of self – the third condition for self to exist – is that it be recognized by others. It cannot exist without relatedness, in two senses. To be aware of one’s self is to see something distinct from its physical or cultural surroundings, capable of a changing relationship; something in motion, with an orientation which might be changing at a different pace or in a different direction from that of its surroundings. That orientation forms the substance of all our feelings of meaningfulness – the deepest substratum of ethics. A condition of self – as every physician, counsellor or priest knows – is having ‘something to live for’. Second, the self cannot exist in a void – it is formed by bonding to others from birth (gradually enlarging the ‘I’ into a ‘we’ which includes the ‘I’, but is distinct from it). It requires a secure base, derived in large measure from the relative permanence of our bonds-folk and our surroundings.
One source of the strain on the pillars of self is the pervasive uncertainty about where one stands, a relativism which, at best, turns moral life into an optional extra, and at worst, into a void, empty of purpose, experienced as a lack of reference to what is good, significant, or meaningful. In our world, activities may have purpose, but life no longer has. Individual lives do not assimilate different conducts, or make sense as part of a larger entity; they are no longer a focus of meaning. Nor is there a single entity they might represent. You might be a paid up member of a family, a loyal colleague at work, a dedicated fan or hobbyist, a good neighbour – a multitude of things each of which is enmeshed in a larger entity. But there is no compelling reason for all these connexions to coalesce.
Another pressure undermining the sense of self comes from the transformation of intention into behaviour, of purpose into law, whereby the individual is relieved of responsibility for outcomes so long as he or she conforms.
The self has also suffered from increasingly rapid changes in social and physical circumstances, which turn attachment into contact – a superficial, evanescent thing; from the narrowing of family relationships to the conjugal or sub-conjugal family; from urbanization and the loss of direct links with nature, which have dulled our sense of the uniqueness and variety of individual constitutions, and replaced our easy acceptance of variety with seeing it as something discordant, unnerving (and which have also promoted a slide from strong associations through mutual stimulation, to the weak associations of agglomeration and contiguity).
The weakening of self feeds back to the social detachment and disconnectedness which are its chief cause. It occurs most easily when the individual is denied the repeated relationships and exchanges of community life, and is propelled into a series of mobile, changing, revocable associations designed for highly specific ends – a series of partial relationships.
In these circumstances, other people’s motives cannot be inferred with conviction; what happens, while apparently being the intended action of a human agent, is often unintelligible and non-negotiable. We do not know what to make of it, how to respond to it, how to distinguish between human responsibility and the nature of things. It is a world of bafflement and disassociation. It is also a world of rage, of watchfulness, of dulled enthusiasms, suspicion, pessimism, defensiveness and quickness to take offence. In pre-capitalist times people knew what other people thought or believed or saw or heard – they shared common values and assumptions, common experiences and purposes. Nowadays you don’t know without asking. Consumer surveys, opinion polls, elections, referenda have taken the place of shared awareness. The basis for relatedness has moved from the unmediated and spontaneous to the external and formally-structured.
One of the ways personal detachment expresses itself is in criminal behaviour. People with little or no stake in society do not feel bound by its rules, and may even have little idea of what they are. A more general expression is the way people typically behave in an urban setting. Even in a small suburb, like Nassau County in New York, one can meet 11,000 people on the street within a ten-minute radius; in mid-town Manhattan the figure climbs to 220,000. In these conditions personal contacts become narrow and superficial, reduced to formulae, measured in terms of their cost in psychic energy, protected by inhibitory screens. Norms of non-involvement evolve to cope with the threatened sensory overload (norms which also permit greater tolerance of the unusual than is likely to be found in a small community). At the extreme this moral atonality allows indifference to homicide, or homicidal indifference, to flourish, as when neighbours hear or even watch murder or rape without doing anything about it.
Ironically, the cultivation of non-involvement leads to understimulation and attempts to create artificially, and within a controlled sphere, the stimuli which it denied in the first place. So urban dwellers manufacture unnecessary problems which they then have to, and can, solve, in games such as Scruples; they overreact to normal stimuli by compulsive intimacy or confessionalism.
Another expression of personal detachment is the conversion of normal activities into specialist ‘recreational’ pursuits – compulsive gratification unattenuated by contextual restraints.
Sex, stripped of an emotional and social setting, becomes a mechanical pastime, repeated to the limits of physical capacity, drained of content as a means of communication or propagation. Eating becomes detached from energy requirements and, at the extreme, from social intercourse, and turns into ingestion or ‘grazing’. Travel ceases to be a purposeful activity, used ‘to regulate imagination by reality’ (as enjoined by Dr Johnson), and becomes an accumulation of experiences divorced from everyday life – time taken out of real time: most cruises no longer have much to do with going anywhere in particular. Each activity becomes a uniform, monotonic pursuit, unrelated to others, a single thread, attracting measurement rather than appreciation, not part of a web.
A common companion of personal detachment, or loneliness, is gregariousness – being amongst, but not with, others, relating to an outside agent in common, not to each other. It is encouraged by, and sustains, the gigantism of modern urban facilities – enormous astrodomes and arenas, huge rallies organized for pop concerts or politics or sport. Often it is only when these gigantic facilities break down that social intercourse takes place. Gregariousness substitutes a spurious identity – nation, class, occupation, belief – for personal identity. For too many people ‘I am I’ is replaced by ‘I am as you wish me to be’.
The impairment of spontaneous, unforced relatedness has not gone without response. Large hotel chains have automated check-in kiosks and room service in order to eliminate contacts between guests and staff. Banks are beginning to charge account holders for communicating with a human being rather than an automated telephone system or an ATM. A Japanese theatrical agency runs a stable of amateur actors who play the parts of friends, colleagues, bosses, university professors and teachers – even siblings and parents – at weddings: fake colleagues with no speeches to give other than deflecting inquisitive questions begin at around 30,000 yen (£190) plus travel expenses. ‘Old friends of the family’, who might have to propose toasts or deliver long, sentimental reminiscences, might cost ten or twenty times more. In China, ‘apology companies’ are sprouting in the spaces emerging between the ever-present bureaucratic cells. Charities and voluntary organizations everywhere attach themselves to the lonely and frightened with help-lines (‘Contemplating suicide? Contact the Samaritans – 1-800 784 2433 – or a local crisis centre’), soup kitchens, Christmas treats and so on. The clergy trundle through hospitals which also deploy their own social workers. Mandatory altruism in the form of community service is proposed for high schools in Texas as a condition of graduation. The Japanese Ministry of Labour publishes advice on how to avoid becoming lonely.
The most extreme form of dissociation is estrangement from self, or self-alienation. People who suffer from it experience themselves as they experience things, with the senses, but without a sense of self. They do not feel they are the subject of their experiences, their thoughts, their feelings, their decisions, their judgments, their actions. They do not feel they are the centre of their world, creators of their own acts. They do not have a sense of wholeness and proven agency that is their own, not others’. They are as out of touch with themselves as they are with others. Alienation at work is a core element of this inside-out experience. As mechanical power took over from human power and standardized machine production replaced craft production the deep-seated need to see oneself and one’s abilities reflected in the things one produced was progressively denied, and with it the affirmation of identity. One became an anonymous part of an external production system, a disposable, interchangeable, appendage.
Alienation is not a natural condition. At the extreme, it is pathological, denying the social nature of the human species and its cultural development, threatening the very existence of those affected. It is at least suggestive, not only of changes in perception and fashion amongst psychotherapists, but of the underlying change in the predicament of their clients, that the dominant patterns of psychopathology changed markedly over the last century, from hysteria, phobias and fixations, to ‘ego loss’ – a sense of emptiness, flatness, futility, lack of purpose or loss of self-esteem, a pathological disorientation in which people do not know what to think of, or where they stand on, issues of cardinal importance to them. Its signs are all around. There is the enforced isolation, the institutionalized suspicion – living constantly with Big Brother, or rather a lot of little brothers chattering and exchanging information over computer networks, professionally suspicious of you and, by their very presence, stirring up your doubts about others. Public streets and private malls, stores and halls, churches and parks are open to the unblinking eyes of their TV cameras. Many campuses have police officers on patrol three shifts a day, seven days a week. They are beaded with intensified lighting and strewn with emergency telephones that can call up a policeman within ninety seconds. Every large commercial building has its security guards and checks; many have entry passes. Nothing is considered safe, and no one does not have the finger pointing at him or her. Avoid contact; avoid being listed; dig a moat around yourself.
But can you? On a typical day, you might be tracked to work by an intelligent traffic system, your employer might listen, legally, to your telephone conversations, or tap into your computer, e-mail and voice-mail. You may be tracked by the ever-present closed circuit camera when shopping, and have your image stored. There are peepholes in the fitting rooms of clothes stores, and hidden microphones. The supermarket will log information about your purchases if you belong to its buyers’ or loyalty club. Your credit card company keeps tabs on where and what you buy, and sells the information to eager marketers. Tollfree numbers identify a caller’s number even if it is unlisted, and may sell it on. Should you change your job, your new employer may legally obtain your medical history from your insurance company, and your credit history from a credit bureau, ask you to submit to a drug test, a lie-detector test, or a personality test, ask you to disclose the prescription drugs you take and whether you have smoked in the preceding twelve months. If you’re a nanny your new employers might track you with a concealed camera.
To see alienation’s private face, walk down main street anywhere in the world. As likely as not you will encounter a run of Walkman-wearing, eye-shaded individuals, secluded, avoiding contact in the most crowded circumstances. Get on a commuter train, tube or bus, to find the passengers pressed against each other, reading, avoiding eye contact. Shuttered faces. Stop for a moment, observe gait, posture, personal furniture: many, particularly if young, saunter rather than walk; act cool and laid back rather than purposefully; carry cans of coke, lager, even bottles of Evian, and crisps (‘I eat and I drink whenever I want to, wherever I want to; it’s no business of yours’); telephoning on the trot, isolated in crowds, and crowded in their isolation. They look defiant rather than compliant; distant. See the clothes inspired by every style in the world and by none at all. All of which is saying, ‘make of it what you will, but don’t try to rule it. I’ll do as I please. I’m as good as you’.
Dissociation goes farther than the weakening and dissolution of social links. It colours our view of our place in nature. If we are all there is, the final arbiters – encompassing, not coexisting with, all reality; if we inflict such damage on the world as to make it unpredictable; if we threaten to become the largest untamed living beings; if we consider there to be no limits to our powers to define and ‘create’ nature through genetic and other engineering, and our individual fate is only loosely connected with that nature; then the world becomes a lonely place, a vortex of meaninglessness, a source of anguish and of personal dislocation, in which the mind, in weariness or leisure, recurs constantly to its obsessions.
The void in subjectivity and self, and the widening gap between ever-more demanding standards of achievement and improving but increasingly inadequate accomplishments, between ‘achievers’ and others, is teeming with illness and unhappiness. Stress is undoubtedly more pervasive and acute for today’s Market Beings than it was for people in precapitalist times. The kinds of stress that we share with them – due to natural disasters, poverty, bereavement – are much less easily dissipated through ritual, religious or other forms of public expression, and the individual is also less protected by community, family, kinship or neighbourhood group. In other words, the stressful life-events might be similar – and even attenuated – but the individual is more exposed to them.
A standard objective measure of stress is the presence of coronary heart disease (CHD). A sudden heart seizure in an apparently healthy person was first reported in the US only in the 1920s. Heart attacks were not even mentioned in most medical textbooks until after World War II. Now cardiovascular diseases cause 30 percent of all deaths worldwide. Although some of the increase might reflect changes in reporting procedures and conventions (which bias diagnosis towards a talked-about disease), the real increase has nonetheless been dramatic, despite new drugs, closer monitoring, and attempts at prevention through changes in lifestyle. Accompanying the increase has been a drop in the social level of patients.
Another objective indicator of stress is obesity, which is spreading in epidemic proportions. In Britain in 1998 21 percent of women and 17 percent of men were classed as sufficiently overweight to seriously endanger their health – (compared with 8 percent and 6 percent respectively in 1980), and the proportion of overweight people rose by more than one-sixth to 54 percent of men and 45 percent of women. [1] In the US, more than half of adults are overweight and over three-tenths clinically obese – only arthritis, high blood pressure and diabetes affect more people, and only cigarette smoking causes more deaths.
The most obvious and dramatic of the objective indicators of stress are mental illness and madness. Difficult as it is to plot their epidemiological history (for reasons of definition, diagnosis, official policy and coverage, and for reasons to do with the late emergence of psychiatry as a distinct profession with lobbying power), they are attaining a progressively greater prominence: mental disorders were not included in the World Health Organization’s International Classification of Disease until its 6th Revision in 1948, when they accounted for 10 pages out of 187 (5 percent). The 7th Revision (1957) also had 10 pages devoted to mental disorders (slightly less as a share of the total); the 8th (1967) had 14 pages (3 percent), the 9th (1977) 38 pages (7.4 percent) and the 10th (1992) 76 pages (6.2 percent). By then, the WHO thought it necessary to publish a separate 362-page volume on Mental and Behavioral Disorders, one third as long as the main book. The WHO predicts that clinical depression will be the leading cause of disability and the second leading contributor to the global burden of disease, accounting for 15 percent of the total, by 2020. [2] And British insurers have announced that stress, anxiety and depression have overtaken back problems as the main causes of long-term absenteeism from work.
Beyond mental illness lies madness – a profound, prolonged inability (including refusal) to know and deal in a rational and autonomous way with oneself and one’s social and physical environment. It is unreason, disorder, a condition which the ruling culture of the time finds difficult, for whatever reason, to engage with positively. Modern market society is particularly conducive to madness, or so it seems. In its first phase a large population of misfits found themselves hauled into the Gulag-cum-madhouse of the time. One-tenth of all arrests made in Paris from the mid-seventeenth century to the end of the eighteenth concerned ‘the insane’. As time progressed, those who could not, or would not, adapt to the new system and its work ethic were separated out semi-permanently so that they became, by the nineteenth century, a confined population. Madness became distilled, concentrated, not part of everyday life. It was segregated physically behind bars, morally and emotionally beyond the realm of meaning and control, a demon to be avoided except as an object of fascinated horror.
In time the frontier of madness drew back, but never again to its pre-market society lines, most dramatically from the middle of the last century. Until then the number of involuntary inmates in mental hospitals in the US rose steadily from 183 per hundred thousand population in 1904 to peak at 409 in 1945, putting severe strains on medical budgets. The trend changed direction in the second half of the century when funding was centralized and psychotropic drugs, primarily chlorpromazine (Thorozine), that tamed the recipients (or made them tractable), became available. Far cheaper than bricks and mortar, and cheaper to administer, they paved the way for a privatization and individualization of control. Numbers in institutions fell steeply – to 380 per hundred thousand in 1956 and 194 in 1970. All but the criminally insane were let loose into ‘the community’ – an effective de-institutionalization rate of 91.3 percent from 1955 to 1994. [3] In England and Wales, prescriptions of anti-depressants rose from 27 per thousand men patients in 1994 to 33.3 in 1996 (19.2 to 71.3 for women patients).
Madness is defined with ever-greater fastidiousness: there is mental retardation (physiological impairment); personality disorders (the disposition to behave in abnormal ways present continuously from early adult life); mental disorders (abnormalities of behaviour with a recognizable onset); adjustment disorders (in response to stressful, changed circumstances); and ‘other’ disorders (significantly, abnormalities of sexual preference, drug dependence). The perimeters of madness are under constant review and adjustment, fought over with bitterness and tenacity – it was only after a postal ballot held in 1975 that homosexuality was removed from the American Psychiatric Association’s list of mental illnesses.
At the same time madness is becoming more ambiguous. A realm of conditional sanity is opening up, pacified by the psychotropic bombs that subdue, cow and curtail. There are two issues here. One is of definition: whether to consider the neurotic and, especially, the chemically-constrained, mad or sane. In terms of the culture’s ability to make them function in some sort of concert with the majority, they are sane, if only conditionally. In this sense, psychotropic drugs are a shining example of modern medical culture, enabling people with a culturally-defined deficit to live and function adequately without becoming ill. They also illustrate the supreme vulnerability of a society where normalcy is dependent on the existence of an intricate logistical system to supply and administer the means of maintaining it to a large, and growing, clientele scattered amongst the general, not quite so mad, population.
The other issue is of scope: in the heartlands of the modern world the number of the potentially-mad, as defined by the culture, is growing, but being held down. In the world at large, it is exploding: mental disorders accounted for 10 percent of the world’s disease burden in 1999 (as measured by years of healthy life lost), and are expected by the World Health Organization to account for 15 percent by 2020. [4] In rich countries the cost per unit of madness, the cost of ‘treatment’ or prevention or control per ‘mental patient’, has come down. But world spending on madness, the madness market as it were, has not dipped at all.
Monitoring is sketchy. The social infrastructure is poor. The distinctions between personal and institutional intractability, and between political and subcultural, intractability, are blurred. When, on 14 February 1989, Ayatollah Khomeini, then spiritual leader of Iran, issued a fatwa calling on all Muslims to kill Salman Rushdie and his publishers for The Satanic Verses, who was criminally insane? Iran’s ruling circles knew it was Rushdie. Many others thought differently ...
Another view of stress is subjective: people who feel their ability to cope with normal life sufficiently impaired for them to disrupt its daily routine. They include those who seek medical advice and consume medication, a larger number who believe themselves to be ill without having recourse to a physician. The number of times people sought professional care for illness more than doubled in the USA last century, from an average of under once a year in the 1920s to more than twice a year in the late 1980s. In Britain the number of consultations with a National Health Service general practitioner rose by half between 1975 and 1996, from 196 million to 294 million. [5] Even more significant is the growing recourse to complementary and alternative medicine (CAM), which is more purely voluntary since it is not tied to the requirements of employers, insurance companies, state agencies, police authorities and so on. In 1997 42.1 percent of Americans resorted to CAM, compared with 33.8 percent in 1990 (an increase of a quarter in just seven years). [6]
Beyond the host of people who seek help publicly and formally there are those, almost certainly greater in number, who do so hidden from view, as part of their normal round, and the even larger number who cope under strain. The ultimate effect of strain is, of course, death. And nowhere is karoshi (death as a result of overwork) more closely monitored than in Japan where, on some lawyers’ estimates, it claims between 10,000 and 30,000 victims a year. In Russia a sharp rise in the number of deaths following the collapse of the Soviet regime is attributed not, as might be expected, to simple poverty or the increasingly parlous state of the post-Soviet health service, but to a ‘psycho-social crisis’ due to greatly rising insecurity. Life expectancy for men at birth was 15 years less in 1995 than in 1983, before the transition, as 20–65-year-olds succumbed to a complex cocktail of homicide, suicide, accidents, falls, burns, drownings and other symptoms of psychological stress and alcoholic bingeing. [7]
There is a huge and growing market for psychotherapy, ranging from psychoanalysis (an intricate, demanding craft), to its simplified, routinized, modular, gadget-ridden but more-easily-marketed derivations – an industry dedicated to reassembling or constructing the self. A lot of it provides little more than comfort. Some of it addresses the underlying condition, and sometimes succeeds in self-realization or (re)assembly. The growth of psychotherapy from the day in 1896, when Freud first hung out his shingle in Vienna, has been remarkable, and one which shows no signs of abating. One survey dating from 1959 listed thirty-six different kinds. Less than twenty years later, in 1977, no fewer than 200 conceptually distinct psychotherapies were noted. The growth in professional activity has been far greater than the growth in the activity as such. Psychological and emotional cures and comforting have always been available, part of normal social intercourse. That is still the case, within functioning families and friendship networks. But it has been extended and partially supplanted by external, professional, marketed versions.
Embracing both the subjective and objective indicators of stress are the epidemics of hysteria that sweep ever more frequently through the market system’s heartlands – from belief in alien abduction to satanic ritual abuse – a ‘cultural symptom of anxiety and stress’ in the words Elaine Showalter, the phenomenon’s theorist and historian. [8]
‘I’m sick and can’t work’ often stands in for ‘I can’t work and so I’m sick’. Absenteeism from work is a mass phenomenon: in Western Europe alone it embraces around 14 percent (one in seven) of the total number of people not at work for whatever reason. In some countries declared illness accounts for astonishingly high proportions: 60 percent of time lost in the US, 43 percent in Belgium, 35 percent in Germany, 34 percent in Ireland and Portugal, 31 percent in Italy and Holland, 28 percent in France, 25 percent in Canada, 20 percent in the UK and Spain. In the early nineties days lost to sickness in Western Europe averaged 80–90 times the number lost in labour disputes. These figures do not constitute cast-iron evidence: different states define absenteeism and its causes differently and different social security laws and regulations induce different behaviour on the part of workers. The figures also do not fully account for fake or phantom illnesses, logged in order to take up unused entitlements to paid sick leave, to join a public rally, to pursue romance or revenge, to care for children or others, or to slop about and enjoy simply being. There are many, many reasons to avoid work, even in a society that ultimately punishes non-work. On the other hand the figures do not allow either for those who turn up to work feeling ill, but bound by duty, fear, solidarity, circumspection or pressing material need.
If early retirement is considered a form of absenteeism, as surely it must be, when not imposed by mass firing – a meta-absenteeism if you like – the evidence is striking. Labour-force participation of men aged 60–64 has declined from over 80 percent in most rich countries in 1960 to 50 percent in America and below 35 percent in Germany, Italy and France. In Britain eight out of ten employees retire before reaching the normal pension age as defined by their pension-scheme rules, two-thirds before the age of sixty. By contrast, late retirement – after normal pension age – is negligible, even in countries with no compulsory retirement age. [9]
Then there is drug taking. It combines a number of functions, some of them mutually contradictory. On one view it is a promoter of connectedness – a social activity, a pipe of peace, a shared joint, which lowers barriers, removes or reduces inhibition, lubricates a common experience. On another view it is precisely the opposite – an escape from connectedness, a substitute for sociability, a ‘solitary vice’, an avoidance of taking responsibility for one’s own life. On both views it relieves stress. Drug-taking of all sorts is rising. Of course drugs do not enter mouths or veins unassisted. They are pushed in many ways – through advertising (legal drugs) and through personal inducement and example (illegals). There are large numbers of people with a stake in extending the market for each and every one. But that goes for all of them. The important point is that the drugs most indicative of social detachment – which, incidentally, are the most restricted in the marketing methods available to them – are the ones that are spreading fastest.
The supreme subjective indicator of stress is suicide. It lies at one extreme of a range of actions that start with unconscious self-harming behaviour like incautious cutting of vegetables, jay-walking or swimming out of depth, and progress through ‘indirect self-poisoning’ through over-eating or wrong-eating to deliberate self-harm such as impulsive overdosing without the preparation (of dosage, secrecy) that would ensure ‘success’, to, finally, the successful denouement. It is mostly committed by men, sometimes suffering from a psychiatric disorder, often under great strain from incompatible roles or duties or expectations. Not all suicides are a cry of despair or for help – they can be a response to unbearable physical pain, a fulfilment of a sense of public duty, a quest for martyrdom, or whatever. Significantly, urban suicide is higher in areas with relatively high boarding-house populations, immigrants and divorcees, and in the spring and early summer, when more people are flaunting their attractions and thereby underlining the suicide’s isolation.
It is not easy to gauge the prevalence of real suicide – that is, ‘completed’, successful suicide. It is usually planned carefully, with precautions taken against discovery, and often camouflaged so that the death appears to be accidental (traffic accidents, falls, drownings). Recording is haphazard; coroners’ courts in England and their equivalents elsewhere operate under different regimes of stringency for evidence; doctors and officials often connive with families to shift the cause of death to morally safer (and insurance-covered) ground. Changes in reporting practice compound the difficulties. In many countries the true figure, as attested by psychiatrists is way above the official figure. Suspect as the figures are, the number of recorded suicides is rising steadily, increasing by 122 percent between the early-to-mid 1950s and the mid-to-late 1990s in the twenty-nine countries, comprising a seventh of the world’s population, for which comparisons could be made. [10]
Confused, with a damaged sense of self and ragged connectedness with others, it is not surprising that Market Being is ill, uncertain, stressed, inadequate and in need of comfort. At any one time a quarter of the world’s population exhibit signs of physical morbidity and mental distress. And the proportion is growing.
Although not all of it can be attributed directly to market society, the coincidence of its spread and deepening, and the increasing prevalence of illness and distress, is remarkable. Recorded spending on health care is on the up-and-up: from an (unweighted) average 3.8 percent of GDP in 1960 to 8.5 percent for the same eighteen rich countries in 1998; and from 115 international (PPP) dollars a head to 1,974 dollars in the same years. It rose from 5.1 to 13.6 percent of GDP in the US, from 3.0 to 7.6 in Japan, 6.3 (1970) to 10.6 in Germany, 4.2 to 9.6 in France, 3.6 to 8.4 in Italy, and 3.9 to 6.7 in the UK. Spending per head rose eighty-seven times in Spain in the same years, seventy times in Japan, fifty-three times in Norway, thirty-six times in Italy, thirty-one times in Switzerland, twenty-nine times in the US and France, and twenty times in Britain.
Saturation medication is becoming general: more than half of all adults and nearly one-third of children in Britain take some form of medication every day. In Britain and the US there are as many renewal prescriptions for psychotropic drugs as there are inhabitants. In England the number of medical prescriptions rose from 6.7 per head in 1982 to 9.7 in 1995.
Yet people are not noticeably healthier. In the mid-1970s, the life expectancy of a person over 60 in France was just two years higher than in 1900. For men it remained constant. While life expectancy in the US and Britain has been rising, the proportion of healthy people (alive and not diseased or disabled) has declined – from 81.8 percent to 79.2 percent for men in the US between 1970 and 1980 (from 81 percent to 77.8 percent for women); and from 83.1 to 81.8 percent for men, 81.1 to 79.2 percent for women, 1976 to 1985, in the UK.
In France the mortality rate for fifteen-to-twenty-year-olds is rising at 2 percent per year. In all rich countries, for men aged forty to fifty it rose from the mid-1960s to the mid-1970s. For British workers over fifty it is higher than during the 1930s. In some countries (and not only the ‘usual suspects’ of the former Soviet Union and AIDS-afflicted sub-Saharan Africa) life expectancy at birth has stopped increasing or even entered a decline.
There are several ways to understand the apparent diminishing returns to medical intervention which these statistics suggest. One is that the underlying morbidity of the population is rising faster than medicalization in all its forms (drugs, treatment, hospitalization), because of a deterioration in social circumstances or a decline in human quality, or both. It might be difficult for readers in rich countries to grasp the fact that most people are getting poorer as their environment deteriorates. Since the agents of health and disease are largely social (between 80 and 90 percent of the differences in life expectancy across the world can be explained by the presence of clean drinking water and of literacy), it is understandable that health suffers. In addition, countless studies show that city life is debilitating psychologically, and that migration contributes greatly to psychopathology.
A deterioration also seems to be occurring in the human stock. Modern genetics has highlighted the ubiquity of inborn disease. In Britain about one child in thirty is born with a genetic defect of some kind. Over a third of registered blind people are blind for genetic reasons and more than a half of all cases of severe mental handicap have an inherited cause. If diseases that have an inherited component, such as cancer or heart disease, are included, two thirds of the population will suffer from, and possibly die of, a genetic disease. Advances in medical and public health techniques have permitted debilitating mutations to accumulate through the generations, without being cleansed by natural selection through death in infancy and childhood.
Another way of interpreting the apparent diminishing returns to medical intervention, particularly in the rich countries, is to say that the refinement of medical treatment – its fragmentation into ever-narrower specialisms, its reductionism, its resistance to participation by the patient, the support it lends to the loss of self and to medical dependency – has made it less effective in securing well-being. And it is a fact that a fifth of the patients in a typical research hospital in the US acquire a doctor-induced illness in the course of their treatment. But at a certain point these distinctions divert attention from a more fundamental and obvious truth: it is in the nature of market society to blight a large proportion of its people with mens insana in corpore insano.
This essay is drawn largely from my book, The Presence of the Future: the costs of capitalism and the transition to ecological society, forthcoming 2002.
1. Report from UK National Audit Office, reviewed in The Financial Times, 15 February 2001.
2. See http://www.who.int/whosis.
3. E. Fuller Torrey, Out of the Shadows, Confronting America’s Mental Illness Crisis, New York, etc.: John Wiley & Sons, 1977, pp. 206–7.
4. See http://www.who.int/whosis.
5. UK Department of Health, General Medical Services Statistics, England and Wales, 1994, National FSA, Table 4.20.
6. House of Lords, Select Committee on Science and Technology, Complementary and Alternative Medicine, London: The Stationery Office, 2000, Table 2, p. 14.
7. Helen Epstein, Time of Indifference, The New York Review, 12 April 2001, pp. 36–7.
8. Elaine Showalter, Hystories: Hysterical Epidemics and Modern Culture, London: Picador, 1997.
9. World Resources Institute, UN Environment Program, UN Development Program, World Bank, 1998–99 World Resources, a Guide to the Global Environment, New York and Oxford: Oxford University Press, 1998, data table &.2, pp. 246–7.
10. UN Demographic Yearbook, 1955 (Table 29B), 1966 (Table 20) and 1998 (Table 21).
Last updated on 26 May 2021