
On Saturday I was part of Cape Town’s SlutWalk. A local manifestation of a global movement which emerged in response to a Toronto policeman’s daft comments about rape and women’s ‘slutty’ choice of clothes in January this year, Cape Town’s SlutWalk was a resounding success. It was the most fun, friendly, and good natured march I’ve ever been on. According to the Mail & Guardian and – hurrah! – the Washington Post, about 2,000 people marched from Prestwich Memorial to Green Point stadium. I really was impressed by the numbers of men there, and by the range of ages represented by the marchers. (This is my report for FeministsSA.)

The posters were brilliant, and people came dressed in ball gowns, angel wings, bunny ears, leotards, jeans and t-shirts, fishnets and thigh-high boots, and (almost) nothing at all. In many ways, it was a typically Capetonian event: we gathered outside hip Truth Coffee beforehand, and the march began half an hour late. It was also overwhelmingly middle-class and, really, for an anti-rape protest to make any sense in Cape Town, it should have been in Khayelitsha or Manenberg.
But I don’t want to detract from the success of the event. In particular, I hope that it’ll prove to be the basis for a campaign against street harassment. SlutWalk is, inadvertently, a protest against the constant low-level harassment of women in public spaces. I was, though, deeply unsettled by the vitriol aimed at SlutWalk when it was announced that South African marches were in the offing. Commentators on SlutWalk Cape Town’s Facebook page accused the organisers of being irresponsible, stupid, and of contributing to – rather than solving – the problem of victim blaming.

If anything, those remarks demonstrated the extent to which women are still held responsible for rape. One particularly unpleasant contributor insisted that only one per cent of all reported rapes are ‘genuine’ – the rest, he alleged, are simply made up by women. What many of these angry men (and they were mainly men) had in common was a fear of a group of scantily-clad women marching together in public: a belief that the amount of naked flesh on display would have – alas undefined – catastrophic ramifications for the women on the march.
Another commentator explained that she opposed the event because she prefers women to ‘have a little mystery’ about them. Unfortunately, she didn’t specify if this was to be achieved by wearing false moustaches, speaking in strange foreign accents, or investing in trench coats.
Women’s bodies, argue the anti-Slutwalk brigade, need to be covered and contained. Because female nakedness is usually sexualised, it’s seen as excessive, dangerous, and disruptive. Clothing is, then, one way of controlling women in patriarchal societies. We are told to cover ourselves up for our own good – because our bodies exercise too powerful an influence over terminally suggestible, weak-willed men.
Food is another means of exercising control over women. As I’ve written in the past, the current vogue for cupcakes is partly the product of the fact that they are the acceptable face of feminine eating: they’re small, childlike (indeed, they’re children’s party food), and pretty – like the women who are supposed to eat them. (I should like to add, for the record, that after SlutWalk, my friends and I picnicked and feasted on cheesecake, samoosas, egg sandwiches, naartjies, as well as breast-shaped cupcakes.)

This link between women’s diet and the control of their bodies can be traced to the eighteenth century. A few weeks ago, I mentioned the influential Enlightenment physician George Cheyne (1671-1743), whose writing on health and eating was not only extraordinarily popular among the English upper classes, but was also partly responsible for a shift in the understanding of the ideal physical form during the 1750s. Partly as a result of Cheyne’s own obesity, he associated excess flesh with excessive behaviour and a kind of moral laxity. Whereas before, fleshiness had been a sign of good health, increasingly slimness was associated with physical and moral health, strength, and beauty.
Cheyne’s audience and the patients whom he treated at his fashionable practice in even more fashionable Bath, were primarily female. In a society where eating meat had long been associated with masculinity – and this had even deeper roots in the ancient humoral system which associated meat and spicy food with the blood, the most ‘manly’ of the four humors – Cheyne advocated the renunciation of all meat, and the adoption of a dairy-rich, vegetarian diet. Men, in other words, needed to eat like women.

During this period, the female body was slowly being reconceptualised as being more delicate – more easily upset – than the male body, and also ruled by the unpredictable emotions, rather than the rational, sober intellect. Although gendered, this emotions-intellect binary did not necessarily privilege the one over the other: the Romantic cult of sensibility celebrated the emotional and irrational, for example. But male and female bodies – or, more accurately, middle-class male and female bodies – needed to be fed differently.
Cheyne was unusual in his implacable opposition to meat-eating, but he and other physicians were united in the belief that a moderate diet was essential for good health – and this was particularly important for women. Cheyne became interested in the ‘nervous’ complaints which seemed to plague his female patients, and connected their diet to their psychological well-being. Essentially, the less women ate, the better. Anita Guerrini explains:
Cheyne’s audience, the aristocracy and new merchant class that frequented Bath, was also the audience for William Law’s exhortations in his popular devotional work A Serious Call (1728). He provided contrasting models of female character in the ‘maiden sisters’ Flavia and Miranda, who ‘have each of them two hundred pounds a year,’ a comfortable middle-class income. While Flavia spent her income on clothes, luxurious foods, sweetmeats, and entertainment, the ascetic Miranda ate only enough to keep herself alive and spent her income on charity. Miranda, said Law, ‘will never have her eyes swell with fatness, or pant under a heavy load of flesh;’ such excess flesh was not only morally depraved, it was physically disgusting. Cheyne’s patients, like the doctor himself, grew in spirit as they wasted in flesh.
During the 1720s, Catherine, the adolescent daughter of British Prime Minister Robert Walpole, was referred to Cheyne because of his specialisation in nutrition and nervous diseases. She suffered from loss of appetite, fainting, and chronic pain, and died in 1722 aged eighteen. Cheyne tried his best to treat her, but could not find a way of making her eat more.

This association of femininity – of physical and moral beauty – and not eating persisted into the nineteenth century and, I would suggest, into the present. Even though we have records which indicate that people, and particularly young women, have purposefully starved themselves to death since the Middle Ages and usually for religious reasons, anorexia nervosa was isolated as a specific ailment by William Withey Gull (1816-1890) in a paper he presented to the Clinical Society of London on 24 October 1873. He argued that this ‘peculiar form of disease occurring mostly in young women, and characterised by extreme emaciation’ was not a symptom of the catch-all feminine disorder ‘hysteria’, but a separate condition with its own symptoms and treatment.
As Joan Jacobs Brumberg notes, this identification of anorexia nervosa occurred within a wider cultural concern about the phenomenon of ‘fasting girls’: young, adolescent women who denied themselves food on religious grounds. Sarah Jacob from Wales claimed that her piety was such that she was able to live without eating.
Some British doctors regarded Sarah Jacob’s claim to total abstinence as a simple fraud and, therefore, an affront to science… Consequently, they called for a watch, with empirical standards, which deprived the girl of all food and, not surprisingly, killed her within 10 days because she was already severely undernourished. Some British doctors attributed Sarah Jacob’s condition to girlhood hysteria, provoked by religious enthusiasm and her celebrity status.
In other words, girls’ decision to starve themselves moved from the realm of religion or mysticism, to science and medicine. It was a disorder which could be described and treated. For example, the French psychiatrist Charles Lasegue (1816-1883) suggested that anorexia should be treated by examining the dynamics of middle-class family class. He
noted the difficult relation between anorectics and their parents but went on to elaborate how the girl obsessively pursued a peculiar and inadequate diet-such as pickled cucumbers in cafe au lait – despite the threats and entreaties of her anxious parents. ‘The family has but two methods at its service which it always exhausts,’ he wrote, ‘entreaties and menaces …. The delicacies of the table are multiplied in the hope of stimulating the appetite, but the more solicitude increases the more the appetite diminishes’.
This shift was due to the increasing medicalisation of the body, and also the secularisation of public life. By the 1870s, doctors exercised the same – or even more – authority as ministers. But what had not changed over the course of eighteenth and nineteenth centuries was the association of femininity with eating very little.

Anorexia is caused by a range of factors, but the connection of ideal femininities with eating a restricted diet only exacerbates the condition. As rape isn’t really about sex, so anorexia isn’t entirely about food: it’s a manifestation of (mainly, but not exclusively) women’s attempts to exercise control over their circumstances through their bodies. Because of the wider, cultural approval of feminine thinness and not eating, these starving young women receive a kind of affirmation for their self-denial.
It’s easy to talk glibly about encouraging a ‘positive attitude’ towards food and eating. We can only achieve this when we acknowledge that women’s bodies are still perceived as dangerous – as needing to be contained by their clothes, kept pure by a range of hygiene products, and made small through dieting and exercise. This is why we still need feminism. In South Africa – where the ANC Women’s League and Lulu Xingwana‘s Department of Women, Children, and Disabled Persons have shown a singular lack of enthusiasm for leading a feminist movement – I hope that SlutWalk represents the beginnings of a new, stronger feminism.

Further Reading
Texts cited here:
Joan Jacobs Brumberg, ‘“Fasting Girls”: Reflections on Writing the History of Anorexia Nervosa,’ Monographs of the Society for Research in Child Development, vol. 50, no. 4/5, History and Research in Child Development (1985), pp. 93-104.
Anne Charlton, ‘Catherine Walpole (1703-22), an Eighteenth-Century Teenaged Patient: A Case Study from the Letters of the Physician George Cheyne (1671 or 73-1743),’ Journal of Medical Biography, vol. 18, no. 2 (May 2010), pp. 108-114.
Anita Guerrini, ‘The Hungry Soul: George Cheyne and the Construction of Femininity,’ Eighteenth-Century Studies, vol. 32, no. 3, Constructions of Femininity (Spring, 1999), pp. 279-291.
Erin O’Connor, ‘Pictures of Health: Medical Photography and the Emergence of Anorexia Nervosa,’ Journal of the History of Sexuality, vol. 5, no. 4 (Apr., 1995), pp. 535-572.
Roy Porter, Flesh in the Age of Reason: How the Enlightenment Transformed the Way We See Our Bodies and Souls (London: Penguin, [2003] 2004).
Martha J. Reineke, ‘“This Is My Body”: Reflections on Abjection, Anorexia, and Medieval Women Mystics,’ Journal of the American Academy of Religion, vol. 58, no. 2 (Summer, 1990), pp.245-265.
Edward Shorter, ‘The First Great Increase in Anorexia Nervosa,’ Journal of Social History, vol. 21, no. 1 (Autumn, 1987), pp. 69-96.
Other sources:
I. de Garine, Food, Diet, and Economic Change Past and Present (Leicester: Leicester University Press, 1993).
Sander L. Gilman, Fat: A Cultural History of Obesity (Cambridge: Polity, 2008).
Harvey A. Levenstein, ‘The Perils of Abundance: Food, Health, and Morality in American History,’ in Food: A Culinary History from Antiquity to the Present, eds. Jean-Louis Flandrin and Massimo Montanari, English ed. by Albert Sonnenfeld (New York: Columbia University Press, 1999), pp. 516-529.
Harvey A. Levenstein, Revolution at the Table: The Transformation of the American Diet (New York: Oxford University Press, 1988).
Susie Orbach, ‘Interpreting Starvation,’ in Consuming Passions: Food in the Age of Anxiety, eds. Sian Griffiths and Jennifer Wallace (Manchester: Mandolin, 1998), pp. 133-139.
Kerry Segrave, Obesity in America, 1850-1939: A History of Social Attitudes and Treatment (Jefferson, NC,: McFarlane, 2008).
Peter N. Stearns, Fat History: Bodies and Beauty in the Modern West (New York: New York University Press, 1997).
Doris Wit, Black Hunger: Food and the Politics of US Identity (New York and Oxford: Oxford University Press, 1999).
Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.
Feb 24
Brave New Food
The TV series which I most want to watch at the moment is Portlandia. Set in Portland, Oregon, it satirises and celebrates the city which originated the ur-hipster. It includes a scene in a restaurant – which I’ve watched only on youtube, alas – in which a couple questions their waitress about the provenance of the chicken on the menu. Assured that it’s free range, local, and organic – partly because their waitress provides them with its papers and name – they leave the restaurant to have a look at it:
This is hilarious because it so closely mimics reality: the menus which list the provenance of all the produce used in the restaurant; the farmers’ market stalls with photographs of happy animals pre-slaughter; the recipes which insist upon free-range, organic ingredients.
I laugh, but I’m as implicated in this hyper-sensitivity about where my food comes from, and how it was treated before it arrived on my plate. I don’t want to eat animals that suffered so that I can continue being an omnivore. I eat relatively little meat and am prepared to pay for free-range chicken, pork, and beef. (I’m not terribly fussed about it being ‘organic’ – whatever we may mean by that.)
It is a scandal how animals are treated in factory farms, and increasing demand for red meat is environmentally unsustainable. So how should we eat meat, without causing harm? If vegetarianism is as implicated in the meat economy – veal is a by-product of the dairy industry, for example – and veganism seems far too difficult, then one way out of this impasse is to consider synthetic alternatives.
I’ve been amused by the overwhelming response to reports about the apparent viability of lab-grown meat. ‘Eeew’ and ‘yuk’ seem to sum up how people feel about it. But lab-grown meat is only the most recent panacea to the world’s crisis produced by scientists – and our views on it say a great deal about our changing feelings about the relationship between food and technology.
The meat in question is being grown by Dr Mark Post at Maastricht University. He’s being funded by an anonymous donor who’s concerned about the greenhouse gas emissions produced by cattle farming. Using stem cells from cows, Post’s team have grown sheets of muscle between pieces of Velcro, which are shocked with an electric current to develop their texture and density:
Post hopes to produce a burger by October.
When I read the earliest reports about Post’s work, I thought immediately of a scene in Margaret Atwood’s Oryx and Crake, where the protagonist visits a lab which grows chicken breasts out of stem cells. This is a dystopian novel which plays on our suspicion of food grown in laboratories. It seems strange, now, for us to consider synthetic, artificial, man-made food to be superior to all that is ‘fresh’, ‘natural’ and ‘authentic’. But this is a relatively new way of thinking about food.
During the 1950s, a decade when science seemed to offer the possibility of a cleaner, healthier, and better organised world, there was a brief, but intense enthusiasm for Chlorella pyrenoidosa, a high-protein algae which grew rapidly and abundantly and was fed by sunlight and carbon dioxide.
The post-war baby boom gave rise to anxieties in the 1950s that the world would be unable to feed its growing population. Of course, we now know that innovations in agriculture during this period – including the wholesale mechanisation of farming, the increased use of pesticides, hormones, and antibiotics, and breeding high-yielding livestock – and the Green Revolution of the 1960s and 1970s produced the crops and farming methods which, at enormous environmental cost, still feed seven billion of us. But at the time, politicians worried that hungry nations would create a politically unstable world.
Algae looked like a sensible solution to the problem. Easy and cheap to grow, and apparently highly nutritious, this seemed to be the Brave New World of food production. Warren Belasco writes:
In today’s terms, chlorella was a superfood. Scientists fell over themselves in excitement: Scientific American and Science reported on it in glowing terms; the Rockefeller Foundation funded research into it; and some calculated that a plantation the size of Rhode Island was would be able to supply half the world’s daily protein requirements.
In the context of a mid-century enthusiasm for all that was efficient, systematic, and man-made, algae’s appeal was immediate: it was entirely usable and produced little or no waste; its farming was not dependent on variable weather and rainfall; it was clean and could be transformed into something that was optimally nutritious.
So why didn’t I have a chlorella burrito for supper?
Unfortunately, chlorella didn’t live up to the hype. Not only did the production of grains and soybeans increase exponentially during the 1950s, meaning that farmers were loath to switch to a new and untested crop, but further research revealed that chlorella production would be more complicated and expensive than initially envisaged. Growing chlorella in the quantities needed to be financially viable required expensive equipment, and it proved to be susceptible to changes in temperature. Harvesting and drying it was even more of headache.
On top of this, chlorella tasted terrible. There were some hopes that the American food industry might be able to transform bitter green chlorella into an enticing foodstuff – in much the same way they used additives and preservatives to manufacture the range of processed foods which bedecked the groaning supermarket shelves of 1950s America. Edible chlorella was not a world away from primula cheese.
Those who were less impressed by the food industry suggested that chlorella could be used to fortify bread and pasta – or even transformed into animal feed. But research demonstrated that heating chlorella destroyed most of its nutrients. Even one of its supporters called it ‘a nasty little green vegetable.’ By the 1960s, it was obvious that at $1,000 a ton, and inedible, chlorella was not going to be the food of the future.
All was not lost for chlorella, though. It proved to be surprisingly popular in Japan, where it is still sold as a nutritional supplement. The West’s enthusiasm for algae also hasn’t dimmed:
Ironically, the food that was supposed to feed the world is now the preserve of the wealthy, health-conscious middle classes – those who suffer most from the diseases of affluence – who can afford to buy small jars of powdered algae.
I hope that Post’s project manages to create a viable product which can be used to supplement people’s diets. I’m not particularly revolted by the idea of lab-grown meat, and if means that it reduces the numbers of factory farms, then that can only be a good thing.
What concerns me more are the potential motives of the businesses which would produce lab-grown meat. If it is taken up by the global food industry – which has patchy records on environmental sustainability and social responsibility – will we be able to trust them to provide us with meat which is healthy for us, and ethically produced?
Source
Warren Belasco, Meals to Come: A History of the Future of Food (Berkeley: University of California Press, 2006).