Skip to content

Posts from the ‘history’ Category

Brave New Food

The TV series which I most want to watch at the moment is Portlandia. Set in Portland, Oregon, it satirises and celebrates the city which originated the ur-hipster. It includes a scene in a restaurant – which I’ve watched only on youtube, alas – in which a couple questions their waitress about the provenance of the chicken on the menu. Assured that it’s free range, local, and organic – partly because their waitress provides them with its papers and name – they leave the restaurant to have a look at it:

This is hilarious because it so closely mimics reality: the menus which list the provenance of all the produce used in the restaurant; the farmers’ market stalls with photographs of happy animals pre-slaughter; the recipes which insist upon free-range, organic ingredients.

I laugh, but I’m as implicated in this hyper-sensitivity about where my food comes from, and how it was treated before it arrived on my plate. I don’t want to eat animals that suffered so that I can continue being an omnivore. I eat relatively little meat and am prepared to pay for free-range chicken, pork, and beef. (I’m not terribly fussed about it being ‘organic’ – whatever we may mean by that.)

It is a scandal how animals are treated in factory farms, and increasing demand for red meat is environmentally unsustainable. So how should we eat meat, without causing harm? If vegetarianism is as implicated in the meat economy – veal is a by-product of the dairy industry, for example – and veganism seems far too difficult, then one way out of this impasse is to consider synthetic alternatives.

I’ve been amused by the overwhelming response to reports about the apparent viability of lab-grown meat. ‘Eeew’ and ‘yuk’ seem to sum up how people feel about it. But lab-grown meat is only the most recent panacea to the world’s crisis produced by scientists – and our views on it say a great deal about our changing feelings about the relationship between food and technology.

The meat in question is being grown by Dr Mark Post at Maastricht University. He’s being funded by an anonymous donor who’s concerned about the greenhouse gas emissions produced by cattle farming. Using stem cells from cows, Post’s team have grown sheets of muscle between pieces of Velcro, which are shocked with an electric current to develop their texture and density:

Post said he could theoretically increase the number of burgers made from a single cow from 100 to 100m. ‘That means we could reduce the number of livestock we use by 1m,’ he said.

Meat grown in the laboratory could have several advantages, because its manufacture is controlled at each step. The tissue could be grown to produce high levels of healthy polyunsaturated fatty acids, or to have a particular texture.

He believes it will be a relatively simple matter to scale up the operation, since most of the technical obstacles have already been overcome. ‘I’d estimate that we could see mass production in another 10 to 20 years,’ he said.

Post hopes to produce a burger by October.

When I read the earliest reports about Post’s work, I thought immediately of a scene in Margaret Atwood’s Oryx and Crake, where the protagonist visits a lab which grows chicken breasts out of stem cells. This is a dystopian novel which plays on our suspicion of food grown in laboratories. It seems strange, now, for us to consider synthetic, artificial, man-made food to be superior to all that is ‘fresh’, ‘natural’ and ‘authentic’. But this is a relatively new way of thinking about food.

During the 1950s, a decade when science seemed to offer the possibility of a cleaner, healthier, and better organised world, there was a brief, but intense enthusiasm for Chlorella pyrenoidosa, a high-protein algae which grew rapidly and abundantly and was fed by sunlight and carbon dioxide.

The post-war baby boom gave rise to anxieties in the 1950s that the world would be unable to feed its growing population. Of course, we now know that innovations in agriculture during this period – including the wholesale mechanisation of farming, the increased use of pesticides, hormones, and antibiotics, and breeding high-yielding livestock – and the Green Revolution of the 1960s and 1970s produced the crops and farming methods which, at enormous environmental cost, still feed seven billion of us. But at the time, politicians worried that hungry nations would create a politically unstable world.

Algae looked like a sensible solution to the problem. Easy and cheap to grow, and apparently highly nutritious, this seemed to be the Brave New World of food production. Warren Belasco writes:

The alluring news came from pilot projects sponsored by the Carnegie Institution and conducted by the Stanford Research Institute in Menlo Park and by Arthur D. Little, Inc. in Cambridge. Initial results suggested that chlorella algae was an astounding photosynthetic superstar. When grown in optimal conditions – sunny, warm, shallow ponds fed by simple carbon dioxide – chlorella converted upwards of 20 per cent of solar energy…into a plant containing 50 per cent protein when dried. Unlike most plants, chlorella’s protein was ‘complete’, for it had the ten amino acids then considered essential, and it was also packed with calories, fat, and vitamins.

In today’s terms, chlorella was a superfood. Scientists fell over themselves in excitement: Scientific American and Science reported on it in glowing terms; the Rockefeller Foundation funded research into it; and some calculated that a plantation the size of Rhode Island was would be able to supply half the world’s daily protein requirements.

In the context of a mid-century enthusiasm for all that was efficient, systematic, and man-made, algae’s appeal was immediate: it was entirely usable and produced little or no waste; its farming was not dependent on variable weather and rainfall; it was clean and could be transformed into something that was optimally nutritious.

So why didn’t I have a chlorella burrito for supper?

Unfortunately, chlorella didn’t live up to the hype. Not only did the production of grains and soybeans increase exponentially during the 1950s, meaning that farmers were loath to switch to a new and untested crop, but further research revealed that chlorella production would be more complicated and expensive than initially envisaged. Growing chlorella in the quantities needed to be financially viable required expensive equipment, and it proved to be susceptible to changes in temperature. Harvesting and drying it was even more of headache.

On top of this, chlorella tasted terrible. There were some hopes that the American food industry might be able to transform bitter green chlorella into an enticing foodstuff – in much the same way they used additives and preservatives to manufacture the range of processed foods which bedecked the groaning supermarket shelves of 1950s America. Edible chlorella was not a world away from primula cheese.

Those who were less impressed by the food industry suggested that chlorella could be used to fortify bread and pasta – or even transformed into animal feed. But research demonstrated that heating chlorella destroyed most of its nutrients. Even one of its supporters called it ‘a nasty little green vegetable.’ By the 1960s, it was obvious that at $1,000 a ton, and inedible, chlorella was not going to be the food of the future.

All was not lost for chlorella, though. It proved to be surprisingly popular in Japan, where it is still sold as a nutritional supplement. The West’s enthusiasm for algae also hasn’t dimmed:

The discovery in the 1960s of the blue-green algae spirulina in the Saharan Lake Chad and in Mexico’s Lake Texcoco gave another boost to the health food uses of algae. Spirulina has a high-nutrient profile similar to chlorella’s but without…production problems….

Ironically, the food that was supposed to feed the world is now the preserve of the wealthy, health-conscious middle classes – those who suffer most from the diseases of affluence – who can afford to buy small jars of powdered algae.

I hope that Post’s project manages to create a viable product which can be used to supplement people’s diets. I’m not particularly revolted by the idea of lab-grown meat, and if means that it reduces the numbers of factory farms, then that can only be a good thing.

What concerns me more are the potential motives of the businesses which would produce lab-grown meat. If it is taken up by the global food industry – which has patchy records on environmental sustainability and social responsibility – will we be able to trust them to provide us with meat which is healthy for us, and ethically produced?

Source

Warren Belasco, Meals to Come: A History of the Future of Food (Berkeley: University of California Press, 2006).

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

A world in your coffee cup

My friend Elizabeth and I have breakfast together every Friday morning. For the past month or so, we’ve managed to eat at a different cafe each week – our only criteria being that they’re in central Cape Town and open early. This week we went to The Power and the Glory, a restaurant and club now irredeemably associated with the city’s burgeoning population of hipsters. But it serves an excellent breakfast. (More evidence that hipsters can serve breakfast well.) And it is – inadvertently – immensely entertaining. As I sat at a window, waiting for Elizabeth to arrive, a hipster customer arrived to buy a take-away coffee.

The scene was almost a parody of hipster-ness: hipster customer was wearing a high-waisted print skirt, brogues, and an elaborate tattoo; hipster waitress behind the serving counter was in a red vintage frock with a tousled pixie hairdo. Both were very pale, and very skinny. (I think we need a term to describe the extreme thinness of hipsters.) Hipster customer removed her hipster shades and asked for a cappuccino.

An awkward silence fell.

Hipster cafes don’t sell cappuccinos. They sell flat whites. Asking for a flat white is as much an indicator of hipster membership as a subscription to The Gentlewoman.

This left hipster waitress in a difficult position. Should she forgo her hipster principles for a moment, ignore the faux pas and order her customer a flat white? Or should she correct her? Was the hipster customer an influential hipster, and not worth insulting? Or was this the time to establish which of the pair was the real hipster?

The barrista, a beefy non-hipster who’d been watching this with some amusement, stepped in. ‘I think you mean a flat white,’ he said.

‘I do!’ said hipster customer.

And all was resolved.

Even if this hilarious moment of hipster awkwardness was so much of its time and place – it was at once typically Capetonian and typical of a particular sub-culture – the fact that it happened over a coffee, gives it almost a timeless quality.

Coffee is unusual in that it has managed to remain fashionable since its arrival in Europe at the beginning of the seventeenth century. Flat whites are only the most recent manifestation of cool coffee. They seem to have originated in Auckland in the late 80s, and differ from cappuccinos or lattes – the more familiar, Italianate forms of hot coffee-and-milk – in that the milk is heated until it’s thick and warm, rather than only frothy.

Flat whites arrived in London four or five years ago, with the opening of a series of small coffee shops in the cooler parts of east and central London by Kiwi expats. Chains like Costa and Starbucks have since added flat whites to their menus, but – as hipsters know – a flat white is defined as much as the cafe and person who makes it, as it is by its ratio of coffee to milk.

And that is the issue. Coffee is coffee, but we’ve come to associate particular meanings with the ways in which we prepare it: between someone who buys their coffee from Origin or Truth in Cape Town and another who only drinks instant, chicory-flavoured Ricoffy with UHT milk. (Which is, incidentally, my idea of culinary hell.) Both are forms of coffee, but they are socially and culturally miles apart. Studying shifting patterns in coffee fashion is fascinating in itself, but they become more interesting when we think of them within the complex networks of trade and finance which allow us to buy coffee at restaurants and in supermarkets.

The coffee craze in Europe in the seventeenth and eighteenth centuries contributed to a boom in the coffee trade. Coffee had been available since early 1600s, having been imported to Europe from Turkey via Venice. Mixed with milk and sugar, it became popular with the new European middle classes. It was associated with exotic sophistication – and also became a marker of intellectual adventurousness. It’s difficult to underestimate the extent to which drinking coffee and the culture and politics of the Enlightenment were entangled, as Anne EC McCants writes:

The expression ‘to break bread together’ now has an archaic feel to it. A proximate contemporary substitute, albeit devoid of the powerful religious significance of bread, is to ‘go out for a cup of coffee’, which is at least as much about conversation as it is about nourishment per se. Historians associate this total reorientation of the culture of food and drink with the substitution of coffeehouses for taverns; the wider dissemination of public news; trading on the stock exchange; new table etiquette and table wares; new arrangements of domestic and public space; the ability to sustain new industrial work schedules despite their tedium….

One of the best depictions of the appeal of the new, middle-class coffee culture is JS Bach’s Coffee Cantata (1732-1735), in which a ‘disobedient’ and ‘obstinate’ young woman’s addiction to coffee so annoys her father that he threatens not to allow her to marry, unless she gives up coffee. In the end she agrees, but – without her father knowing – resolves to include her clause in her marriage contract which stipulates that she must have a steady supply of coffee.

The first coffee house opened in Britain in 1650, and within a decade there were around 3,000 of them in London. These were places where men could meet to talk in relative freedom. In 1675, Charles II tried to close them in fear that coffee house patrons were plotting to overthrow him. (Given his father’s sticky end, a paranoia about the middle classes was always inevitable.) Monarchical and official suspicion of coffee houses never really ended, though. These were places where the free exchange of information allowed for the dissemination of the Enlightenment ideas that transformed the eighteenth-century world.

But trade was also changing this world. When the Dutch managed to get hold of coffee plants from Arab traders in 1690, they established plantations in Java, where they already cultivated a range of spices. The French began to grow coffee in the West Indies at the beginning of the eighteenth century, and over the course of the next hundred years or so, coffee was planted in West Africa and parts of Latin America.

The plantation system – in many ways the origins of modern capitalism – was dependent on slave labour. Europe’s taste for coffee was satisfied by slavery. But even after the abolition of slavery in the early and middle of the nineteenth century, European demand for coffee shaped the economies of countries very far away.

The domestication of coffee consumption in the nineteenth century – when women began to drink coffee, and more of it was served at home – caused demand to spike. Improvements in transport meant that coffee could be shipped over longer distances far quicker and in greater quantities than ever before. During the 1820s and 1830s, coffee cultivation became a way of linking the economies of newly-independent nations in Latin America, to global trade. Coffee production in Guatemala, Nicaragua, Costa Rica, and El Salvador increased exponentially, and governments introduced measures to facilitate the industry: new transport infrastructure, tax breaks for landowners, low or no export duties, and legislation to lower the cost of labour.

Plentiful land and cheap labour were secured by progressively disenfranchising Indian populations, whose right to own property and to work where they pleased was eroded by pro-plantation legislation. Uprisings against governments and landowners were stamped out – usually with the help of the military. The argument for increased coffee production just seemed so compelling. By the end of the nineteenth century, ninety per cent of the world’s coffee came from South America.

Brazil was the largest single Latin American supplier of coffee, and from 1906 onwards was the controller of the international coffee trade. The Brazilian government bought up beans, stockpiled them, and then released them into the market, thereby regulating the coffee price. European and North American countries encouraged African countries to begin cultivating coffee on a grander scale too.

African producers tended to grow Robusta coffee varieties, which are generally hardier, but less tasty, than the Arabica coffee produced in Latin America. This meant that when demand for instant coffee grew in the 1950s, coffee production in postcolonial African states, whose governments subsidised coffee farmers and facilitated the free movement of labour, flourished. The entry of African coffee growers into the world market meant that the price began to plummet – and the Kennedy administration in the US realised that this was an ideal opportunity for some Cold War quiet diplomacy.

The 1962 International Coffee Agreement was meant to stabilise Latin American economies and to immunise them against potential Soviet-backed revolutions by introducing production quotas for every major coffee producing nation. Even if the ICA did include African producers, it favoured the US and Brazil, effectively giving them veto rights on any policy decisions.

The collapse of the Agreement in the late eighties – partly as a result of the increased production of non-signatories, like Vietnam – caused a major decline in the price of coffee. For consumers and cafe owners, this was distinctly good thing: good coffee was cheaper than ever before. Coffee shops in the US, in particular, fuelled a demand for good, ‘real’, coffee.

But for Rwanda, the collapse of the international coffee price and the end of regulation had disastrous implications. In 1986 and 1987, Rwanda’s annual coffee sales more than halved. The government was bankrupted and increasingly dependent aid from international institutions including the World Bank, which demanded the privatisation of state enterprises, cuts in government spending, and trade liberalisation. (Hmmm – sound familiar?) The government could no longer fund social services and schools and hospitals closed. This exacerbated existing political tensions, and created a large unemployed population, many of whom became volunteers for the paramilitary groups which carried out the genocide in 1994.

It’s supremely ironic that Rwanda has turned – again – to coffee to pull itself out of the disaster of the nineties. This time, though, coffee is being produced in ways which are meant to be more sustainable – both ecologically and economically. There, though, problems with this. Isaac A. Kamola writes:

However, widely lauded ‘fair-trade’ coffee is not without its own contradictions. First, fair-trade coffee is an equally volatile market, with much of the additional price paid to growers dependent upon goodwill consumption. Such consumption patterns are highly vulnerable to economic fluctuations, changes in cultural and ethical patterns, education campaigns, and individual commitment. Furthermore, fair-trade coffee also faces an oversupply problem, with more fair-trade coffee being produced than there are consumers of it.

In Mexico, for instance, the current instability in the global food prices – caused partly by food speculation – is placing incredible pressure on small farmers who cultivate coffee: the fluctuating coffee price has shrunk their incomes at a time when maize has never been so expensive. And even prosperity brings problems. Kenyan coffee is of particularly good quality, and the increase in the coffee price has benefitted local farmers. It has also brought an increase in crime, as gangs steal coffee berries and smuggle them out of the country.

Demand abroad fuels coffee production in Africa, Latin America, and elsewhere. No other commodity demonstrates the connectedness of global patterns of consumption and production than coffee. As Kamola makes the point, we need to make this system fairer, but the fair-trade model still ensures that African farmers are dependent on demand abroad:

This does not mean that fair trade should be discouraged. It should be underscored, however, that reforms in First World consumption patterns are not alone sufficient to ensure the protection of people from the violent whims of neoliberal markets.

As much as coffee is associated with sophistication in the West – as much as it helped to facilitate the Enlightenment – it has also been the cause of incredible deprivation and suffering elsewhere. Invented in New Zealand, popularised in the UK, and made from Rwandan beans certified by the Fairtrade Foundation based in London, a flat white in Cape Town tells a global story.

Further Reading

Sources cited here:

Anne E.C. McCants, ‘Poor consumers as global consumers: the diffusion of tea and coffee drinking in the eighteenth century,’ Economic History Review, vol. 61, no. 1 (2008), pp. 172-200.

Isaac A. Kamola, ‘Coffee and Genocide,’ Transition, no. 99 (2008), pp. 54-72.

Dale Pendell, ‘Goatherds, Smugglers, and Revolutionaries: A History of Coffee,’ Whole Earth, (June 2002), pp.7-9.

Craig S. Revels, ‘Coffee in Nicaragua: Introduction and Expansion in the Nineteenth Century,’ Conference of Latin Americanist Geographers, vol. 26 (2000), pp. 17-28.

Other sources:

Joyce Appleby, The Relentless Revolution: A History of Capitalism (New York: WW Norton, [2010] 2011).

Merid W. Aregay, ‘The Early History of Ethiopia’s Coffee Trade and the Rise of Shawa,’ The Journal of African History, vol. 29, no. 1, Special Issue in Honour of Roland Oliver (1988), pp. 19-25.

Roy Love, ‘Coffee Crunch,’ Review of African Political Economy, vol. 26, no. 82, North Africa in Africa (Dec.,1999), pp. 503-508.

Sidney W. Mintz, Tasting Food, Tasting Freedom (Boston: Beacon Press, 1996).

Sidney W. Mintz, Sweetness and Power: The Place of Sugar in Modern History (New York: Penguin, 1985).

Stefano Ponte, ‘Behind the Coffee Crisis,’ Economic and Political Weekly, vol. 36, no. 46/47 (Nov. 24-30, 2001), pp. 4410-4417.

Wolfgang Schivelbusch, Tastes of Paradise: A Social History of Spices, Stimulants, and Intoxicants, trans. David Jacobson (New York: Random House, 1992).

James Walvin, Fruits of Empire: Exotic Produce and British Taste, 1660-1800 (Basingstoke and London: Macmillan, 1997).

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

Modernism, Postmodernism, Authenticism?

I’m not entirely sure what it says about me, but the first article I read in the Observer is always Jay Rayner’s restaurant review. (In fact, I started reading the Observer in high school because of Jay Rayner’s reviews – it came as a pleasant surprise that there was a really good newspaper organised around them.) Last week’s was on Viajante in Bethnal Green, which seems to specialise in a kind of sub-Adrià-esque complicated, miniaturised cuisine. Rayner was not impressed:

In its eagerness to be so very now and forward thinking, the food at Viajante manages at times to feel curiously dated; it recalls the first flush of Hestomania, when even he has moved on and is now cooking up big platefuls of heartiness at Dinner.

Modern techniques are great. They’re brilliant. If you want to cook my steak by banging it round the Large Hadron Collider, be my guest. Dehydrate my pig cheeks. Spherify my nuts. But only do so if the result tastes nicer. At Viajante deliciousness is too often forced to give way to cleverness.

Rayner’s point is that the modernist cooking presented by Viajante is beginning to feel old hat. Even if – as he’s admitted – restaurant critics are ‘rampant neophiliacs,’ it does seem that enthusiasm for the molecular gastronomy espoused most famously by Heston Blumenthal and Ferran Adrià has peaked. Or that, rather, it’s become so integrated into the repertoires of high-end chefs that it no longer seems to be so very experimental.

I was surprised when I first heard molecular gastronomy described as ‘modernist cuisine’ – a term now probably forever associated with Nathan Myhrvold and Chris Young’s five volume tome Modernist Cuisine: The Art and Science of Cooking. This was published last year – long after what most people would agree to be the end of literary and cultural modernism in the 1950s and 1960s. (I wonder how we should define the cuisine of the modernist movement during the early twentieth century? I tend to think of Virginia Woolf’s descriptions of feasts in To the Lighthouse and A Room of One’s Own.)

If anything, this should be postmodern cuisine. The purpose of molecular gastronomy is to reconsider the processes which underpin cooking: to understand them, and then reconfigure them. It’s all fairly similar to Derrida’s deconstruction – and Adrià has described his technique in precisely the same terms.

When I was in London at the end of last year, I went with a friend to the V&A’s exhibition, ‘Postmodernism: Style and Subversion, 1970-1990’. It was a strange exhibition: in an attempt to include all that could be considered postmodern in design and architecture, it had a scattergun approach as to what it included. It felt curiously empty – but I’m not sure if that’s the fault of the curator, or of the movement itself.

One of the oddest features of the exhibition was a strange preponderance of teapots. It was a pity that this was as far as the V&A got to thinking about postmodernism and food – because nouvelle cuisine, the food of the postmodern moment, was so design heavy. Even if the point of nouvelle cuisine was to liberate high-end cuisine from the heavy, meaty, and flour-based-sauce cooking of the 1960s and 1970s, it was also characterised by incredibly careful plating and presentation. In many ways, garnishes were as important as the food itself.

There are strong links, I think, between nouvelle cuisine and molecular gastronomy. Both disregard the orthodoxy established by classic French cooking and experiment with ideas and ingredients from other culinary traditions – best exemplified by the late 90s enthusiasm for ‘fusion food’, done well by Peter Gordon, done badly by legions of others – and the techniques of cooking itself. Other than the fact that molecular gastronomy is underpinned by the work of scientists Hervé This and Nicholas Kurti, it also differs from nouvelle cuisine in its playfulness – its refusal to take itself seriously, something which places it firmly within the postmodern moment. But, as Rayner suggests, it would seem that molecular gastronomy has had its day: Adrià has transformed El Bulli into a foundation, and Blumenthal is serving hearty, historical meals at Dinner.

Two years ago I taught an introduction to historiography at Goldsmiths in London, and was struck by how dated postmodern theory felt. When I studied it a decade ago – crucially, pre-9/11 – it seemed, even then, to be an exciting and useful way of understanding the world, particularly because of its emphasis on the relationship between language and power. I didn’t – and still don’t – agree with the critiques of history offered up by Hayden White and Keith Jenkins, but they were thought-provoking.

After the events of 11 September 2011, the War on Terror, the 2008 economic crash, and the Arab Spring, postmodernism appears even more the product of its time: of the prosperous, confident 1980s and 1990s, when the end of communism seemed to signal Francis Fukuyama’s end of history. I find it easier to take seriously the postmodernism and poststructuralism of the 1970s and earlier – when philosophers, linguists, and theorists were attempting to find a new way of thinking reality – partly by emphasising the extent to which narratives and discourses are contingent and rooted in their particular contexts. Jean-Francois Lyotard’s The Postmodern Condition (1979) is still an arrestingly original document.

This act of de-privileging dominant discourses – or indeed any discourse – has also been its undoing, as Edward Docx argues in a recent article for Prospect:

by removing all criteria, we are left with nothing but the market. The opposite of what postmodernism originally intended. … If we de-privilege all positions, we can assert no position, we cannot therefore participate in society or the collective and so, in effect, an aggressive postmodernism becomes, in the real world, indistinguishable from an odd species of inert conservatism.

So what follows postmodernism? Docx suggests that it is something he dubs ‘authenticism’. He explains:

we can detect this growing desire for authenticity all around us. We can see it in the specificity of the local food movement or the repeated use of the word ‘proper; on gastropub menus. We can hear it in the use of the word ‘legend’ as applied to anyone who has actually achieved something in the real world. … We can identify it in the way brands are trying to hold on to, or take up, an interest in ethics, or in a particular ethos. … Values are important once more…

…we can see a growing reverence and appreciation for the man or woman who can make objects well. We note a new celebration of meticulousness…. We uncover a new emphasis on design through making…. Gradually we hear more and more affirmation for those who can render expertly, the sculptor who can sculpt, the ceramist, the jeweller, even the novelist who can actually write.

It’s telling that the various manifestation of the new, global food movement – from Occupy Food to the hundreds of local campaigns for small-scale agriculture and unadulterated food – tend to refer to themselves as ‘real food’ (as opposed to Big Food – or the plastic, ‘Frankenstein’ food it produces).

This is a good way of understanding the recent trend in food – which Docx identifies – for the artisanal (whatever we may mean by that), the handmade, the local, the ‘old-fashioned’ (again, this is open to debate and redefinition), and the ethical. It says a great deal that the chef of the moment is René Redzepi, the Danish chef and owner of Noma, who sees himself as much as a cook as a food activist. This demand for ‘authentic’ food is, strange as it may seem, political: it’s a refusal to buy into the advertising and branding of the food industry, even if it’s an act that only a very small proportion of people can afford to do. But it’s a beginning, and a welcome one.

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

Milking It

This week the committee organising the 2012 Olympics in London caused widespread anger when it announced that breastfeeding mothers would have to buy an extra ticket to bring their babies into sports venues. Some venues have a few discounted tickets for children, but others don’t. One commentator posted on Mumsnet

that while she and her husband were lucky enough to get tickets to an equestrian event in August, organisers had told her there are no children’s tickets so she will have to pay £95 for a three-month old in a sling.

Those who can’t afford an extra ticket, or who lose out in the next round of ticket allocation, are advised to stay away. Unsurprisingly, Britain’s Equality and Human Rights Commission has suggested that this is potentially a case of ‘indirect sex discrimination’ because it will affect considerably more women than men.

This situation is ridiculous in so many ways. What angers me the most is that the Olympic committee took this decision in a country where the National Health Service advises that babies be breastfed exclusively for the first six months of life. The members of the committee seem either to think that women shouldn’t breastfeed in public – an irritating view about which I am going to be extraordinarily rude at some stage – or that mothers with babies have no desire to attend public events.

In the midst of the uproar, The Ecologist tweeted an article which it had published six years ago about the debate over whether women should breast- or bottle-feed their babies. It’s an argument that parents, doctors, and policy makers have been holding since at least the beginning of the twentieth century, and it’s to the credit of Pat Thomas that her piece provides a good overview of shifting attitudes towards infant feeding over the course of the past hundred years or so.

But it’s also a problematic piece of writing, and one which demonstrates particularly well why so many mothers feel bullied about how they decide to feed their babies. Thomas makes no attempt to hide her view that all mothers should breastfeed their children. She begins with a terrifying list of statistics:

The health consequences – twice the risk of dying in the first six weeks of life, five times the risk of gastroenteritis, twice the risk of developing eczema and diabetes and up to eight times the risk of developing lymphatic cancer – are staggering. With UK formula manufacturers spending around £20 per baby promoting this ‘baby junk food’, compared to the paltry 14 pence per baby the government spends promoting breastfeeding, can we ever hope to reverse the trend?

I’d love to know where she found these figures – particularly given her opening statement that women have breastfed for ‘nearly half a million years’. (How does she know this? Why the coy, qualifying ‘nearly’?) Thomas is, though, correct to point to the compelling evidence that breastfed babies tend to be healthier than those who are fed on formula, and that breastfed children may do better at school and have stronger immune systems. Also, there is a direct and proven link between the use of baby formula and high child mortality rates in the developing world.

She blames the slow decline of breastfeeding over the course of the twentieth century on the medicalization of childcare, and on the advertising strategies employed by formula companies – most notoriously Nestle. I have little to add to her second point, other that, broadly, I agree with her. The International Code of Marketing of Breastmilk Substitutes, a response to the Nestle Boycott of the late seventies, needs to be properly implemented. But her argument about the medicalization of women’s experiences of childbirth and childrearing is not entirely correct. She quotes Mary Renfrew from the Mother and Infant Research Unit at the University of York:

‘If you look at medical textbooks from the early part of the 20th century, you’ll find many quotes about making breastfeeding scientific and exact, and it’s out of these that you can see things beginning to fall apart.’ This falling apart, says Renfrew, is largely due to the fear and mistrust that science had of the natural process of breastfeeding.

In particular, the fact that a mother can put a baby on the breast and do something else while breastfeeding, and have the baby naturally come off the breast when it’s had enough, was seen as disorderly and inexact. The medical/scientific model replaced this natural situation with precise measurements – for instance, how many millilitres of milk a baby should ideally have at each sitting – which skewed the natural balance between mother and baby, and established bottlefeeding as a biological norm.

During the early years of twentieth century, global concern about high rates of child mortality animated a child welfare movement which aimed to improve the conditions in which children were raised. In Europe, North America, Australia, New Zealand, and parts of Africa and Latin America, medical professionals held up rational and scientific methods of feeding and caring for babies as the best means of eradicating the ‘ignorant’ practises which, many believed, caused babies to die. This new emphasis on hygiene, speedy medical intervention, and regular monitoring of babies’ development and health at clinics and hospitals did lower rates of morbidity – as did declining fertility rates, the control of infectious disease, economic prosperity, and increased attendance of school.

Doctors and specialists in the relatively new field of paediatrics were particularly interested in how babies were fed. Contrary to what Thomas suggests, the nineteenth-century orthodoxy that breastfeeding was the healthiest and best option for both mothers and babies lasted well into the 1940s. Innovations in artificial formulas provided mothers who couldn’t breastfeed – for whatever reason – with good alternatives, and doctors did recommend them. There were anxieties that malnourished mothers’ milk would not feed babies sufficiently, and doctors recommended ‘top ups’ with formula or other liquid.

The real difference between nineteenth- and twentieth-century attitudes towards breastfeeding was that it was increasingly controlled and patrolled by trained professionals. As Renfrew notes, mothers were told how much milk their babies needed at each feed, and there was a lot of debate in medical journals and in other professional forums about how and when babies should be fed.

The set of guidelines formulated by the incredibly influential, New Zealand-based Dr Truby King emphasised the importance of routine in feeding. King’s mothercraft movement – which established clinics and training centres around the British Empire during the first half of the twentieth century – taught mothers to feed ‘by the clock’. At five months, a baby was to be fed only five times per day – and at the same time every day – while one month-old babies had an extra, sixth feed.

Like many childcare professionals of the period, King believed that feeding on demand was not only unhealthy – it placed babies at risk of under- or overfeeding – but it was morally and intellectually damaging too. Babies who understood that crying would cause them to be fed would become spoilt, lazy children and adults. Indeed, this points to the infant welfare movement’s more general preoccupation with mothers and motherhood. As the interests of the state were seen, increasingly, as being linked to the proper rearing and education of children, the role of the mother grew in importance. King called his centres ‘shrines to motherhood’, for instance.

But the naturally fussy, over-cautious, and credulous mother was not to be trusted to follow her own instincts: authorities and professionals, who tended to be male, were to provide her with rational, scientific advice on raising her baby. It’s difficult to gauge mothers’ response to the information aimed at them. In her study of mothers in the United States in the 1920s and 1930s, Julia Grant concludes that mothers did heed childcare professionals, but modified their advice according to the views and experiences of their peers. Similarly, mothers in New Zealand took what they wanted from King’s pamphlets on childrearing.

Equally, mothercraft clinics and breastfeeding advice days were well attended by mothers and babies. Several mothercraft centres all over the world also included a dietetic wing, where nursing mothers could stay for up to a fortnight, learning how to breastfeed their babies. There, they would be taught how to breastfeed by the clock, and how to cope with mastitis and painful breasts and nipples. Wonderfully, hospital fees were means tested, so poor mothers could attend for free.

Throughout its existence, the Cape Town dietetic hospital never had an empty waiting list, and similar units in Britain, Australia, and New Zealand were as enthusiastically supported by women. Mothercraft seems to have been at its most successful when mothers could choose how and when they wanted to its advice and services.

While it’s true that the medicalization of breastfeeding transformed this act into a ‘science’ which needed to be re-taught to mothers – that it became possible to inform a mother that she was breastfeeding incorrectly – and that this was underpinned by misogynistic and eugenicist ideas around childhood, motherhood, and the nation, it is as true that mothers did respond positively to the advice provided by mothercraft and other organisations. Clearly, mothers wanted more advice about how to feed their babies – and that they altered it to suit their conditions and needs.

It’s for this reason that I think that Thomas is doing mothers a disservice. Encouraging more women to breastfeed needs to respect the fact that women’s choices about how to feed their babies are influenced by a variety of factors and considerations. Thomas – and other breastfeeding evangelicals – seems to buy into the same discourse of maternal irresponsibility as childcare professionals did in the early twentieth century: the belief that women somehow don’t really understand what’s best for their babies, and must be properly educated. Even if her – and others’ – motives are progressive and well-meaning, they still fail to take mothers seriously.

Further Reading

Sources cited here:

Rima D. Apple, Mothers and Medicine: A Social History of Infant Feeding, 1890-1950 (Madison: University of Wisconsin Press, 1987).

Linda Bryder, A Voice for Mothers: The Plunket Society and Infant Welfare 1907-2000 (Auckland: Auckland University Press, 2003).

Julia Grant, Raising Baby by the Book: The Education of American Mothers (New Haven and London: Yale University Press, 1998).

Philippa Mein Smith, Mothers and King Baby: Infant Survival and Welfare in an Imperial World: Australia 1880-1950 (Basingstoke: Macmillan, 1997).

Other sources:

Linda M. Blum, At the Breast: Ideologies of Breastfeeding and Motherhood in the Contemporary United States (Boston: Beacon Press, 1999).

Molly Ladd-Taylor, Mother-Work: Women, Child Welfare, and the State, 1890-1930 (Urbana and Chicago: University of Illinois Press, 1994).

Marilyn Yalom, A History of the Breast (New York: Ballantine Books, 1997).

Creative Commons License
Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

Real Revolutions

When Keenwa opened in Cape Town last year, much was made of the fact that it serves ‘authentic’ Peruvian food. I put ‘authentic’ in quotes partly because I’ve read far too much Derrida and Foucault, but mainly as a result of some scepticism. I doubt that any of the reviewers who’ve eaten at Keenwa have ever been to Peru, and there’s something odd about deciding how a varied and changing cuisine can be made ‘authentic’. The bobotie I cook has grated apple in it, but a friend’s doesn’t: which is more authentic? Neither, obviously.

I was thinking about this a month ago when I had supper with my friends Katherine and Ricardo in London. Ricardo is from Cuba, and cooked us a Cuban-themed dinner. The only Cuban food I’ve ever eaten was at Cuba Libre, a restaurant and tapas bar in Islington. It’s the kind of place which people recommend by saying ‘it’s not authentic, but….’ I haven’t the faintest idea if it’s authentic (whatever that may be), but it was certainly fun.

The food that Ricardo made showed up the problem with the mania for ‘authenticity’ particularly well. We had fried plantain, tortilla, and congrí. This is, to some extent, the kind of food his family would eat in Cuba, although because he and Katherine are vegetarians, we had tortilla instead of the usual, more meaty accompaniment to the meal (hurrah – I love tortilla), and the congrí was pork-free. It was delicious, but was it any less authentic? You tell me.

Christiane Paponnet-Cantat describes the food eaten in Cuba as ‘contact cuisine’, a concept borrowed from Mary Louise Pratt’s conceptualisation of the colonial space as a cultural and social ‘contact zone’ which ‘treats the relations among colonisers and colonised…not in terms of separateness or apartheid, but in terms of co-presence, interaction, interlocking understandings and practice’. She suggests that Cuban, and colonial cooking more generally, is a manifestation of the complex relationships between different groups of people in colonies.

Congrí is an excellent example of this contact cuisine. As in the rest of the Caribbean, Cuba’s indigenous population was eradicated – by disease and conflict – after the arrival of European colonists during the seventeenth century. Slaves were imported from West and Central Africa to work on sugar plantations. In the nineteenth century, indentured labourers from India replaced slaves. Along with foodstuffs introduced by the Spanish – like rice in the 1690s – these groups brought with them a variety of cuisines.

Congrí – at its most basic, a dish of rice and beans – can be found in various forms around the Caribbean. It’s a version of moros y cristianos (Moors and Christians) and rice and peas. And jollof rice, popular in West Africa, is similar too. The term congrí seems to have originated in Haiti and is a combination of ‘Congo’ and ‘riz’ (the French for rice), suggesting its African origins.

The recipe that Ricardo used for his congrí was by Nitza Villapol. To my shame, I’d never heard of her until Ricardo mentioned one of her best-known recipe books, Cocina al minuto. This was published in 1958, four years after Cocina criolla, the Bible of Cuban cuisine. Villapol seems to have been a kind of Cuban Delia Smith or Julia Child: she was as interested in writing about Cuban cuisine as she was in communicating it to people. She had a long-running television series which aired between 1951 and 1997. (She died in 1998.)

Villapol would be interesting simply on these grounds, but she was also an enthusiastic supporter of the Cuban Revolution. Born into a wealthy family in 1923, she was named after the Russian river Nitza by her communism-supporting father. She spent her early childhood in New York, returning to Cuba with her family at the age of nine. During World War Two she trained as a home economist and nutritionist at the University of London.

This experience of wartime rationing proved to be surprisingly useful. In Cuba, Villapol began her career during the 1950s by teaching cookery classes to young, middle-class brides and her earliest recipe books emerged out of this work. But after Fidel Castro seized power in 1959, she devoted to her formidable talents to teaching a kind of revolutionary cuisine. Tellingly, editions of her recipe books published after 1959 no longer included advertisements for American consumer goods.

Under the new communist regime, food distribution was centralised, and rationing was introduced in 1962. People collected their allowances of food – listed in a libreta (ration book) – from the local bodega, or depot. Villapol’s aim was to teach Cubans how to cook when they had little control over the quantity or the nature of the ingredients they would receive at the bodega. She taught a cuisine developed to underpin the goals of the revolution. Unfortunately, I don’t read Spanish and I haven’t been able to track down any substantial scholarship on Villapol. From what I’ve gleaned, though, it seems to me that she was interested in cooking a form of a ‘traditional’ Cuban cooking – but the cooking of ordinary Cubans, rather than those at the top of the social scale who would, presumably, have favoured American or European dishes as a marker of wealth and sophistication. This elevation of ‘every day’ Cuban food would have meshed well with the aims of the revolution.

By writing recipes for favourites like congrí and flan, Villapol created a kind of canon for Cuban cooking. The popularity – and possibly the ubiquity – of her writing and television programmes meant that not only was she seen as the authority on Cuban cuisine, but she also became the source for all that was (or is) ‘authentically’ Cuban. The irony is that this happened during a time of rationing, when what people ate was determined by supplies available to the state.

This system functioned relatively well until the collapse of communism in Europe in 1989. Peter Rosset et al. explain:

When trade relations with the Soviet Bloc crumbled in late 1989 and 1990, and the United States tightened the trade embargo, Cuba was plunged into economic crisis. In 1991 the government declared the Special Period in Peacetime, which basically put the country on a wartime economy-style austerity program. An immediate 53 percent reduction in oil imports not only affected fuel availability for the economy, but also reduced to zero the foreign exchange that Cuba had formerly obtained via the re-export of petroleum. Imports of wheat and other grains for human consumption dropped by more than 50 percent, while other foodstuffs declined even more.

There was simply not enough food to go around. (A similar set of factors caused the famine in North Korea, a country as dependent on trade with the USSR as Cuba.) As a 1998 article from the sympathetic New Internationalist noted:

The monthly rations from the State for a family of four cost around 50 pesos ($2.15), almost a quarter of the average salary of 214 pesos ($9.30). Food from the bodega is not enough to live on and no-one, neither the Government nor the people, pretends it is. It may take you halfway through the month, but no more.

Even though he was shielded – to some extent – from the worst food shortages because he was in school and university accommodation during the Special Period, Ricardo described what it was like to live while permanently hungry – and entirely obsessed with the next meal, even if it was likely to be thin, watery soup or overcooked pasta. In fact, one of the most traumatic features of the Special Period was that staples like rice and coffee – things which most people ate every day – were no longer available. Some people seem to have made congrí from broken up spaghetti.

Drawing on her experience of wartime cooking in London, Villapol used her cookery series to show her audience how to to replicate Cuban favourites with the meagre rations available to the population. Ricardo mentioned one episode during which she fashioned a steak out of orange peel. She received widespread ridicule for doing this, and I think deservedly so.

Cuba managed to pull itself out of its food crisis by radically reorganising its agricultural sector. The state transformed most of its farms into worker-owned co-operatives which

allowed collectives of workers to lease state farmlands rent free, in perpetuity. Property rights would remain in the hands of the state, and [co-operatives] would need to continue to meet production quotas for their key crops, but the collectives were the owners of what they produced. What food crops they produced in excess of their quotas could be freely sold at newly opened farmers’ markets.

In addition to this, urban agriculture helped to provide a supply of vegetables and pork:

The earlier food shortages and resultant increase in food prices suddenly turned urban agriculture into a very profitable activity for Cubans, and, once the government threw its full support behind a nascent urban gardening movement, it exploded to near epic proportions. Formerly vacant lots and backyards in all Cuban cities now sport food crops and farm animals, and fresh produce is sold from stands throughout urban areas at prices substantially below those prevailing in the farmers’ markets.

Food may not be abundant now, and there are still occasional shortages of particular items, but no-one goes hungry anymore. Cuba does offer a model of a sustainable, largely organic and pesticide-free food system, and we can learn a great deal from it.

I’m interested, though, in how Cuban food has changed as a result of the Special Period. From a quick trawl of the internet, it seems to me that Nitza Villapol still exercises a kind of nostalgic appeal to some Cubans living in Miami – there’s even one woman who’s heroically cooking her way through Villapol’s oeuvre. But for those still in Cuba – and those who experienced the deprivations of the Special Period – she seems to be tainted by association. It’s certainly the case that despite Villapol’s best efforts, Cuban diets are more meat-heavy and vegetable-poor than ever before. I wonder if this is the effect of the hunger of the nineties: a diet which was based once mainly on rice and fresh produce, has become increasingly focussed around red meat because meat is associated with plenty – and with having a full stomach.

So where does that leave us on ‘authentic’ Cuban cuisine?

Sources cited here:

Mavis Alvarez, Martin Bourque, Fernando Funes, Lucy Martin, Armando Nova, and Peter Rosset, ‘Surviving Crisis in Cuba: The Second Agrarian Reform and Sustainable Agriculture,’ in Promised Land: Competing Visions of Agrarian Reform, ed. Peter Rosset, Raj Patel, and Michael Courville (Food First Books, 2006), pp. 225-248. (Also available here.)

Christiane Paponnet-Cantat, ‘The Joy of Eating: Food and Identity in Contemporary Cuba,’ Caribbean Quarterly, vol. 49, no. 3 (Sept., 2003), pp. 11-29.

Jeffrey M. Pilcher, ‘Tamales or Timbales: Cuisine and the Formation of Mexican National Identity, 1821-1911,’ The Americas, vol. 53, no. 2 (Oct., 1996), pp. 193-216.

Mary Louise Pratt, Imperial Eyes: Travel Writing and Transculturation (London: Routledge, 1992).

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

Which Formula?

So this is my blog’s thirty-sixth post. And, wow, what a year it’s been. Thank you, dear readers, for staying the course, and I promise more for 2012. This, though, is going to be the last essay for 2011. I’ll be spending December eating, cooking, researching, and teasing the cat. Really, it’s going to be wild. But before the fun begins, I’ll be in the UK for ten days, to present a seminar paper and to do a little research at the amazing Wellcome Library.

My real, live academic research pertains to the history of childhood in the British Empire. My PhD thesis traces the ways in which ideas around childhood and youth changed in the Cape Colony during the second half of the nineteenth century. It pays particular attention to the role and impact of Dutch Reformed evangelicalism in this process. But my postdoctoral project – which is being funded by the National Research Foundation (peace be upon it) – looks at the work of the Mothercraft movement within the British Empire between 1907 and 1945.

Mothercraft was pioneered in New Zealand in 1907 in response to concerns about the very high child mortality rates among the country’s Pākehā population. Dr Truby King devised a twelve-point programme to teach specially-trained nurses – known as Plunket nurses in New Zealand and Athlone nurses in South Africa – how to encourage mothers raise healthy babies. The success of Mothercraft was such that King was invited to establish a Mothercraft Training Centre in Britain in 1917. First called the Babies of the Empire League, it sent its nurses around the Empire: to Canada, Australia, India, east Africa, the Caribbean, and South Africa. My project focuses on the work the South African Mothercraft Centre and League, which were established in the mid-1920s.

But what, I hear you say, does this have to do with food? Well, a surprising amount. One of the main emphases of Mothercraft was on the proper feeding of babies. King was an enthusiastic promoter of breastfeeding.

We have a misconception that most babies were fed by wet nurses during the nineteenth century. It bolsters the view we have of middle-class Victorian ladies who were so terrified of their own bodies that feeding their babies was simply beyond the pale. This wasn’t strictly true, though. To begin with, wet nurses were expensive to hire and only the very wealthiest families could afford them. Most middle class women fed their own babies, as did many working-class women too.

In fact, the majority of women who relied on others to feed their babies were poor. In a time when working hours were yet to be properly defined by law, long days in factories or shops were the norm for female urban workers. Those without relatives, paid ‘baby farms’ – a house run by a woman who would care for babies and young children – to care for their offspring, often for weeks at a time. The quality of the care in these early crèches was variable: some were good, but many neglected the babies kept there. All over the world, baby farms had astonishingly high mortality rates.

Most of the popular childrearing manuals of the 1800s recommended that women breastfeed their babies. Thomas Bull, the author of the very popular Maternal Management of Children, in Health and Disease (1840) recommended breastfeeding on the grounds that it benefitted both mother and baby.

The period of suckling is generally one of the most healthy of a women’s life. But there are exceptions to this as a general rule; and nursing, instead of being accompanied by health, may be the same cause of its being materially, and even fatally, impaired. This may arise out of one of two causes, – either, a parent continuing to suckle too long; or, from the original powers or strength not being equal to the continued drain on the system.

If the mother could not breastfeed, then the best alternative was to hire a wet nurse. Only if this was an impossibility should the child be raised ‘by hand’:

To accomplish this with success requires the most careful attention on the part of the parent, and at all times is attended with risk to the life of the child; for although some children, thus reared, live and have sound health, these are exceptions to the general rule, artificial feeding being in most instances unsuccessful.

Bull acknowledged that the various concoctions fed to babies tended often to undermine, rather than fortify, their health. Popular recipes for baby formulas usually included corn or rice flour mixed to a paste with water or milk. This had little or no nutritional value, and would have been very difficult for immature digestive systems to process. Other popular substitutes were cows’ or goats’ milk, tea, and thin gruel.

It’s little wonder, then, that the Mothercraft programme placed such emphasis on breastfeeding. Many Mothercraft Centres provided beds for new mothers, who could spend up to a fortnight there, learning how to feed their babies.

At around the same period, infant formulas were beginning to improve in quality and producers, most notably Nestlé, began to promote them as a healthy – even the healthier – and clean alternative to breast feeding. Nestlé is credited – rightly or wrongly – with the invention of formula milk in 1867. The popularity of powdered baby milk only began to grow during the 1940s and 1950s. Nestlé promoted Lactogen through recipe books, pamphlets, and free samples. Problematically, these were usually distributed at hospitals and clinics – at precisely the places where women would be taught how to breastfeed. By the middle of the twentieth century in the west, it was increasingly the norm for babies to be bottle fed.

I don’t particularly want to address the fraught debate over whether women should breastfeed or not. I am, though, interested in the politics of bottle feeding in the developing world, where big companies – like Nestlé – have promoted formula assiduously since the 1950s. Here, the issue with bottle feeding is not so much the quality of the formula, but the fact that it’s mixed with dirty water or fed to babies in unsterilized bottles. Also, many of the women who use formula can’t afford it, so they water it down, meaning that their children don’t receive adequate nutrition.

In 1974, War against Want published a pamphlet accusing Nestlé of profiting from the deaths of millions of children in poor countries. Three years later, an international boycott of Nestlé began, causing the World Health Organisation to proscribe the promotion of Lactogen and other formulas in its 1981 International Code for the Marketing of Breast Milk Substitutes.

But the Code has been poorly policed, and even in developed nations, compliance has been slow. In Australia, for instance, the advertising of baby milk powders only ended in the mid-1990s. There is much evidence to suggest that Nestlé and others continue the practice, albeit under different guises. In the United States, for instance, the Special Supplemental Nutrition Programme for Women, Infants and Children (WIC) distributes more than half the formula sold in the US every year. Companies provide this formula to the WIC at a discount.

All over the world, governments are endorsing breastfeeding in the first six months of life as the best – the healthiest and the cheapest – way of feeding a baby. Companies like Nestlé are actively undermining this, despite the best intentions of the WHO. The implications of the continued use of formula in the developing world are devastating:

According to Save the Children… infant mortality in Bangladesh alone could be cut by almost a third – saving the lives of 314 children every day – if breastfeeding rates were improved. Globally, the organisation believes, 3,800 lives could be saved each day. Given that world leaders are committed to cutting infant mortality by two thirds by 2015 as one of the Millennium Development Goals, protecting and promoting breastfeeding is almost certainly the biggest single thing that could be done to better child survival rates.

A few weeks ago I wrote a post which criticised the World Food Programme’s decision to go into partnership with a range of exceptionally dodgy multinationals – Cargill, Vodafone, Unilever, Yum!Brands – to reduce world hunger. I really don’t have anything against public/private partnerships, and am an enthusiastic supporter of corporate social responsibility (when it’s done well, though). But it’s deeply concerning that the WFP is providing unwitting PR to a group of particularly nasty businesses.

In a recent article for the Guardian, Felicity Lawrence discusses growing concern about big food companies’ decision to shift their focus to developing markets:

As affluent western markets reach saturation point, global food and drink firms have been opening up new frontiers among people living on $2 a day in low- and middle-income countries. The world’s poor have become their vehicle for growth.

SABMiller, Unilever, and Nestlé have developed campaigns to target poorer markets:

The companies say they are finding innovative ways to give isolated people the kind of choices the rich have enjoyed for years and are providing valuable jobs and incomes to some of the most marginalised. But health campaigners are raising the alarm. They fear the arrival of highly processed food and drink is also a vector for the lifestyle diseases, such as obesity, diabetes, heart disease and alcoholism, which are increasing at unprecedented rates in developing countries.

This is Nestlé’s strategy in Brazil:

Nestlé’s floating supermarket took its maiden voyage on the Amazon last year and has been distributing its products to around 800,000 isolated riverside people each month ever since. Christened Nestlé Até Você, Nestlé comes to you, the boat carries around 300 branded processed lines, including ice creams, and infant milk , but no other foods. The products are in smaller pack sizes to make them more affordable. The boat also acts as a collection point for the network of door-to-door saleswomen Nestlé has recruited to promote its brands. Targeting consumers from socioeconomic classes C, D and E is part of the company’s strategic plan for growth, it says. Nestlé has also set up a network of more than 7,500 resellers and 220 microdistributors to reach those at the bottom of the pyramid in the slums of Rio and São Paulo and other major Brazilian cities.

Even if Nestlé does respect the terms of the International Code for the Marketing of Breast Milk Substitutes, and I hope it does, not only is it selling unhealthy processed non-foods, but it also gains legitimacy via its partnership with…the United Nations. Earlier this year, Nestlé supported the UN’s ‘Every Woman Every Child’ initiative, which aims to improve child and maternal health. So an organisation implicated in contributing to the high rate of child mortality in the developing world, and in facilitating a global obesity epidemic, is working with the UN…to improve child health.

Merry Christmas.

Further Reading

Texts quoted here:

Thomas Bull, The Maternal Management of Children, in Health and Disease (London: Longman, Orme, Brown, Green, and Longmans, 1840).

Christina Hardyment, Dream Babies: Childcare Advice from John Locke to Gina Ford. Revised ed. (London: Frances Lincoln, 2007).

Virginia Thorley, ‘Commercial Interests and Advice on Infant Feeding: Marketing to Mothers in Postwar Queensland,’ Health and History, vol. 5, no. 1 (2003), pp. 65-89.

Other sources:

Linda Bryder, ‘Breastfeeding and Health Professionals in Britain, New Zealand and the United States, 1900-1970,’ Medical History. vol. 49, no. 2 (2005), pp. 179-196.

Linda Bryder, ‘From breast to bottle: a history of modern infant feeding.’ Endeavour, vol. 33, issue 2 (June 2009), pp. 54-59.

Linda Bryder, Not Just Weighing Babies: Plunket in Auckland, 1980-1998 (Pyramid Press, Auckland, 1998).

S.E. Duff, ‘What will this child be? Children, Childhood, and the Dutch Reformed Church in the Cape Colony, 1860-1894’ (PhD thesis, Birkbeck, University of London, 2010).

Nancy Rose Hunt, ‘“Le Bebe en Brousse”: European Women, African Birth Spacing and Colonial Intervention in Breast Feeding in the Belgian Congo,’ The International Journal of African Historical Studies, vol. 21, no. 3 (1988), pp. 401-432.

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

White Food

Public service announcement: The National Assembly is due to vote on the Protection of State Information Bill on Tuesday, 22 November. Please wear black to show your opposition to the Bill, and join the Right2Know Campaign’s protests against this Draconian piece of legislation. (If you’d like to know more about the Secrecy Bill, check out this post I wrote for FeministsSA.)

One of my favourite places in London is Exmouth Market. It was about a five-minute walk from my amazing hall of residence in Bloomsbury, and its street food – some of the best in the UK, apparently – made a pleasingly delicious lunch from time to time. Its book shop, Clerkenwell Tales, is also excellent.

I think, though, that Exmouth Market is best known as the sometime home of Brindisa, the Spanish delicatessen which is also based in Borough Market, and Moro, the restaurant which more-or-less introduced the cooking of Spain, North Africa, and the eastern Mediterranean to Britain. Having cooked from the first Moro recipe book, and having read a great deal about its founders, Sam and Sam Clark, I was curious about the restaurant itself, but I never went further than a detailed perusal of its menu: the place was simply far too pricey for my student budget.

Like so many of the young chefs who led the revolution in Britain’s eating habits during the 1990s and early 2000s, and this includes Jamie Oliver and Hugh Fearnley-Whittingstall, the Clarks had worked at the River Cafe. Founded by Ruth Rogers and Rose Gray, the restaurant was never intended to be more than a canteen for Rogers Stirk Harbour + Partners, the famous architectural firm run by Ruth’s husband, Richard Rogers. But it evolved into something more: into the first restaurant in Britain to emphasise the heavily regionalised and seasonal nature of Italian cuisine. The River Cafe imported Ligurian olive oil, cavolo nero, and Pecorino Romano to replicate the cooking of Italy in London.

It could be terribly precious and seemed to confuse eating ‘authentic’ Italian cuisine with some kind of food-based morality. The River Cafe recipe books exuded the restaurant’s self-righteousness, as Julian Barnes explains:

When the first River Cafe Cook Book came out – the blue one – it drew high praise followed by a certain raillery. Some felt they were having a lifestyle package thrust at them; some felt the emphasis on just this kind of olive oil and just those kinds of lentils was a little discouraging. As James Fenton put it in the Independent at the time: ‘I’ve been picking it up and putting it down for weeks now. I can’t say I’ve actually cooked anything from it. More, what I’m doing is deciding whether I can live up to its exacting standards.’

As many pointed out, the food served by the River Cafe, Moro, and others, is, essentially, peasant food. There is something deeply – and amusingly – ironic about the lefty middle classes (and the River Cafe had a deserved association with the rise of New Labour) paying through the nose to eat bread and cabbage soup, a range of cheap cuts of meat, and polenta.

Polenta is a staple of northern Italy and for all its association with the sophisticated eating of the 1990s, it’s really only cornmeal – or maize– or mielie meal, as we’d call it in South Africa. Partly because of the endless variety of the maize plant, cornmeal comes in both yellow and white and can be ground as finely or as coarsely as tastes demand. In fact, the difference between the yellow, medium-ground cornmeal used to produce polenta or the finer-textured yellow flour for cornbread from the American south, and the fine, white cornmeal favoured for mielie pap in South Africa is minimal.

People’s preferences for yellow or white cornmeal are, then, culturally determined. A recent article published by the magnificent Mail and Guardian explores South Africa’s taste for whiter, finer maize meal:

In the poorest communities a bag of maize meal is often the only way of satisfying a family’s hunger, and the cost factor plays a role too. An 80kg bag of maize meal is about R400: on a 500g portion a person a day, an extended family of 10 people would consume an 80kg bag in about 16 days. The daily total consumption of maize meal in South Africa is about 10 000 tonnes.

But these maize-meal consumers demand a product that is white – stripped of roughage and nutrients – and manufacturers have remodelled their businesses to serve this demand.

South Africa’s best-selling brand of maize meal is White Star, produced by Pioneer Foods. White Star is whiter and finer than other brands. Premier Foods and Tiger Brands, the country’s other two big producers of maize meal, have also invested in technology which produces this whiter maize meal.

In the pursuit of whiteness, the big millers began installing new-generation degerminators about a decade ago. In the grinding process, the degerminator extracts the greyish germ of the maize, which contains oil and other nutrients. The more of the germ extracted, the whiter and blander the end product.

Maize meal that has the least germ extracted is called ‘unsifted’; moving up the scale it becomes ‘sifted’, ‘special’ and ‘super’. Unsifted and sifted maize-meal products have been discontinued by the bigger millers. ‘Super’ is generally defined by millers as having less than 1% oil and it almost exclusively consists of the starchy endosperm. Degerminators were originally expensive technology used only by large mills, but today even relatively small maize millers have them.

The latest development in the quest for greater whiteness is colour-sorting machines, which examine every grain of maize and remove any discoloured (non-white) grain. …

A manager at Premier Foods’ Kroonstad mill, the largest in the world, said there might nevertheless still be some discoloured specks in the final product, which happened when the seed was white on the outside but had discolouration within.

Removing the germ from the maize meal means that it tastes blander and has a longer shelf life (the germ contains oil which goes off quickly). It also means that the meal is considerably less nutritious – even though South African millers do fortify maize meal and wheat flour with vitamins A, B1, B2, and B6, as well as niacin, folic acid, iron, and zinc. And what happens to the discarded germ? It goes into cattle feed, rendering animal feed more nutritious than human food.

This demand for white food is neither particular to South Africa – there is a similar trend in Mexico, for instance – nor is it a recent phenomenon. Historically, food that is white – white bread, white sugar, white rice, or white maize meal – is more expensive to produce because it needs to be processed in order to rid it of those impurities or elements which cause it to be darker in colour. white food is associated with wealth and luxury.

The coming of industrialised food production caused an increase in the scale of the adulteration of food to make it go further or seem more appealing. As a result of this, whiteness was associated increasingly with purity. Ironically, though, food producers used poisonous additives like caustic lime to make bread and other products whiter.

The production of food in factories also reduced its price, and this was particularly noticeable for highly processed products like white sugar and white flour. Now produced on a mass scale, even the very poor could afford to drink white sugar in their tea. Indeed, white bread and sugar came to be seen as ‘affordable luxuries’ from the end of the nineteenth century and into the twentieth. These were comforting, ‘special’ items which could make an already meagre diet seem more luxurious. George Orwell wrote in The Road to Wigan Pier (1937):

The miner’s family spend only tenpence a week on green vegetables and tenpence half-penny on milk (remember that one of them is a child less than three years old), and nothing on fruit; but they spend one and nine on sugar (about eight pounds of sugar, that is) and a shilling on tea. The half-crown spent on meat might represent a small joint and the materials for a stew; probably as often as not it would represent four or five tins of bully beef. The basis of their diet, therefore, is white bread and margarine, corned beef, sugared tea, and potatoes – an appalling diet. Would it not be better if they spent more money on wholesome things like oranges and wholemeal bread…? Yes, it would, but the point is that no ordinary human being is ever going to do such a thing. The ordinary human being would sooner starve than live on brown bread and raw carrots. And the peculiar evil is this, that the less money you have, the less inclined you feel to spend it on wholesome food. A millionaire may enjoy breakfasting off orange juice and Ryvita biscuits; an unemployed man doesn’t. … When you are unemployed, which is to say when you are underfed, harassed, bored, and miserable, you don’t want to eat dull wholesome food. You want something a little bit ‘tasty’. There is always some cheaply pleasant thing to tempt you. Let’s have three pennorth of chips! Run out and buy us a twopenny ice-cream! Put the kettle on and we’ll all have a nice cup of tea! … White bread-and-marg and sugared tea don’t nourish you to any extent, but they are nicer (at least most people think so) than brown bread-and-dripping and cold water. Unemployment is an endless misery that has got to be constantly palliated, and especially with tea, the English-man’s opium. A cup of tea or even an aspirin is much better as a temporary stimulant than a crust of brown bread.

In the same way, in the midst of rising food prices and a stagnating job market, South Africa’s poor buy white, fine maize meal.

However, there does seem to be a surprising shift in bread sales, as lower-income consumers appear to be buying more brown bread – as opposed to the white bread they usually favour. This, though, is probably due to the fact that brown bread costs less because it’s exempted from value-added tax. This is a change caused by necessity rather than a new set of ideas around white or brown bread.

As Orwell makes the point, it’s the association of comfort with particular kinds of food which renders them more attractive – even if a diet rich in white sugar and white bread is not at all healthy. A combination of education, affluence, and a new set of values which associate unprocessed, ‘whole’ food – wholegrain bread, whole wheat flour, brown or wild rice, and sticky brown sugar – cause the middle classes to favour products which are overwhelmingly more nutritious.

It is infinitely strange that former peasant food – like polenta – should be sold at a premium to the middle classes at restaurants, while those who are poor prefer white maize because of an association with luxury and wealth. If we are to encourage more people to eat better, it’s clear that we need to lower the prices of ‘whole’ foods. But changing people’s buying habits is related more to a set of cultural assumptions about whiteness than to cost or even knowledge about their nutritional value.

Further Reading

Sources cited here:

Julian Barnes, The Pedant in the Kitchen (London: Atlantic, 2003).

Warren Belasco, Meals to Come: A History of the Future of Food (Berkeley: University of California Press, 2006).

Sidney W. Mintz, Sweetness and Power: The Place of Sugar in Modern History (New York: Penguin, 1985).

George Orwell, The Road to Wigan Pier (London: Gollancz, 1937).

Lorine Swainston Goodwin, The Pure Food, Drink, and Drug Crusadors, 1879-1914 (Jefferson: McFarland & Co., 1999).

Other sources:

Joyce Appleby, The Relentless Revolution: A History of Capitalism (New York: WW Norton, [2010] 2011).

Warren Belasco and Philip Scranton (eds.), Food Nations: Selling Taste in Consumer Societies (New York: Routledge, 2002).

Jack Goody, ‘Industrial Food: Towards the Development of a World Cuisine,’ in Cooking, Cuisine, and Class: A Study in Comparative Sociology (Cambridge: Cambridge University Press, 1982), pp. 154-174.

Harvey A. Levenstein, ‘The Rise of the Giant Food Processors,’ Revolution at the Table: The Transformation of the American Diet (New York: Oxford University Press, 1988), pp. 30-87.

Anne EC McCants, ‘Poor consumers as global consumers: The Diffusion of Tea and Coffee Drinking in the Eighteenth Century,’ Economic History Review, vol. 61 (2008), pp. 172-200.

Sidney W. Mintz, ‘Sweet, Salt, and the Language of Love,’ MLN, vol. 106, no. 4, French Issue: Cultural Representations of Food (Sep., 1991), pp. 852-860.

Sidney W. Mintz, Tasting Food, Tasting Freedom (Boston: Beacon Press, 1996).

James Walvin, Fruits of Empire: Exotic Produce and British Taste, 1660-1800 (Basingstoke and London: Macmillan, 1997).

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

Food Processes

A fortnight ago my mother and I devoted a day to our annual chutney making, and we spent the evening recovering from the inhalation of vinegar fumes, in front of the television. We watched the first episode of the new series of Nigel Slater’s Simple Suppers. Being fans of Slater’s recipe books, we had high hopes, but these began to crumble when he remarked conspiratorially to the camera that ‘some people buy jars of pesto.’

We groaned. Of course, pesto out of a bottle is never going to be quite as amazing as pesto made freshly. (I’m not going to wade into the tiresome debate over whether pesto made in a food processor is better than that made with a pestle and mortar.) But it’s fine. Really: for a quick, warming supper, it’s absolutely delicious. And, as my father pointed out as he walked past to switch the kettle on, it’s great to be able to support businesses which train people and provide employment.

As an antidote to Slater’s preciousness, I read a couple of Calvin Trillin’s essays from Eating with the Pilgrims, a collection published in Penguin’s newish Great Food series (the one with the beautiful covers). Although he’s also a poet and journalist, Trillin is probably best known for his food writing in the New Yorker. His writing is clear, clever, and deeply sympathetic to others who, like him, love eating. Trillin tends not to write about food itself, but, rather about how people think about it, as he remarked in an interview: ‘I’m not interested in finding the best chilli restaurant in Cincinnati. I’m interested in Cincinnatians fighting about who has the best chilli.’

What I like about Trillin is that he writes about buffalo wings and barbeque with the same seriousness that other writers devote to stilton or cassoulet:

The sort of eating I’ve always been interested in is what I guess you’d call vernacular eating. It has something to do with a place. Buffalo chicken wings have something to do with Buffalo. The fact that people in Cincinnati have something they call authentic Cincinnati chilli, and seem unaware that people in the Southwest eat chilli, let alone Mexicans, and think that chilli is made by Macedonians and served on spaghetti, that’s interesting to me. Whether Skyline chilli is better than Empress chilli I don’t really care about.

This is Trillin on fried chicken:

Because a superior fried-chicken restaurant is often the institutional extension of a single chicken-obsessed woman, I realize that, like a good secondhand bookstore or a bad South American dictatorship, it is not easily passed down intact. Still, in sullen moments I blame these lamentable closings on the agribusiness corporations’ vertical integration of the broiler industry. In fact, in sullen moments I blame almost everything on the vertical integration of the broiler industry – the way some people trace practically any sort of mischief or natural disaster back to the Central Intelligence Agency, and some people, presumably slightly more sophisticated, blame everything on the interstate-highway program. If the civilisation really is about to crumble, everybody is entitled to his own idea of which is the most significant crack. Which brings us to Kentucky Fried Chicken.

I urge you to read Trillin’s excellent cultural history of buffalo wings and his fantastic account of seeking the best barbequed mutton in Kentucky. My favourite essay, other than his celebration of Shopsin’s, the legendary-despite-its-best-efforts New York restaurant, is about boudin, a staple of Cajun cuisine which is, in its purest form, a kind of sausage made out of pork meat, rice, and liver. (I wish I could provide a link, but the New Yorker has an unfriendly unwillingness to open up its archives.)

These are not particularly sophisticated dishes, and they’re often produced with a heavy reliance on processed foods – pre-packaged seasonings, the inevitable Campbell’s mushroom soup – whose flavours become as important to the finished product as those elements which make boudin or buffalo wings unique. In fact, in between Slater’s snobbery and Trillin’s celebration of deliciousness is a useful way of thinking about what we mean by processed food.

We know that the cheapness and easy availability of processed food has been blamed, rightly, for facilitating a global obesity epidemic. (Even if the increasing prevalence of obesity can’t logically be described as an ‘epidemic’. Obesity isn’t really catching.) High in salt, preservatives, and calories, most processed food provides eaters with meals which are temporarily filling and satisfying, but without much beneficial nutritional content. In food deserts – areas where low incomes, and poor transport infrastructure and distribution networks make access to fresh food very difficult – it’s usually only processed food which is available at corner shops and discount supermarkets.

But, technically, most food that we eat – even ‘good’ food – is processed. I know that blogs have been criticised for simply listing the contents of bloggers’ fridges, but I’m doing this for a reason: with the exception of the eggs, lettuce, leeks, herbs, and cherries in my fridge, the rest of it is processed. This includes the milk and cream (nearly all dairy products are pasteurised and homogenised before they’re sold to the public), blackberry jam, sun dried tomatoes (laugh if you must), butter, Colman’s and Pommery mustard, mum’s and Mrs Ball’s chutney, salami, tomato paste, and the tube of sweetened chestnut puree.

By ‘processed food’ we mean food that is prepared in some way before it’s sold: from the most severely limited run of cured hams, to the strangest possible non-food imaginable. So it’s not all bad. In fact, I’m not sure that most of us would cope without processed food of some variety: I can’t buy raw milk in Cape Town, and I rely on tinned tomatoes and frozen peas. I am not about to make my own couscous, or knit my own yogurt, despite being politically left-wing.

We do, though, eat more processed food than ever before. Since the beginning of the nineteenth century as food production became increasingly industrialised, first in the United States and then in the rest of the world, our diets have changed. We eat more of those products which are difficult or time-consuming to prepare at home (bread, pasta), and mass production has made formerly expensive, ‘artisan’ items (Parmesan cheese, chocolate) cheaper and more readily available.

I think that that one of the reasons why I was surprised by Slater’s snobbery was because of the lengthy and often quite nostalgic descriptions of the processed food of the 1960s in his memoir Toast. We tend to associate the rise of processed food with the post-war boom: with bizarre recipes for spam fritters, and a hundred and one ways with Angel Delight. In the modernist 1950s, this was the sophisticated food of the future – the food of the newly prosperous middle classes. Michael Pollan remembers:

The general consensus seemed to be that ‘food’ – a word that was already beginning to sound old-fashioned – was destined to break its surly bonds to Nature, float free of agriculture and hitch its future to Technology. If not literally served in a pill, the meal of the future would be fabricated ‘in the laboratory out of a wide variety of materials,’ as one contemporary food historian predicted, including not only algae and soybeans but also petrochemicals. Protein would be extracted directly from fuel oil and then ‘spun and woven into “animal” muscle – long wrist-thick tubes of “fillet steak.”‘

By 1965, we were well on our way to the synthetic food future. Already the eating of readily identifiable plant and animal species was beginning to feel somewhat recherche, as food technologists came forth with one shiny new product after another: Cool Whip, the Pop-Tart, nondairy creamer, Kool-Aid, Carnation Instant Breakfast and a whole slew of eerily indestructible baked goods (Wonder Bread and Twinkies being only the most famous).

The appeal of cake mixes, tinned macaroni cheese, and, later, boil-in-the-bag meals was that these were quick, labour-saving dinners. As middle-class women entered the workforce in ever-increasing numbers, so eating habits adapted to new work patterns.

The backlash against processed food and industrialised agriculture of the 1970s – in the United States, the largely California-based counter-cuisine, for example – associated the mass production of food with environmental destruction and social inequality. (Poorer people tend to eat the worst processed food.) We’ve since begun to associate the idea of processed food with strange non-foods – with turkey twizzlers and cheese strings – rather than think of it as food which has been prepared in some way, and usually in large quantities, before being sold.

I know that this may seem like a fairly nitpicky point, but we need to acknowledge the extent to which we rely on processed food in order to feed ourselves. Most of us eat better and a greater variety of things because of the mass production of food. To my mind, the more pertinent question is not how we should prevent people from eating processed food, but, rather, how we can make this food better and healthier. Obviously, we need to teach people how to cook healthily – and we have to consider the relationship between eating patterns and the hours that people work. Middle-class foodies and other well-meaning campaigners around nutrition must realise that their anti-processed food stance is not only a kind of snobbery, but entirely impractical.

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

Eat the Rich

Today’s City Press includes a fantastically interesting article about the increased incidence of obesity in post-1994 South Africa. The piece explores the links between the country’s transition to democracy and the fact that 61% of all South Africans – 70% of women over the age of 35, 55% of white men 15 years and older, and a quarter of all teenagers – are obese or overweight.

The reasons for these incredibly high levels of obesity are, as the article acknowledges, complex. In many ways, South Africa conforms to a pattern emerging throughout the developing world. In a report published a few months ago, the World Health Organisation noted that lifestyle-related diseases – like diabetes, high blood pressure, heart disease, and obesity – are now among the main causes of death and disease in developing nations. These diseases of affluence are no longer limited to the West.

For the new South African middle classes, fast food and branded processed products, like Coke, are markers of sophistication: of having ‘made it’ in this increasingly prosperous society. But, as in the rest of the world, those at the top of the social scale tend not to be overweight:

contrary to popular myth, obesity is not a ‘rich man’s disease’.

Indeed, the most affluent urbanites can get into their SUVs and drive to gym or to Woolies food hall where, for a price, they can load up their trolleys with fresh, top-quality groceries – from free-range chickens to organic lemons.

This means, says [Prof Salome] Kruger, that ‘the highest income earners are thinner’.

For urban dwellers who earn less, fresh food is usually more difficult, and expensive, to buy than processed non-food:

But for your average city dweller – earning money, but not necessarily enough to own a car to get them out to the major supermarket malls – food is where you find it.

Typically, this is in small corner shops selling a limited, and often more expensive, range of fresh foods. Fruit and veg can be hard to find among the toothpaste and toilet paper spaza staples.

‘R15!’ It’s taxi fare from Orlando to the Pick n Pay in Soweto’s Maponya Mall – and it was 25-year-old road worker Lindiwe Xorine’s reply when City Press asked her how far it was to the nearest supermarket.

We call these areas where access to fresh food is limited, ‘food deserts’. It’s entirely possible to buy fruit, vegetables, and free-range meat in South African cities, but high prices and bad transport infrastructure limit people’s ability to purchase these products.

We’re dealing, effectively, with the effects of mass urbanisation since the ending of influx control in the mid-1980s and the 1994 elections.

The migration of South Africans from rural to urban areas has been a key factor in the nation’s radical change of lifestyle habits.

Twenty years ago, restricted by apartheid laws, just 10% of black South Africans lived in urban areas. Today, more than 56% do.

Alison Feeley, a scientist at the Medical Research Council, says this massive shift to a fast-paced urban life has resulted in dietary patterns shifting just as dramatically from ‘traditional foods to fast foods’.

But this isn’t the first time that South Africa, or indeed other countries, has had to cope with the impact of urbanisation on people’s diets. During the nineteenth century, industrialisation caused agricultural workers to abandon farming in their droves, and to move to cities in search of employment, either in factories or in associated industries. In Britain, this caused a drop in the quality of urban diets. Food supplies to cities were inadequate, and the little food that the new proletariat could afford was monotonous, meagre, and lacking in protein and fresh fruit and vegetables.

One of the effects of this inadequate diet was a decrease in average height – one of the best indicators of childhood health and nutrition – among the urban poor in Victorian cities. In fact, British officers fighting the South African War (1899-1902) had to contend with soldiers who were physically incapable of fighting the generally fitter, stronger, and healthier Boer forces, most of whom had been raised on diets rich in animal protein.

This link between industrialisation, urbanisation, and a decline in the quality of city dwellers’ diets is not inevitable. For middle-class Europeans in cities like London, Paris, and Berlin, industrialised transport and food production actually increased the variety of food they could afford. In the United States, from the second half of the nineteenth century onwards, a burgeoning food industry benefitted poorer urbanites as well. Processed food was cheap and readily available. Impoverished (and hungry) immigrants from Eastern Europe, Ireland, and Italy were astonished by the variety and quantity of food they could buy in New York, Detroit, and San Francisco.

It’s difficult to identify similar patterns in South Africa. We know that the sudden growth of Kimberley and Johannesburg after the discovery of diamonds (1867) and gold (1882) stimulated agriculture in Griqualand West and the South African Republic. Farmers in these regions now supplied southern Africa’s fastest growing cities with food. The expansion of Kimberley and Johannesburg as a result of the mineral revolution was different from that of London or New York because their new populations were overwhelmingly male – on the Witwatersrand, there were roughly ninety men for every woman – and highly mobile. These immigrants from the rest of Africa, Europe, Australia, and the United States had little intention of settling in South Africa. As a result of this, it’s likely that these urban dwellers weren’t as badly effected by poor diets as their compatriots in the industrialised cities of the north Atlantic.

Cape Town’s slums and squatter settlements were, though, populated by a new urban poor who migrated with their families to the city during the final three decades of the nineteenth century. Most factory workers were paid barely enough to cover their rent. Mr W. Dieterle, manager of J.H. Sturk & Co., a manufacturer of snuff and cigars, said of the young women he employed:

It would seem incredible how cheaply and sparsely they live. In the mornings they have a piece of bread with coffee, before work. We have no stop for breakfast, but I allow them to stand up when they wish to eat. Very few avail themselves of this privilege. They stay until one o’clock without anything, and then they have a piece of bread spread with lard, and perhaps with the addition of a piece of fish.

This diet – heavy on carbohydrates and cheap stimulants (like coffee), and relatively poor in protein and fresh produce – was typical of the city’s poor. It wasn’t the case that food was unavailable: it was just that urban workers couldn’t afford it.

In fact, visitors to the Cape during this period commented frequently on the abundance and variety of fruit, vegetables, and meat on the tables of the middle classes. White, middle-class girls at the elite Huguenot Seminary in Wellington – a town about 70km from Cape Town – drank tea and coffee, ate fruit, and smeared sheep fat and moskonfyt (syrupy grape jam) on their bread for breakfast and supper. A typical lunch consisted of soup, roasted, stewed, curried, or fried meat (usually mutton), three or four vegetables, rice, and pudding.

It’s also worth noting that the Seminary served its meals during the morning, the middle of the day, and in the evening – something which was relatively new. Industrialisation caused urban workers’ mealtimes to change. Breakfast moved earlier in the day – from the middle of the morning to seven or eight o’clock – lunch (or dinner) shifted to midday from the mid-afternoon, and dinner (or tea) emerged as a substantial meal at the end of the day.

Factory workers in Cape Town ate according to this new pattern as well. The difference was the quality of their diet. A fifteen year-old white, middle-class girl in leafy Claremont who had eaten an ample, varied diet since early childhood was taller and heavier than her black contemporaries in Sturk’s cigar factory. In all likelihood, she would have begun menstruating earlier, and would have recovered from illness and, later, childbirth far more quickly than poorer young women of the same age. She would have lived for longer too.

Urbanisation changes the ways in which we eat: we eat at different times and, crucially, we eat new and different things. By looking at a range of examples from the nineteenth century, we can see that this change isn’t necessarily a bad thing. The industrial revolution contributed to the more varied and cheaper diets of the middle classes. Industrialised food production and transport caused the urban poor in the United States to eat better than many of those left behind in rural areas, for example. But it’s also clear that it exacerbates social inequality. In the 1800s, the poor had too little to eat and that which they did have was not particularly nutritious. Children raised on these diets were shorter and more prone to illness than those who ate more varied, plentiful, and protein-rich food. Now, the diets available to the poor in urbanising societies are as bad, even if the diseases they contribute to are caused by eating too much rather than too little.

Most importantly, we have an abundance of food in our growing cities. Just about everyone can afford to eat. The point is that only a minority can afford good, fresh food, and have the time, knowledge, and equipment to prepare it. Food mass produced in factories helped Europe and North America’s cities to feed their urban poor a hundred years ago. I’m not sure if that’s the best solution for the twenty-first century.

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

Buying Power

In between lecturing, glowering at undergraduates, marking, marking some more, doing research, and marking, I help out with the Right2Know Campaign. Launched about a year ago, Right2Know represents a coalition of individuals, civil society organisations, and community groups who are concerned about the Protection of State Information Bill.

We believe that the Secrecy Bill – as R2K prefers to call it – will undermine all South Africans’ right to access government information, something which is guaranteed by Section 32 of our Bill of Rights. The Secrecy Bill will allow government officials in any ‘organ of state’ – an unpleasant image – or, in other words, any department, parastatal, agency, or institution which is associated with the state, to classify information deemed to be sensitive and potentially threatening to national security. In effect, this means that the Natal Shark Board, the Algoa Bus Company, and even the Johannesburg Zoo would be able to classify information.

Also, the Bill doesn’t include a public interest clause, and the penalties which it seeks to introduce for the leaking of classified information are ludicrously high. Whistle blowers face up to twenty-five years imprisonment. I don’t object to legislation which controls access to potentially dangerous information – like the plans for Koeberg or Pollsmoor – but Right2Know is deeply concerned that this Bill will make secrecy, rather than openness, the default position within government. This Bill will have a chilling effect on the media, but it’ll also impact on ordinary people’s ability to hold the state to account.

The campaign has had a significant impact on this piece of legislation. The Bill as it stands now isn’t nearly as draconian as its earliest incarnation last year, and the ANC has now withdrawn the Bill from Parliament altogether. (We do worry, though, about the process of ‘public consultation’ which the ANC is about to begin.)

But I think that our greatest achievement has been mobilising popular opinion against a law the implications of which are not immediately obvious. We’ve managed to get people to march against the Bill, and to pack public information sessions and community meetings. I think that this is partly because the campaign has been fairly successful in causing the ruling party to change its mind. Right2Know has shown how the gathering of ordinary people in large numbers around a particular cause can make a difference.

Although the Occupy movement shows that when people feel strongly enough about an issue, they’ll take part in protests even if they know that the chances of success are pretty slim, it’s still difficult to counter criticism that there’s no point to being politically engaged because effecting change is really difficult. I think that it’s partly for this reason that so many campaigning organisations turn to consumer activism as a way of encouraging people to take action on particular issues: it’s easier to shift buying habits in the name of a cause and it requires less commitment than other forms of protest. Also, it’s proven to be relatively successful. Consumer activism hits companies where it hurts: their profits. Last year’s Greenpeace campaign to persuade Nestle not to use rainforest products caused the food giant to announce that it would not engage in ecologically harmful practises in Indonesia.

Consumer activism around food has existed for as long as the idea of the consumer – rather than the customer. I’ve written before about the link between the rise of the American food industry and its increasing use of advertising to promote branded products during the late nineteenth century, and the construction of ‘consumers’. Customers bought oats from the grocer’s bin because they ate porridge for breakfast. Consumers chose Quaker Oats from a range of other brands because they identified with the values associated with that particular product.

One of the effects of the industrialisation of food production – indeed, of the food chain – was a heightened incidence of food adulteration. We know that for centuries shopkeepers and grocers added bulk to make their products to make them go further: adding ground up chalk to flour, water to milk or vinegar, sand to sugar, and dried leaves to tea. The difference was that as more food was produced in factories and it became more difficult to monitor this production, the adulteration of food occurred on a mass scale. In both Britain and the United States, concern about the purity of food grew over the course of the nineteenth century, and with very good reason.

In 1820, Frederick Accum, a German chemist living in London, published A Treatise on Adulterations of Food and Culinary Poisons in which he detailed the extent to which British food producers used harmful – and even potentially deadly – substances to increase the volume and weight of their products, and also improve their appearance. Lead, copper, and mercury salts were used to make adulterated tea and coffee darker, bread whiter, and sweets and jellies more colourful. Thirty years later – and after Accum had fled back to Germany after the furore caused by his book – another group of British scientists found that adulteration was the norm, rather than the exception, in food manufacturing.

One of these, Arthur Hill Hassall, worked as the chief analyst for the gloriously-titled Analytical Sanitary Commission, and he went to work methodically analysing the composition of a range of medicines and manufactured food products. Between 1851 and 1854, Hassall identified alum in bread, and iron, lead and mercury compounds in cayenne pepper, copper salts in bottled fruit and pickles, and Venetian red in sauces, potted meats, and fish. He published his findings in The Lancet, and the public outcry that resulted from his work was partly behind the passing of the first Food Adulteration Act in 1860.

In Britain, efforts to curb the adulteration of food were driven largely by scientists and politicians. Consumer outrage was important in that it encouraged food producers to comply with new regulations around additives, but this was not a consumer-driven campaign. It was, though, in the United States, where the pure food movement was the first manifestation of consumer activism on a national scale. The size, influence, and political clout of the American food industry needed a concerted challenge in order to change.

Americans had been aware of a drop in the quality of manufactured food since the middle of the nineteenth century – and understood that this was connected to the fact that food was being processed in factories. As one popular rhyme put it:

Mary had a little lamb, / And when she saw it sicken, / She shipped it off to Packingtown, / And now it’s labelled chicken.

The first people to mobilise against food adulteration were middle-class women in the 1870s. Well-off and well-educated white American women were involved in a range of philanthropic and reform movements during the final decades of the nineteenth century – a period known as the Progressive Era in American historiography. The global temperance movement – which campaigned for the tighter regulation of alcohol sales – was run almost entirely by middle-class ladies who justified their engagement with politics on the grounds that this was an issue relevant mainly to women – and particularly poor women. Similarly, American women agitated for the regulation of the food industry because supplying households with food was the concern of diligent wives and mothers. Even if many women involved with the temperance and other movements eventually became active in women’s franchise organisations, these campaigns were politically and, to some extent, socially, conservative. They were also locally driven, and emerged out of existing social clubs, improvement societies, and charities.

As in Britain, studies carried out by health boards and medical societies found that the contamination of processed food was rife: flour contained ground rice, plaster of paris, grits, and sand; bread contained copper sulphate and ashes; butter contained copper; cheese contained mercury salts; and lard contained caustic lime, and alum. Cayenne pepper was adulterated with red lead and iron oxide; mustard with lead chromate and lime sulphate; and vinegar with sulphuric, hydrochloric, and pyroligneous acids, and burnt sugar. Nice.

These campaigns were grounded in a belief that the food producers had become so powerful that the American government needed to step in to protect consumers from them. Even if several states did enact food purity legislation, it became clear that the food industry needed to be regulated on a national industry, and a campaign led by the Ladies’ Home Journal and Colliers’ and supported by home economists and others argued for the introduction of a federal law, similar to that in the UK.

Surprisingly, food companies were in favour of this legislation. Not only would it simplify the increasingly complex and contradictory rules operating in different states, but they lobbied the American government to write a law which suited their business interests. In fact, Heinz and other organisations actually benefitted from the Pure Food and Drug Act of 1906: they advertised their products – which Heinz sold in clear glass bottles to demonstrate their purity – as being the safer, healthier, and purer alternative to the unbranded products sold by small, local grocers. Heinz, regulated by the American government, was the wholesome choice.

I don’t want to detract from the achievement of the pure food campaigners, but, ironically, their efforts to curb the excesses of the American food industry actually had the effect of strengthening these big processors. So I think that this example of consumer activism is instructive. It’s certainly true that as consumers our ability to withhold or redirect our buying power can cause change, and we should exploit this. But this only works in times of plenty. We’ve seen how sales of organic produce have dropped globally during the recession. Eating ethically is an expensive business.

More importantly, though, consumer activism doesn’t cause us to question the fact that we act – and are seen by our governments – primarily as consumers, rather than citizens. Secondly, it doesn’t interrogate why buying things is believed to be so important: it doesn’t consider consumerism itself. There is mounting evidence to indicate that rampant consumerism does not make for happy societies, and that we need to buy and waste less for the good of our planet.

I was struck recently by a comment made by Yvon Chouinard, the founder of the outdoor wear range Patagonia, in an interview with The Ecologist: ‘There is no doubt that we’re not going to save the world by buying organic food and clothes – it will be by buying less.’ Consumer activism can only go so far in causing change. We need to question consumerism itself.

Further Reading

Texts quoted here:

Lorine Swainston Goodwin, The Pure Food, Drink, and Drug Crusaders, 1879-1914 (Jefferson, NC, and London: McFarland, 1999).

Harvey A. Levenstein, Revolution at the Table: The Transformation of the American Diet (New York: Oxford University Press, 1988).

Susan Strasser, Customer to Consumer: The New Consumption in the Progressive Era,’ OAH Magazine of History, vol. 13, no. 3, The Progressive Era (Spring, 1999), pp. 10-14.

Other sources:

Warren Belasco and Philip Scranton (eds.), Food Nations: Selling Taste in Consumer Societies (New York: Routledge, 2002).

Jack Goody, ‘Industrial Food: Towards the Development of a World Cuisine,’ in Cooking, Cuisine, and Class: A Study in Comparative Sociology (Cambridge: Cambridge University Press, 1982), pp. 154-174.

Roger Horowitz, Meat in America: Technology, Taste, Transformation (Baltimore: Johns Hopkins University Press, 2005).

Tim Jackson, Prosperity without Growth: Economics for a Finite Planet (London: Earthscan, 2009).

Nancy F. Koehn, ‘Henry Heinz and Brand Creation in the Late Nineteenth Century: Making Markets for Processed Food,’ The Business History Review, vol. 73, no. 3 (Autumn, 1999), pp. 349-393.

Peter N. Stearns, ‘Stages of Consumerism: Recent Work on the Issues of Periodisation,’ The Journal of Modern History, vol. 69, no. 1 (Mar., 1997), pp. 102-117.

Susan Strasser, ‘Making Consumption Conspicuous: Transgressive Topics Go Mainstream,’ Technology and Culture, vol. 43, no. 4, Kitchen Technologies (Oct., 2002), pp. 755-770.

Frank Trentmann, ‘Beyond Consumerism: New Historical Perspectives on Consumption,’ Journal of Contemporary History, vol. 39, no. 3 (Jul., 2004), pp. 373-401.

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.