The TV series which I most want to watch at the moment is Portlandia. Set in Portland, Oregon, it satirises and celebrates the city which originated the ur-hipster. It includes a scene in a restaurant – which I’ve watched only on youtube, alas – in which a couple questions their waitress about the provenance of the chicken on the menu. Assured that it’s free range, local, and organic – partly because their waitress provides them with its papers and name – they leave the restaurant to have a look at it:
This is hilarious because it so closely mimics reality: the menus which list the provenance of all the produce used in the restaurant; the farmers’ market stalls with photographs of happy animals pre-slaughter; the recipes which insist upon free-range, organic ingredients.
I laugh, but I’m as implicated in this hyper-sensitivity about where my food comes from, and how it was treated before it arrived on my plate. I don’t want to eat animals that suffered so that I can continue being an omnivore. I eat relatively little meat and am prepared to pay for free-range chicken, pork, and beef. (I’m not terribly fussed about it being ‘organic’ – whatever we may mean by that.)
It is a scandal how animals are treated in factory farms, and increasing demand for red meat is environmentally unsustainable. So how should we eat meat, without causing harm? If vegetarianism is as implicated in the meat economy – veal is a by-product of the dairy industry, for example – and veganism seems far too difficult, then one way out of this impasse is to consider synthetic alternatives.
I’ve been amused by the overwhelming response to reports about the apparent viability of lab-grown meat. ‘Eeew’ and ‘yuk’ seem to sum up how people feel about it. But lab-grown meat is only the most recent panacea to the world’s crisis produced by scientists – and our views on it say a great deal about our changing feelings about the relationship between food and technology.
The meat in question is being grown by Dr Mark Post at Maastricht University. He’s being funded by an anonymous donor who’s concerned about the greenhouse gas emissions produced by cattle farming. Using stem cells from cows, Post’s team have grown sheets of muscle between pieces of Velcro, which are shocked with an electric current to develop their texture and density:
Post said he could theoretically increase the number of burgers made from a single cow from 100 to 100m. ‘That means we could reduce the number of livestock we use by 1m,’ he said.
Meat grown in the laboratory could have several advantages, because its manufacture is controlled at each step. The tissue could be grown to produce high levels of healthy polyunsaturated fatty acids, or to have a particular texture.
…
He believes it will be a relatively simple matter to scale up the operation, since most of the technical obstacles have already been overcome. ‘I’d estimate that we could see mass production in another 10 to 20 years,’ he said.
Post hopes to produce a burger by October.
When I read the earliest reports about Post’s work, I thought immediately of a scene in Margaret Atwood’s Oryx and Crake, where the protagonist visits a lab which grows chicken breasts out of stem cells. This is a dystopian novel which plays on our suspicion of food grown in laboratories. It seems strange, now, for us to consider synthetic, artificial, man-made food to be superior to all that is ‘fresh’, ‘natural’ and ‘authentic’. But this is a relatively new way of thinking about food.
During the 1950s, a decade when science seemed to offer the possibility of a cleaner, healthier, and better organised world, there was a brief, but intense enthusiasm for Chlorella pyrenoidosa, a high-protein algae which grew rapidly and abundantly and was fed by sunlight and carbon dioxide.
The post-war baby boom gave rise to anxieties in the 1950s that the world would be unable to feed its growing population. Of course, we now know that innovations in agriculture during this period – including the wholesale mechanisation of farming, the increased use of pesticides, hormones, and antibiotics, and breeding high-yielding livestock – and the Green Revolution of the 1960s and 1970s produced the crops and farming methods which, at enormous environmental cost, still feed seven billion of us. But at the time, politicians worried that hungry nations would create a politically unstable world.
Algae looked like a sensible solution to the problem. Easy and cheap to grow, and apparently highly nutritious, this seemed to be the Brave New World of food production. Warren Belascowrites:
The alluring news came from pilot projects sponsored by the Carnegie Institution and conducted by the Stanford Research Institute in Menlo Park and by Arthur D. Little, Inc. in Cambridge. Initial results suggested that chlorella algae was an astounding photosynthetic superstar. When grown in optimal conditions – sunny, warm, shallow ponds fed by simple carbon dioxide – chlorella converted upwards of 20 per cent of solar energy…into a plant containing 50 per cent protein when dried. Unlike most plants, chlorella’s protein was ‘complete’, for it had the ten amino acids then considered essential, and it was also packed with calories, fat, and vitamins.
In today’s terms, chlorella was a superfood. Scientists fell over themselves in excitement: Scientific American and Science reported on it in glowing terms; the Rockefeller Foundation funded research into it; and some calculated that a plantation the size of Rhode Island was would be able to supply half the world’s daily protein requirements.
In the context of a mid-century enthusiasm for all that was efficient, systematic, and man-made, algae’s appeal was immediate: it was entirely usable and produced little or no waste; its farming was not dependent on variable weather and rainfall; it was clean and could be transformed into something that was optimally nutritious.
So why didn’t I have a chlorella burrito for supper?
Unfortunately, chlorella didn’t live up to the hype. Not only did the production of grains and soybeans increase exponentially during the 1950s, meaning that farmers were loath to switch to a new and untested crop, but further research revealed that chlorella production would be more complicated and expensive than initially envisaged. Growing chlorella in the quantities needed to be financially viable required expensive equipment, and it proved to be susceptible to changes in temperature. Harvesting and drying it was even more of headache.
On top of this, chlorella tasted terrible. There were some hopes that the American food industry might be able to transform bitter green chlorella into an enticing foodstuff – in much the same way they used additives and preservatives to manufacture the range of processed foods which bedecked the groaning supermarket shelves of 1950s America. Edible chlorella was not a world away from primula cheese.
Those who were less impressed by the food industry suggested that chlorella could be used to fortify bread and pasta – or even transformed into animal feed. But research demonstrated that heating chlorella destroyed most of its nutrients. Even one of its supporters called it ‘a nasty little green vegetable.’ By the 1960s, it was obvious that at $1,000 a ton, and inedible, chlorella was not going to be the food of the future.
All was not lost for chlorella, though. It proved to be surprisingly popular in Japan, where it is still sold as a nutritional supplement. The West’s enthusiasm for algae also hasn’t dimmed:
The discovery in the 1960s of the blue-green algae spirulina in the Saharan Lake Chad and in Mexico’s Lake Texcoco gave another boost to the health food uses of algae. Spirulina has a high-nutrient profile similar to chlorella’s but without…production problems….
Ironically, the food that was supposed to feed the world is now the preserve of the wealthy, health-conscious middle classes – those who suffer most from the diseases of affluence – who can afford to buy small jars of powdered algae.
I hope that Post’s project manages to create a viable product which can be used to supplement people’s diets. I’m not particularly revolted by the idea of lab-grown meat, and if means that it reduces the numbers of factory farms, then that can only be a good thing.
What concerns me more are the potential motives of the businesses which would produce lab-grown meat. If it is taken up by the global food industry – which has patchy records on environmental sustainability and social responsibility – will we be able to trust them to provide us with meat which is healthy for us, and ethically produced?
Source
Warren Belasco, Meals to Come: A History of the Future of Food (Berkeley: University of California Press, 2006).
My friend Elizabeth and I have breakfast together every Friday morning. For the past month or so, we’ve managed to eat at a different cafe each week – our only criteria being that they’re in central Cape Town and open early. This week we went to The Power and the Glory, a restaurant and club now irredeemably associated with the city’s burgeoning population of hipsters. But it serves an excellent breakfast. (More evidence that hipsters can serve breakfast well.) And it is – inadvertently – immensely entertaining. As I sat at a window, waiting for Elizabeth to arrive, a hipster customer arrived to buy a take-away coffee.
The scene was almost a parody of hipster-ness: hipster customer was wearing a high-waisted print skirt, brogues, and an elaborate tattoo; hipster waitress behind the serving counter was in a red vintage frock with a tousled pixie hairdo. Both were very pale, and very skinny. (I think we need a term to describe the extreme thinness of hipsters.) Hipster customer removed her hipster shades and asked for a cappuccino.
An awkward silence fell.
Hipster cafes don’t sell cappuccinos. They sell flat whites. Asking for a flat white is as much an indicator of hipster membership as a subscription to The Gentlewoman.
This left hipster waitress in a difficult position. Should she forgo her hipster principles for a moment, ignore the faux pas and order her customer a flat white? Or should she correct her? Was the hipster customer an influential hipster, and not worth insulting? Or was this the time to establish which of the pair was the real hipster?
The barrista, a beefy non-hipster who’d been watching this with some amusement, stepped in. ‘I think you mean a flat white,’ he said.
‘I do!’ said hipster customer.
And all was resolved.
Even if this hilarious moment of hipster awkwardness was so much of its time and place – it was at once typically Capetonian and typical of a particular sub-culture – the fact that it happened over a coffee, gives it almost a timeless quality.
Coffee is unusual in that it has managed to remain fashionable since its arrival in Europe at the beginning of the seventeenth century. Flat whites are only the most recent manifestation of cool coffee. They seem to have originated in Auckland in the late 80s, and differ from cappuccinos or lattes – the more familiar, Italianate forms of hot coffee-and-milk – in that the milk is heated until it’s thick and warm, rather than only frothy.
Flat whitesarrived in London four or five years ago, with the opening of a series of small coffee shops in the cooler parts of east and central London by Kiwi expats. Chains like Costa and Starbucks have since added flat whites to their menus, but – as hipsters know – a flat white is defined as much as the cafe and person who makes it, as it is by its ratio of coffee to milk.
And that is the issue. Coffee is coffee, but we’ve come to associate particular meanings with the ways in which we prepare it: between someone who buys their coffee from Origin or Truth in Cape Town and another who only drinks instant, chicory-flavoured Ricoffy with UHT milk. (Which is, incidentally, my idea of culinary hell.) Both are forms of coffee, but they are socially and culturally miles apart. Studying shifting patterns in coffee fashion is fascinating in itself, but they become more interesting when we think of them within the complex networks of trade and finance which allow us to buy coffee at restaurants and in supermarkets.
The coffee craze in Europe in the seventeenth and eighteenth centuries contributed to a boom in the coffee trade. Coffee had been available since early 1600s, having been imported to Europe from Turkey via Venice. Mixed with milk and sugar, it became popular with the new European middle classes. It was associated with exotic sophistication – and also became a marker of intellectual adventurousness. It’s difficult to underestimate the extent to which drinking coffee and the culture and politics of the Enlightenment were entangled, as Anne EC McCants writes:
The expression ‘to break bread together’ now has an archaic feel to it. A proximate contemporary substitute, albeit devoid of the powerful religious significance of bread, is to ‘go out for a cup of coffee’, which is at least as much about conversation as it is about nourishment per se. Historians associate this total reorientation of the culture of food and drink with the substitution of coffeehouses for taverns; the wider dissemination of public news; trading on the stock exchange; new table etiquette and table wares; new arrangements of domestic and public space; the ability to sustain new industrial work schedules despite their tedium….
One of the best depictions of the appeal of the new, middle-class coffee culture is JS Bach’s Coffee Cantata (1732-1735), in which a ‘disobedient’ and ‘obstinate’ young woman’s addiction to coffee so annoys her father that he threatens not to allow her to marry, unless she gives up coffee. In the end she agrees, but – without her father knowing – resolves to include her clause in her marriage contract which stipulates that she must have a steady supply of coffee.
The first coffee house opened in Britain in 1650, and within a decade there were around 3,000 of them in London. These were places where men could meet to talk in relative freedom. In 1675, Charles II tried to close them in fear that coffee house patrons were plotting to overthrow him. (Given his father’s sticky end, a paranoia about the middle classes was always inevitable.) Monarchical and official suspicion of coffee houses never really ended, though. These were places where the free exchange of information allowed for the dissemination of the Enlightenment ideas that transformed the eighteenth-century world.
But trade was also changing this world. When the Dutch managed to get hold of coffee plants from Arab traders in 1690, they established plantations in Java, where they already cultivated a range of spices. The French began to grow coffee in the West Indies at the beginning of the eighteenth century, and over the course of the next hundred years or so, coffee was planted in West Africa and parts of Latin America.
The plantation system – in many ways the origins of modern capitalism – was dependent on slave labour. Europe’s taste for coffee was satisfied by slavery. But even after the abolition of slavery in the early and middle of the nineteenth century, European demand for coffee shaped the economies of countries very far away.
The domestication of coffee consumption in the nineteenth century – when women began to drink coffee, and more of it was served at home – caused demand to spike. Improvements in transport meant that coffee could be shipped over longer distances far quicker and in greater quantities than ever before. During the 1820s and 1830s, coffee cultivation became a way of linking the economies of newly-independent nations in Latin America, to global trade. Coffee production in Guatemala, Nicaragua, Costa Rica, and El Salvador increased exponentially, and governments introduced measures to facilitate the industry: new transport infrastructure, tax breaks for landowners, low or no export duties, and legislation to lower the cost of labour.
Plentiful land and cheap labour were secured by progressively disenfranchising Indian populations, whose right to own property and to work where they pleased was eroded by pro-plantation legislation. Uprisings against governments and landowners were stamped out – usually with the help of the military. The argument for increased coffee production just seemed so compelling. By the end of the nineteenth century, ninety per cent of the world’s coffee came from South America.
Brazil was the largest single Latin American supplier of coffee, and from 1906 onwards was the controller of the international coffee trade. The Brazilian government bought up beans, stockpiled them, and then released them into the market, thereby regulating the coffee price. European and North American countries encouraged African countries to begin cultivating coffee on a grander scale too.
African producers tended to grow Robusta coffee varieties, which are generally hardier, but less tasty, than the Arabica coffee produced in Latin America. This meant that when demand for instant coffee grew in the 1950s, coffee production in postcolonial African states, whose governments subsidised coffee farmers and facilitated the free movement of labour, flourished. The entry of African coffee growers into the world market meant that the price began to plummet – and the Kennedy administration in the US realised that this was an ideal opportunity for some Cold War quiet diplomacy.
The 1962 International Coffee Agreement was meant to stabilise Latin American economies and to immunise them against potential Soviet-backed revolutions by introducing production quotas for every major coffee producing nation. Even if the ICA did include African producers, it favoured the US and Brazil, effectively giving them veto rights on any policy decisions.
The collapse of the Agreement in the late eighties – partly as a result of the increased production of non-signatories, like Vietnam – caused a major decline in the price of coffee. For consumers and cafe owners, this was distinctly good thing: good coffee was cheaper than ever before. Coffee shops in the US, in particular, fuelled a demand for good, ‘real’, coffee.
But for Rwanda, the collapse of the international coffee price and the end of regulation had disastrous implications. In 1986 and 1987, Rwanda’s annual coffee sales more than halved. The government was bankrupted and increasingly dependent aid from international institutions including the World Bank, which demanded the privatisation of state enterprises, cuts in government spending, and trade liberalisation. (Hmmm – sound familiar?) The government could no longer fund social services and schools and hospitals closed. This exacerbated existing political tensions, and created a large unemployed population, many of whom became volunteers for the paramilitary groups which carried out the genocide in 1994.
It’s supremely ironic that Rwanda has turned – again – to coffee to pull itself out of the disaster of the nineties. This time, though, coffee is being produced in ways which are meant to be more sustainable – both ecologically and economically. There, though, problems with this. Isaac A. Kamola writes:
However, widely lauded ‘fair-trade’ coffee is not without its own contradictions. First, fair-trade coffee is an equally volatile market, with much of the additional price paid to growers dependent upon goodwill consumption. Such consumption patterns are highly vulnerable to economic fluctuations, changes in cultural and ethical patterns, education campaigns, and individual commitment. Furthermore, fair-trade coffee also faces an oversupply problem, with more fair-trade coffee being produced than there are consumers of it.
In Mexico, for instance, the current instability in the global food prices – caused partly by food speculation – is placing incredible pressure on small farmers who cultivate coffee: the fluctuating coffee price has shrunk their incomes at a time when maize has never been so expensive. And even prosperity brings problems. Kenyan coffee is of particularly good quality, and the increase in the coffee price has benefitted local farmers. It has also brought an increase in crime, as gangs steal coffee berries and smuggle them out of the country.
Demand abroad fuels coffee production in Africa, Latin America, and elsewhere. No other commodity demonstrates the connectedness of global patterns of consumption and production than coffee. As Kamola makes the point, we need to make this system fairer, but the fair-trade model still ensures that African farmers are dependent on demand abroad:
This does not mean that fair trade should be discouraged. It should be underscored, however, that reforms in First World consumption patterns are not alone sufficient to ensure the protection of people from the violent whims of neoliberal markets.
As much as coffee is associated with sophistication in the West – as much as it helped to facilitate the Enlightenment – it has also been the cause of incredible deprivation and suffering elsewhere. Invented in New Zealand, popularised in the UK, and made from Rwandan beans certified by the Fairtrade Foundation based in London, a flat white in Cape Town tells a global story.
Further Reading
Sources cited here:
Anne E.C. McCants, ‘Poor consumers as global consumers: the diffusion of tea and coffee drinking in the eighteenth century,’ Economic History Review, vol. 61, no. 1 (2008), pp. 172-200.
Isaac A. Kamola, ‘Coffee and Genocide,’ Transition, no. 99 (2008), pp. 54-72.
Dale Pendell, ‘Goatherds, Smugglers, and Revolutionaries: A History of Coffee,’ Whole Earth, (June 2002), pp.7-9.
Craig S. Revels, ‘Coffee in Nicaragua: Introduction and Expansion in the Nineteenth Century,’ Conference of Latin Americanist Geographers, vol. 26 (2000), pp. 17-28.
Other sources:
Joyce Appleby, The Relentless Revolution: A History of Capitalism (New York: WW Norton, [2010] 2011).
Merid W. Aregay, ‘The Early History of Ethiopia’s Coffee Trade and the Rise of Shawa,’ The Journal of African History, vol. 29, no. 1, Special Issue in Honour of Roland Oliver (1988), pp. 19-25.
Roy Love, ‘Coffee Crunch,’ Review of African Political Economy, vol. 26, no. 82, North Africa in Africa (Dec.,1999), pp. 503-508.
Sidney W. Mintz, Sweetness and Power: The Place of Sugar in Modern History (New York: Penguin, 1985).
Stefano Ponte, ‘Behind the Coffee Crisis,’ Economic and Political Weekly, vol. 36, no. 46/47 (Nov. 24-30, 2001), pp. 4410-4417.
Wolfgang Schivelbusch, Tastes of Paradise: A Social History of Spices, Stimulants, and Intoxicants, trans. David Jacobson (New York: Random House, 1992).
James Walvin, Fruits of Empire: Exotic Produce and British Taste, 1660-1800 (Basingstoke and London: Macmillan, 1997).
Jonathan Gold, the Pulitzer-prize-winning restaurant critic for the L.A. Weekly…is an extreme eater—organs, hallucinogenic heat, and parts still moving—and a devoted one. He visits three to five hundred restaurants every year. It’s a family affair. His children, Isabel and Leon, suckled on tripas de leche (small intestine filled with undigested cow’s milk) and Sichuan food from the San Gabriel Valley, beg to eat at home.
I’m not entirely sure what it says about me, but the first article I read in the Observer is always Jay Rayner’s restaurant review. (In fact, I started reading the Observer in high school because of Jay Rayner’s reviews – it came as a pleasant surprise that there was a really good newspaper organised around them.) Last week’s was on Viajante in Bethnal Green, which seems to specialise in a kind of sub-Adrià-esque complicated, miniaturised cuisine. Rayner was not impressed:
In its eagerness to be so very now and forward thinking, the food at Viajante manages at times to feel curiously dated; it recalls the first flush of Hestomania, when even he has moved on and is now cooking up big platefuls of heartiness at Dinner.
Modern techniques are great. They’re brilliant. If you want to cook my steak by banging it round the Large Hadron Collider, be my guest. Dehydrate my pig cheeks. Spherify my nuts. But only do so if the result tastes nicer. At Viajante deliciousness is too often forced to give way to cleverness.
Rayner’s point is that the modernist cooking presented by Viajante is beginning to feel old hat. Even if – as he’s admitted – restaurant critics are ‘rampant neophiliacs,’ it does seem that enthusiasm for the molecular gastronomy espoused most famously by Heston Blumenthal and Ferran Adrià has peaked. Or that, rather, it’s become so integrated into the repertoires of high-end chefs that it no longer seems to be so very experimental.
If anything, this should be postmodern cuisine. The purpose of molecular gastronomy is to reconsider the processes which underpin cooking: to understand them, and then reconfigure them. It’s all fairly similar to Derrida’s deconstruction – and Adrià has described his technique in precisely the same terms.
When I was in London at the end of last year, I went with a friend to the V&A’s exhibition, ‘Postmodernism: Style and Subversion, 1970-1990’. It was a strange exhibition: in an attempt to include all that could be considered postmodern in design and architecture, it had a scattergun approach as to what it included. It felt curiously empty – but I’m not sure if that’s the fault of the curator, or of the movement itself.
One of the oddest features of the exhibition was a strange preponderance of teapots. It was a pity that this was as far as the V&A got to thinking about postmodernism and food – because nouvelle cuisine, the food of the postmodern moment, was so design heavy. Even if the point of nouvelle cuisine was to liberate high-end cuisine from the heavy, meaty, and flour-based-sauce cooking of the 1960s and 1970s, it was also characterised by incredibly careful plating and presentation. In many ways, garnishes were as important as the food itself.
There are strong links, I think, between nouvelle cuisine and molecular gastronomy. Both disregard the orthodoxy established by classic French cooking and experiment with ideas and ingredients from other culinary traditions – best exemplified by the late 90s enthusiasm for ‘fusion food’, done well by Peter Gordon, done badly by legions of others – and the techniques of cooking itself. Other than the fact that molecular gastronomy is underpinned by the work of scientists Hervé This and Nicholas Kurti, it also differs from nouvelle cuisine in its playfulness – its refusal to take itself seriously, something which places it firmly within the postmodern moment. But, as Rayner suggests, it would seem that molecular gastronomy has had its day: Adrià has transformed El Bulli into a foundation, and Blumenthal is serving hearty, historical meals at Dinner.
Two years ago I taught an introduction to historiography at Goldsmiths in London, and was struck by how dated postmodern theory felt. When I studied it a decade ago – crucially, pre-9/11 – it seemed, even then, to be an exciting and useful way of understanding the world, particularly because of its emphasis on the relationship between language and power. I didn’t – and still don’t – agree with the critiques of history offered up by Hayden White and Keith Jenkins, but they were thought-provoking.
After the events of 11 September 2011, the War on Terror, the 2008 economic crash, and the Arab Spring, postmodernism appears even more the product of its time: of the prosperous, confident 1980s and 1990s, when the end of communism seemed to signal Francis Fukuyama’s end of history. I find it easier to take seriously the postmodernism and poststructuralism of the 1970s and earlier – when philosophers, linguists, and theorists were attempting to find a new way of thinking reality – partly by emphasising the extent to which narratives and discourses are contingent and rooted in their particular contexts. Jean-Francois Lyotard’s The Postmodern Condition (1979) is still an arrestingly original document.
This act of de-privileging dominant discourses – or indeed any discourse – has also been its undoing, as Edward Docx argues in a recent article for Prospect:
by removing all criteria, we are left with nothing but the market. The opposite of what postmodernism originally intended. … If we de-privilege all positions, we can assert no position, we cannot therefore participate in society or the collective and so, in effect, an aggressive postmodernism becomes, in the real world, indistinguishable from an odd species of inert conservatism.
So what follows postmodernism? Docx suggests that it is something he dubs ‘authenticism’. He explains:
we can detect this growing desire for authenticity all around us. We can see it in the specificity of the local food movement or the repeated use of the word ‘proper; on gastropub menus. We can hear it in the use of the word ‘legend’ as applied to anyone who has actually achieved something in the real world. … We can identify it in the way brands are trying to hold on to, or take up, an interest in ethics, or in a particular ethos. … Values are important once more…
…we can see a growing reverence and appreciation for the man or woman who can make objects well. We note a new celebration of meticulousness…. We uncover a new emphasis on design through making…. Gradually we hear more and more affirmation for those who can render expertly, the sculptor who can sculpt, the ceramist, the jeweller, even the novelist who can actually write.
It’s telling that the various manifestation of the new, global food movement – from Occupy Food to the hundreds of localcampaigns for small-scale agriculture and unadulterated food – tend to refer to themselves as ‘real food’ (as opposed to Big Food – or the plastic, ‘Frankenstein’ food it produces).
This is a good way of understanding the recent trend in food – which Docx identifies – for the artisanal (whatever we may mean by that), the handmade, the local, the ‘old-fashioned’ (again, this is open to debate and redefinition), and the ethical. It says a great deal that the chef of the moment is René Redzepi, the Danish chef and owner of Noma, who sees himself as much as a cook as a food activist. This demand for ‘authentic’ food is, strange as it may seem, political: it’s a refusal to buy into the advertising and branding of the food industry, even if it’s an act that only a very small proportion of people can afford to do. But it’s a beginning, and a welcome one.
The World Food Programme spends £50 million on wheat from Glencore – a business which admits that it engages in food speculation, and despite the WFP’s commitment to buying its supplies from small farmers. But was Glencore the best option?
This week José Graziano da Silva, the Director General of the UN’s Food and Agriculture Organisation, announced that the famine in Somalia has ended. A combination of good rain, the most successful harvest in seventeen years, and the effective dispersal and deployment of food and agricultural aid means that most Somalis now have adequate access to food. But this is likely to be a temporary reprieve: it’s uncertain if food stocks will last until April, when the next rainy season begins and the main planting is done.
This already fragile situation is compounded by Somalia’s complicated politics: the southern part of the country is still controlled by the Islamist group al-Shabaab, which banned the Red Cross from operating in the area this week, and has disrupted food supplies in the past. Tellingly, around half of the 2.34 million people still in need of humanitarian assistance and seventy per cent of the country’s acutely malnourished children are in southern Somalia.
The end of the famine is no cause for celebration, then. Thirty-one per cent of the Somali population remains reliant on food aid, famine looms in another three months, and there are the after-effects of the famine to cope with: the plight of the refugees scattered around Somalia, Ethiopia, and Kenya; and the generation of malnourished children.
It’s estimated that between 50,000 and 100,000 people died in this famine, half of them children.
Clearly, something isn’t working.
And as one famine comes to an end – or, at least, a halt – in East Africa, another one seems to be developing on the other side of the continent. Niger, and, indeed, its neighbours Chad and Mali, is both drought- and famine-prone. Even in good years, it struggles to feed itself. Fifteen per cent of the world’s malnourished children live in Niger. But poor rainfall at the end of 2011 and a spike in global food prices means that the country’s population faces famine.
Niger’s last famine was in 2010, when the World Food Programme provided food to 4.5 million people. But things seem to be more hopeful there than in Somalia, and largely because Niger has a government which functions relatively well. Realising that it needs to store its food supply properly, provide jobs so that its population can afford to buy food, and also limit the growth of its population, the government of Niger is introducing measures to improve people’s access to food. One new piece of legislation will make it compulsory for children to remain in school until the age of sixteen, partly because of the strong link between girls’ education and declining family size.
Somalia’s weak and ineffectual government can’t do anything to prevent famine from occurring there again. With all the will in the world, there is no way that Somalia’s food crisis will end until its political situation stabilises.
The comparison of Niger and Somalia is particularly useful for demonstrating the extent to which responses to famine – from the media, NGOs, charities, and other international organisations – are heavily politicised. Reporting on the Niger famine in 2010 was fairly muted and I’ve only seen a couple of references to its most recent food crisis. Somalia, though, never seems to be out of the news. The reason for this is depressingly simple:
Niger, the large West African country whose name is best known for being just one unfortunate letter away from a pejorative racial insult, has a few terrorists, but not enough to really matter. Elements from al Qaeda in the Islamic Maghreb wander across Niger’s border every now and then, taking advantage of the large desolate areas which characterise most of the country, but for the most part its contribution to the War on Terror is minimal.
Al-Shabaab is loosely affiliated to al Qaeda and the United States fears that the Horn of Africa could prove to be a useful base for planning future terrorist activities. It probably also helps that Somalia has media-friendly pirates too.
So all famines aren’t equal. All famines are complicated. Indeed, the whole question of ‘hunger’ is complex. I was amused to note that Monday marks the beginning of the WFP’s Free Rice Week. The project encourages individuals to play a game on a website. For every correct answer, Free Rice Week’s sponsors donate ten grains of rice to the WFP. The aim of the project is to ‘provide education to everyone for free’. Hmm…. ok – it includes some basic, if vague, information about ‘hunger’. And also to ‘help end world hunger by providing rice to hungry people for free.’
Huh?
So this is going to end world hunger by giving all hungry people rice?
Seriously?
Other than the fact that it would be as – or even more – effective for the project’s sponsors and participants to skip the cute competition and simply donate rice to the WFP (or, even better, to a local feeding scheme or food bank), this really isn’t going to end world hunger.
I know that this seems like a soft target to shout at, and, really, there’s nothing wrong with donating food or money to the WFP, but my annoyance with projects and competitions like this one, stems from the fact that they’re dishonest. There is no way that Free Rice Week is going to end world hunger. It’s a pity that the WFP sees fit to inform people that by taking part in it they’re contributing to solving the food crisis.
In fact, I think that Free Rice Week and other, similar projects actually contribute to the problem.
Firstly, they fudge the meaning of ‘hunger’. Over the past year or so, we’ve become familiar with the FAO’s horrifying statistic that one billion people go hungry every day – that one sixth of the world’s population does not have adequate access to food. But there are problems with this statistic:
it is not the only way to measure food insecurity. Over the years, it has been criticised on many fronts: for the poor quality of underlying data; for the focus on calorie intake, without consideration of proteins, vitamins and minerals; and for the emphasis on availability – rather than affordability, accessibility or actual use – of food. Some say we’d be better off focusing on improving household consumption surveys, opinion polls, and direct measures of height and body weight.
These figures need to be accurate because they ‘are also used to help guide where to send foreign aid, track progress towards international development goals, and hold governments to account for promises made.’
Moreover, it glosses over the fact that there are many kinds of hunger: the extreme events – the famines – which are the products of natural disasters, conflict, and state collapse; the hunger which is the product of poor diets and an inability to buy or access enough food; and the hunger in developed nations. In Britain and the United States, the numbers of people now reliant on food stamps and food banks has spiked during the recession.
Secondly, these projects ignore the fact that responding to various kinds of hunger requires far, far more than throwing money at the problem. In fact, the WFP’s website even acknowledges this: ‘People can go hungry even when there’s plenty of food around. Often it’s a question of access – they can’t afford food or they can’t get to local markets.’ Famines in the twentieth- and twenty-first centuries occur as a result of a collapse of distribution systems, usually caused by conflict or a crisis in government. Famines tend not to happen in stable democracies. The WFP must receive money for food aid – that is absolutely non-negotiable – but long-term change, as we’ve seen in the cases of Somalia and Niger, can only occur once stable, effective governments are in place. No amount of free rice is going to end famine in Somalia.
In other cases of hunger, it’s clear that people are simply too poor to buy food: employment, education, good health systems, and higher wages will go far in remedying this situation. But even then, we have to accommodate the choices that poor people make when spending their money. In an article for Foreign Policy’s special edition on food last year, Abhijit Banerjee and Esther Duflo took a closer look at the lives of the ‘one billion hungry’ and came to some interesting conclusions:
We often see the world of the poor as a land of missed opportunities and wonder why they don’t invest in what would really make their lives better. But the poor may well be more sceptical about supposed opportunities and the possibility of any radical change in their lives. They often behave as if they think that any change that is significant enough to be worth sacrificing for will simply take too long. This could explain why they focus on the here and now, on living their lives as pleasantly as possible and celebrating when occasion demands it.
We asked Oucha Mbarbk [a Moroccan peasant] what he would do if he had more money. He said he would buy more food. Then we asked him what he would do if he had even more money. He said he would buy better-tasting food. We were starting to feel very bad for him and his family, when we noticed the TV and other high-tech gadgets. Why had he bought all these things if he felt the family did not have enough to eat? He laughed, and said, ‘Oh, but television is more important than food!’
We need to take people’s choices about how they spend their limited funds, more seriously.
Thirdly, by focussing on raising funds, the WFP transforms itself into a philanthropic organisation. Donations of food and other forms humanitarian aid are absolutely necessary to alleviating food crises, but they won’t end these crises – or end ‘hunger’ (whatever we may mean by that). In an excellent article for the Guardian, the UN’s Special Rapporteur on the Right to Food, Olivier de Schutter argues:
our global food system…is in crisis. Last year’s famine in the Horn of Africa, and the current woes in the Sahel, are the surface cracks of a broken system. These regional outbreaks of hunger are not, as such, extreme events.
Beyond semantics, this is a crucial distinction. In viewing these events as extreme and unexpected, we fail to acknowledge the regularity and predictability of hunger. This flaw is fatal, for it means failing to acknowledge that the food system itself is broken. It means failing to build readiness for persistent famine into international development and humanitarian policy. And it means waiting until people starve before doing anything.
Food aid doesn’t address the deeper, structural problems underlying the food crisis. It doesn’t consider bad governance; the impact of food speculation on rising food prices; and agricultural efficiency, particularly in the light of climate change.
By appealing to people to donate money to fund their response to food crises – which could have been avoided – the WFP and others cast hunger as something which can be remedied with old-fashioned philanthropy. It’s certainly true that philanthropic organisations can do immensely good work – like reducing rates of polio and malaria in the developing world. But this doesn’t necessarily solve the problems which give rise to these crises:
the poor are not begging us for charity, they are demanding justice. And when, on the occasion of his birthday, a sultan or emperor reprieved one thousand prisoners sentenced to death, no one ever called those pardons justice. Nor is it justice when a plutocrat decides to reprieve untold thousands from malaria. Human beings should not have to depend upon a rich man’s whim for the right to life.
Precisely. The world’s poor should not be dependent on the goodwill of wealthy people who have the time and inclination to play games on the internet.
This demonstrates nicely how bad writing, poor research, and a slightly odd understanding of food combine to produce some truly ridiculous writing:
Inviting a partner to play a game with food can therefore be an erotic experience. It’s easily done. For example, forget cutlery and table manners and sink your teeth into a rib of beef or a shin of veal while gazing into each other’s eyes: meat invites us to abandon our usual obsession with hygiene.
Take a mental note: the more a person is inclined to dirty himself with food, the more passionate is his nature. At a dinner for two, a piece of meat on a bone held firmly in one hand so that it can be gnawed at detracts nothing from civilized man. On the contrary, it jolts him out of that somewhat affected composure which inhibits a healthy expression of his instincts.
So, let this be a homage to meat, for the role it plays in delivering metropolitan man from his excessive aplomb.