Brave New Food
The TV series which I most want to watch at the moment is Portlandia. Set in Portland, Oregon, it satirises and celebrates the city which originated the ur-hipster. It includes a scene in a restaurant – which I’ve watched only on youtube, alas – in which a couple questions their waitress about the provenance of the chicken on the menu. Assured that it’s free range, local, and organic – partly because their waitress provides them with its papers and name – they leave the restaurant to have a look at it:
This is hilarious because it so closely mimics reality: the menus which list the provenance of all the produce used in the restaurant; the farmers’ market stalls with photographs of happy animals pre-slaughter; the recipes which insist upon free-range, organic ingredients.
I laugh, but I’m as implicated in this hyper-sensitivity about where my food comes from, and how it was treated before it arrived on my plate. I don’t want to eat animals that suffered so that I can continue being an omnivore. I eat relatively little meat and am prepared to pay for free-range chicken, pork, and beef. (I’m not terribly fussed about it being ‘organic’ – whatever we may mean by that.)
It is a scandal how animals are treated in factory farms, and increasing demand for red meat is environmentally unsustainable. So how should we eat meat, without causing harm? If vegetarianism is as implicated in the meat economy – veal is a by-product of the dairy industry, for example – and veganism seems far too difficult, then one way out of this impasse is to consider synthetic alternatives.
I’ve been amused by the overwhelming response to reports about the apparent viability of lab-grown meat. ‘Eeew’ and ‘yuk’ seem to sum up how people feel about it. But lab-grown meat is only the most recent panacea to the world’s crisis produced by scientists – and our views on it say a great deal about our changing feelings about the relationship between food and technology.
The meat in question is being grown by Dr Mark Post at Maastricht University. He’s being funded by an anonymous donor who’s concerned about the greenhouse gas emissions produced by cattle farming. Using stem cells from cows, Post’s team have grown sheets of muscle between pieces of Velcro, which are shocked with an electric current to develop their texture and density:
Post said he could theoretically increase the number of burgers made from a single cow from 100 to 100m. ‘That means we could reduce the number of livestock we use by 1m,’ he said.
Meat grown in the laboratory could have several advantages, because its manufacture is controlled at each step. The tissue could be grown to produce high levels of healthy polyunsaturated fatty acids, or to have a particular texture.
…
He believes it will be a relatively simple matter to scale up the operation, since most of the technical obstacles have already been overcome. ‘I’d estimate that we could see mass production in another 10 to 20 years,’ he said.
Post hopes to produce a burger by October.
When I read the earliest reports about Post’s work, I thought immediately of a scene in Margaret Atwood’s Oryx and Crake, where the protagonist visits a lab which grows chicken breasts out of stem cells. This is a dystopian novel which plays on our suspicion of food grown in laboratories. It seems strange, now, for us to consider synthetic, artificial, man-made food to be superior to all that is ‘fresh’, ‘natural’ and ‘authentic’. But this is a relatively new way of thinking about food.
During the 1950s, a decade when science seemed to offer the possibility of a cleaner, healthier, and better organised world, there was a brief, but intense enthusiasm for Chlorella pyrenoidosa, a high-protein algae which grew rapidly and abundantly and was fed by sunlight and carbon dioxide.
The post-war baby boom gave rise to anxieties in the 1950s that the world would be unable to feed its growing population. Of course, we now know that innovations in agriculture during this period – including the wholesale mechanisation of farming, the increased use of pesticides, hormones, and antibiotics, and breeding high-yielding livestock – and the Green Revolution of the 1960s and 1970s produced the crops and farming methods which, at enormous environmental cost, still feed seven billion of us. But at the time, politicians worried that hungry nations would create a politically unstable world.
Algae looked like a sensible solution to the problem. Easy and cheap to grow, and apparently highly nutritious, this seemed to be the Brave New World of food production. Warren Belasco writes:
The alluring news came from pilot projects sponsored by the Carnegie Institution and conducted by the Stanford Research Institute in Menlo Park and by Arthur D. Little, Inc. in Cambridge. Initial results suggested that chlorella algae was an astounding photosynthetic superstar. When grown in optimal conditions – sunny, warm, shallow ponds fed by simple carbon dioxide – chlorella converted upwards of 20 per cent of solar energy…into a plant containing 50 per cent protein when dried. Unlike most plants, chlorella’s protein was ‘complete’, for it had the ten amino acids then considered essential, and it was also packed with calories, fat, and vitamins.
In today’s terms, chlorella was a superfood. Scientists fell over themselves in excitement: Scientific American and Science reported on it in glowing terms; the Rockefeller Foundation funded research into it; and some calculated that a plantation the size of Rhode Island was would be able to supply half the world’s daily protein requirements.
In the context of a mid-century enthusiasm for all that was efficient, systematic, and man-made, algae’s appeal was immediate: it was entirely usable and produced little or no waste; its farming was not dependent on variable weather and rainfall; it was clean and could be transformed into something that was optimally nutritious.
So why didn’t I have a chlorella burrito for supper?
Unfortunately, chlorella didn’t live up to the hype. Not only did the production of grains and soybeans increase exponentially during the 1950s, meaning that farmers were loath to switch to a new and untested crop, but further research revealed that chlorella production would be more complicated and expensive than initially envisaged. Growing chlorella in the quantities needed to be financially viable required expensive equipment, and it proved to be susceptible to changes in temperature. Harvesting and drying it was even more of headache.
On top of this, chlorella tasted terrible. There were some hopes that the American food industry might be able to transform bitter green chlorella into an enticing foodstuff – in much the same way they used additives and preservatives to manufacture the range of processed foods which bedecked the groaning supermarket shelves of 1950s America. Edible chlorella was not a world away from primula cheese.
Those who were less impressed by the food industry suggested that chlorella could be used to fortify bread and pasta – or even transformed into animal feed. But research demonstrated that heating chlorella destroyed most of its nutrients. Even one of its supporters called it ‘a nasty little green vegetable.’ By the 1960s, it was obvious that at $1,000 a ton, and inedible, chlorella was not going to be the food of the future.
All was not lost for chlorella, though. It proved to be surprisingly popular in Japan, where it is still sold as a nutritional supplement. The West’s enthusiasm for algae also hasn’t dimmed:
The discovery in the 1960s of the blue-green algae spirulina in the Saharan Lake Chad and in Mexico’s Lake Texcoco gave another boost to the health food uses of algae. Spirulina has a high-nutrient profile similar to chlorella’s but without…production problems….
Ironically, the food that was supposed to feed the world is now the preserve of the wealthy, health-conscious middle classes – those who suffer most from the diseases of affluence – who can afford to buy small jars of powdered algae.
I hope that Post’s project manages to create a viable product which can be used to supplement people’s diets. I’m not particularly revolted by the idea of lab-grown meat, and if means that it reduces the numbers of factory farms, then that can only be a good thing.
What concerns me more are the potential motives of the businesses which would produce lab-grown meat. If it is taken up by the global food industry – which has patchy records on environmental sustainability and social responsibility – will we be able to trust them to provide us with meat which is healthy for us, and ethically produced?
Source
Warren Belasco, Meals to Come: A History of the Future of Food (Berkeley: University of California Press, 2006).
Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.
Feb 4
Eating Like Horses
I spent most of January in the UK, accidentally timing a rather unexpected visit to coincide with the scandal over the presence of horsemeat in some meat products sold in British and Irish supermarkets. For most of my stay I lived near The People’s Supermarket – a co-operative supermarket run on strictly ethical lines – in Lamb’s Conduit Street. Its response to the hysteria that the news seemed to provoke was to write on the sandwich board which stands outside the entrance: ‘Come in! Our meat is completely horse-free.’
Although much of the recent fuss has focussed on the presence of horse meat in some Burger King meals, and in budget burger patties and ready meals at Tesco, Iceland, and a few other supermarkets, as several reports have made the point, Irish and British inspectors also found traces of pork in the same products:
I’ve been interested in the fact that the furore which followed the announcement of the discovery has focussed on the fact it was horse – and not pork – found in these meat products. Considering that some religions actually ban the consumption of pork, and that, as Tesco and others have made the point, eating horsemeat poses no threat to human health, this hysteria about horse struck me as misplaced.
I know that a lot has been – and is being – written about the horse meat saga, but I’d like to draw attention to a few trends in this coverage which suggest a few interesting things about our attitudes towards what we deem to be acceptable – socially, morally, ethically – to eat, and how we judge others whose habits differ from ours.
Unsurprisingly, a number of columnists pointed out the hypocrisy of happily eating dead cows, sheep, and pigs, but of being too squeamish to eat horses. Not only was horsemeat available in Britain until the 1930s, but it is eaten in France and other parts of the world. Lisa Markwell wrote in the Independent:
I agree: there is something fundamentally illogical about agreeing to eat one kind of animal, but being disgusted by the thought of eating another. But our ideas around what is – and what is not – acceptable to eat are socially and culturally determined. They change over time, and differ from place to place. Whereas swan and heron were considered to be delicacies during the medieval period, we now understand these as birds to be conserved and protected. Even in France, people have fairly mixed feelings about eating horse.
In other words, our definition of what is ‘disgusting’ is flexible. It’s for this reason that I’m relatively sympathetic to those who are appalled by the prospect of horsemeat. Despite having learned to ride as a child, I think I could probably bring myself to eat horse or donkey, but I know that I could never try dog, for instance. In the same way, I wouldn’t try to feed rabbit to my bunny-loving friend Isabelle.
The more important issue is that we should be able to trust the businesses that sell us our food. As Felicity Lawrence commented in the Guardian, the presence of horsemeat and pork in beef products is simply one in a long line of food safety scandals:
The reason for this failure of food regulation is both complex and devastatingly simple. On the one hand, the food chain has become increasingly difficult to regulate. It is now controlled by a handful of big supermarkets and food companies interested in cutting costs during a period of sky-high food prices. It becomes inevitable, then, that the quality of meat and other produce will be compromised:
And on the other hand, regulators themselves are less efficient:
There are also – justified – concerns about the FSA’s closeness to business, which has been lobbying hard for looser regulation. After all, the previous chief executive of the FSA, Tim Smith, is now Tesco’s technical director.
Unsurprisingly, this combination of unscrupulous, cost-cutting business and dysfunctional and light-touch regulation has allowed food safety to be compromised. When the first attempts to prevent food adulteration were introduced in Britain and in the United States – Teddy Roosevelt’s famous Pure Food and Drug Act (1906) – these were in response to concerns raised by campaigners, most of them middle-class women, about the safety of food produced by the relatively new, industrialised food producers. As we have seen over the past century or so, any loosening of those regulations has resulted in a decline in the quality of food.
And this brings me to my final point. One of the most striking features of the coverage of the horsemeat scandal has been the number of commentators who’ve asked their readers: ‘what else do you expect?’ Giles Coren was particularly withering in his scorn for consumers of cheap food:
The food products contaminated with horse and pork were in the ‘value’ ranges of cheap supermarkets. As the BBC reported, these contain considerably less meat than more expensive products:
Like Coren, other columnists and food writers argue that ordinary British people have become ‘disconnected’ from the food chain, having little knowledge of how their food travels from farm to supermarket. More interest on behalf of the public, they seem to imply, would in some way prevent these kind of scandals from occurring.
I disagree. Not only does this display an astonishingly naïve understanding of how big food businesses work, but it fails to take into account the fact that the people who tend to be most at risk of consuming adulterated food are those who are poor: those who buy cheap food – the value products – from big supermarkets. There is a vein of snobbery running through much of the argument that consumers of cheap food only have themselves to blame if they end up inadvertently eating horse, or other potentially harmful additives.
What this debate reveals, I think, is an odd attitude towards food, particularly meat, and class. Over the past century, and particularly since the 1950s, the eating of animal protein has been democratised. Whereas before the 1900, more or less, only the middle and upper classes could afford to eat meat on any regular basis, from around the end of the Second World War, it has become increasingly the norm for all people to be able to buy cheap protein.
But the technologies – the hormone supplements, factory farming, selective breeding, the Green Revolution – which have allowed us all to eat more meat, have also proven to be unsustainable, and particularly in ecological terms. As a recent report published by the World Wildlife Foundation, Prime Cuts: Valuing the Meat we Eat, argues, it’s not simply the case that everyone – all over the world – should eat less meat for the sake of the environment, human health, animal welfare, biodiversity and other reasons, but that we should eat better meat: meat from animals reared sustainably.
If we are committed to the idea that everybody, regardless of wealth, should be able to eat a reasonable amount of meat – and it is true that definitions of sustainable diets do vary – then we should not ask why people are surprised to find that cheap meat is adulterated or contaminated, but, rather, why so many people can’t afford to buy better quality meat.
Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.