Skip to content

Posts from the ‘history’ Category

Ten Books That Shaped the British Empire: Mrs Beeton’s Book of Household Management

Last week WiSER hosted the Johannesburg launch of Antoinette Burton and Isabel Hofmeyr‘s new edited collection Ten Books That Shaped the British Empire: Creating an Imperial Commons. Isabel invited three historians to pitch the books that they feel should have been included, and it was the funniest and most entertaining launch I’ve ever attended – and had the honour of speaking at. You must, of course, read Ten Books. It is that rare thing: an academically rigorous text which is accessible without losing any of the complexity of its arguments.

51SQOLjtaXL._SY344_BO1,204,203,200_I picked Mrs Beeton’s Book of Household Management. This is what I argued:

I nominate a book which has been accused not only of dooming British cooking to a repertoire which makes a virtue of stewed tea, turnips, and something called toast water – no, me neither – but whose author was labelled by Elizabeth David – no less – a plagiarist. David added: ‘I wonder if I would have ever learned to cook at all if I had been given a routine Mrs Beeton to learn from.’ I argue that Mrs Beeton’s Book of Household Management – or, to give it its full title, The Book of Household Management, comprising information for the Mistress, Housekeeper, Cook, Kitchen-Maid, Butler, Footman, Coachman, Valet, Upper and Under House-Maids, Lady’s-Maid, Maid-of-all-Work, Laundry-Maid, Nurse and Nurse-Maid, Monthly Wet and Sick Nurses, etc. etc. – also Sanitary, Medical, & Legal Memoranda: with a History of the Origin, Properties, and Uses of all Things Connected with Home Life and Comfort – was one of the most important and influential books to circulate around the British Empire. It shaped both the colonial encounter, and the postcolonial kitchen.

This is not so much a history of a book, but a history of a compendium of advice assembled, edited, and changed over time, originally by a woman and her husband, and then by an assortment of publishers and printers. Isabella Beeton was twenty-one years old and newly married when she began publishing articles on cooking and domestic advice in The Englishwoman’s Domestic Magazine. In 1861, she published what was possibly the world’s first serial recipe book – her guide to household management – and it promptly sold out. Mrs Beeton sold 60,000 copies in its first year, and 2 million by 1868. It is still in print. But by 1868, Isabella Beeton had been dead for three years – and probably as a result of complications arising of syphilis, which she had caught from her philandering husband, Samuel.

As death was the most useful thing to happen to John F. Kennedy’s career as the best president the US never had, so Isabella Beeton’s early demise helped to transform her book from a, to the guide to respectable living for the middle classes. Samuel Beeton was the book’s publisher, and as readers clamoured for yet another updated edition of Mrs Beeton – and he remained deliberately vague as to where the real Mrs Beeton really was – the book was corrected and modified to suit the changing circumstances of nineteenth-century middle-class households.

Mrs Beeton was not the first or the best recipe book of the period – Eliza Acton and before her Hannah Glasse were more accomplished cooks – nor was Isabella the first author to compile her book from snippets and cuttings from other sources. Mrs Beeton was always a compendium, a scrapbook. But this book was the first to give cooking times, accurate lists of ingredients, and menus arranged by cost. This was a practical guide to living for Britain’s new middle classes, which demystified table settings, etiquette, laundry, the management of servants, and the everyday rhythms of a respectable households. Also, this book worked to empower middle-class women, providing them with a range of skills – bookkeeping, nursing, project managing – that their daughters would use as they began gradually to enter to the workplace during the early decades of the twentieth century.

Young women packed Mrs Beeton into their luggage and sailed with her around the empire, and so Mrs Beeton also became the foundation on which middle-class British households were made in regions as far a flung as Nigeria and Australia. But Mrs Beeton also became a metaphor for the British Empire during the nineteenth and early twentieth centuries: endlessly mutable, able to change according to circumstance, meaning many things to all people at once. Linked to another saintly, if distant female figure, this book was both emblematic of a well-run household as well as a canny business machine.

Gradually, foreign recipes – for mulligatawny soup from India, lamingtons from Australia – made their way into the book. But in the colonies, Mrs Beeton became the basis for local guides to household management. In South Africa, the wildly popular Hilda’s Where Is It? by Hildagonda Duckitt (1919) and even Kook en Geniet (1951) were both obviously modelled on Mrs Beeton. In fact, Kook en Geniet could best be described as the Afrikaans Mrs Beeton. Published as a guide to housekeeping and cooking for young brides in 1951 by Ina de Villiers – and overseen by her daughter Eunice van der Berg since 2010 – it has never been out of print. Like Mrs Beeton, its success lies partly in the fact that it is regularly updated. There is no single version of Kook en Geniet. Each edition retains a core of essential recipes, but methods and ingredients change as new products appear. Dishes are added and, less frequently, subtracted as culinary fashions evolve.

85808Mrs Beeton was also the model for guides to colonial living. The Kenya Settlers’ Cookery Book and Household Guide, published by the Church of Scotland’s Women’s Guild in 1943, walks an uneasy path between demonstrating to young wives how to maintain the standards of Home, but also providing practical advice as to keeping house in east Africa. This is a guidebook in aid of civilisation: while some concessions are made to Kenyan conditions, its model remains always Mrs Beeton. Its recipes are for macaroni cheese, chicken pie, and shortbread. Gardens are to be planted with poppies, dahlias, roses, carnations, and snapdragons. African servants are to be civilised. Phrases in Swahili and Kikuyu centre around cleanliness, punctuality, and obedience. Even African chickens had to trained into good behaviour. When local ingredients – like mangoes or maize meal – were used, it was in the context of familiar recipes: green mielies au gratin, boiled banana pudding.

093587But Mrs Beeton’s influence didn’t vanish with the end of empire. She is present, too, in postcolonial recipe books. Mary Ominde’s African Cookery Book, published by Heinemann in 1975, is intended for housewives in independent Kenya, eager to play their role in raising healthy Kenyan families. But it is based very obviously on earlier colonial guidebooks. It – too – has borrowed or plagiarised from other writers, and while it does include some local recipes, for nyoyo and blood in sour milk, its emphasis is overwhelmingly on cooking dishes familiar to readers of the Kenya Settlers’ Guide. But this – like Kook en Geniet – is a recipe book in service of nationalism.

Mrs Beeton was, then, essential to the shaping of the colonial encounter between white women and children and African and Indian servants. She provided its domestic framework – a model of the ideal home at Home in Britain – and local writers created guidelines for the achievement of this manifestation of order and civilisation in the colonies. She persisted even after the end of empire. Postcolonial recipe books were informed not only by the structure of Mrs Beeton, but also the book’s recipes and ethos. Building a nation through food, if you will.

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

Modern Times

A month or so ago, the food writer Todd Kliman was criticised for publishing an article in the Washingtonian titled ‘Can Ethiopian Cuisine become Modern?’ Although much of the response to the headline was, I agree, entirely justified – this is a silly, insulting question which invokes a stereotype about Africans being forever stuck in pre-modernity – Kliman’s article presents a considerably more nuanced argument. He is interested in why the Ethiopian food which he eats enthusiastically in Washington DC – a city famous for its Ethiopian restaurants – has changed relatively little in the past few decades. He writes:

But even though the cuisine’s profile has risen, the food itself hasn’t exactly evolved. Ethiopian restaurants have become markedly more fashionable over the last 20 years – gone are the days of sitting around woven-grass tables in dark, sometimes dank dens – but the cooking is hardly different from what you would have found four decades ago. A meal then is a meal now.

Put another way, Kliman investigates why Ethiopian food – particularly as it is prepared in the US – has not been made cosmopolitan. He acknowledges that what we now define as Ethiopian cuisine has only been so since the 1970s, when refugees fleeing the civil war opened restaurants selling cheap, delicious, and exotic-yet-familiar food to curious eaters in the West:

The educated elite who came to America in the ’70s might not look like culinary pioneers … but in selecting the roughly two dozen dishes they would introduce to American diners, they in effect codified the meaning of Ethiopian food in the West. (Most of these dishes come from the Gondar region … so just as Sicilian and Neapolitan red sauce and pizza came to mean Italian food to most Americans, Gondarean dishes have come to mean Ethiopian.)

These restaurants included special, vegetarian feast dishes on ordinary menus. They prepared puddings, added raw vegetables to salads, and cooked with boneless meat. Ethiopian cooking needed to be made palatable to foreign audiences. A good comparison to Ethiopian food in the US is Indian food in Britain. There, after the Second World War, largely Bengali cooks remade some of the dishes of the region to British tastes: not as hot, richer, and with a greater proportion of gravy to meat. The difference between these two cuisines, though, is that while it’s still possible to find old-fashioned curry houses across Britain, the numbers of restaurants specialising in regional cuisines and in remaking Indian cooking traditions have also proliferated. Kliman suggests that one reason for Ethiopians’ hesitancy to embrace change – both in the US and, interestingly, in Ethiopia – has to do with the country’s fraught politics. One diner in Addis Ababa explained:

He talked about the coup, the war, the decades of suppression and fear. Just as Ethiopians are enormously proud that their country has been called the birthplace of civilization, he explained, they’re proud of the fact that they’re eating the same food as their nomadic, tribal ancestors. (And, not least, eating that food in the exact same way: with their hands.) Continuity can be equated with conservatism, yes. But in a country with a long history of political uncertainty and upheaval, it also signals stability and comfort.

Ethiopian cuisine has long been shaped by nationalism. During the late nineteenth century, at a time when a national identity and the idea of an Ethiopian state were being forged, the Ethiopian court pioneered a kind of cooking which it described as the national cuisine. This was a selective vision of what the majority of Ethiopians ate, but, nonetheless, became the basis of the cooking in cafes and restaurants that began to open in the early 1900s. In the past three decades or so, this national cuisine has been adopted as somehow encapsulating Ethiopia’s national identity – despite the fact that it bears little resemblance to what nomads would have eaten even in the recent past.

Ethiopian tea and coffee at Arts on Main, Johannesburg.

Arts on Main, Johannesburg.

But even if Kliman isn’t really interested in Ethiopian food becoming ‘modern,’ this question about diet and modernity is an important one. The appeal of Ethiopian restaurants to leftwing Americans in the 1970s (ironically in Washington DC, one of the key cities of the Enlightenment) was precisely because it seemed to speak to their anxieties about modernity in an era of oil crises, rising anxiety about ecological disaster, and the slow emergence of finance capital. This was – they believed – food from a simpler, gentler, pre-modern time.

But American progressives have not always been so enthusiastic about immigrant cooking. In his wonderful book Revolution at the Table: The Transformation of the American Diet (1988), Harvey Levenstein devotes a chapter to the New England Kitchen (NEK), a project established in Boston in 1890 by Edward Atkinson, Wilbur Atwater, and Ellen Richards. Concerned about the growing potential for strikes and other forms of collective action in American industry, Atkinson, a prosperous Boston businessman, was interested in ways of improving the living conditions of his employees without raising their wages. Nutrition seemed to offer one way of solving this conundrum – an impression confirmed by the hugely influential scientist of nutrition, Wilbur Atwater. Ellen Richards, a chemist and the first woman graduate of the Massachusetts Institute of Technology, argued that ways needed to be found to apply scientific research and principles to the improvement – the modernising – of American households.

The result of this collaboration was the NEK, which was intended both as a research institute and as a school where working people could learn to prepare simple, nutritious meals. Initially, it appeared to be a raging success, attracting funding from Andrew Carnegie, and with branches soon opening in New York, Chicago, and Philadelphia. But the NEK model failed quickly, and largely because it could not attract adequate numbers of the urban poor to attend classes. This was due in part to the fact that the diet recommended by the NEK was distinctly dull, heavy in refined carbohydrates, and sparingly flavoured. (This was in a time before the discovery of vitamins, so NEK staff were dismissive of the usefulness of fruit and vegetables.)

The ethnically varied working poor – constituted mainly of Italians, French Canadians, the Irish, and Jews from eastern and central Europe – apparently served by the NEK were not interested in this bland, heavy ‘American’ cooking. Moreover, as Levenstein makes the point, the cuisines brought by these immigrants was far more than simply sustenance: they were the basis for new identities in a foreign land, they created social cohesion, and they were closely intertwined with women’s own positions within both families and communities. Although the NEK project failed in some ways, its work was picked up in the early twentieth century by nutritionists who campaigned for the ‘Americanisation’ of immigrant diets, arguing that the strong flavourings of foreign diets served only to overwork digestive systems and encourage drinking. Meals had to be eaten on plates, rather from bowls, and with knives and forks. Spaghetti was not deemed an appropriate dinner. This was modern eating for modern Americans.

This process was not particular to the US. Missionaries in nineteenth- and twentieth-century Africa taught converts on mission stations to eat with knives and forks, instead of communally, with hands. Home economics classes, the homecraft and Jeanes movements, and other interventions were intended to teach African women how to run modern, civilised homes shortly before and after independence.

But this suspicion of immigrant food and eating as being somehow both anti-modern and unpatriotic is worth considering. American nutritionists in the early decades of the twentieth century were also suspicious of how immigrant women bought their food, choosing to go to small delis owned by other immigrants, instead of larger grocery stores. South Africa is experiencing yet another wave of xenophobic violence again – attacks on foreigners, most of them from the rest of the continent, as well as China and south Asia, never really cease, but we’re witnessing a moment of particularly heightened violence – and targets are often small spaza shops in informal settlements. Locals accuse foreigners of buying stock in bulk, thus undercutting South African businesspeople. One of the implications of the closure of these businesses is hunger: they sell food at much lower prices than the big supermarkets, which also tend to be taxi- and bus-rides away.

Apartheid’s project of race classification insisted that the race categories into which the population was divided were culturally defined: Indian people in Durban ate curry, ‘Malay’ people in Cape Town cooked bredie. Apartheid ideologues went out of their way to erase centuries of entangled histories. A refusal to engage with others – a refusal to understand our reliance on others – simply continues that project.

Sources

Timothy Burke, Lifebuoy Men, Lux Women: Commodification, Consumption, and Cleanliness in Modern Zimbabwe (Durham, NC: Duke University Press, 1996).

Nancy Rose Hunt, ‘Colonial Fairy Tales and the Knife and Fork Doctrine in the Heart of Africa,’ in African Encounters with Domesticity, ed. Karen Tranberg Hansen (New Brunswick, New Jersey: Rutgers University Press, 1992).

Harvey A. Levenstein, Revolution at the Table: The Transformation of the American Diet (Berkeley: University of California Press, 1988).

James C. McCann, Stirring the Pot: A History of African Cuisine (Athens, OH.: Ohio University Press, 2009).

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

My Book

I’ve been finishing a book, which will be out in May this year.image-service.asp

Posts and links to follow soon.

Cows Come Home

Last week, Maharashtra, India’s second-biggest state and home to the country’s commercial capital Mumbai, approved legislation which would ban the sale or possession of beef. The slaughter of cattle – cows, bulls, and calves – is now illegal. The right wing Hindu nationalist Bharatiya Janata Party (BJP), which has been in power both nationally and in Maharashtra since May last year, argued that the Maharashtra Animal Preservation (Amendment) Act saves an animal revered by many Hindus as holy. In their view, this represents a victory for pious Hindus.

As many have pointed out, although some Hindus may be in favour of a ban on the slaughter of a beast which they believe to embody divinity, the consumption and sale of beef in India is a complex and contradictory business. Firstly, the beef trade is controlled by the country’s Muslim minority, and beef is consumed mainly by them and the even smaller Christian portion of the population. Despite the fact that India is supposed to be a secular state, this law is aimed directly as these religious minorities. Vashna Jagarnath writes:

This ban will devastate the beef industry in Maharashtra, an industry that is largely run by the Muslim minority. It is not an isolated act. On the contrary, it is part of a longstanding attempt by the Hindu right, now backed with the power of the state, to make the lives of religious minorities increasingly difficult.

The ban provides the fascist project with two immediate benefits – exerting control over the minorities by sending a clear message about their increasingly precarious position in contemporary India; and dealing an economic blow to Muslims who trade in the bovine industry.

In Gaborone, Botswana.

In Gaborone, Botswana.

Secondly, this is not the first time that there have been efforts to control the slaughter of cattle in India. Several states have made the killing of cows illegal, and there are laws which limit the sale of beef in some areas. Indeed, the Maharashtra Animal Preservation (Amendment) Act has taken nineteen years to pass. The Bill was sent to the then-President to sign into law in 1996, but it floundered – only when the BJP was re-elected in 2014 was it able to recommit to making the ban real.

And the ban has caused widespread outrage in India – and not only among Muslims and Christians. This is the third point: some Hindus eat beef too. Not all Hindus stick absolutely (religiously?) to vegetarianism. In 2001, the historian DN Jha faced harassment and attempts to prevent the publication of his – by all accounts fairly dry – monograph, The Myth of the Holy Cow. His not particularly fresh thesis was that Hinduism’s ban on beef is a relatively new phenomenon. Pankaj Mishra explains:

the cow wasn’t sacred to the nomads and pastoralists from Central Asia who settled North India in the second millennium BC and created the high Brahminical culture of what we now know as Hinduism.

These Indians slaughtered cattle for both food and the elaborate sacrificial rituals prescribed by the Vedas, the first and the holiest Indian scriptures. After they settled down and turned to agriculture, they put a slightly higher value upon the cow: it produced milk, ghee, yoghurt and manure and could be used for ploughing and transport as well.

Indian religion and philosophy after the Vedas rejected the ritual killing of animals. This may have also served to protect the cow. But beef eating was still not considered a sin. It is often casually referred to in the earliest Buddhist texts.

The cow became holy first for upper-caste Hindus between the seventh and the thirteenth centuries CE. These were the people who could afford not to spend most of their time producing their food. What changed, though, to identify vegetarianism with Hinduism?

The answer lies in the 19th century, when many newly emergent middle-class Hindus began to see the cow as an important symbol of a glorious tradition defiled by Muslim rule over India. For these Hindus, the cause for banning cow-slaughter became a badge of identity, part of their quest for political power in post-colonial India. Educated Muslims felt excluded from, even scorned by, these Hindu notions of the Indian past; and they developed their own separatist fantasies.

The implications of these nationalist beginnings during the Raj are now playing out in Maharashtra.

My final point is one that I found the most surprising: the effects of the ban on the export of beef. India not only exports water buffalo – the red meat of choice for many Indians – but twenty per cent of the world’s beef comes from India. The Maharashtra Animal Preservation (Amendment) Act has implications, then, for the global food supply. Beef has been a commodity traded on national and international markets since improvements in transport – railways, shipping – and, more importantly, refrigeration, in the late nineteenth and early twentieth centuries. In the United States, the price of beef dropped in the 1870s and 1880s because of the opening up of huge ranches in the west which were connected by rail to packing centres in large cities, most notably (and notoriously, given the revelations in Upton Sinclair’s The Jungle (1906)) Chicago.

Something similar happened in South Africa, when the politician and wildly successful businessman Sir David de Villiers Graaff, 1st Baronet, pioneered refrigeration, allowing fruit, vegetables, and meat to be transported across the country’s vast interior without spoiling. His Imperial Cold Storage and Supply Company – founded on the eve of the South African War (1899-1902), out of which De Villiers Graaff profited nicely – became one of the biggest meat packing businesses in Africa.

This and large-scale tax avoidance were at the root of the wild success of the Vestey brothers’ beef empire in the early twentieth century. By 1922, Vesteys had, as Ian Phimister writes, ‘interests in South America, China and Russia, and extensive land holdings in South Africa; it gradually extended its operations to embrace Australia, New Zealand and Madagascar.’ The business shipped beef – produced cheaply under appalling conditions for both workers and cattle – around the world with ‘five steamers refrigerated and fitted for the carriage of frozen meat’.

A poster in Williamsburgh's Spoonbill & Sugartown bookshop.

A poster in Williamsburgh’s Spoonbill & Sugartown bookshop.

The demand that drove the expansion of ranching and packing in the US, and De Villiers Graaff and the Vestey bothers’ businesses, was a growing middle-class taste for a meat once prohibitively expensive. Beef became – like sugar, chocolate, and tea – an affordable luxury once an industrialised food chain caused prices to fall. A similar process is currently underway in India, as an ever-bigger middle class chooses to add more beef to its diet. Although a small, committedly nationalist middle-class was partly responsible for making Hindu diets vegetarian in the nineteenth century, the opposite is happening now. Part of a global circulation of both commodities and ideas – middle classes in other developing nations are also eating more red meat – to what extent will this large middle class be able to negotiate the demands of right wingers keen to protect the lives of holy cows, and the attractions of a more varied and ‘modern’ diet?

Sources

Ebbe Dommisse, Sir David de Villiers Graaff: First Baronet of De Grendel (Cape Town: Tafelberg, 2011).

Harvey A. Levenstein, Revolution at the Table: The Transformation of the American Diet (Berkeley, CA: University of California Press, 2003).

I. R. Phimister, ‘Meat and Monopolies: Beef Cattle in Southern Rhodesia, 1890-1938,’ Journal of African History, vol. 19, no. 3 (1978), pp. 391-414.

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

Apples and Oranges

One of my favourite scenes in Alice in Wonderland is when the Caterpillar asks Alice ‘Who are YOU?’ Having spent the day being shrunk, telescoped, and grown again, Alice is at a loss: ‘I—I hardly know, sir, just at present—at least I know who I WAS when I got up this morning, but I think I must have been changed several times since then.’ During a period obsessed with lineages, classes, and groups, Alice’s inability to slot herself into the correct category feels profoundly transgressive. Her ontological uncertainty—she remarks to the Caterpillar ‘I can’t explain MYSELF…because I’m not myself’—is more mature than the Caterpillar who will, as Alice argues, turn into a chrysalis and then a butterfly. Nobody is one thing for very long.

The same can be said, of course, for confectionary. Periodically, Britain convulses in a fraught debate over the status of the Jaffa Cake. In their commercial form these are rounds of Genoise sponge topped with orange jelly, and covered with chocolate. Supermarkets sell bright blue packets of McVitie’s Jaffa Cakes in the same aisle as Digestive biscuits, Hobnobs, and shortbread. So to the uninformed, the Jaffa Cake is – despite its name – a biscuit.

But is it really? Legally, the Jaffa Cake qualifies as a cake. A long and complicated court case in 1991 ruled in favour of McVitie’s, confirming that the Jaffa Cake is indeed a cake and should not, then, be subject to VAT. Harry Wallop explains:

In the eyes of the taxman, a cake is a staple food and, accordingly, zero-rated for the purposes of VAT. A chocolate-covered biscuit, however, is a whole other matter—a thing of unspeakable decadence, a luxury on which the full 20pc rate of VAT is levied.

McVitie’s was determined to prove it should be free of the consumer tax. The key turning point was when its QC highlighted how cakes harden when they go stale, biscuits go soggy. A Jaffa goes hard. Case proved.

So this is a Cake which looks like a biscuit but is really a cake.

Oranges trees in Perth, Australia.

Oranges trees in Perth, Australia.

But this ontological uncertainty extends beyond its position as cake or biscuit. Jaffa Cakes are named after Jaffa oranges. (McVitie’s never patented the name Jaffa Cake, so chocolate-and-citrus flavoured confections are often described as ‘Jaffa.’) These were developed in Palestine – in and near the port city of Jaffa – during the 1840s. Sweet, seedless, and with a thick rind which made them perfect for transporting, Jaffa or Shamouti oranges became Palestine’s most important export in the nineteenth century. The arrival of Jewish immigrants in the 1880s and 1890s revolutionised citrus growing in the region. These new arrivals introduced mechanised, ‘scientific’ forms of agriculture, dramatically increasing yields.

By 1939, Jewish, Palestinian, and, occasionally, Jewish and Palestinian farmers working collaboratively, employed altogether 100,000 people, and exported vast numbers of oranges abroad. Britain was a major importer of Jaffa oranges, particularly after Palestine became a Mandated territory under British control in 1923. The Empire Marketing Board – which promoted the sale of imperial produce – urged Britons to buy Jaffa oranges, something picked up by McVitie’s in 1927 with the invention of the Jaffa Cake.

An Empire Marketing Board advertisement for Jaffa oranges.

An Empire Marketing Board advertisement for Jaffa oranges.

Jaffa oranges were – and, to some extent, are – held up as an example of successful Palestinian and Israeli co-operation during the interwar period. But after 1948, the same oranges became a symbol of Israel itself. Similar to the boycott of Outspan oranges during apartheid, organisations like BDS have urged customers not to buy Jaffa oranges as a way of weakening Israel’s economy and demonstrating their commitment to a free Palestine. (Jaffa oranges are no longer, though, a major Israeli export, and are grown in Spain, South Africa, and elsewhere.)

The changing meanings of Jaffa Cakes – cake, biscuit – and their constituent ingredients – symbol of collaboration, symbol of oppression – show how the categories into which we slot food are themselves constructs. (We could, really, compare apples and oranges.) But also, the Jaffa Cake helps to draw our attention to how taxes, trade agreements, and the politics and practicalities of shipping shape the ways in which we eat, buy, and think about food. Last year, the supremely British McVitie’s – producer of the Jaffa Cake, the most widely recognised biscuit (I mean, cake) in Britain – was sold to Yildiz, a food group based in … Turkey.

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

New Wine

Last week some friends and I had supper at the Cube Tasting Kitchen. I should emphasise at the outset that for all the fact that I write a blog about food, I’m not a huge fan of the mad flights of fancy which characterise fine dining at the moment. I’m not into molecular gastronomy. I think it’s really interesting—and for a number of reasons, not only culinary—but given the choice between that and the sublime comfort food served at The Leopard and Woodlands Eatery, pizza at Stella e Luna, or dim sum at the South China Dim Sum Bar, I’d probably choose one of the latter.

But Cube was, really, entirely wonderful. And fun. It’s a small, box shaped, white walled restaurant in Joburg’s Parktown North, in a row of good and unpretentious middle-range restaurants, including Mantra which is one of my favourite places at which to eat saag paneer. It was an evening of delights over fifteen courses. We began with six starters, each themed according to a vegetable—tomato, cucumber, cabbage, potato—or a deconstructed—pissaladière—or reconstructed—Parmesan ice cream with balsamic vinegar made to look like vanilla ice cream and chocolate sauce—version of a familiar dish. The cucumber came with a gin cocktail, the cabbage soup was blue and then turned purple, and the Parmesan ice cream didn’t really work.

Blue cabbage soup...

Blue cabbage soup…

Johannesburg-20150129-00424

…that turns purple. (Apologies for the grainy photographs.)

That was okay, though. The fact that not every course was an absolute success was part of the fun. The infectious enthusiasm of the young chefs—who cook almost in the middle of the restaurant—and of the serving staff turned this into a game and an adventure. I had vegetarian main courses. The oddest, but most successful, was a combination of asparagus, humus, and shards of meringue with black pepper. The most delicious was a mushroom soufflé and a curry reduced to its most basic elements. The most beautiful was a Jackson Pollocked plate of beetroot and leek, which was also, paradoxically, the least flavourful.

Johannesburg-20150129-00428

Beetroot and leek.

And pudding—after baklava and cheese, and a palate cleanser of sherbet, pomegranate jelly, and orange sponge consumed as you would tequila with salt and lime—was a forest floor of pistachio marshmallow, rice crispy and cranberry cookies, chilled chocolate mousse, dried flower and chocolate soil, coffee biscuits, lemon gel, and wheat grass. Then there were chocolate brownies and coconut ice.

Forest floor pudding.

Forest floor pudding.

The size of the portions and the length of time it took to eat all of this—we were there for more than three hours—meant that we could digest at leisure. Because this was as much an intellectual and sensory exercise as it was supper. It would be easy to criticise this kind of dining on the grounds that its purpose is not really to feed people: it uses good, expensive food to allow fairly wealthy paying customers to have fun. But it is equally true that food has always been about more than nutrition. Human beings have long consumed—sacrificed—food in the name of status and power, in performing rituals, and marking celebrations.

It is, though, interesting that molecular gastronomy—which has its roots in the nouvelle cuisine of the 1980s—came to prominence before and during the 2008 crash, in a period marked by ever widening social and economic inequality. (On a side note, it’s worth thinking about relative definitions of wealth: our meal at Cube was expensive, but within the realms of financial possibility even for someone on a fairly modest researcher’s salary. I would never be able to afford the same menu at a similar restaurant in London, for instance.) Molecular gastronomy does not—despite the grandiose claims of some of its practitioners—represent the future of food.

It does, though, represent the past. What sets the foams, pearls, and flavoured air of molecular gastronomy apart from other iterations of fine dining is its reliance on technology. Indeed, the twin gurus of this kind of cuisine—academics Nicholas Kurti and Hervé This—were interested in researching the chemical processes which occurred during cooking. Their acolytes—from Heston Blumenthal to Ferran Adrià and René Redzepi—have used this knowledge to disrupt, deconstruct, reconstruct, and undermine what we think of as ‘food.’

This work, though, does not really fundamentally challenge our eating habits and choice of things to eat. Noma might serve insects and Blumenthal may have invented snail porridge, but molluscs and insects have been part of human diets for a very long time. I think that a more accurate name for molecular gastronomy is, really, modernist cuisine—the title of Nathan Myhrvold’s 2011 encyclopaedic guide to contemporary cooking. In all of is reliance and enthusiasm for technology, molecular gastronomy is supremely modern: this is the food of industrialisation. It is as heavily processed as cheese strings. Modernist cuisine is the logical extreme of an industrialised food system.

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

In a Nutshell

On my fridge, I have a collection of business cards from cafes and shops visited on trips abroad. This afternoon—months late—I added another few from a recent month-long stay in Canada and the US, and I was reminded of a fantastic breakfast at the August First bakery in Burlington, Vermont. I was in Burlington for a conference and spent a couple of days beforehand working and wandering around a small university town – I grew up in a small university town so I have a professional interest in them – which has a reputation for extraordinarily progressive and inclusive politics. DSCN1370 There were posters advertising make-your-own banjo classes (out of gourds, apparently), vegan Thanksgiving, and homebrew nights; the local Democratic party was next door to a Tibetan dumpling shop; and I have never been so aware of the plight of the Dalai Lama as I was in the week I spent in Vermont. And there was the most amazing co-operative, which had a wall – a wall! – of granola. Progressive America is, truly, the most amazing place. (In a similar vein, Ann Arbor’s community co-op is opposite a Birkenstock shop.) DSCN1380 I had, then, granola at August First. And it was wonderful granola, with whole walnuts and fat raisins, and with plenty of really good plain yoghurt. Burlington has embraced its granola. But – and I write this as one who makes her own granola – there is a contradiction at the heart of the association of granola with progressive living: a lot of the time, it’s full of sugar. Unlike muesli, which is left raw, granola is baked usually with honey, maple syrup, or (sometimes and) sugar, as well as oil, and, occasionally, egg white. This is not necessarily the healthiest breakfast. So why does granola signify healthy eating? DSCN1386 This isn’t the only food to be linked to left wing politics. Paul Laity notes:

‘Socialism,’ George Orwell famously wrote in The Road to Wigan Pier (1936), draws towards it ‘with magnetic force every fruit-juice drinker, nudist, sandal-wearer, sex-maniac, Quaker, “Nature Cure” quack, pacifist and feminist in England.’ His tirade against such ‘cranks’ is memorably extended in other passages of the book to include ‘vegetarians with wilting beards,’ the ‘outer-suburban creeping Jesus’ eager to begin his yoga exercises, and ‘that dreary tribe of high-minded women and sandal-wearers and bearded fruit-juice drinkers…’

Orwell’s ‘cranks’—a term reclaimed by the London vegetarian restaurant in 1961—were the free-thinking and –living British Bohemians of the early twentieth century, who experimented with new forms of comfortable dress, sustainable eating, eastern religions, egalitarian social arrangements, and alternative sexual identities. This early counter culture was strongly influenced by late nineteenth-century dieticians and naturopaths—many of them based in Germany—who advocated raw, simple eating in contrast to the meat- and starch-heavy meals which characterised most middle-class diets. DSCN1388 As Catherine Carstairs remarks in her essay ‘The Granola High: Eating Differently in the Late 1960s and 1970s,’ it was immigrants from central Europe who brought health food shops to North America, stocking vitamin supplements, wholewheat bread, and, inevitably, fruit juice. It was these shops that made widely available the foods eaten at more exclusive sanatoriums in Europe and the United States.

Like muesli and bircher muesli, granola was invented in a health spa. In her excellent and exhaustively detailed history of granola, Karen Hochman argues that Dr James Caleb Jackson—a farmer, journalist, and doctor—invented granula in 1863 for the patients at his spa, Our Home on the Hillside, in upstate New York. Relying heavily on Graham flour—invented by the dour evangelical preacher Sylvester Graham—he baked sheets of biscuits and crumbled them into granules to be soaked in milk and then eaten for breakfast. It’s likely that granula—the predecessor of Grape Nuts—would never have moved beyond the confines of Our Home on the Hillside had it not come to the attention of a rival sanatorium doctor and Seventh Day Adventist, William Kellogg, who used rolled, toasted oats instead of Graham flour biscuits. He renamed his product granola, and it became for a while a significant money earner for his Sanitarium Food Company (renamed Kellogg’s Food Company in 1908).

But enthusiasm for granola remained—largely—limited to the relatively small numbers of people who shopped in health food stores until the 1960s and 1970s. Then, concern about the effects of pesticides and additives on human, plant, and animal health; suspicion of the food industry; a desire to experiment with diets from elsewhere; and a back to the land movement all coincided to produce an interest in purer, healthier, more ‘natural’ foods. Hippies—another food counter culture—looked back and found granola. So did big food companies, as Hochman writes about the US:

Granola went mainstream in 1972, when the first major commercial granola, Heartland Natural Cereal, was introduced by Pet Incorporated. In rapid succession, Quaker introduced Quaker 100% Natural Granola; Kellogg’s introduced Country Morning granola cereal and General Mills introduced Nature Valley granola.

The sweet, nut- and dried fruit-filled granola we eat today is derived from the granola reinvented in the 1960s and 1970s. Despite having been popularised by Quaker and General Mills—the enemies of the second food counter culture—granola retained its association with progressive, healthy living.

This cultural history of granola tell us three things, I think. Firstly, that the food counter culture has roots in alternative experiments in living stretching as far back as the late eighteenth century, when vegetarianism and lighter diets were picked up as markers of enlightened, rational eating. Secondly, that business has long taken advantage of the experiments done by people working and living on the fringes of respectability.

Finally, it also traces the shifting meanings of what we define as ‘healthy.’ Despite evidence presented to us by nutritionists, what we think of as being healthy food depends on a range of factors, including whether, historically, a product has been associated with health-conscious living.

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

Presumptuous Insect

A few months ago, I was interviewed on a radio station about changing attitudes towards food and eating. After a caller commented that when he’d lived in rural Limpopo, he’d happily eaten frogs, but preferred McDonald’s having moved to Johannesburg, I managed—somehow—to talk myself into an urgent appeal to the nation to eat insects. I’m still not entirely sure how this happened, but I think it was partly connected to the recent slew of articles on why we need to eat insects to save the planet.

This insect turn in culinary fashion is, of course, nothing new. In 1885, the entomologist Vincent M. Holt published Why not eat insects? To some extent, current arguments for eating insects deviate little from this little manifesto. Holt remarks, rightly, that there is nothing inherently dirty about insects—in fact, crustaceans, being bottom feeders, are potentially more dangerous to eat—and that they can form part of a balanced diet. He suggests that Western aversion to eating them is linked strongly to culturally specific ideas about what is fine and not fine to eat. He cites the example of a Chinese banquet at an exhibition in London, pointing out that Britons happily sampled a menu which included cuttlefish, sea slugs, and birds’ nests because it was both exotic and, apparently, healthy. Past Europeans ate insects, and societies in Africa, Asia, and elsewhere happily, according to Holt, eat insects:

Beginning with the earliest times, one can produce examples of insect-eating at every period down to our own age. Speaking to the people of Israel, at Lev. xi. 22, Moses directly encourages them to eat clean-feeding insects: ‘These ye may eat, the locust after his kind, and the bald locust after his kind, and the beetle after his kind, and the grasshopper after his kind.’ …

Cooked in many and various ways, locusts are eaten in the Crimea, Arabia, Persia, Madagascar, Africa, and India. … From the time of Homer, the Cicadae formed the theme of every Greek poet, in regard to both tunefulness and delicate flavour. Aristotle tells us that the most polished of the Greeks enjoyed them… Cicadae are eaten at the present day by the American Indians and by the natives of Australia.

He appeals to his readers:

We pride ourselves upon our imitation of the Greeks and Romans in their arts; we treasure their dead languages: why not, then, take a useful hint from their tables? We imitate the savage nations in their use of numberless drugs, spices, and condiments: why not go a step further?

Contemporary interest in eating insects is, though, strongly connected to anxieties about a food chain which seems to be increasingly ecologically unsustainable. Current methods of producing enough protein for the world’s population are to the cost of animal welfare and good labour practice, consume vast quantities of water, and produce methane and other greenhouse gases. Something needs to change, and insect enthusiasts argue that crickets, grasshoppers, and caterpillars are a viable alternative to beef, chicken, and pork. In a 2013 report for the Food and Agriculture Organisation, Dutch entomologist Arnold van Huis—academic and author of The Insect Cookbook: Food for a Sustainable Planet (Arts and Traditions of the Table: Perspectives on Culinary History)—notes more than 1,900 species of insects already form part of the diets of ‘at least two billion people.’ A lot of these insects are high in protein—higher, in some cases, than beef—and other nutrients. Many of them consume waste, and farming them is comparatively cheap and requires little labour.

DSCN9087

This promotion of what Dana Goodyear calls ‘ethical entomophagy’ in Anything that Moves: Renegade Chefs, Fearless Easters and the Making of a New American Food Culture, has met with some commercial success. There are now—outside of regions where insects are normally part of diets—businesses dedicated to farming insects for human consumption. It’s possible to buy cricket flour; Selfridges sells chocolate covered giant ants; and pop up restaurants and Noma have featured insects on their menus. The logic is that these high-end sales of edible insects will gradually influence the middle and bottom of the market. A kind of ‘trickle down’ revolution in diet.

While it is certainly true that we can and have chosen to eat foodstuffs once deemed to be dangerous or socially taboo—potatoes in eighteenth-century France, beef in Japan during the Meiji Restoration—these shifts in attitude take time to achieve. Also, in the case of potatoes and beef, these societies were strongly hierarchical with powerful aristocracies. Thankfully, most of us no longer live in a world where the king’s decision to consume a formerly shunned ingredient changes the way that all of us eat.

As every recent article on entomophagy notes, the main obstacle to the widespread incorporation of insects into, particularly but not exclusively, Western diets is a strong aversion to eating them. If only, the argument goes, picky Westerners would give up their hypocritical dislike of insects—they eat shrimp and prawns, after all—and then we’ll all be fine. But I think it’s worth taking this dislike seriously. As Goodyear makes the point, a lot of these insects aren’t particularly delicious. She tries embryonic bee drones picked from honeycomb:

the drones, dripping in butter and lightly coated with honey from their cells, were fatty and a little bit sweet, and, like everything chitinous, left me with a disturbing aftertaste of dried shrimp.

I’ve eaten fried, salted grasshoppers at a food festival on London’s south bank, and they were crunchy and salty—improved, like most things, by deep frying—but otherwise memorable only for having been grasshoppers.

Making insects palatable involves processing, something which almost inevitably increases the ecological footprint of the product. Perhaps even more importantly, as the caller I referred to at the beginning of this post said, insects are widely associated with poverty and deprivation. Modernity—life in the city—requires a new diet. While it is true that in many societies, people do eat insects out of choice, it is equally significant that when they can, people stop eating insects as soon as possible.

Our current anxiety about sustainable sources of protein is driven partly by concern that the new middle classes in China and India will demand to eat as much beef, in particular, as their Western counterparts. I wonder to what extent this concern is part of a long tradition of Malthusian yellow peril: that China, in particular, will somehow eat up all the world’s resources. I don’t have any objection to promoting entomophagy—although trickle down strategies have a fairly low level of success—but I think we should look more carefully at the reasons underpinning our interest in investing in alternative forms of protein, and also be careful that we won’t take seriously the interests and tastes of people clawing their way out of poverty.

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

Not for all the tea

When I was finishing my PhD, my friend Jane gave me a t-shirt emblazoned with the slogan ‘tea is not a food group.’ She used to shout that into my room—we lived a few doors down from each other in the same student residence—as she passed me on the way to the lift. She had good reason for doing so. When I’m absorbed in writing, I can forget that the world exists: that it’s necessary to brush your hair, dress properly, cook, and not have conversations with yourself out loud. And that it’s unwise to subsist on tea.

Over the past couple of months, I’ve been in the final throes of completing a book manuscript and I’ve tried—probably not always successfully—to maintain at least a semblance of normal, civilised behaviour, but tea has remained a constant. It’s a kind of writing comfort blanket; a small routine in the middle of anxious typing. In some ways, then, it was a misfortune to be in the United States for much of this period. I could drink as much excellent coffee as I could cope with, but tea? Good strong, hot black tea? Until I discovered a branch of TeaHaus in Ann Arbor, not so much.

I know that I’m not the first to complain about the difficulty of finding a decent cup of black tea in the US, and, to some extent, this belief that Americans don’t understand hot tea is something of a misnomer. Teavana, Argo, and Teahaus all attest to an enthusiasm—an apparently growing enthusiasm—for well-made tea. I’ve never encountered so many different kinds of tea in supermarkets. (And, truly, Celestial Seasonings is the best name for a brand of tea.) But it is true, I think, that it’s hard to find really good black tea in the average café. While this is probably linked to the fact that most tea drunk in the US is iced tea, it’s also because tea in these establishments is made with hot—not boiling—water. This is crucial. Tea leaves need to steep in freshly boiled water.

Tea.

Tea.

This aversion to boiling water can be traced back to a 1994 civil case: Liebeck vs McDonald’s Restaurants. Two years previously, Stella Liebeck, an elderly Albuquerque resident, had spilled a cup of boiling hot coffee over her lap. She sued McDonald’s, and was awarded initially $2.7 million in punitive damages. While for some, the case has become emblematic of the madness of a litigious legal system, the truth is considerably more complex. Not only had Liebeck suffered third degree burns—resulting in extensive reconstructive work and long stays in hospital—but she and her family only sued McDonald’s as a last resort. When their reasonable request that McDonald’s cover her medical bills was turned down, they decided to go to court. Moreover, in the end, Liebeck received considerably less than $2.7 million: the judge reduced that sum to $480,000, and she was awarded, eventually, between $400,000 and $600,000 in an out of court settlement with McDonald’s.

This was not, then, a frivolous lawsuit. But it was interpreted as such, and became one of the examples cited in efforts to reform tort law—the legislation which allows people to sue others in the case of injury to themselves or their property—in the US. As some lawyers argue, the tort reform lobby led by Republicans isn’t really to reduce the numbers of lawsuits submitted by greedy people, but, rather, an attempt to protect business from having to pay for its mistakes.

For tea drinkers, though, this misperception (fanned by tort reform campaigners) has resulted in tepid, unpleasant cups of tea. Concerned about similar lawsuits, restaurants now serve hot—rather than boiling—water. But perhaps there is a kind of poetic—or historical—logic to having to search high and low for decent tea in the US. The chests of tea tipped into Boston’s harbour in 1773 was both in defiance of the Tea Act and a rejection of Britain’s right to tax the thirteen colonies. When patriots switched to coffee—indeed some refused even to eat the fish caught in or near the harbour on the grounds that they could have consumed some of the tea—it was in defiance of British rule. In the land of the free, shouldn’t tea be hard to come by? This association of coffee and freedom wasn’t new, even then. Coffee houses in eighteenth-century Britain and Europe were places where middle-class men could gather to talk and think. The work of the Enlightenment was done, to some extent, over cups of coffee. But coffee was produced on slave plantations and coffee houses—and the freedoms discussed in them—were largely for white men. Coffee represented, then, freedom of the few.

Like so many people recently, I’ve been thinking about the historical contexts which produced the principles on which liberal democracies are founded. Freedom of expression and of thought, freedom to gather, freedom of religious belief are fundamental to the functioning of liberal democracies. Regardless of the fact that these principles were originated during a period in which they applied mainly to white men—and regardless of the fact that they have not prevented injustices from being committed (sometimes in their name) in liberal democracies—these remain the best, albeit imperfect, protection of the greatest number of freedoms for the greatest number of people.

To suggest that they are somehow a western invention inapplicable to other parts of the world would be an enormous insult to Egypt’s cartoonists who continue to criticise successive oppressive governments despite risking potential imprisonment or worse; to Saudi Arabian blogger Raif Badawi, who received the first fifty of a thousand lashes last Friday, for writing in support of free expression; to the Kenyan MPs who last year so strongly opposed a new security bill which will dramatically curb journalists’ ability to report freely. Also, it would be a profound insult to the vast majority of Muslim people in France and elsewhere—members of a diverse and varied faith—who managed to cope with the fact that Charlie Hebdo and other publications ran cartoons which insulted or poked fun at Islam.

Whether you think that the cartoons in Charlie Hebdo were amusing or clever or blasphemous or racist, is besides the point. Free speech and free expression were no more responsible for the killings in France last week than they were for the murder of more than two thousand people in Nigeria by Boko Haram. This isn’t to argue that we shouldn’t discuss—loudly, freely, rudely—how right or wrong it was to publish these cartoons in a society which many feel has strongly Islamaphobic and racist elements—in the same way we should debate potentially misogynistic, anti-Semitic, racist, homophobic, or transphobic writing, art, or speech too. But to begin to suggest that there are times when we shouldn’t criticise and satirise, is to suggest that there should be limits to what we may think and imagine.

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

Fugitive Knowledge

I never expected to receive an email from the Wayne County Airport Police. I had been so disoriented by the unpleasantness of immigration, crossing from Canada to the United States, that I’d dropped my travel notebook in Detroit airport. I’d only discovered its absence when unpacking in Ann Arbor and, as with most deep, unhappy losses, had only begun to realise how much I missed my small, black Moleskine diary a day or two later. But it was found, and a policewoman emailed to ask if it was mine. It arrived in a Fedex box six times its size within the week.

The diary would mean very little to anyone, I think. It contains addresses and phone numbers; lists of places to visit, things to buy, books to read, what to pack. It also includes recipes and descriptions of food I’ve eaten in Australia, Europe, Canada, and the US. It was these that I was particularly sorry to lose. In Kingston—a few days before arriving in Michigan—I’d written down the recipe for apple pie made by Elva McGaughey, my friend Jane’s mother, and an encyclopaedia of information on the home cooking of Ontario families.

That it was apple pie was significant. A week previously, Jane and Jennifer and Jennifer’s small son Stephen and I had picked apples in Québec’s Eastern Townships. We drove from Montréal, through bright green, softly rolling countryside. The sky was low and it drizzled. At the orchard, as Stephen snored gently in his sling, we filled deep paper bags with McIntosh and Cortland apples.

DSCN1484

Several people pointed out to me that the saying should be, really, ‘as Canadian as apple pie’ because—in their view—the best pie is made with Macs, a popular variety developed by John McIntosh, who discovered these tart, crunchy apples on his farm in Ontario in 1811. The Mac now constitutes 28% of the Canadian apple crop, and two thirds of all the apples grown in New England. It is—as I discovered—excellent for eating straight off the tree, and cooks down into a slightly sour, thick mush in pie.

Today, the Mac is one of only a handful of apples grown commercially. Industrialised food chains demand hardy, uniform, easily grown varieties which can withstand long periods of storage and transport without going off or developing bruises. Until comparatively recently, there were thousands of apple varieties to choose from. Writing about the United States, Rowan Jacobsen explains:

By the 1800s, America possessed more varieties of apples than any other country in the world, each adapted to the local climate and needs. Some came ripe in July, some in November. Some could last six months in the root cellar. Some were best for baking or sauce, and many were too tannic to eat fresh but made exceptional hard cider, the default buzz of agrarian America.

Nomenclature of the Apple: A Catalogue of the Known Varieties Referred to in American Publications from 1804 to 1904 by the pomologist WH Ragan, lists 17,000 apple names. I wonder if a small part of the enthusiasm fueling the current rediscovery of old varieties—even neglected apple trees will continue bearing fruit for decades—is due to the multiple meanings we’ve attached to apples over many, many centuries. They feature prominently in classical and Norse mythology, where they are symbols of fertility, love, youth, and immortality, but also of discord. They are fruit with doubled meanings. The apple in fairytales represents both the victory of the evil stepmother, as well as the beginning of our heroine’s salvation: her prince will kiss her out of the coma induced by the poisoned apple. In her novel The Biographer’s Tale, AS Byatt represents the two wives—one in England, the other Turkish—of the bigamist Victorian explorer Sir Elmer Bole with green and red apples. The fruit in the Garden of Eden—since at least the first century CE described as an apple—bestowed both knowledge and banishment.

If the name McIntosh seemed oddly familiar, then it may be because of a now-ubiquitous Californian brand: the Apple Macintosh, launched in 1984, was named ‘Apple’ by Steve Jobs—apparently then on a fruitarian diet—and after the Mac apple, a favourite of one of the company’s top engineers. It is appropriate that these sophisticated machines which offer access to so much knowledge—licit, illicit, open, secret—should be named for apples.

In Berlin Childhood around 1900, Walter Benjamin describes being woken early—at half past six—on winter mornings before school. His nursemaid would light the fire in a small stove by his bed:

When it was ready, she would put an apple in the little oven to bake. Before long, the grating of the burner door was outlined in a red flickering on the floor. And it seemed, to my weariness, that this image was enough for one day. It was always so at this hour; only the voice of my nursemaid disturbed the solemnity with which the winter morning used to give me up into the keeping of the things in my room. The shutters were not yet open as I slid aside the bolt of the oven door for the first time, to examine the apple cooking inside. Sometimes, its aroma would scarcely have changed. And then I would wait patiently until I thought I could detect the fine bubbly fragrance that came from a deeper and more secretive cell of the winter’s day than even the fragrance of the fir tree on Christmas eve. There lay the apple, the dark, warm fruit that—familiar and yet transformed, like a good friend back from a journey—now awaited me. It was the journey through the dark land of the oven’s heat, from which it had extracted the aromas of all the things the day held in store for me. So it was not surprising that, whenever I warmed my hands on its shining cheeks, I would always hesitate to bite in. I sensed that the fugitive knowledge conveyed in its smell could all too easily escape me on the way to my tongue. That knowledge which sometimes was so heartening that it stayed to comfort me on my trek to school.

The baked apple—Proust’s madeleine for twenty-first-century theorists—both opens up Benjamin’s memories of childhood during a period of acute homesickness, but, as a child, it contained the ‘fugitive knowledge’ of what lay ahead. It could fortify—sustain—him on the journey to school, between the dark warmth of home and the noise and brightness of school.

Notebooks contain the same fugitive knowledge: they are both guides for future action, and repositories of information, memory, fact gathered over time and place. They travel in pockets and backpacks and book bags from Drawn and Quarterly, accruing meaning, emotional and intellectual. They belong to time present, as well as time future and past.

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.