Skip to content

Posts from the ‘comment’ Category

Throwing Light

At the beginning of this year, The Economist published a worried article about the state of South Africa’s electricity supply. Eskom – the parastatal responsible for both generating and transmitting electricity across the nation – is in serious trouble. Rolling blackouts – called load shedding – have become increasingly the norm, as they’re used to reduce pressure on a grid already under strain mainly because of poor maintenance of transmission infrastructure:

South Africans now check electricity reports that read like weather forecasts: ‘There is a medium probability of load shedding today and tomorrow, with a higher probability on Thursday and Friday,’ said a recent Eskom tweet.

The introduction of artificial light – first gas lamps during the eighteenth century, then gradually electricity – profoundly shaped the ways in which human beings lived and worked. With lamps and bulbs to light the early mornings and nighttime, the workday lengthened, dinner moved much later in the day, and the hours of sleep were limited to the darkest period at night and the very early morning. Experiences of walking the city after dark changed. Sean Cubitt writes:

The history of invention in lighting technologies is extraordinarily brief: for millennia oil, fat and wax candles were the only lighting materials, and flame the only energy; their expense made darkness common for the poor majority. Only in the 1780s did mantles appear. Then the flood: limelight, arc light, gas lighting and incandescent electric light arrived hand in hand with the industrial city, its extension of the working day and its rush to produce new consumer rituals and needs in the illuminated windows of the department store. It is scarcely possible to imagine the megacities’ 24/7 lifestyles, the perpetual-motion machinery of modern manufacture, the constant flow of transport, without neon, incandescent and fluorescent lighting, headlamps and streetlamps.

The gradual electrification of domestic appliances, the slow spread of gas and electric ovens and stoves, and the growing availability of better refrigeration not only changed how people shopped, cooked, and ate, but also freed up women from intensive domestic labour. For most of the world’s population, though, access to expensive electricity remains precarious. When Sierra Leone declared a three-day curfew to limit the spread of ebola, many worried that households would be unable to store fresh produce in a hot country where refrigerator ownership is rare. Going to market there – and other parts of the continent – doesn’t represent some kind of commitment to seasonal, locally produced, and organic eating, but is, rather, a product of necessity. Even in wealthier countries, irregular power supply is the norm, not the exception. As Alex Christie-Miller, a journalist based in Istanbul, remarked recently on Twitter, ‘people tweeting too much about power cuts’ is really a case of #SecondWorldProblems. Chimamanda Ngozi Adichie has written eloquently about the frequent power outages in Lagos:

Day after day, I awkwardly navigate between my sources of light, the big generator for family gatherings, the inverter for cooler nights, the small generator for daytime work.

Like other privileged Nigerians who can afford to, I have become a reluctant libertarian, providing my own electricity, participating in a precarious frontier spirit.

As we adapted to light – as electricity or power is described in Nigeria – so we have had to adapt to darkness, to hot days in summer, to very cold nights in winter. To periods of internet-lessness, to laptops with better batteries. This disruption of patterns of living shaped by electricity has forced us all into a series of – reluctant – accommodations. Solar panels and lamps, miners’ lamps, generators, inverters, hot boxes, and gas stoves all help to facilitate some form of normality. But we’ve had to change our routines too. Friends with light at night entertain those whose houses are in the dark. If my suburb is due to be load shed at the worst time – between 18:30 and 22:00 – I rush to finish as much work as I can before then, and spend the hours of darkness cooking because I have a gas stove, and then reading by my solar lamp.

DSCN0739

Streetlamp and power lines, Gardens, Cape Town

Load shedding cuisine has become a feature of these blackouts too. Food magazines post lists of restaurants which continue to serve food – albeit frequently from a load shedding menu – when the power is cut. More often, we have to plan more carefully what and when we cook and bake. My mother managed to make enough brownies to feed 150 people at my sister’s wedding, and all through keeping a beady eye on load shedding schedules. A couple of weekends ago, unsure if the electricity would go off in the afternoon, I made a cheesecake that didn’t require baking, only setting in a fridge – and fridges and freezers remain cold without power unless their doors are opened too often.

Some tips for coping with an uncertain electricity supply (when schedules fail, and when the inevitable barbecue isn’t possible or desirable):

  • If in possession of a gas hob: buy long matches to light the burners without toasting your fingers; invest in a stovetop, whistling kettle; remember that leftovers need to be easily reheated on the stove (and not in the oven or microwave); add cold sauces to hot, almost-cooked pasta in the pan in which the pasta cooked to avoid having to use more than one burner; couscous and bulgur wheat need only to be steeped in boiling water; toast can be made in a frying pan.
  • A hot box, or wrapping a casserole dish in a thick blanket, will cook a pot of rice or a stew once they’ve been brought to the boil on a stove.
  • Make bread in darkness, allow it to rise overnight, and then bake it the following morning.
  • Open the fridge and freezer sparingly.
  • Keep frequently used ingredients – oils, vinegars, tinned goods, spices, pasta, rice – at the front of cupboards for easy location in the gloom.
  • Buy cooked chicken, fish, salami, and other protein to add to salads.
  • Not all cakes and puddings need to be baked: cheesecake, fridge cake, mousse, trifle, fool.
  • Plan ahead: allow things to defrost slowly instead of relying on the microwave; make dishes which can be eaten at room temperature; prepare sauces for pasta in advance.

More suggestions welcome.

In ‘A Temporary Matter,’ the first story in Jhumpa Lahiri’s collection The Interpreter of Maladies (1999), a couple working out the devastating consequences of a miscarriage find a new way of speaking the truth to each other during a series of planned power cuts. The darkness allows them to say all the things they’d kept secret or avoided thinking about during the short period of their marriage.

DSCN5503

In the Company’s Gardens, Cape Town

Partly because electricity has shaped our lives to such an extent that it’s unusual to live in complete darkness, we attach all sorts of positive meanings to the dark and being without light. In disconnection, we find connection. Temporary darkness becomes a space for contemplation, for self-reflection, for re-connection with one another and the natural world. I don’t have much patience for those who suggest that this current round of load shedding will be morally improving for South Africans – that it’ll teach us the virtues of slowness, for instance – as access to electricity still remains fairly uneven, despite the post-apartheid’s state’s success in increasing the numbers of households with electricity from 35% in 1990, to 84% two decades later. Also, as Ngozi Adichie writes:

I cannot help but wonder how many medical catastrophes have occurred in public hospitals because of ‘no light,’ how much agricultural produce has gone to waste, how many students forced to study in stuffy, hot air have failed exams, how many small businesses have foundered.

Hospitals are exempt from load shedding in South Africa, but her point still stands. The country comes to a grinding halt whenever electricity cuts. Nonetheless, in a way, the darkness has become a space for a kind of truth-telling in South Africa: of an increasingly discredited state unable to fulfill one of its most basic functions – keeping the lights on.

Neighbourgoods Market, Braamfontein, Johannesburg

Neighbourgoods Market, Braamfontein, Johannesburg

I am, though, interested to see how frugal cooking habits shaped by unpredictable electricity will change over the next few years – and not only in South Africa. Increasingly fragile electricity grids are not limited to the developing world. How will food writers rewrite recipes that depend on long periods of braising in expensive-to-heat ovens? How will recipe books make allowance for the difficulties of keeping fresh produce, fresh? How will our shifting relationship with energy produce new ways of cooking and eating?

Further reading

Sean Cubitt, ‘Electric Light and Electricity,’ Theory, Culture & Society, vol. 30, nos. 7/8 (2013), pp. 309–323.

David E. Nye, When the Lights Went Out: A History of Blackouts in America (Cambridge, MA: MIT Press, 2010).

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

Modern Times

A month or so ago, the food writer Todd Kliman was criticised for publishing an article in the Washingtonian titled ‘Can Ethiopian Cuisine become Modern?’ Although much of the response to the headline was, I agree, entirely justified – this is a silly, insulting question which invokes a stereotype about Africans being forever stuck in pre-modernity – Kliman’s article presents a considerably more nuanced argument. He is interested in why the Ethiopian food which he eats enthusiastically in Washington DC – a city famous for its Ethiopian restaurants – has changed relatively little in the past few decades. He writes:

But even though the cuisine’s profile has risen, the food itself hasn’t exactly evolved. Ethiopian restaurants have become markedly more fashionable over the last 20 years – gone are the days of sitting around woven-grass tables in dark, sometimes dank dens – but the cooking is hardly different from what you would have found four decades ago. A meal then is a meal now.

Put another way, Kliman investigates why Ethiopian food – particularly as it is prepared in the US – has not been made cosmopolitan. He acknowledges that what we now define as Ethiopian cuisine has only been so since the 1970s, when refugees fleeing the civil war opened restaurants selling cheap, delicious, and exotic-yet-familiar food to curious eaters in the West:

The educated elite who came to America in the ’70s might not look like culinary pioneers … but in selecting the roughly two dozen dishes they would introduce to American diners, they in effect codified the meaning of Ethiopian food in the West. (Most of these dishes come from the Gondar region … so just as Sicilian and Neapolitan red sauce and pizza came to mean Italian food to most Americans, Gondarean dishes have come to mean Ethiopian.)

These restaurants included special, vegetarian feast dishes on ordinary menus. They prepared puddings, added raw vegetables to salads, and cooked with boneless meat. Ethiopian cooking needed to be made palatable to foreign audiences. A good comparison to Ethiopian food in the US is Indian food in Britain. There, after the Second World War, largely Bengali cooks remade some of the dishes of the region to British tastes: not as hot, richer, and with a greater proportion of gravy to meat. The difference between these two cuisines, though, is that while it’s still possible to find old-fashioned curry houses across Britain, the numbers of restaurants specialising in regional cuisines and in remaking Indian cooking traditions have also proliferated. Kliman suggests that one reason for Ethiopians’ hesitancy to embrace change – both in the US and, interestingly, in Ethiopia – has to do with the country’s fraught politics. One diner in Addis Ababa explained:

He talked about the coup, the war, the decades of suppression and fear. Just as Ethiopians are enormously proud that their country has been called the birthplace of civilization, he explained, they’re proud of the fact that they’re eating the same food as their nomadic, tribal ancestors. (And, not least, eating that food in the exact same way: with their hands.) Continuity can be equated with conservatism, yes. But in a country with a long history of political uncertainty and upheaval, it also signals stability and comfort.

Ethiopian cuisine has long been shaped by nationalism. During the late nineteenth century, at a time when a national identity and the idea of an Ethiopian state were being forged, the Ethiopian court pioneered a kind of cooking which it described as the national cuisine. This was a selective vision of what the majority of Ethiopians ate, but, nonetheless, became the basis of the cooking in cafes and restaurants that began to open in the early 1900s. In the past three decades or so, this national cuisine has been adopted as somehow encapsulating Ethiopia’s national identity – despite the fact that it bears little resemblance to what nomads would have eaten even in the recent past.

Ethiopian tea and coffee at Arts on Main, Johannesburg.

Arts on Main, Johannesburg.

But even if Kliman isn’t really interested in Ethiopian food becoming ‘modern,’ this question about diet and modernity is an important one. The appeal of Ethiopian restaurants to leftwing Americans in the 1970s (ironically in Washington DC, one of the key cities of the Enlightenment) was precisely because it seemed to speak to their anxieties about modernity in an era of oil crises, rising anxiety about ecological disaster, and the slow emergence of finance capital. This was – they believed – food from a simpler, gentler, pre-modern time.

But American progressives have not always been so enthusiastic about immigrant cooking. In his wonderful book Revolution at the Table: The Transformation of the American Diet (1988), Harvey Levenstein devotes a chapter to the New England Kitchen (NEK), a project established in Boston in 1890 by Edward Atkinson, Wilbur Atwater, and Ellen Richards. Concerned about the growing potential for strikes and other forms of collective action in American industry, Atkinson, a prosperous Boston businessman, was interested in ways of improving the living conditions of his employees without raising their wages. Nutrition seemed to offer one way of solving this conundrum – an impression confirmed by the hugely influential scientist of nutrition, Wilbur Atwater. Ellen Richards, a chemist and the first woman graduate of the Massachusetts Institute of Technology, argued that ways needed to be found to apply scientific research and principles to the improvement – the modernising – of American households.

The result of this collaboration was the NEK, which was intended both as a research institute and as a school where working people could learn to prepare simple, nutritious meals. Initially, it appeared to be a raging success, attracting funding from Andrew Carnegie, and with branches soon opening in New York, Chicago, and Philadelphia. But the NEK model failed quickly, and largely because it could not attract adequate numbers of the urban poor to attend classes. This was due in part to the fact that the diet recommended by the NEK was distinctly dull, heavy in refined carbohydrates, and sparingly flavoured. (This was in a time before the discovery of vitamins, so NEK staff were dismissive of the usefulness of fruit and vegetables.)

The ethnically varied working poor – constituted mainly of Italians, French Canadians, the Irish, and Jews from eastern and central Europe – apparently served by the NEK were not interested in this bland, heavy ‘American’ cooking. Moreover, as Levenstein makes the point, the cuisines brought by these immigrants was far more than simply sustenance: they were the basis for new identities in a foreign land, they created social cohesion, and they were closely intertwined with women’s own positions within both families and communities. Although the NEK project failed in some ways, its work was picked up in the early twentieth century by nutritionists who campaigned for the ‘Americanisation’ of immigrant diets, arguing that the strong flavourings of foreign diets served only to overwork digestive systems and encourage drinking. Meals had to be eaten on plates, rather from bowls, and with knives and forks. Spaghetti was not deemed an appropriate dinner. This was modern eating for modern Americans.

This process was not particular to the US. Missionaries in nineteenth- and twentieth-century Africa taught converts on mission stations to eat with knives and forks, instead of communally, with hands. Home economics classes, the homecraft and Jeanes movements, and other interventions were intended to teach African women how to run modern, civilised homes shortly before and after independence.

But this suspicion of immigrant food and eating as being somehow both anti-modern and unpatriotic is worth considering. American nutritionists in the early decades of the twentieth century were also suspicious of how immigrant women bought their food, choosing to go to small delis owned by other immigrants, instead of larger grocery stores. South Africa is experiencing yet another wave of xenophobic violence again – attacks on foreigners, most of them from the rest of the continent, as well as China and south Asia, never really cease, but we’re witnessing a moment of particularly heightened violence – and targets are often small spaza shops in informal settlements. Locals accuse foreigners of buying stock in bulk, thus undercutting South African businesspeople. One of the implications of the closure of these businesses is hunger: they sell food at much lower prices than the big supermarkets, which also tend to be taxi- and bus-rides away.

Apartheid’s project of race classification insisted that the race categories into which the population was divided were culturally defined: Indian people in Durban ate curry, ‘Malay’ people in Cape Town cooked bredie. Apartheid ideologues went out of their way to erase centuries of entangled histories. A refusal to engage with others – a refusal to understand our reliance on others – simply continues that project.

Sources

Timothy Burke, Lifebuoy Men, Lux Women: Commodification, Consumption, and Cleanliness in Modern Zimbabwe (Durham, NC: Duke University Press, 1996).

Nancy Rose Hunt, ‘Colonial Fairy Tales and the Knife and Fork Doctrine in the Heart of Africa,’ in African Encounters with Domesticity, ed. Karen Tranberg Hansen (New Brunswick, New Jersey: Rutgers University Press, 1992).

Harvey A. Levenstein, Revolution at the Table: The Transformation of the American Diet (Berkeley: University of California Press, 1988).

James C. McCann, Stirring the Pot: A History of African Cuisine (Athens, OH.: Ohio University Press, 2009).

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

Cows Come Home

Last week, Maharashtra, India’s second-biggest state and home to the country’s commercial capital Mumbai, approved legislation which would ban the sale or possession of beef. The slaughter of cattle – cows, bulls, and calves – is now illegal. The right wing Hindu nationalist Bharatiya Janata Party (BJP), which has been in power both nationally and in Maharashtra since May last year, argued that the Maharashtra Animal Preservation (Amendment) Act saves an animal revered by many Hindus as holy. In their view, this represents a victory for pious Hindus.

As many have pointed out, although some Hindus may be in favour of a ban on the slaughter of a beast which they believe to embody divinity, the consumption and sale of beef in India is a complex and contradictory business. Firstly, the beef trade is controlled by the country’s Muslim minority, and beef is consumed mainly by them and the even smaller Christian portion of the population. Despite the fact that India is supposed to be a secular state, this law is aimed directly as these religious minorities. Vashna Jagarnath writes:

This ban will devastate the beef industry in Maharashtra, an industry that is largely run by the Muslim minority. It is not an isolated act. On the contrary, it is part of a longstanding attempt by the Hindu right, now backed with the power of the state, to make the lives of religious minorities increasingly difficult.

The ban provides the fascist project with two immediate benefits – exerting control over the minorities by sending a clear message about their increasingly precarious position in contemporary India; and dealing an economic blow to Muslims who trade in the bovine industry.

In Gaborone, Botswana.

In Gaborone, Botswana.

Secondly, this is not the first time that there have been efforts to control the slaughter of cattle in India. Several states have made the killing of cows illegal, and there are laws which limit the sale of beef in some areas. Indeed, the Maharashtra Animal Preservation (Amendment) Act has taken nineteen years to pass. The Bill was sent to the then-President to sign into law in 1996, but it floundered – only when the BJP was re-elected in 2014 was it able to recommit to making the ban real.

And the ban has caused widespread outrage in India – and not only among Muslims and Christians. This is the third point: some Hindus eat beef too. Not all Hindus stick absolutely (religiously?) to vegetarianism. In 2001, the historian DN Jha faced harassment and attempts to prevent the publication of his – by all accounts fairly dry – monograph, The Myth of the Holy Cow. His not particularly fresh thesis was that Hinduism’s ban on beef is a relatively new phenomenon. Pankaj Mishra explains:

the cow wasn’t sacred to the nomads and pastoralists from Central Asia who settled North India in the second millennium BC and created the high Brahminical culture of what we now know as Hinduism.

These Indians slaughtered cattle for both food and the elaborate sacrificial rituals prescribed by the Vedas, the first and the holiest Indian scriptures. After they settled down and turned to agriculture, they put a slightly higher value upon the cow: it produced milk, ghee, yoghurt and manure and could be used for ploughing and transport as well.

Indian religion and philosophy after the Vedas rejected the ritual killing of animals. This may have also served to protect the cow. But beef eating was still not considered a sin. It is often casually referred to in the earliest Buddhist texts.

The cow became holy first for upper-caste Hindus between the seventh and the thirteenth centuries CE. These were the people who could afford not to spend most of their time producing their food. What changed, though, to identify vegetarianism with Hinduism?

The answer lies in the 19th century, when many newly emergent middle-class Hindus began to see the cow as an important symbol of a glorious tradition defiled by Muslim rule over India. For these Hindus, the cause for banning cow-slaughter became a badge of identity, part of their quest for political power in post-colonial India. Educated Muslims felt excluded from, even scorned by, these Hindu notions of the Indian past; and they developed their own separatist fantasies.

The implications of these nationalist beginnings during the Raj are now playing out in Maharashtra.

My final point is one that I found the most surprising: the effects of the ban on the export of beef. India not only exports water buffalo – the red meat of choice for many Indians – but twenty per cent of the world’s beef comes from India. The Maharashtra Animal Preservation (Amendment) Act has implications, then, for the global food supply. Beef has been a commodity traded on national and international markets since improvements in transport – railways, shipping – and, more importantly, refrigeration, in the late nineteenth and early twentieth centuries. In the United States, the price of beef dropped in the 1870s and 1880s because of the opening up of huge ranches in the west which were connected by rail to packing centres in large cities, most notably (and notoriously, given the revelations in Upton Sinclair’s The Jungle (1906)) Chicago.

Something similar happened in South Africa, when the politician and wildly successful businessman Sir David de Villiers Graaff, 1st Baronet, pioneered refrigeration, allowing fruit, vegetables, and meat to be transported across the country’s vast interior without spoiling. His Imperial Cold Storage and Supply Company – founded on the eve of the South African War (1899-1902), out of which De Villiers Graaff profited nicely – became one of the biggest meat packing businesses in Africa.

This and large-scale tax avoidance were at the root of the wild success of the Vestey brothers’ beef empire in the early twentieth century. By 1922, Vesteys had, as Ian Phimister writes, ‘interests in South America, China and Russia, and extensive land holdings in South Africa; it gradually extended its operations to embrace Australia, New Zealand and Madagascar.’ The business shipped beef – produced cheaply under appalling conditions for both workers and cattle – around the world with ‘five steamers refrigerated and fitted for the carriage of frozen meat’.

A poster in Williamsburgh's Spoonbill & Sugartown bookshop.

A poster in Williamsburgh’s Spoonbill & Sugartown bookshop.

The demand that drove the expansion of ranching and packing in the US, and De Villiers Graaff and the Vestey bothers’ businesses, was a growing middle-class taste for a meat once prohibitively expensive. Beef became – like sugar, chocolate, and tea – an affordable luxury once an industrialised food chain caused prices to fall. A similar process is currently underway in India, as an ever-bigger middle class chooses to add more beef to its diet. Although a small, committedly nationalist middle-class was partly responsible for making Hindu diets vegetarian in the nineteenth century, the opposite is happening now. Part of a global circulation of both commodities and ideas – middle classes in other developing nations are also eating more red meat – to what extent will this large middle class be able to negotiate the demands of right wingers keen to protect the lives of holy cows, and the attractions of a more varied and ‘modern’ diet?

Sources

Ebbe Dommisse, Sir David de Villiers Graaff: First Baronet of De Grendel (Cape Town: Tafelberg, 2011).

Harvey A. Levenstein, Revolution at the Table: The Transformation of the American Diet (Berkeley, CA: University of California Press, 2003).

I. R. Phimister, ‘Meat and Monopolies: Beef Cattle in Southern Rhodesia, 1890-1938,’ Journal of African History, vol. 19, no. 3 (1978), pp. 391-414.

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

Apples and Oranges

One of my favourite scenes in Alice in Wonderland is when the Caterpillar asks Alice ‘Who are YOU?’ Having spent the day being shrunk, telescoped, and grown again, Alice is at a loss: ‘I—I hardly know, sir, just at present—at least I know who I WAS when I got up this morning, but I think I must have been changed several times since then.’ During a period obsessed with lineages, classes, and groups, Alice’s inability to slot herself into the correct category feels profoundly transgressive. Her ontological uncertainty—she remarks to the Caterpillar ‘I can’t explain MYSELF…because I’m not myself’—is more mature than the Caterpillar who will, as Alice argues, turn into a chrysalis and then a butterfly. Nobody is one thing for very long.

The same can be said, of course, for confectionary. Periodically, Britain convulses in a fraught debate over the status of the Jaffa Cake. In their commercial form these are rounds of Genoise sponge topped with orange jelly, and covered with chocolate. Supermarkets sell bright blue packets of McVitie’s Jaffa Cakes in the same aisle as Digestive biscuits, Hobnobs, and shortbread. So to the uninformed, the Jaffa Cake is – despite its name – a biscuit.

But is it really? Legally, the Jaffa Cake qualifies as a cake. A long and complicated court case in 1991 ruled in favour of McVitie’s, confirming that the Jaffa Cake is indeed a cake and should not, then, be subject to VAT. Harry Wallop explains:

In the eyes of the taxman, a cake is a staple food and, accordingly, zero-rated for the purposes of VAT. A chocolate-covered biscuit, however, is a whole other matter—a thing of unspeakable decadence, a luxury on which the full 20pc rate of VAT is levied.

McVitie’s was determined to prove it should be free of the consumer tax. The key turning point was when its QC highlighted how cakes harden when they go stale, biscuits go soggy. A Jaffa goes hard. Case proved.

So this is a Cake which looks like a biscuit but is really a cake.

Oranges trees in Perth, Australia.

Oranges trees in Perth, Australia.

But this ontological uncertainty extends beyond its position as cake or biscuit. Jaffa Cakes are named after Jaffa oranges. (McVitie’s never patented the name Jaffa Cake, so chocolate-and-citrus flavoured confections are often described as ‘Jaffa.’) These were developed in Palestine – in and near the port city of Jaffa – during the 1840s. Sweet, seedless, and with a thick rind which made them perfect for transporting, Jaffa or Shamouti oranges became Palestine’s most important export in the nineteenth century. The arrival of Jewish immigrants in the 1880s and 1890s revolutionised citrus growing in the region. These new arrivals introduced mechanised, ‘scientific’ forms of agriculture, dramatically increasing yields.

By 1939, Jewish, Palestinian, and, occasionally, Jewish and Palestinian farmers working collaboratively, employed altogether 100,000 people, and exported vast numbers of oranges abroad. Britain was a major importer of Jaffa oranges, particularly after Palestine became a Mandated territory under British control in 1923. The Empire Marketing Board – which promoted the sale of imperial produce – urged Britons to buy Jaffa oranges, something picked up by McVitie’s in 1927 with the invention of the Jaffa Cake.

An Empire Marketing Board advertisement for Jaffa oranges.

An Empire Marketing Board advertisement for Jaffa oranges.

Jaffa oranges were – and, to some extent, are – held up as an example of successful Palestinian and Israeli co-operation during the interwar period. But after 1948, the same oranges became a symbol of Israel itself. Similar to the boycott of Outspan oranges during apartheid, organisations like BDS have urged customers not to buy Jaffa oranges as a way of weakening Israel’s economy and demonstrating their commitment to a free Palestine. (Jaffa oranges are no longer, though, a major Israeli export, and are grown in Spain, South Africa, and elsewhere.)

The changing meanings of Jaffa Cakes – cake, biscuit – and their constituent ingredients – symbol of collaboration, symbol of oppression – show how the categories into which we slot food are themselves constructs. (We could, really, compare apples and oranges.) But also, the Jaffa Cake helps to draw our attention to how taxes, trade agreements, and the politics and practicalities of shipping shape the ways in which we eat, buy, and think about food. Last year, the supremely British McVitie’s – producer of the Jaffa Cake, the most widely recognised biscuit (I mean, cake) in Britain – was sold to Yildiz, a food group based in … Turkey.

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

New Wine

Last week some friends and I had supper at the Cube Tasting Kitchen. I should emphasise at the outset that for all the fact that I write a blog about food, I’m not a huge fan of the mad flights of fancy which characterise fine dining at the moment. I’m not into molecular gastronomy. I think it’s really interesting—and for a number of reasons, not only culinary—but given the choice between that and the sublime comfort food served at The Leopard and Woodlands Eatery, pizza at Stella e Luna, or dim sum at the South China Dim Sum Bar, I’d probably choose one of the latter.

But Cube was, really, entirely wonderful. And fun. It’s a small, box shaped, white walled restaurant in Joburg’s Parktown North, in a row of good and unpretentious middle-range restaurants, including Mantra which is one of my favourite places at which to eat saag paneer. It was an evening of delights over fifteen courses. We began with six starters, each themed according to a vegetable—tomato, cucumber, cabbage, potato—or a deconstructed—pissaladière—or reconstructed—Parmesan ice cream with balsamic vinegar made to look like vanilla ice cream and chocolate sauce—version of a familiar dish. The cucumber came with a gin cocktail, the cabbage soup was blue and then turned purple, and the Parmesan ice cream didn’t really work.

Blue cabbage soup...

Blue cabbage soup…

Johannesburg-20150129-00424

…that turns purple. (Apologies for the grainy photographs.)

That was okay, though. The fact that not every course was an absolute success was part of the fun. The infectious enthusiasm of the young chefs—who cook almost in the middle of the restaurant—and of the serving staff turned this into a game and an adventure. I had vegetarian main courses. The oddest, but most successful, was a combination of asparagus, humus, and shards of meringue with black pepper. The most delicious was a mushroom soufflé and a curry reduced to its most basic elements. The most beautiful was a Jackson Pollocked plate of beetroot and leek, which was also, paradoxically, the least flavourful.

Johannesburg-20150129-00428

Beetroot and leek.

And pudding—after baklava and cheese, and a palate cleanser of sherbet, pomegranate jelly, and orange sponge consumed as you would tequila with salt and lime—was a forest floor of pistachio marshmallow, rice crispy and cranberry cookies, chilled chocolate mousse, dried flower and chocolate soil, coffee biscuits, lemon gel, and wheat grass. Then there were chocolate brownies and coconut ice.

Forest floor pudding.

Forest floor pudding.

The size of the portions and the length of time it took to eat all of this—we were there for more than three hours—meant that we could digest at leisure. Because this was as much an intellectual and sensory exercise as it was supper. It would be easy to criticise this kind of dining on the grounds that its purpose is not really to feed people: it uses good, expensive food to allow fairly wealthy paying customers to have fun. But it is equally true that food has always been about more than nutrition. Human beings have long consumed—sacrificed—food in the name of status and power, in performing rituals, and marking celebrations.

It is, though, interesting that molecular gastronomy—which has its roots in the nouvelle cuisine of the 1980s—came to prominence before and during the 2008 crash, in a period marked by ever widening social and economic inequality. (On a side note, it’s worth thinking about relative definitions of wealth: our meal at Cube was expensive, but within the realms of financial possibility even for someone on a fairly modest researcher’s salary. I would never be able to afford the same menu at a similar restaurant in London, for instance.) Molecular gastronomy does not—despite the grandiose claims of some of its practitioners—represent the future of food.

It does, though, represent the past. What sets the foams, pearls, and flavoured air of molecular gastronomy apart from other iterations of fine dining is its reliance on technology. Indeed, the twin gurus of this kind of cuisine—academics Nicholas Kurti and Hervé This—were interested in researching the chemical processes which occurred during cooking. Their acolytes—from Heston Blumenthal to Ferran Adrià and René Redzepi—have used this knowledge to disrupt, deconstruct, reconstruct, and undermine what we think of as ‘food.’

This work, though, does not really fundamentally challenge our eating habits and choice of things to eat. Noma might serve insects and Blumenthal may have invented snail porridge, but molluscs and insects have been part of human diets for a very long time. I think that a more accurate name for molecular gastronomy is, really, modernist cuisine—the title of Nathan Myhrvold’s 2011 encyclopaedic guide to contemporary cooking. In all of is reliance and enthusiasm for technology, molecular gastronomy is supremely modern: this is the food of industrialisation. It is as heavily processed as cheese strings. Modernist cuisine is the logical extreme of an industrialised food system.

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

In a Nutshell

On my fridge, I have a collection of business cards from cafes and shops visited on trips abroad. This afternoon—months late—I added another few from a recent month-long stay in Canada and the US, and I was reminded of a fantastic breakfast at the August First bakery in Burlington, Vermont. I was in Burlington for a conference and spent a couple of days beforehand working and wandering around a small university town – I grew up in a small university town so I have a professional interest in them – which has a reputation for extraordinarily progressive and inclusive politics. DSCN1370 There were posters advertising make-your-own banjo classes (out of gourds, apparently), vegan Thanksgiving, and homebrew nights; the local Democratic party was next door to a Tibetan dumpling shop; and I have never been so aware of the plight of the Dalai Lama as I was in the week I spent in Vermont. And there was the most amazing co-operative, which had a wall – a wall! – of granola. Progressive America is, truly, the most amazing place. (In a similar vein, Ann Arbor’s community co-op is opposite a Birkenstock shop.) DSCN1380 I had, then, granola at August First. And it was wonderful granola, with whole walnuts and fat raisins, and with plenty of really good plain yoghurt. Burlington has embraced its granola. But – and I write this as one who makes her own granola – there is a contradiction at the heart of the association of granola with progressive living: a lot of the time, it’s full of sugar. Unlike muesli, which is left raw, granola is baked usually with honey, maple syrup, or (sometimes and) sugar, as well as oil, and, occasionally, egg white. This is not necessarily the healthiest breakfast. So why does granola signify healthy eating? DSCN1386 This isn’t the only food to be linked to left wing politics. Paul Laity notes:

‘Socialism,’ George Orwell famously wrote in The Road to Wigan Pier (1936), draws towards it ‘with magnetic force every fruit-juice drinker, nudist, sandal-wearer, sex-maniac, Quaker, “Nature Cure” quack, pacifist and feminist in England.’ His tirade against such ‘cranks’ is memorably extended in other passages of the book to include ‘vegetarians with wilting beards,’ the ‘outer-suburban creeping Jesus’ eager to begin his yoga exercises, and ‘that dreary tribe of high-minded women and sandal-wearers and bearded fruit-juice drinkers…’

Orwell’s ‘cranks’—a term reclaimed by the London vegetarian restaurant in 1961—were the free-thinking and –living British Bohemians of the early twentieth century, who experimented with new forms of comfortable dress, sustainable eating, eastern religions, egalitarian social arrangements, and alternative sexual identities. This early counter culture was strongly influenced by late nineteenth-century dieticians and naturopaths—many of them based in Germany—who advocated raw, simple eating in contrast to the meat- and starch-heavy meals which characterised most middle-class diets. DSCN1388 As Catherine Carstairs remarks in her essay ‘The Granola High: Eating Differently in the Late 1960s and 1970s,’ it was immigrants from central Europe who brought health food shops to North America, stocking vitamin supplements, wholewheat bread, and, inevitably, fruit juice. It was these shops that made widely available the foods eaten at more exclusive sanatoriums in Europe and the United States.

Like muesli and bircher muesli, granola was invented in a health spa. In her excellent and exhaustively detailed history of granola, Karen Hochman argues that Dr James Caleb Jackson—a farmer, journalist, and doctor—invented granula in 1863 for the patients at his spa, Our Home on the Hillside, in upstate New York. Relying heavily on Graham flour—invented by the dour evangelical preacher Sylvester Graham—he baked sheets of biscuits and crumbled them into granules to be soaked in milk and then eaten for breakfast. It’s likely that granula—the predecessor of Grape Nuts—would never have moved beyond the confines of Our Home on the Hillside had it not come to the attention of a rival sanatorium doctor and Seventh Day Adventist, William Kellogg, who used rolled, toasted oats instead of Graham flour biscuits. He renamed his product granola, and it became for a while a significant money earner for his Sanitarium Food Company (renamed Kellogg’s Food Company in 1908).

But enthusiasm for granola remained—largely—limited to the relatively small numbers of people who shopped in health food stores until the 1960s and 1970s. Then, concern about the effects of pesticides and additives on human, plant, and animal health; suspicion of the food industry; a desire to experiment with diets from elsewhere; and a back to the land movement all coincided to produce an interest in purer, healthier, more ‘natural’ foods. Hippies—another food counter culture—looked back and found granola. So did big food companies, as Hochman writes about the US:

Granola went mainstream in 1972, when the first major commercial granola, Heartland Natural Cereal, was introduced by Pet Incorporated. In rapid succession, Quaker introduced Quaker 100% Natural Granola; Kellogg’s introduced Country Morning granola cereal and General Mills introduced Nature Valley granola.

The sweet, nut- and dried fruit-filled granola we eat today is derived from the granola reinvented in the 1960s and 1970s. Despite having been popularised by Quaker and General Mills—the enemies of the second food counter culture—granola retained its association with progressive, healthy living.

This cultural history of granola tell us three things, I think. Firstly, that the food counter culture has roots in alternative experiments in living stretching as far back as the late eighteenth century, when vegetarianism and lighter diets were picked up as markers of enlightened, rational eating. Secondly, that business has long taken advantage of the experiments done by people working and living on the fringes of respectability.

Finally, it also traces the shifting meanings of what we define as ‘healthy.’ Despite evidence presented to us by nutritionists, what we think of as being healthy food depends on a range of factors, including whether, historically, a product has been associated with health-conscious living.

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

Presumptuous Insect

A few months ago, I was interviewed on a radio station about changing attitudes towards food and eating. After a caller commented that when he’d lived in rural Limpopo, he’d happily eaten frogs, but preferred McDonald’s having moved to Johannesburg, I managed—somehow—to talk myself into an urgent appeal to the nation to eat insects. I’m still not entirely sure how this happened, but I think it was partly connected to the recent slew of articles on why we need to eat insects to save the planet.

This insect turn in culinary fashion is, of course, nothing new. In 1885, the entomologist Vincent M. Holt published Why not eat insects? To some extent, current arguments for eating insects deviate little from this little manifesto. Holt remarks, rightly, that there is nothing inherently dirty about insects—in fact, crustaceans, being bottom feeders, are potentially more dangerous to eat—and that they can form part of a balanced diet. He suggests that Western aversion to eating them is linked strongly to culturally specific ideas about what is fine and not fine to eat. He cites the example of a Chinese banquet at an exhibition in London, pointing out that Britons happily sampled a menu which included cuttlefish, sea slugs, and birds’ nests because it was both exotic and, apparently, healthy. Past Europeans ate insects, and societies in Africa, Asia, and elsewhere happily, according to Holt, eat insects:

Beginning with the earliest times, one can produce examples of insect-eating at every period down to our own age. Speaking to the people of Israel, at Lev. xi. 22, Moses directly encourages them to eat clean-feeding insects: ‘These ye may eat, the locust after his kind, and the bald locust after his kind, and the beetle after his kind, and the grasshopper after his kind.’ …

Cooked in many and various ways, locusts are eaten in the Crimea, Arabia, Persia, Madagascar, Africa, and India. … From the time of Homer, the Cicadae formed the theme of every Greek poet, in regard to both tunefulness and delicate flavour. Aristotle tells us that the most polished of the Greeks enjoyed them… Cicadae are eaten at the present day by the American Indians and by the natives of Australia.

He appeals to his readers:

We pride ourselves upon our imitation of the Greeks and Romans in their arts; we treasure their dead languages: why not, then, take a useful hint from their tables? We imitate the savage nations in their use of numberless drugs, spices, and condiments: why not go a step further?

Contemporary interest in eating insects is, though, strongly connected to anxieties about a food chain which seems to be increasingly ecologically unsustainable. Current methods of producing enough protein for the world’s population are to the cost of animal welfare and good labour practice, consume vast quantities of water, and produce methane and other greenhouse gases. Something needs to change, and insect enthusiasts argue that crickets, grasshoppers, and caterpillars are a viable alternative to beef, chicken, and pork. In a 2013 report for the Food and Agriculture Organisation, Dutch entomologist Arnold van Huis—academic and author of The Insect Cookbook: Food for a Sustainable Planet (Arts and Traditions of the Table: Perspectives on Culinary History)—notes more than 1,900 species of insects already form part of the diets of ‘at least two billion people.’ A lot of these insects are high in protein—higher, in some cases, than beef—and other nutrients. Many of them consume waste, and farming them is comparatively cheap and requires little labour.

DSCN9087

This promotion of what Dana Goodyear calls ‘ethical entomophagy’ in Anything that Moves: Renegade Chefs, Fearless Easters and the Making of a New American Food Culture, has met with some commercial success. There are now—outside of regions where insects are normally part of diets—businesses dedicated to farming insects for human consumption. It’s possible to buy cricket flour; Selfridges sells chocolate covered giant ants; and pop up restaurants and Noma have featured insects on their menus. The logic is that these high-end sales of edible insects will gradually influence the middle and bottom of the market. A kind of ‘trickle down’ revolution in diet.

While it is certainly true that we can and have chosen to eat foodstuffs once deemed to be dangerous or socially taboo—potatoes in eighteenth-century France, beef in Japan during the Meiji Restoration—these shifts in attitude take time to achieve. Also, in the case of potatoes and beef, these societies were strongly hierarchical with powerful aristocracies. Thankfully, most of us no longer live in a world where the king’s decision to consume a formerly shunned ingredient changes the way that all of us eat.

As every recent article on entomophagy notes, the main obstacle to the widespread incorporation of insects into, particularly but not exclusively, Western diets is a strong aversion to eating them. If only, the argument goes, picky Westerners would give up their hypocritical dislike of insects—they eat shrimp and prawns, after all—and then we’ll all be fine. But I think it’s worth taking this dislike seriously. As Goodyear makes the point, a lot of these insects aren’t particularly delicious. She tries embryonic bee drones picked from honeycomb:

the drones, dripping in butter and lightly coated with honey from their cells, were fatty and a little bit sweet, and, like everything chitinous, left me with a disturbing aftertaste of dried shrimp.

I’ve eaten fried, salted grasshoppers at a food festival on London’s south bank, and they were crunchy and salty—improved, like most things, by deep frying—but otherwise memorable only for having been grasshoppers.

Making insects palatable involves processing, something which almost inevitably increases the ecological footprint of the product. Perhaps even more importantly, as the caller I referred to at the beginning of this post said, insects are widely associated with poverty and deprivation. Modernity—life in the city—requires a new diet. While it is true that in many societies, people do eat insects out of choice, it is equally significant that when they can, people stop eating insects as soon as possible.

Our current anxiety about sustainable sources of protein is driven partly by concern that the new middle classes in China and India will demand to eat as much beef, in particular, as their Western counterparts. I wonder to what extent this concern is part of a long tradition of Malthusian yellow peril: that China, in particular, will somehow eat up all the world’s resources. I don’t have any objection to promoting entomophagy—although trickle down strategies have a fairly low level of success—but I think we should look more carefully at the reasons underpinning our interest in investing in alternative forms of protein, and also be careful that we won’t take seriously the interests and tastes of people clawing their way out of poverty.

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

Not for all the tea

When I was finishing my PhD, my friend Jane gave me a t-shirt emblazoned with the slogan ‘tea is not a food group.’ She used to shout that into my room—we lived a few doors down from each other in the same student residence—as she passed me on the way to the lift. She had good reason for doing so. When I’m absorbed in writing, I can forget that the world exists: that it’s necessary to brush your hair, dress properly, cook, and not have conversations with yourself out loud. And that it’s unwise to subsist on tea.

Over the past couple of months, I’ve been in the final throes of completing a book manuscript and I’ve tried—probably not always successfully—to maintain at least a semblance of normal, civilised behaviour, but tea has remained a constant. It’s a kind of writing comfort blanket; a small routine in the middle of anxious typing. In some ways, then, it was a misfortune to be in the United States for much of this period. I could drink as much excellent coffee as I could cope with, but tea? Good strong, hot black tea? Until I discovered a branch of TeaHaus in Ann Arbor, not so much.

I know that I’m not the first to complain about the difficulty of finding a decent cup of black tea in the US, and, to some extent, this belief that Americans don’t understand hot tea is something of a misnomer. Teavana, Argo, and Teahaus all attest to an enthusiasm—an apparently growing enthusiasm—for well-made tea. I’ve never encountered so many different kinds of tea in supermarkets. (And, truly, Celestial Seasonings is the best name for a brand of tea.) But it is true, I think, that it’s hard to find really good black tea in the average café. While this is probably linked to the fact that most tea drunk in the US is iced tea, it’s also because tea in these establishments is made with hot—not boiling—water. This is crucial. Tea leaves need to steep in freshly boiled water.

Tea.

Tea.

This aversion to boiling water can be traced back to a 1994 civil case: Liebeck vs McDonald’s Restaurants. Two years previously, Stella Liebeck, an elderly Albuquerque resident, had spilled a cup of boiling hot coffee over her lap. She sued McDonald’s, and was awarded initially $2.7 million in punitive damages. While for some, the case has become emblematic of the madness of a litigious legal system, the truth is considerably more complex. Not only had Liebeck suffered third degree burns—resulting in extensive reconstructive work and long stays in hospital—but she and her family only sued McDonald’s as a last resort. When their reasonable request that McDonald’s cover her medical bills was turned down, they decided to go to court. Moreover, in the end, Liebeck received considerably less than $2.7 million: the judge reduced that sum to $480,000, and she was awarded, eventually, between $400,000 and $600,000 in an out of court settlement with McDonald’s.

This was not, then, a frivolous lawsuit. But it was interpreted as such, and became one of the examples cited in efforts to reform tort law—the legislation which allows people to sue others in the case of injury to themselves or their property—in the US. As some lawyers argue, the tort reform lobby led by Republicans isn’t really to reduce the numbers of lawsuits submitted by greedy people, but, rather, an attempt to protect business from having to pay for its mistakes.

For tea drinkers, though, this misperception (fanned by tort reform campaigners) has resulted in tepid, unpleasant cups of tea. Concerned about similar lawsuits, restaurants now serve hot—rather than boiling—water. But perhaps there is a kind of poetic—or historical—logic to having to search high and low for decent tea in the US. The chests of tea tipped into Boston’s harbour in 1773 was both in defiance of the Tea Act and a rejection of Britain’s right to tax the thirteen colonies. When patriots switched to coffee—indeed some refused even to eat the fish caught in or near the harbour on the grounds that they could have consumed some of the tea—it was in defiance of British rule. In the land of the free, shouldn’t tea be hard to come by? This association of coffee and freedom wasn’t new, even then. Coffee houses in eighteenth-century Britain and Europe were places where middle-class men could gather to talk and think. The work of the Enlightenment was done, to some extent, over cups of coffee. But coffee was produced on slave plantations and coffee houses—and the freedoms discussed in them—were largely for white men. Coffee represented, then, freedom of the few.

Like so many people recently, I’ve been thinking about the historical contexts which produced the principles on which liberal democracies are founded. Freedom of expression and of thought, freedom to gather, freedom of religious belief are fundamental to the functioning of liberal democracies. Regardless of the fact that these principles were originated during a period in which they applied mainly to white men—and regardless of the fact that they have not prevented injustices from being committed (sometimes in their name) in liberal democracies—these remain the best, albeit imperfect, protection of the greatest number of freedoms for the greatest number of people.

To suggest that they are somehow a western invention inapplicable to other parts of the world would be an enormous insult to Egypt’s cartoonists who continue to criticise successive oppressive governments despite risking potential imprisonment or worse; to Saudi Arabian blogger Raif Badawi, who received the first fifty of a thousand lashes last Friday, for writing in support of free expression; to the Kenyan MPs who last year so strongly opposed a new security bill which will dramatically curb journalists’ ability to report freely. Also, it would be a profound insult to the vast majority of Muslim people in France and elsewhere—members of a diverse and varied faith—who managed to cope with the fact that Charlie Hebdo and other publications ran cartoons which insulted or poked fun at Islam.

Whether you think that the cartoons in Charlie Hebdo were amusing or clever or blasphemous or racist, is besides the point. Free speech and free expression were no more responsible for the killings in France last week than they were for the murder of more than two thousand people in Nigeria by Boko Haram. This isn’t to argue that we shouldn’t discuss—loudly, freely, rudely—how right or wrong it was to publish these cartoons in a society which many feel has strongly Islamaphobic and racist elements—in the same way we should debate potentially misogynistic, anti-Semitic, racist, homophobic, or transphobic writing, art, or speech too. But to begin to suggest that there are times when we shouldn’t criticise and satirise, is to suggest that there should be limits to what we may think and imagine.

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

Ironically

I spent much of my time in Ann Arbor in coffee shops, writing. Having conquered my guilt at working in cafes, occupying space which could be filled by more paying customers (truly, a Calvinist education never really leaves you), I embraced America, the land of the free Wifi. One of my favourite places for working was Mighty Good Coffee, a relatively new shop and café on North Main Street—about a three minute walk on the diagonal from Kerrytown—which is bright and airy and friendly, with lovely coffee and a fridge full of yoghurt.

It also sells artisanal toast. Curious, I tried first a slice of ten seed loaf (good), and then returned with friends and ordered sourdough with cherry jam (very good indeed). But what sets artisanal toast apart from ordinary toast? Was it made by elves, as a friend asked acerbically on Facebook? As far as I could see, this was particularly nice bread, toasted in a fairly fancy toaster, served with rather special butter and jam. But for slightly more than $3.

My—and, I think, other people’s—interest in Ann Arbor’s first (possibly?) instance of artisanal toast was piqued by an article published by the Pacific Standard early last year. In it, John Gravois traces the origins of the artisan toast vogue to San Francisco and the Trouble Coffee & Coconut Club and, more specifically, to its owner, Giulietta Carrelli. The café is, as she comments, her way of coping with bouts of recurring mental illness: it provides structure, stability, and a support network, and it serves food which comforts. Gravois explains: ‘She put toast on the menu because it reminded her of home: “I had lived so long with no comfort,” she says.’

What could easily have been a story about hipsters selling the most ordinary of ordinary breakfast foods for outrageous sums of money becomes, then, a quite moving account of a young woman’s strategies for dealing with, at times, debilitating episodes of mania and psychosis. But, as Gravois notes, her decision to include toast on Trouble’s—otherwise eccentric—menu was picked up by other, more typically hipster San Francisco cafes where artisanal toast became another marker—alongside drip coffee, beards, lumberjack shirts—of hipsterdom.

Artisanal toast: ten seed loaf, blueberry jam.

Artisanal toast: ten seed loaf, cherry jam.

At the same time as I tried Mighty Good’s toast, commentators were outraged by the latest artisanal craze: ice. Large, dense, clear cubes of ice for artisanal cocktails—mixed with homemade or small batch bitters, liqueurs, and sodas—which fit better into glasses and melt more slowly. But, as Mother Jones reported, manufacturing, transporting, and storing artisanal ice is hugely energy inefficient. It is done at some cost to the environment.

In these terms, ‘artisanal’ means handmade and small scale—it means paying attention to the production of otherwise mass-produced or mundane items like toast or ice or bread or beer or crisps. There is something innately ridiculous in elevating toasted bread to the status of cult object. The enthusiasm for the artisanal is, to be kind, an attempt to reclaim the ‘authentic’ (whatever that may be) in the face of a wholly industrialised food chain, and, to be less kind, as much of a fashion as brogues, topknots, and foraging.

It’s useful to use artisanal toast—for instance—to explore what we understand by irony. Hipsters are accused routinely—and I used ‘accuse’ deliberately—of dressing, eating, reading, thinking, and of being ironically. In an essay for the New York Times, the philosopher and literary scholar Christy Wampole writes:

Before he makes any choice, he has proceeded through several stages of self-scrutiny. The hipster is a scholar of social forms, a student of cool. He studies relentlessly, foraging for what has yet to be found by the mainstream. He is a walking citation; his clothes refer to much more than themselves. He tries to negotiate the age-old problem of individuality, not with concepts, but with material things.

Hipsters’ knowing adoption of the unfashionable, old-fashioned, and the obscure is, she argues, a form of irony: this is an appropriation of a set of markers but no real commitment to what they signify.

I would tend to disagree with Wampole—on this point and her broader argument about living without irony (and her confusion of hipster and millennial)—because I’m not entirely sure that irony is the defining characteristic of hipsterdom. The embrace of the artisanal, hipsters’ enthusiasm for recovering forgotten recipes and fashions, their opposition to the corporate and the mass produced (generally—some brands like Apple seem to be immune to this), and even the strain of literary seriousness which runs through some iterations of hipsterdom, seem to me to denote seriousness, even earnestness. Occasionally, this tips into twee, as Judy Berman observes:

twee is anti-greed and suspicious of an adult world that revolves around avarice. More importantly, twee is aware of humanity’s capacity for violence and evil, but chooses to be optimistic about human nature nonetheless. This could be a progressive stance—one that not only believes we’re capable of improvement but works toward it. In practice, though, twee politics too often prescribe escapism and isolation, allowing the privileged to respond to crises both global and personal by sticking their fingers in their ears and yelling, ‘Na na na, can’t hear you!’

If being a hipster was predicated only on irony—on not taking any of this seriously—then it would be difficult to establish cafes, shops, literary journals, and other enterprises dedicated to the small scale, the cool, and the exclusive. In fact, what much of the writing on hipsterdom misses is that it is precisely this: exclusive. It is a subculture of the (upper) middle classes. For all the fact that young hipsters have colonised historically poor parts of cities, being a hipster is expensive. Organic vegetable boxes, iPhones, copies of n+1, and fixed gear bicycles aren’t cheap.

Much of hipsters’ political and social cluelessness stems from their position of privilege. And here it’s worth thinking more about hipsters’ politics. For all that I think most hipsters would label themselves progressives, there is a strangely libertarian strand within, particularly, hipster attitudes towards food. This connection between some kinds of right wing politics and a return to the land is by no means unusual or new. Most recently, the locavore movement—in its suspicion of big business and agriculture which bleeds into a suspicion of big government—has been taken up by libertarians in some red states in the US. But I think for some hipsters, learning the skills of rural living—learning self-sufficiency—has been produced by the profound economic and social uncertainty of the past decade or so. It is no coincidence that hipsterdom emerged at around the same time as the 2008 crash. Dana Goodyear describes a feast she attends in Anything that Moves: Renegade Chefs, Fearless Eaters, and the Making of a New American Food Culture:

Jonathan, a strawberry-blonde roaster at an artisanal coffee shop in Orange County, espoused a more complex view. Late in history, with America’s institutions crumbling around them, he and his friends felt mistrustful, even paranoid. They had retreated into Home Ec, believing that if the worst were to happen, at least they’d know how to pickle their own vegetables. ‘Our generation feels lost,’ he said. ‘We’re wanting to be self-sufficient.’

The parallels between hipsters and their parents’ generation—the Baby Boomers—are particularly evident here. Hippies’ enthusiasm for homesteading and green living, their rediscovery of lost crafts and skills was partly a reaction against the growth of the corporate, but it also signalled a profound lack of faith in mainstream society, something only amplified by the environmental and economic crises of the 1970s.

My point is that if we understand hipster earnestness as both a product of privilege as well as crisis, it helps to rethink the position of irony within hipsterdom. It becomes, then, a means of establishing a line between those who understand the irony, and those who don’t. Irony is a boundary marker, but it does not constitute what it means to be a hipster. Secondly, it also helps to illuminate the politics of hipsterdom. However seriously meant, a reclaiming of old fashioned forms of cooking and preserving, an interest in old recipes, and a commitment to organic and free range food does not necessarily signal progressive politics. If anything, these are interests and pursuits of the leisured and the moneyed. To what extent are hipsters a manifestation of inequality?

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

(Mostly) #ReadWomen2014

Earlier this year, the excellent Joanna Walsh inadvertently started a campaign to encourage wider and more extensive reading of women writers. What began life as New Year’s cards featuring a collection of women writers soon transformed into a Twitter hashtag—#ReadWomen2014—and then into book clubs, discussions, and a campaign which seeks, simply, to ‘create a little extra space … in which more women can be heard more loudly, both by women and men.’

Joanna Walsh's bookmarks for #ReadWomen2014. To order: They cost £10/$16/€13 for a sheet, including postage anywhere in the world. If you'd like more than one sheet, any number of subsequent sheets posted together cost £5/$8/€7 each.  To buy a set, go to Paypal, and send your payment to readwomen2014@gmail.com. Please leave a message with your payment confirming the number of sheets you'd like, and the address you'd like them sent to.

Joanna Walsh’s bookmarks for #ReadWomen2014. To order:
They cost £10/$16/€13 for a sheet, including postage anywhere in the world. If you’d like more than one sheet, any number of subsequent sheets posted together cost £5/$8/€7 each.
To buy a set, go to Paypal, and send your payment to readwomen2014@gmail.com. Please leave a message with your payment confirming the number of sheets you’d like, and the address you’d like them sent to.

I think, often, that the key to being happy as an academic is to realise how strange an occupation it is. In my case, it is doubly odd because I work in a research institute and have minimal teaching duties. I am paid to think, to write, to travel, and to read. I spend most of my time reading, and yet am constantly on the edge of panic that I’m not reading enough. Academia is a conversation with other writers. Everything is historiography. Not to read is intellectual failure.

But I need to read beyond work—mostly novels, memoirs, essays, and occasionally short stories. I read to feel the world more intensely; to feel myself in the world more intensely. I read to remind myself that what I feel is felt and shared—and has been felt and shared—by so many others. To some extent, to distinguish between academic and non-academic books is arbitrary. The most moving, thought provoking, and beautifully written book I’ve read this year was published by a university press. But for the sake of categories, and because I read fiction differently, here is a list of all of 2014’s non-academic books. I’ve not been particularly careful about only reading women writers this year, despite supporting Joanna’s campaign. And out of twenty books read and being read, twelve were by women. Reading two novels and a memoir by Michael Ondaatje—who, for some reason, I think would be a fan of #ReadWomen2014—rather increased things in favour of male writers.

I like this comment by Alexander Chee in his article about #ReadWomen2014:

I think women writers appealed to me because they acknowledged the struggles of women as well as those of men; as writers, they simply provided a fuller picture of the world.

I think so too. I think this is why I tend to reach, instinctively, for women writers.

Read in 2014: Anne Lamott, Bird by Bird; Bill Buford, Heat; Geoff Dyer, Jeff in Venice, Death in Varanasi; Anne Patchett, Bel Canto; Francesca Marciano, Casa Rossa; Monique Truong, The Book of Salt; Hannah Kent, Burial Rites; Michael Ondaatje, Running in the Family, In the Skin of a Lion, and The Cat’s Table; Robertson Davies, The Rebel Angels; Chimimanda Ngozi Adichie, Americanah; Michael Paterniti, The Telling Room; Joanna Rakoff, My Salinger Year; Jane Bowles, Two Serious Ladies; Rebecca Solnit, Men Explain Things to Me; Jane Smiley, The Greenlanders.

Can’t/won’t finish: Michel Houllebecq, Platform.

Still reading: Eleanor Catton, The Luminaries; Marilynn Robinson, Lila.

To read next: Karen Russell, Vampires in the Lemon Grove; Dana Goodyear, Anything That Moves; WG Sebald, The Emigrants, and Austerlitz.

Creative Commons License Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.