- ‘Manufacturers can also buy … eggs pre-formed into 300g cylinders or tubes, so that each egg slice is identical and there are no rounded ends.’
- US chefs talk GMO labelling.
- Diets are worsening.
- Updating the USDA’s Dietary Guidelines for Americans.
- ‘But is being fed poorly inhumane? Should criminals be deprived of any pleasure from food? Isn’t that counterproductive if the purpose of imprisonment is rehabilitation?’
- São Paulo is running out of water.
- Some states in the US are considering legalising the sale of raw milk.
- Understanding the gluten-free trend.
- How to spend $300,000 on dinner.
- Cafe Neo in Lagos.
- Schools in Rome diversify their menus.
- Explaining the munchies.
- Manhattan’s best designed coffee shops.
- How to choose and look after knives.
- Eat chocolate cake for breakfast, lose weight.
- A recipe for Grewia occidentalis berries. (Thanks, mum!)
- Sewerage brewerage.
- ‘For the best breakfast, I vote the Socialist era.’
- Why are Kinder Surprise eggs illegal in the US?
- Catering for the fashion industry.
- A short history of the samosa.
- Penguins can only taste the saltiness and sourness of their food.
- When England was the coffee capital of Europe.
- What chefs hate to cook.
- The Carson McCullers diet.
- ‘Cooking … is a process that enables us to increase the calorie density of our food, so it’s almost as if you’re making calories out of nothing.’
- A robot that feeds you tomatoes as you run.
- Kanye West’s favourite restaurant.
- Learning to make La Genovese in Naples.
- If cities were made out of food.
- Grape molasses cake.
- The art of the crisp sandwich.
- Burmese pudding.
- The kitchen of the future.
- Joan Didion’s recipe book.
- Why not drink pig milk?
- A world in a grain of salt.
- Join a chilli club. And a guide to very, very hot chillies.
- An optical illusion placemat.
- Camembert shortbread.
- An obituary for Michele Ferrero.
- Stop motion latte art.
- New York City’s salt mountains.
- A cooking disaster.
- The man who invented Sriracha.
- Protein from sugar beet leaves.
- Food-themed art.
- Betty Crocker’s jelly salad.
- A guide to the English breakfast.
- Unfashionable sauces.
One of my favourite scenes in Alice in Wonderland is when the Caterpillar asks Alice ‘Who are YOU?’ Having spent the day being shrunk, telescoped, and grown again, Alice is at a loss: ‘I—I hardly know, sir, just at present—at least I know who I WAS when I got up this morning, but I think I must have been changed several times since then.’ During a period obsessed with lineages, classes, and groups, Alice’s inability to slot herself into the correct category feels profoundly transgressive. Her ontological uncertainty—she remarks to the Caterpillar ‘I can’t explain MYSELF…because I’m not myself’—is more mature than the Caterpillar who will, as Alice argues, turn into a chrysalis and then a butterfly. Nobody is one thing for very long.
The same can be said, of course, for confectionary. Periodically, Britain convulses in a fraught debate over the status of the Jaffa Cake. In their commercial form these are rounds of Genoise sponge topped with orange jelly, and covered with chocolate. Supermarkets sell bright blue packets of McVitie’s Jaffa Cakes in the same aisle as Digestive biscuits, Hobnobs, and shortbread. So to the uninformed, the Jaffa Cake is – despite its name – a biscuit.
But is it really? Legally, the Jaffa Cake qualifies as a cake. A long and complicated court case in 1991 ruled in favour of McVitie’s, confirming that the Jaffa Cake is indeed a cake and should not, then, be subject to VAT. Harry Wallop explains:
In the eyes of the taxman, a cake is a staple food and, accordingly, zero-rated for the purposes of VAT. A chocolate-covered biscuit, however, is a whole other matter—a thing of unspeakable decadence, a luxury on which the full 20pc rate of VAT is levied.
McVitie’s was determined to prove it should be free of the consumer tax. The key turning point was when its QC highlighted how cakes harden when they go stale, biscuits go soggy. A Jaffa goes hard. Case proved.
So this is a Cake which looks like a biscuit but is really a cake.
But this ontological uncertainty extends beyond its position as cake or biscuit. Jaffa Cakes are named after Jaffa oranges. (McVitie’s never patented the name Jaffa Cake, so chocolate-and-citrus flavoured confections are often described as ‘Jaffa.’) These were developed in Palestine – in and near the port city of Jaffa – during the 1840s. Sweet, seedless, and with a thick rind which made them perfect for transporting, Jaffa or Shamouti oranges became Palestine’s most important export in the nineteenth century. The arrival of Jewish immigrants in the 1880s and 1890s revolutionised citrus growing in the region. These new arrivals introduced mechanised, ‘scientific’ forms of agriculture, dramatically increasing yields.
By 1939, Jewish, Palestinian, and, occasionally, Jewish and Palestinian farmers working collaboratively, employed altogether 100,000 people, and exported vast numbers of oranges abroad. Britain was a major importer of Jaffa oranges, particularly after Palestine became a Mandated territory under British control in 1923. The Empire Marketing Board – which promoted the sale of imperial produce – urged Britons to buy Jaffa oranges, something picked up by McVitie’s in 1927 with the invention of the Jaffa Cake.
Jaffa oranges were – and, to some extent, are – held up as an example of successful Palestinian and Israeli co-operation during the interwar period. But after 1948, the same oranges became a symbol of Israel itself. Similar to the boycott of Outspan oranges during apartheid, organisations like BDS have urged customers not to buy Jaffa oranges as a way of weakening Israel’s economy and demonstrating their commitment to a free Palestine. (Jaffa oranges are no longer, though, a major Israeli export, and are grown in Spain, South Africa, and elsewhere.)
The changing meanings of Jaffa Cakes – cake, biscuit – and their constituent ingredients – symbol of collaboration, symbol of oppression – show how the categories into which we slot food are themselves constructs. (We could, really, compare apples and oranges.) But also, the Jaffa Cake helps to draw our attention to how taxes, trade agreements, and the politics and practicalities of shipping shape the ways in which we eat, buy, and think about food. Last year, the supremely British McVitie’s – producer of the Jaffa Cake, the most widely recognised biscuit (I mean, cake) in Britain – was sold to Yildiz, a food group based in … Turkey.
Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.
Last week some friends and I had supper at the Cube Tasting Kitchen. I should emphasise at the outset that for all the fact that I write a blog about food, I’m not a huge fan of the mad flights of fancy which characterise fine dining at the moment. I’m not into molecular gastronomy. I think it’s really interesting—and for a number of reasons, not only culinary—but given the choice between that and the sublime comfort food served at The Leopard and Woodlands Eatery, pizza at Stella e Luna, or dim sum at the South China Dim Sum Bar, I’d probably choose one of the latter.
But Cube was, really, entirely wonderful. And fun. It’s a small, box shaped, white walled restaurant in Joburg’s Parktown North, in a row of good and unpretentious middle-range restaurants, including Mantra which is one of my favourite places at which to eat saag paneer. It was an evening of delights over fifteen courses. We began with six starters, each themed according to a vegetable—tomato, cucumber, cabbage, potato—or a deconstructed—pissaladière—or reconstructed—Parmesan ice cream with balsamic vinegar made to look like vanilla ice cream and chocolate sauce—version of a familiar dish. The cucumber came with a gin cocktail, the cabbage soup was blue and then turned purple, and the Parmesan ice cream didn’t really work.
That was okay, though. The fact that not every course was an absolute success was part of the fun. The infectious enthusiasm of the young chefs—who cook almost in the middle of the restaurant—and of the serving staff turned this into a game and an adventure. I had vegetarian main courses. The oddest, but most successful, was a combination of asparagus, humus, and shards of meringue with black pepper. The most delicious was a mushroom soufflé and a curry reduced to its most basic elements. The most beautiful was a Jackson Pollocked plate of beetroot and leek, which was also, paradoxically, the least flavourful.
And pudding—after baklava and cheese, and a palate cleanser of sherbet, pomegranate jelly, and orange sponge consumed as you would tequila with salt and lime—was a forest floor of pistachio marshmallow, rice crispy and cranberry cookies, chilled chocolate mousse, dried flower and chocolate soil, coffee biscuits, lemon gel, and wheat grass. Then there were chocolate brownies and coconut ice.
The size of the portions and the length of time it took to eat all of this—we were there for more than three hours—meant that we could digest at leisure. Because this was as much an intellectual and sensory exercise as it was supper. It would be easy to criticise this kind of dining on the grounds that its purpose is not really to feed people: it uses good, expensive food to allow fairly wealthy paying customers to have fun. But it is equally true that food has always been about more than nutrition. Human beings have long consumed—sacrificed—food in the name of status and power, in performing rituals, and marking celebrations.
It is, though, interesting that molecular gastronomy—which has its roots in the nouvelle cuisine of the 1980s—came to prominence before and during the 2008 crash, in a period marked by ever widening social and economic inequality. (On a side note, it’s worth thinking about relative definitions of wealth: our meal at Cube was expensive, but within the realms of financial possibility even for someone on a fairly modest researcher’s salary. I would never be able to afford the same menu at a similar restaurant in London, for instance.) Molecular gastronomy does not—despite the grandiose claims of some of its practitioners—represent the future of food.
It does, though, represent the past. What sets the foams, pearls, and flavoured air of molecular gastronomy apart from other iterations of fine dining is its reliance on technology. Indeed, the twin gurus of this kind of cuisine—academics Nicholas Kurti and Hervé This—were interested in researching the chemical processes which occurred during cooking. Their acolytes—from Heston Blumenthal to Ferran Adrià and René Redzepi—have used this knowledge to disrupt, deconstruct, reconstruct, and undermine what we think of as ‘food.’
This work, though, does not really fundamentally challenge our eating habits and choice of things to eat. Noma might serve insects and Blumenthal may have invented snail porridge, but molluscs and insects have been part of human diets for a very long time. I think that a more accurate name for molecular gastronomy is, really, modernist cuisine—the title of Nathan Myhrvold’s 2011 encyclopaedic guide to contemporary cooking. In all of is reliance and enthusiasm for technology, molecular gastronomy is supremely modern: this is the food of industrialisation. It is as heavily processed as cheese strings. Modernist cuisine is the logical extreme of an industrialised food system.
Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.
On my fridge, I have a collection of business cards from cafes and shops visited on trips abroad. This afternoon—months late—I added another few from a recent month-long stay in Canada and the US, and I was reminded of a fantastic breakfast at the August First bakery in Burlington, Vermont. I was in Burlington for a conference and spent a couple of days beforehand working and wandering around a small university town – I grew up in a small university town so I have a professional interest in them – which has a reputation for extraordinarily progressive and inclusive politics. There were posters advertising make-your-own banjo classes (out of gourds, apparently), vegan Thanksgiving, and homebrew nights; the local Democratic party was next door to a Tibetan dumpling shop; and I have never been so aware of the plight of the Dalai Lama as I was in the week I spent in Vermont. And there was the most amazing co-operative, which had a wall – a wall! – of granola. Progressive America is, truly, the most amazing place. (In a similar vein, Ann Arbor’s community co-op is opposite a Birkenstock shop.) I had, then, granola at August First. And it was wonderful granola, with whole walnuts and fat raisins, and with plenty of really good plain yoghurt. Burlington has embraced its granola. But – and I write this as one who makes her own granola – there is a contradiction at the heart of the association of granola with progressive living: a lot of the time, it’s full of sugar. Unlike muesli, which is left raw, granola is baked usually with honey, maple syrup, or (sometimes and) sugar, as well as oil, and, occasionally, egg white. This is not necessarily the healthiest breakfast. So why does granola signify healthy eating? This isn’t the only food to be linked to left wing politics. Paul Laity notes:
‘Socialism,’ George Orwell famously wrote in The Road to Wigan Pier (1936), draws towards it ‘with magnetic force every fruit-juice drinker, nudist, sandal-wearer, sex-maniac, Quaker, “Nature Cure” quack, pacifist and feminist in England.’ His tirade against such ‘cranks’ is memorably extended in other passages of the book to include ‘vegetarians with wilting beards,’ the ‘outer-suburban creeping Jesus’ eager to begin his yoga exercises, and ‘that dreary tribe of high-minded women and sandal-wearers and bearded fruit-juice drinkers…’
Orwell’s ‘cranks’—a term reclaimed by the London vegetarian restaurant in 1961—were the free-thinking and –living British Bohemians of the early twentieth century, who experimented with new forms of comfortable dress, sustainable eating, eastern religions, egalitarian social arrangements, and alternative sexual identities. This early counter culture was strongly influenced by late nineteenth-century dieticians and naturopaths—many of them based in Germany—who advocated raw, simple eating in contrast to the meat- and starch-heavy meals which characterised most middle-class diets. As Catherine Carstairs remarks in her essay ‘The Granola High: Eating Differently in the Late 1960s and 1970s,’ it was immigrants from central Europe who brought health food shops to North America, stocking vitamin supplements, wholewheat bread, and, inevitably, fruit juice. It was these shops that made widely available the foods eaten at more exclusive sanatoriums in Europe and the United States.
Like muesli and bircher muesli, granola was invented in a health spa. In her excellent and exhaustively detailed history of granola, Karen Hochman argues that Dr James Caleb Jackson—a farmer, journalist, and doctor—invented granula in 1863 for the patients at his spa, Our Home on the Hillside, in upstate New York. Relying heavily on Graham flour—invented by the dour evangelical preacher Sylvester Graham—he baked sheets of biscuits and crumbled them into granules to be soaked in milk and then eaten for breakfast. It’s likely that granula—the predecessor of Grape Nuts—would never have moved beyond the confines of Our Home on the Hillside had it not come to the attention of a rival sanatorium doctor and Seventh Day Adventist, William Kellogg, who used rolled, toasted oats instead of Graham flour biscuits. He renamed his product granola, and it became for a while a significant money earner for his Sanitarium Food Company (renamed Kellogg’s Food Company in 1908).
But enthusiasm for granola remained—largely—limited to the relatively small numbers of people who shopped in health food stores until the 1960s and 1970s. Then, concern about the effects of pesticides and additives on human, plant, and animal health; suspicion of the food industry; a desire to experiment with diets from elsewhere; and a back to the land movement all coincided to produce an interest in purer, healthier, more ‘natural’ foods. Hippies—another food counter culture—looked back and found granola. So did big food companies, as Hochman writes about the US:
Granola went mainstream in 1972, when the first major commercial granola, Heartland Natural Cereal, was introduced by Pet Incorporated. In rapid succession, Quaker introduced Quaker 100% Natural Granola; Kellogg’s introduced Country Morning granola cereal and General Mills introduced Nature Valley granola.
The sweet, nut- and dried fruit-filled granola we eat today is derived from the granola reinvented in the 1960s and 1970s. Despite having been popularised by Quaker and General Mills—the enemies of the second food counter culture—granola retained its association with progressive, healthy living.
This cultural history of granola tell us three things, I think. Firstly, that the food counter culture has roots in alternative experiments in living stretching as far back as the late eighteenth century, when vegetarianism and lighter diets were picked up as markers of enlightened, rational eating. Secondly, that business has long taken advantage of the experiments done by people working and living on the fringes of respectability.
Finally, it also traces the shifting meanings of what we define as ‘healthy.’ Despite evidence presented to us by nutritionists, what we think of as being healthy food depends on a range of factors, including whether, historically, a product has been associated with health-conscious living.
Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.
A few months ago, I was interviewed on a radio station about changing attitudes towards food and eating. After a caller commented that when he’d lived in rural Limpopo, he’d happily eaten frogs, but preferred McDonald’s having moved to Johannesburg, I managed—somehow—to talk myself into an urgent appeal to the nation to eat insects. I’m still not entirely sure how this happened, but I think it was partly connected to the recent slew of articles on why we need to eat insects to save the planet.
This insect turn in culinary fashion is, of course, nothing new. In 1885, the entomologist Vincent M. Holt published Why not eat insects? To some extent, current arguments for eating insects deviate little from this little manifesto. Holt remarks, rightly, that there is nothing inherently dirty about insects—in fact, crustaceans, being bottom feeders, are potentially more dangerous to eat—and that they can form part of a balanced diet. He suggests that Western aversion to eating them is linked strongly to culturally specific ideas about what is fine and not fine to eat. He cites the example of a Chinese banquet at an exhibition in London, pointing out that Britons happily sampled a menu which included cuttlefish, sea slugs, and birds’ nests because it was both exotic and, apparently, healthy. Past Europeans ate insects, and societies in Africa, Asia, and elsewhere happily, according to Holt, eat insects:
Beginning with the earliest times, one can produce examples of insect-eating at every period down to our own age. Speaking to the people of Israel, at Lev. xi. 22, Moses directly encourages them to eat clean-feeding insects: ‘These ye may eat, the locust after his kind, and the bald locust after his kind, and the beetle after his kind, and the grasshopper after his kind.’ …
Cooked in many and various ways, locusts are eaten in the Crimea, Arabia, Persia, Madagascar, Africa, and India. … From the time of Homer, the Cicadae formed the theme of every Greek poet, in regard to both tunefulness and delicate flavour. Aristotle tells us that the most polished of the Greeks enjoyed them… Cicadae are eaten at the present day by the American Indians and by the natives of Australia.
He appeals to his readers:
We pride ourselves upon our imitation of the Greeks and Romans in their arts; we treasure their dead languages: why not, then, take a useful hint from their tables? We imitate the savage nations in their use of numberless drugs, spices, and condiments: why not go a step further?
Contemporary interest in eating insects is, though, strongly connected to anxieties about a food chain which seems to be increasingly ecologically unsustainable. Current methods of producing enough protein for the world’s population are to the cost of animal welfare and good labour practice, consume vast quantities of water, and produce methane and other greenhouse gases. Something needs to change, and insect enthusiasts argue that crickets, grasshoppers, and caterpillars are a viable alternative to beef, chicken, and pork. In a 2013 report for the Food and Agriculture Organisation, Dutch entomologist Arnold van Huis—academic and author of The Insect Cookbook: Food for a Sustainable Planet (Arts and Traditions of the Table: Perspectives on Culinary History)—notes more than 1,900 species of insects already form part of the diets of ‘at least two billion people.’ A lot of these insects are high in protein—higher, in some cases, than beef—and other nutrients. Many of them consume waste, and farming them is comparatively cheap and requires little labour.
This promotion of what Dana Goodyear calls ‘ethical entomophagy’ in Anything that Moves: Renegade Chefs, Fearless Easters and the Making of a New American Food Culture, has met with some commercial success. There are now—outside of regions where insects are normally part of diets—businesses dedicated to farming insects for human consumption. It’s possible to buy cricket flour; Selfridges sells chocolate covered giant ants; and pop up restaurants and Noma have featured insects on their menus. The logic is that these high-end sales of edible insects will gradually influence the middle and bottom of the market. A kind of ‘trickle down’ revolution in diet.
While it is certainly true that we can and have chosen to eat foodstuffs once deemed to be dangerous or socially taboo—potatoes in eighteenth-century France, beef in Japan during the Meiji Restoration—these shifts in attitude take time to achieve. Also, in the case of potatoes and beef, these societies were strongly hierarchical with powerful aristocracies. Thankfully, most of us no longer live in a world where the king’s decision to consume a formerly shunned ingredient changes the way that all of us eat.
As every recent article on entomophagy notes, the main obstacle to the widespread incorporation of insects into, particularly but not exclusively, Western diets is a strong aversion to eating them. If only, the argument goes, picky Westerners would give up their hypocritical dislike of insects—they eat shrimp and prawns, after all—and then we’ll all be fine. But I think it’s worth taking this dislike seriously. As Goodyear makes the point, a lot of these insects aren’t particularly delicious. She tries embryonic bee drones picked from honeycomb:
the drones, dripping in butter and lightly coated with honey from their cells, were fatty and a little bit sweet, and, like everything chitinous, left me with a disturbing aftertaste of dried shrimp.
I’ve eaten fried, salted grasshoppers at a food festival on London’s south bank, and they were crunchy and salty—improved, like most things, by deep frying—but otherwise memorable only for having been grasshoppers.
Making insects palatable involves processing, something which almost inevitably increases the ecological footprint of the product. Perhaps even more importantly, as the caller I referred to at the beginning of this post said, insects are widely associated with poverty and deprivation. Modernity—life in the city—requires a new diet. While it is true that in many societies, people do eat insects out of choice, it is equally significant that when they can, people stop eating insects as soon as possible.
Our current anxiety about sustainable sources of protein is driven partly by concern that the new middle classes in China and India will demand to eat as much beef, in particular, as their Western counterparts. I wonder to what extent this concern is part of a long tradition of Malthusian yellow peril: that China, in particular, will somehow eat up all the world’s resources. I don’t have any objection to promoting entomophagy—although trickle down strategies have a fairly low level of success—but I think we should look more carefully at the reasons underpinning our interest in investing in alternative forms of protein, and also be careful that we won’t take seriously the interests and tastes of people clawing their way out of poverty.
When I was finishing my PhD, my friend Jane gave me a t-shirt emblazoned with the slogan ‘tea is not a food group.’ She used to shout that into my room—we lived a few doors down from each other in the same student residence—as she passed me on the way to the lift. She had good reason for doing so. When I’m absorbed in writing, I can forget that the world exists: that it’s necessary to brush your hair, dress properly, cook, and not have conversations with yourself out loud. And that it’s unwise to subsist on tea.
Over the past couple of months, I’ve been in the final throes of completing a book manuscript and I’ve tried—probably not always successfully—to maintain at least a semblance of normal, civilised behaviour, but tea has remained a constant. It’s a kind of writing comfort blanket; a small routine in the middle of anxious typing. In some ways, then, it was a misfortune to be in the United States for much of this period. I could drink as much excellent coffee as I could cope with, but tea? Good strong, hot black tea? Until I discovered a branch of TeaHaus in Ann Arbor, not so much.
I know that I’m not the first to complain about the difficulty of finding a decent cup of black tea in the US, and, to some extent, this belief that Americans don’t understand hot tea is something of a misnomer. Teavana, Argo, and Teahaus all attest to an enthusiasm—an apparently growing enthusiasm—for well-made tea. I’ve never encountered so many different kinds of tea in supermarkets. (And, truly, Celestial Seasonings is the best name for a brand of tea.) But it is true, I think, that it’s hard to find really good black tea in the average café. While this is probably linked to the fact that most tea drunk in the US is iced tea, it’s also because tea in these establishments is made with hot—not boiling—water. This is crucial. Tea leaves need to steep in freshly boiled water.
This aversion to boiling water can be traced back to a 1994 civil case: Liebeck vs McDonald’s Restaurants. Two years previously, Stella Liebeck, an elderly Albuquerque resident, had spilled a cup of boiling hot coffee over her lap. She sued McDonald’s, and was awarded initially $2.7 million in punitive damages. While for some, the case has become emblematic of the madness of a litigious legal system, the truth is considerably more complex. Not only had Liebeck suffered third degree burns—resulting in extensive reconstructive work and long stays in hospital—but she and her family only sued McDonald’s as a last resort. When their reasonable request that McDonald’s cover her medical bills was turned down, they decided to go to court. Moreover, in the end, Liebeck received considerably less than $2.7 million: the judge reduced that sum to $480,000, and she was awarded, eventually, between $400,000 and $600,000 in an out of court settlement with McDonald’s.
This was not, then, a frivolous lawsuit. But it was interpreted as such, and became one of the examples cited in efforts to reform tort law—the legislation which allows people to sue others in the case of injury to themselves or their property—in the US. As some lawyers argue, the tort reform lobby led by Republicans isn’t really to reduce the numbers of lawsuits submitted by greedy people, but, rather, an attempt to protect business from having to pay for its mistakes.
For tea drinkers, though, this misperception (fanned by tort reform campaigners) has resulted in tepid, unpleasant cups of tea. Concerned about similar lawsuits, restaurants now serve hot—rather than boiling—water. But perhaps there is a kind of poetic—or historical—logic to having to search high and low for decent tea in the US. The chests of tea tipped into Boston’s harbour in 1773 was both in defiance of the Tea Act and a rejection of Britain’s right to tax the thirteen colonies. When patriots switched to coffee—indeed some refused even to eat the fish caught in or near the harbour on the grounds that they could have consumed some of the tea—it was in defiance of British rule. In the land of the free, shouldn’t tea be hard to come by? This association of coffee and freedom wasn’t new, even then. Coffee houses in eighteenth-century Britain and Europe were places where middle-class men could gather to talk and think. The work of the Enlightenment was done, to some extent, over cups of coffee. But coffee was produced on slave plantations and coffee houses—and the freedoms discussed in them—were largely for white men. Coffee represented, then, freedom of the few.
Like so many people recently, I’ve been thinking about the historical contexts which produced the principles on which liberal democracies are founded. Freedom of expression and of thought, freedom to gather, freedom of religious belief are fundamental to the functioning of liberal democracies. Regardless of the fact that these principles were originated during a period in which they applied mainly to white men—and regardless of the fact that they have not prevented injustices from being committed (sometimes in their name) in liberal democracies—these remain the best, albeit imperfect, protection of the greatest number of freedoms for the greatest number of people.
To suggest that they are somehow a western invention inapplicable to other parts of the world would be an enormous insult to Egypt’s cartoonists who continue to criticise successive oppressive governments despite risking potential imprisonment or worse; to Saudi Arabian blogger Raif Badawi, who received the first fifty of a thousand lashes last Friday, for writing in support of free expression; to the Kenyan MPs who last year so strongly opposed a new security bill which will dramatically curb journalists’ ability to report freely. Also, it would be a profound insult to the vast majority of Muslim people in France and elsewhere—members of a diverse and varied faith—who managed to cope with the fact that Charlie Hebdo and other publications ran cartoons which insulted or poked fun at Islam.
Whether you think that the cartoons in Charlie Hebdo were amusing or clever or blasphemous or racist, is besides the point. Free speech and free expression were no more responsible for the killings in France last week than they were for the murder of more than two thousand people in Nigeria by Boko Haram. This isn’t to argue that we shouldn’t discuss—loudly, freely, rudely—how right or wrong it was to publish these cartoons in a society which many feel has strongly Islamaphobic and racist elements—in the same way we should debate potentially misogynistic, anti-Semitic, racist, homophobic, or transphobic writing, art, or speech too. But to begin to suggest that there are times when we shouldn’t criticise and satirise, is to suggest that there should be limits to what we may think and imagine.