Skip to content

Posts tagged ‘National Health Service’

Bread Lines

Most of my friends went slightly mad as they finished their PhD dissertations; some cried compulsively, another forgot to eat, and I knew a couple who never wore anything other than pyjamas for months on end. My lowest ebb came when I developed a mild addiction to The Archers, a daily, fifteen-minute soap on Radio 4, featuring the activities of a large, extended family in the fictional village of Ambridge.

Described by Sandi Toksvig as ‘a memorable theme tune, followed by fifteen minutes of ambient farm noise and sighing,’ The Archers was created in 1950 as a kind of public information service: the BBC collaborated with the Ministry of Agriculture, Fisheries, and Food to broadcast information about new technologies and methods to farmers during a period when Britain was trying to increase agricultural productivity.

The series still has an agricultural story editor, and there’s at least one fairly awkward moment in each episode when Ruth Archer discusses milking machines, or Adam Macy mulls over the relative benefits of crop rotation. But its appeal lies now in its human drama. It’s been criticised – rightly – for avoiding complex or uncomfortable social issues, but, recently, it’s featured an excellent storyline involving the series’ poorest family, the Grundys.

Struggling with cuts in benefits and reduced wages, Emma Grundy runs out of money and takes refuge in a food bank, where she and her daughter are given a free lunch. In a sense, this thread dramatizes the Guardian’s excellent Breadline Britain Project, which tracks the ‘impact and consequences of recession on families and individuals across the UK.’ The project has demonstrated convincingly that British people are eating worse as they become less financially secure.

One of its most arresting reports argues that Britain is in a ‘nutrition recession’:

Detailed data compiled for the Guardian, which analysed the grocery buying habits of thousands of UK citizens, shows that consumption of fat, sugar and saturates has soared since 2010, particularly among the poorest households, despite the overall volume of food bought remaining almost static. Food experts and campaigners called for government action to address concerns the UK faces a sustained nutritional crisis triggered by food poverty, which is in turn storing up public health problems that threaten to widen inequalities between rich and poor households.

The data show consumption of high-fat and processed foods such as instant noodles, coated chicken, meat balls, tinned pies, baked beans, pizza and fried food has grown among households with an income of less than £25,000 a year as hard-pressed consumers increasingly choose products perceived to be cheaper and more ‘filling’.

Over the same period, fruit and vegetable consumption has dropped in all but the most well-off UK households, and most starkly among the poorest consumers, according to the data.

It’s no wonder that so many columnists have evoked George Orwell’s description of the very poor eating habits of Wigan’s most impoverished residents during the Great Depression in The Road to Wigan Pier (1937). But the use of the term ‘breadline’ harks back to an earlier, and arguably more influential study, Seebohm Rowntree’s Poverty: A Study in Town Life (1901). Rowntree (1871-1954), the son of the philanthropist and chocolate tycoon Joseph (1836-1925), had studied chemistry in Manchester before beginning work as a scientist in the family business in York.

Benjamin Seebohm Rowntree*

But like his father – whose awareness of poverty had been awakened, apparently, by a trip to Ireland during the potato famineRowntree’s encounters with York’s poor led to the first of three studies which he undertook into poverty in York. Inspired partly by Charles Booth’s The Life and Labour of the People (1886), which analysed the lives of London’s poor, in 1899 Rowntree conducted a survey of the working-class population of York. His findings caused a national outcry, as Ian Packer explains:

Poverty: A Study of Town Life (1901)…became an important subject of debate because of its assertion that not only were 28 percent of the total households in York in poverty but nearly 10 percent had incomes so low that they could not keep the members of the family in what Seebohm termed ‘physical efficiency,’ that is, provided with sufficient nutritional food to maintain health.

Rowntree used access to food as a means of gauging poverty, and it is here that he originated the idea of the ‘breadline’. Diana Wylie writes:

Rowntree latched on to food, or, more precisely, its lack, as a convenient and revealing means of measuring socially unacceptable levels of deprivation. He drew an absolute poverty line; below it, people did not earn enough to buy the ‘minimum necessities for the maintenance of merely physical efficiency.’ If working men did not consume 3,500 calories of food energy daily, and women four-fifths that amount, their intelligence became dulled and their stature stunted. This quite pragmatic definition of hunger, the ‘underfeeding’ that would destroy a person’s stamina, served for Rowntree as the index for judging Britain’s social progress.

This and Rowntree’s subsequent two studies of poverty in York, published in 1936 and 1951, became some of the most significant evidence on which arguments for the creation of a British welfare state, were based. Rowntree’s point was that unemployment and low wages – and not bad eating or spending habits – were at the root of working-class poverty. It became, then, the ethical duty of the state to provide the means of freeing the population from the threat of hunger.

There is a direct line between Poverty: A Study in Town Life and the 1942 Beveridge Report, one of the most important documents of the twentieth century, which provided the foundation for Britain’s welfare state. But the influence of Rowntree’s work was felt beyond Yorkshire and the UK. In Starving on a Full Stomach (2001), Diana Wylie demonstrates the impact of the idea of the breadline on social scientists in South Africa during the early twentieth century.

In 1935, Edward Batson, a graduate of the London School of Economics, Beveridge enthusiast, and professor of social science at the University of Cape Town, arrived in South Africa and began work on ‘the first systematic survey of black urban poverty in sub-Saharan Africa.’

By 1938, Batson had surveyed 808 Cape Town households to discover how much they spent on six essential food groups, and compared their diet with the…minimum daily standard recommended in 1933 by the British Medical Association. His figures revealed that half of Cape Town’s Coloured people lived below the poverty datum line.

Like Rowntree

Batson refuted some common social scientific assumptions such as that ignorance determined the poor diets of poor Capetonians, a perspective that, he said, had recently become ‘fashionable.’ … On the contrary, Batson wrote, most people simply could not afford to eat better.

Batson’s research was undertaken in the midst of widespread debates around the founding of a South African welfare state, the underpinnings of which were put in place during the 1920s and 1930s with legislation such as the 1928 Old Age Pensions Act, and the 1937 Children’s Act. But although his work concentrated on black people, the South African welfare state was established largely to benefit whites. Indeed, Jeremy Seekings makes the point that pensions legislation in the 1920s emerged out of concerns about protecting the white (and, to some extent, coloured) ‘deserving’ poor from a perceived black ‘threat.’ This meant that evidence of significant hunger among black people was not a force in the formulation of South African welfare policy, at least before the Second World War.

So whereas Rowntree’s research contributed to the creation of a universal welfare state in Britain, where all people qualified for assistance from the state through the provision of social security payments, and free healthcare and education, in South Africa, welfare was raced: the welfare state was created to protect and to maintain white power, and to entrench racial segregation.

Understanding the origins of the term ‘breadline’ helps us to see the extent to which attitudes towards, and efforts to eradicate, hunger have changed over time, and the ways in which they’re influenced by thinking about race, as well as class. That being hungry and white meant – and means – something different to being hungry and black.

This photograph is from the National Portrait Gallery‘s collection.

Sources

William Beinart, Twentieth-Century South Africa, new ed. (Oxford: Oxford University Press, 2001).

Timothy J. Hatton and Roy E. Bailey, ‘Seebohm Rowntree and the Postwar Poverty Puzzle,’ The Economic History Review, vol. 53, no. 2 (Aug. 2000), pp. 517-543).

Ian Packer, ‘Religion and the New Liberalism: The Rowntree Family, Quakerism, and Social Reform,’ Journal of British Studies, vol. 42, no2 (April 2003), pp. 236-257.

Jeremy Seekings, ‘“Not a Single White Person Should be Allowed to Go Under”: Swartgevaar and the Origins of South Africa’s Welfare State, 1924-1929,’ Journal of African History, vol. 48, no. 3 (Nov. 2000), pp. 375-394.

Diana Wylie, Starving on a Full Stomach: Hunger and the Triumph of Cultural Racism in Modern South Africa (Charlottesville and London: University Press of Virginia, 2001).

Creative Commons License
Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

A Sporting Chance

My expectations of the London Olympics’ opening ceremony were so low that, I suppose, I would have been impressed if it had featured Boris as Boudicca, driving a chariot over the prostate figures of the Locog committee. (Actually, now that I think about it, that would have been fairly entertaining.)

Appalled by the organising committee’s slavishly sycophantic attitude towards its sponsors and their ‘rights’ – which caused them to ban home knitted cushions from being distributed to the Olympic athletes, and to require shops and restaurants to remove Olympic-themed decorations and products – as well the rule that online articles and blog posts may not link to the official 2012 site if they’re critical of the games, the decision to make the official entrance of the Olympic site a shopping mall, and the creation of special lanes for VIP traffic, I wasn’t terribly impressed by the London Olympics.

But watching the opening ceremony last night, I was reduced to a pile of NHS-adoring, Tim Berners-Lee worshipping, British children’s literature-loving goo. Although a reference to the British Empire – other than the arrival of the Windrush – would have been nice, I think that Danny Boyle’s narrative of British history which emphasised the nation’s industrial heritage, its protest and trade union movements, and its pop culture, was fantastic.

As some commentators have noted, this was the opposite of the kind of kings-and-queens-and-great-men history curriculum which Michael Gove wishes schools would teach. Oh and the parachuting Queen and Daniel Craig were pretty damn amazing too.

There was even a fleeting, joking reference to the dire quality of British food during the third part of the ceremony. There was something both apt, but also deeply ironic about this. On the one hand, there has been extensive coverage of Locog’s ludicrous decision to allow manufacturers of junk food – Coke, Cadbury’s, McDonald’s – not only to be official sponsors of a sporting event, but to provide much of the catering. (McDonald’s even tried to ban other suppliers from selling chips on the Olympic site.)

But, on the other, Britain’s food scene has never been in better shape. It has excellent restaurants – and not only at the top end of the scale – and thriving and wonderful farmers’ markets and street food.

It’s this which makes the decision not to open up the catering of the event to London’s food trucks, restaurants, and caterers so tragic. It is true that meals for the athletes and officials staying in the Village have been locally sourced and made from ethically-produced ingredients, and this is really great. But why the rules and regulations which actually make it more difficult for fans and spectators to buy – or bring their own – healthy food?

Of course, the athletes themselves will all be eating carefully calibrated, optimally nutritious food. There’s been a lot of coverage of the difficulties of catering for so many people who eat such a variety of different things. The idea that athletes’ performance is enhanced by what they consume – supplements, food, and drugs (unfortunately) – has become commonplace.

Even my local gym’s café – an outpost of the Kauai health food chain – serves meals which are, apparently, suited for physically active people. I’ve never tried them, partly because the thought of me as an athlete is so utterly nuts. (I’m an enthusiastic, yet deeply appalling, swimmer.)

The notion that food and performance are linked in some way, has a long pedigree. In Ancient Greece, where diets were largely vegetarian, but supplemented occasionally with (usually goat) meat, evidence suggests that athletes at the early Olympics consumed more meat than usual to improve their performance. Ann C. Grandjean explains:

Perhaps the best accounts of athletic diet to survive from antiquity, however, relate to Milo of Croton, a wrestler whose feats of strength became legendary. He was an outstanding figure in the history of Greek athletics and won the wrestling event at five successive Olympics from 532 to 516 B.C. According to Athenaeus and Pausanius, his diet was 9 kg (20 pounds) of meat, 9 kg (20 pounds) of bread and 8.5 L (18 pints) of wine a day. The validity of these reports from antiquity, however, must be suspect. Although Milo was clearly a powerful, large man who possessed a prodigious appetite, basic estimations reveal that if he trained on such a volume of food, Milo would have consumed approximately 57,000 kcal (238,500 kJ) per day.

Eating more protein – although perhaps not quite as much as reported by Milo of Croton’s fans – helps to build muscle, and would have given athletes an advantage over other, leaner competitors.

Another ancient dietary supplement seems to have been alcohol. Trainers provided their athletes with alcoholic drinks before and after training – in much the same way that contemporary athletes may consume sports drinks. But some, more recent sportsmen seem to have gone a little overboard, as Grandjean notes:

as recently as the 1908 Olympics, marathon runners drank cognac to enhance performance, and at least one German 100-km walker reportedly consumed 22 glasses of beer and half a bottle of wine during competition.

Drunken, German walker: I salute you and your ability to walk in a straight line after that much beer.

The London Olympic Village is, though, dry. Even its pub only serves soft drinks. With the coming of the modern games – which coincided with the development of sport and exercise science in the early twentieth century – diets became the subject of scientific enquiry. The professionalization of sport – with athletes more reliant on doing well in order to make a living – only served to increase the significance of this research.

One of the first studies on the link between nutrition and the performance of Olympic athletes was conducted at the 1952 games in Helsinki. The scientist E. Jokl (about whom I know nothing – any help gratefully received) demonstrated that those athletes who consumed fewer carbohydrates tended to do worse than those who ate more. Grandjean comments:

His findings may have been the genesis of the oft-repeated statement that the only nutritional difference between athletes and nonathletes is the need for increased energy intake. Current knowledge of sports nutrition, however, would indicate a more complex relationship.

As research into athletes’ diets has progressed, so fashions for particular supplements and foods have emerged over the course of the twentieth century. Increasing consumption of protein and carbohydrates has become a common way of improving performance. Whereas during the 1950s and 1960s, athletes simply ate more meat, milk, bread, and pasta, since the 1970s, a growing selection of supplements has allowed sportsmen and –women to add more carefully calibrated and targeted forms of protein and carbohydrates to their diets.

Similarly, vitamin supplements have been part of athletes’ diets since the 1930s. Evidence from athletes competing at the 1972 games in Munich demonstrated widespread use of multivitamins, although now, participants tend to choose more carefully those vitamins which produce specific outcomes.

But this history of shifting ideas around athletes’ diets cannot be understood separately from the altogether more shadowy history of doping – of using illicit means of improving one’s performance. Even the ancient Greeks and Romans used stimulants – ranging from dried figs to animal testes – to suppress fatigue and boost performance.

More recently, some of the first examples of doping during the nineteenth century come from cycling (nice to see that some things don’t change), and, more specifically, from long-distance, week-long bicycle races which depended on cyclists’ reserves of strength and stamina. Richard IG Holt, Ioulietta Erotokritou-Mulligan, and Peter H. Sönksen explain:

A variety of performance enhancing mixtures were tried; there are reports of the French using mixtures with caffeine bases, the Belgians using sugar cubes dripped in ether, and others using alcohol-containing cordials, while the sprinters specialised in the use of nitroglycerine. As the race progressed, the athletes increased the amounts of strychnine and cocaine added to their caffeine mixtures. It is perhaps unsurprising that the first doping fatality occurred during such an event, when Arthur Linton, an English cyclist who is alleged to have overdosed on ‘tri-methyl’ (thought to be a compound containing either caffeine or ether), died in 1886 during a 600 km race between Bordeaux and Paris.

Before the introduction of doping regulations, the use of performance enhancing drugs was rife at the modern Olympics:

In 1904, Thomas Hicks, winner of the marathon, took strychnine and brandy several times during the race. At the Los Angeles Olympic Games in 1932, Japanese swimmers were said to be ‘pumped full of oxygen’. Anabolic steroids were referred to by the then editor of Track and Field News in 1969 as the ‘breakfast of champions’.

But regulation – the first anti-drugs tests were undertaken at the 1968 Mexico games – didn’t stop athletes from doping – the practice simply went underground. The USSR and East Germany allowed their representatives to take performance enhancing drugs, and an investigation undertaken after Ben Johnson was disqualified for doping at the Seoul games revealed that at least half of the athletes who competed at the 1988 Olympics had taken anabolic steroids. In 1996, some athletes called the summer Olympics in Atlanta the ‘Growth Hormone Games’ and the 2000 Olympics were dubbed the ‘Dirty Games’ after the disqualification of Marion Jones for doping.

At the heart of the issue of doping and the use of supplements, is distinguishing between legitimate and illegitimate means of enhancing performance. The idea that taking drugs to make athletes run, swim, or cycle faster, or jump further and higher, is unfair, is a relatively recent one. It’s worth noting that the World Anti-Doping Agency, which is responsible for establishing and maintaining standards for anti-doping work, was formed only in 1999.

What makes anabolic steroids different from consuming high doses of protein, amino acids, or vitamins? Why, indeed, was Caster Semenya deemed to have an unfair advantage at the 2009 IAAF World Championships, but the blade-running Oscar Pistorius is not?

I’m really pleased that both Semenya and Pistorius are participating in the 2012 games – I’m immensely proud that Semenya carried South Africa’s flag into the Olympic stadium – but their experiences, as well as the closely intertwined histories of food supplements and doping in sport, demonstrate that the idea of an ‘unfair advantage’ is a fairly nebulous one.

Further Reading

Elizabeth A. Applegate and Louis E. Grivetti, ‘Search for the Competitive Edge: A History of Dietary Fads and Supplements,’ The Journal of Nutrition, vol. 127, no. 5 (2007), pp. 869S-873S.

Ann C. Grandjean, ‘Diets of Elite Athletes: Has the Discipline of Sports Nutrition Made an Impact?’ The Journal of Nutrition, vol. 127, no. 5 (2007), pp. 874S-877S.

Richard IG Holt, Ioulietta Erotokritou-Mulligan, and Peter H. Sönksen, ‘The History of Doping and Growth Hormone Abuse in Sport,’ Growth Hormone & IGF Research, vol. 19 (2009), pp. 320-326.

Creative Commons License
Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

Milking It

This week the committee organising the 2012 Olympics in London caused widespread anger when it announced that breastfeeding mothers would have to buy an extra ticket to bring their babies into sports venues. Some venues have a few discounted tickets for children, but others don’t. One commentator posted on Mumsnet

that while she and her husband were lucky enough to get tickets to an equestrian event in August, organisers had told her there are no children’s tickets so she will have to pay £95 for a three-month old in a sling.

Those who can’t afford an extra ticket, or who lose out in the next round of ticket allocation, are advised to stay away. Unsurprisingly, Britain’s Equality and Human Rights Commission has suggested that this is potentially a case of ‘indirect sex discrimination’ because it will affect considerably more women than men.

This situation is ridiculous in so many ways. What angers me the most is that the Olympic committee took this decision in a country where the National Health Service advises that babies be breastfed exclusively for the first six months of life. The members of the committee seem either to think that women shouldn’t breastfeed in public – an irritating view about which I am going to be extraordinarily rude at some stage – or that mothers with babies have no desire to attend public events.

In the midst of the uproar, The Ecologist tweeted an article which it had published six years ago about the debate over whether women should breast- or bottle-feed their babies. It’s an argument that parents, doctors, and policy makers have been holding since at least the beginning of the twentieth century, and it’s to the credit of Pat Thomas that her piece provides a good overview of shifting attitudes towards infant feeding over the course of the past hundred years or so.

But it’s also a problematic piece of writing, and one which demonstrates particularly well why so many mothers feel bullied about how they decide to feed their babies. Thomas makes no attempt to hide her view that all mothers should breastfeed their children. She begins with a terrifying list of statistics:

The health consequences – twice the risk of dying in the first six weeks of life, five times the risk of gastroenteritis, twice the risk of developing eczema and diabetes and up to eight times the risk of developing lymphatic cancer – are staggering. With UK formula manufacturers spending around £20 per baby promoting this ‘baby junk food’, compared to the paltry 14 pence per baby the government spends promoting breastfeeding, can we ever hope to reverse the trend?

I’d love to know where she found these figures – particularly given her opening statement that women have breastfed for ‘nearly half a million years’. (How does she know this? Why the coy, qualifying ‘nearly’?) Thomas is, though, correct to point to the compelling evidence that breastfed babies tend to be healthier than those who are fed on formula, and that breastfed children may do better at school and have stronger immune systems. Also, there is a direct and proven link between the use of baby formula and high child mortality rates in the developing world.

She blames the slow decline of breastfeeding over the course of the twentieth century on the medicalization of childcare, and on the advertising strategies employed by formula companies – most notoriously Nestle. I have little to add to her second point, other that, broadly, I agree with her. The International Code of Marketing of Breastmilk Substitutes, a response to the Nestle Boycott of the late seventies, needs to be properly implemented. But her argument about the medicalization of women’s experiences of childbirth and childrearing is not entirely correct. She quotes Mary Renfrew from the Mother and Infant Research Unit at the University of York:

‘If you look at medical textbooks from the early part of the 20th century, you’ll find many quotes about making breastfeeding scientific and exact, and it’s out of these that you can see things beginning to fall apart.’ This falling apart, says Renfrew, is largely due to the fear and mistrust that science had of the natural process of breastfeeding.

In particular, the fact that a mother can put a baby on the breast and do something else while breastfeeding, and have the baby naturally come off the breast when it’s had enough, was seen as disorderly and inexact. The medical/scientific model replaced this natural situation with precise measurements – for instance, how many millilitres of milk a baby should ideally have at each sitting – which skewed the natural balance between mother and baby, and established bottlefeeding as a biological norm.

During the early years of twentieth century, global concern about high rates of child mortality animated a child welfare movement which aimed to improve the conditions in which children were raised. In Europe, North America, Australia, New Zealand, and parts of Africa and Latin America, medical professionals held up rational and scientific methods of feeding and caring for babies as the best means of eradicating the ‘ignorant’ practises which, many believed, caused babies to die. This new emphasis on hygiene, speedy medical intervention, and regular monitoring of babies’ development and health at clinics and hospitals did lower rates of morbidity – as did declining fertility rates, the control of infectious disease, economic prosperity, and increased attendance of school.

Doctors and specialists in the relatively new field of paediatrics were particularly interested in how babies were fed. Contrary to what Thomas suggests, the nineteenth-century orthodoxy that breastfeeding was the healthiest and best option for both mothers and babies lasted well into the 1940s. Innovations in artificial formulas provided mothers who couldn’t breastfeed – for whatever reason – with good alternatives, and doctors did recommend them. There were anxieties that malnourished mothers’ milk would not feed babies sufficiently, and doctors recommended ‘top ups’ with formula or other liquid.

The real difference between nineteenth- and twentieth-century attitudes towards breastfeeding was that it was increasingly controlled and patrolled by trained professionals. As Renfrew notes, mothers were told how much milk their babies needed at each feed, and there was a lot of debate in medical journals and in other professional forums about how and when babies should be fed.

The set of guidelines formulated by the incredibly influential, New Zealand-based Dr Truby King emphasised the importance of routine in feeding. King’s mothercraft movement – which established clinics and training centres around the British Empire during the first half of the twentieth century – taught mothers to feed ‘by the clock’. At five months, a baby was to be fed only five times per day – and at the same time every day – while one month-old babies had an extra, sixth feed.

Like many childcare professionals of the period, King believed that feeding on demand was not only unhealthy – it placed babies at risk of under- or overfeeding – but it was morally and intellectually damaging too. Babies who understood that crying would cause them to be fed would become spoilt, lazy children and adults. Indeed, this points to the infant welfare movement’s more general preoccupation with mothers and motherhood. As the interests of the state were seen, increasingly, as being linked to the proper rearing and education of children, the role of the mother grew in importance. King called his centres ‘shrines to motherhood’, for instance.

But the naturally fussy, over-cautious, and credulous mother was not to be trusted to follow her own instincts: authorities and professionals, who tended to be male, were to provide her with rational, scientific advice on raising her baby. It’s difficult to gauge mothers’ response to the information aimed at them. In her study of mothers in the United States in the 1920s and 1930s, Julia Grant concludes that mothers did heed childcare professionals, but modified their advice according to the views and experiences of their peers. Similarly, mothers in New Zealand took what they wanted from King’s pamphlets on childrearing.

Equally, mothercraft clinics and breastfeeding advice days were well attended by mothers and babies. Several mothercraft centres all over the world also included a dietetic wing, where nursing mothers could stay for up to a fortnight, learning how to breastfeed their babies. There, they would be taught how to breastfeed by the clock, and how to cope with mastitis and painful breasts and nipples. Wonderfully, hospital fees were means tested, so poor mothers could attend for free.

Throughout its existence, the Cape Town dietetic hospital never had an empty waiting list, and similar units in Britain, Australia, and New Zealand were as enthusiastically supported by women. Mothercraft seems to have been at its most successful when mothers could choose how and when they wanted to its advice and services.

While it’s true that the medicalization of breastfeeding transformed this act into a ‘science’ which needed to be re-taught to mothers – that it became possible to inform a mother that she was breastfeeding incorrectly – and that this was underpinned by misogynistic and eugenicist ideas around childhood, motherhood, and the nation, it is as true that mothers did respond positively to the advice provided by mothercraft and other organisations. Clearly, mothers wanted more advice about how to feed their babies – and that they altered it to suit their conditions and needs.

It’s for this reason that I think that Thomas is doing mothers a disservice. Encouraging more women to breastfeed needs to respect the fact that women’s choices about how to feed their babies are influenced by a variety of factors and considerations. Thomas – and other breastfeeding evangelicals – seems to buy into the same discourse of maternal irresponsibility as childcare professionals did in the early twentieth century: the belief that women somehow don’t really understand what’s best for their babies, and must be properly educated. Even if her – and others’ – motives are progressive and well-meaning, they still fail to take mothers seriously.

Further Reading

Sources cited here:

Rima D. Apple, Mothers and Medicine: A Social History of Infant Feeding, 1890-1950 (Madison: University of Wisconsin Press, 1987).

Linda Bryder, A Voice for Mothers: The Plunket Society and Infant Welfare 1907-2000 (Auckland: Auckland University Press, 2003).

Julia Grant, Raising Baby by the Book: The Education of American Mothers (New Haven and London: Yale University Press, 1998).

Philippa Mein Smith, Mothers and King Baby: Infant Survival and Welfare in an Imperial World: Australia 1880-1950 (Basingstoke: Macmillan, 1997).

Other sources:

Linda M. Blum, At the Breast: Ideologies of Breastfeeding and Motherhood in the Contemporary United States (Boston: Beacon Press, 1999).

Molly Ladd-Taylor, Mother-Work: Women, Child Welfare, and the State, 1890-1930 (Urbana and Chicago: University of Illinois Press, 1994).

Marilyn Yalom, A History of the Breast (New York: Ballantine Books, 1997).

Creative Commons License
Tangerine and Cinnamon by Sarah Duff is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.