Monday 20 June 2011

Chapter 1: Introduction

“Wherefore do ye spend money for that which is not bread?”

(Isaiah 55: 2)

Bread has been regarded as a staple food in many societies ever since its ‘invention’ several millennia ago, inspiring numerous metaphors still in regular use today.

Fig. 1.1. Supermarket bread aisle (Domokos 2009)

As of the early 21st century the inhabitants of the island of Britain consume approximately nine million loaves of bread every day. Baking is a highly industrialised and efficient sector of the British economy turning out some of Europe’s cheapest bread, yet visitors from other European countries regularly cite the lack of good bread as one of the main disadvantages of living in Britain (Whitley 2006, 3). This dissertation will look at the story of British bread over the last five centuries to some degree from the perspective of an ‘outsider’ (with a Central European background) frequently mystified by the popularity of the sliced and wrapped article known as ‘bread’ in this part of the world (fig. 1.1.).

1. 1. Aims and objectives

The study of bread is beset with a difficulty inherent to the study of any kind of food: Its subject is by definition a continually diminishing resource (Fenton 2007, 3). Archaeology has focussed on the structural remains of grain processing, such as milling, brewing and baking, often in the context of industrial landscapes (Palmer and Neaverson 2001, 55), and various other disciplines, including history, economics and ethnology, similarly tend to concentrate on one or two specific stages in the journey from field to table. This dissertation aims to present a more comprehensive analysis of the life of bread so to speak, from the cultivation of the soil through the reaping, threshing and milling of bread crops to the baking and eating of various forms of bread, arguing that developments in these processes over the last 500 years have combined to create the picture we see today, as well as addressing some more specific points such as regional differences, the rise of factory baking and potential reasons for the dominance of sliced wrapped bread in the Britain of today. It will investigate these issues using a multidisciplinary approach which is both appropriate to the study of the time period in question and required by the topic itself. It will not, however, deal in much detail with the transport and marketing of either bread crops or bread itself, two topics which are both too complex and wide-ranging to be included in a paper of this scale without sacrificing too much specific investigation.

1.2. What is bread? A point of definition

Surprisingly, very few of the written sources consulted for the purpose of this dissertation deem it necessary to define ‘bread’. The only clear definition encountered refers to prehistoric bread, which is described as consisting mainly of ground food plants, usually cereals, with the addition of a liquid (and sometimes leaven), made into a dough and baked in an oven or similar structure or roasted at/on an open fire, on embers or in ashes (Hansson 2002, 186-187). Collins English Dictionary, UK edition, 2007 defines ‘bread’ as follows: “a food made from a dough of flour or meal mixed with water or milk, usually raised with yeast or baking powder and then baked” (Collins English Dictionary 2007). This dissertation uses a definition which includes unleavened breads as well as those utilising legumes such as peas or beans and shall focus on what was considered everyday ‘bread’ in various time periods and regions.

1. 3. Literature review

Various aspects of bread are discussed in books written between the 16th and 19th centuries, often with the aim of instruction, thus reflecting as much ideals as actual circumstances of the periods in question. Andrewe Boord’s 16th century ‘Dietary of helth’, treating of the health implications of housing, clothing and diet, reflects the author’s medical background as well as his intended readership’s social standing (Boord 1576), while Gervais Markham’s ‘The English housewife’, with a strong moral message apparent from the title and itself a transcription of medical cures and recipes from various uncertain sources, is similarly indicative of the upper-class status of both author and readership, e.g. with regard to bread types and social standing (Markham 1615). In the 19th century Isabella Beeton’s ‘Household mangement’, covering a range of subjects as diverse as cookery, childcare and taxes, was aimed at the young middle class wife managing a household with servants (Beeton 1861), while Henry Stephens’ ‘Book of the farm’ is an extremely detailed manual for young farmers, dealing with every aspect of mixed agriculture including ploughing, cereal cultivation, harvesting and threshing (Stephens 1860).

Few if any works published to date attempt to cover the whole story of bread from cultivation to consumption by human beings, though Chris Petersen’s ‘Bread and the British economy’, a modified D. Phil thesis in economic and social history, has one of the widest approaches of the secondary sources considered for the purpose of this dissertation, dealing with the importance of bread in 18th and 19th century diets as well as the increasing dominance of wheat bread, technical developments in milling and baking and relevant legislation among others, with a lot of statistics and calculations as characteristic of an economic history text (Petersen 1995). A significant proportion of literature relevant to this topic comes from the field of folk-life studies/rural studies/ethnology, and while not archaeological publications themselves these deal with a lot of material culture; examples include several works by Alexander Fenton relating to different aspects of Scottish agriculture and rural life (Fenton 1976; 1985; 2007) as well as S Minwel Tibbott’s seminal book on Welsh domestic life (Tibbott 2002). As in the case of some specialised historical works (Blandford 1976), such volumes can have a tendency to get lost in the minutiae of their subjects (e. g. types of ploughs/ovens etc.) at the expense of the wider context, as well as being at times frustratingly vague with regard to timescales, although this is to some degree due to the nature of the field.

Recent television programmes aimed at a more general audience have dealt with agriculture, including cereal cultivation, in the 17th, 19th and 20th centuries; these include two BBC miniseries of the reality TV/costumed interpretation/experimental archaeology genre – ‘Tales from the green valley’ and ‘Victorian farm’ (including a tie-in book), which involved some of the same archaeologist and historian participants, and the more traditional BBC4 documentary series ‘Mud, sweat and tractors’ (Tales from the green valley: complete BBC series 2004; Victorian farm: complete BBC series 2008; Mud, sweat and tractors: the story of agriculture 2010). The latter example, while providing a good overview of 20th century developments in cereal cultivation, including the effects of agricultural policies and changing economic system, suffers from a considerable bias in favour of modern, large-scale, ‘conventional’ agriculture, with organic farming for instance reduced to little more than a footnote (Mud, sweat and tractors: the story of agriculture 2010).

Relevant archaeological works such as Martin Watt’s book ‘The archaeology of mills and milling’ (Watts 2002) and Palmer and Neaverson’s volume on industrial archaeology (Palmer and Neaverson 2001) focus on the milling stage of the story, which tends to leave the largest amount of structural remains, while renowned chef and cookery writer Elizabeth David’s book on ‘English bread and yeast cookery’, written against the background of the nascent revival of home baking in the 1970s, combines a historical overview of milling and baking with recipes (David 1977), and Felicity Lawrence’s investigative publication ‘Not on the label’ provides a disturbing insight into the modern baking industry and its use of various additives (Lawrence 2004).

Chapter 2: Cultivation

2. 1. Field systems

From the Middle Ages to around the middle of the 18th century the open-field system with a three-year rotation of crops predominated English agriculture. Typically wheat was followed by barley and then by fallowing (Langlands et al. 2008, 23). By the 16th century the staple crops were peas/ beans, barley and wheat, though their proportion could vary significantly between individual farms (Hoskins 1951, 10-11). The introduction of clover, lucerne and sainfoin (fig.2.1.1.) into arable rotations around the mid-17th century contributed to an increase in grain yields (Thirsk 1997, 25).

Fig. 2.1.1. Sainfoin (‘Wild flowers of Magog Down -
the pea family (leguminosae)’ 2008a)

However, the open-field system with a three-year rotation never reached Scotland (Fenton 1976, 11), whose characteristic pattern of subsistence agriculture remained fundamentally unchanged from around the third or fourth until the 18th century, and even longer in the Highlands and Islands (Fenton 1976, 3). Open-field farming with mixed husbandry was still the typical system in the English Midlands by the early 17th century (Hoskins 1951, 10), while a four-part rotation of bere (fig.2.1.2.), oats, wheat and peas was being introduced in the main wheat-growing areas of Scotland (Fenton 1976, 11-12).

Fig. 2.1.2. Bere
(‘Agronomy Institute research on bere’ nd.[m])


The 18th century, however, witnessed a dramatic transformation of the agricultural landscape of Britain. Enclosure- which had begun in the late 17th century in areas such as southern Scotland (Fenton 1976, 14) - led to a gradual disappearance of the open field system (Langlands et al. 2008, 23). Between 1700 and 1845 some 4000 acts of enclosure were enacted by Parliament (Mazoyer and Roudart 2006, 340).

From around the mid-18th century onwards the Norfolk four-course rotation, using crops such as wheat, turnips, barley and clover, began to gain popularity, and its development and adaptation to local condition continued into the 19th century (Langlands et al. 2008, 23). The Scottish Highlands and Islands saw a gradual conversion of the run-rig system into small consolidated holdings from around the mid-18th century onwards, although it survived in isolated pockets well into the second half of the 20th century (Fenton 1976, 24-25).

In Britain as a whole, however, the Agricultural Revolution of the 20th century had consequences every bit as far-reaching as those of the Industrial Revolution of the 18th and 19th centuries (Mud, sweat and tractors: the story of agriculture 2010). While in the 1930s most grain was produced on small farms by large labour forces of “men and horses” – as the Mud, sweat and tractors TV programme put it, apparently blithely ignoring the huge contributions of women and children in all aspects of farming (Mud, sweat and tractors: the story of agriculture 2010) -, developments in all aspects of technology, combined with a policy of government subsidies from World War 2 onwards (more on this below), led among other consequences to the draining of wetlands and the elimination of hedgerows in order to facilitate the consolidation into fewer, bigger and more specialised farms which characterised the second half of the 20th century in particular (Mud, sweat and tractors: the story of agriculture 2010).

2. 2. Population pressure and politics

Following a prolonged decline after the Black Death, population numbers began to rise again from around the year 1500. The resulting increase in the demand for grain and the looming spectre of shortages and civil disturbances led to renewed government emphasis on cereal production. Farmers were encouraged to plough up grassland and conversely punished for putting arable land down to pasture. Together with more intensive rotations and the application of more fertilisers these measures resulted in a significant increase in cereal harvests (Thirsk 1997, 23). In the 1640s productivity and specialisation increased even further in order to feed the Parliamentary army at war in England, Scotland and Ireland (Thirsk 1997, 25).

The second half of the 18th century saw a repeat of the same pattern, with another increase in population once again putting the emphasis on grain production and more grass- and heath-lands being brought into cultivation, as well as small farms being amalgamated into larger units (Thirsk 1997, 147). In the Highlands and Islands of Scotland agrarian capitalism in the form of increased rents, enclosure and the ‘Clearances’ erased many old villages and hamlets and depopulated entire areas (Fenton 1976; Symonds 1999).

Both cultivation and consumption of grain in the 19th century were significantly influenced by the passing of the Corn Law in 1815 and its subsequent repeal in 1846. This law was designed to protect British agriculture by banning the import of wheat when the average price of home-grown wheat dropped below a certain value. It therefore found favour with many farmers, but met with violent opposition from urban populations of all classes who objected to bread prices being kept artificially high, although Sheppard and Newton (1957, 60) argue that it had in fact less of an effect on the price of bread than expected. After the repeal of the Corn Law trade in Britain was essentially free, and cereal imports rose from around 1870, which in turn led to a steady and significant reduction in domestic cultivation (Borchert 1948; Fenton 2007), and a long period of agricultural depression which lasted, it has been argued, until the beginning of World War 2, interrupted only by a temporary upturn between 1916 and the early 1920s resulting from World War 1 emergency policies with short-lived results (Blair 1941; Borchert 1948; Holderness 1985).

The 1929 crisis led to a collapse of all commodities, including cereals (Mud, sweat and tractors: the story of agriculture 2010). During World War 2, however, the acreage under cultivation rose once more thanks to the application of new scientific methods and technologies and the encouragement of ploughing-up of grassland through subsidies (fig.2.2.1.); not surprisingly during this period the emphasis was on increasing the quantity rather than the quality of grain (Blair 1941; Borchert 1948; Holderness 1985; Mud, sweat and tractors: the story of agriculture 2010).

Fig. 2.2.1. Land girl learning to plough
(Guardian 2009)

Facing a severe food shortage after the end of World War 2, the British government decided to continue its agricultural subsidy policy, passing the Agriculture Act of 1947 which guaranteed arable farmers minimum payments for their produce to protect them against the fluctuations of the open market (Mud, sweat and tractors: the story of agriculture 2010). However, taxpayers from other sectors of the economy soon became disgruntled with the amount of subsidies paid to farmers, and when a Labour government came into power in 1964, political pressure to reduce them mounted, and tensions and conflict between government and farmers ensued (Mud, sweat and tractors: the story of agriculture 2010).

In 1973 the UK joined the EEC with its system of tariff barriers designed to keep out cheap wheat from other parts of the world, thus securing a huge guaranteed market for its produce, and demand for British bread wheat increased (Mud, sweat and tractors: the story of agriculture 2010). Furthermore the Chorleywood Bread Process, which had been developed in the 1960s (more on this below), allowed for the use of a substantial proportion of home-grown wheat in the white loaf. The result of both of these developments was a huge increase in output, which led to the infamous ‘grain mountains’ of the early 1980s (Mud, sweat and tractors: the story of agriculture 2010). Thus the introduction of set-aside in 1992, a system by which farmers were paid to take land out of production as a means of regulating supply and demand (Mud, sweat and tractors: the story of agriculture 2010). Some farmers took advantage of government subsidies offered for conversion to organic production from the 1990s onwards, many doing so with the objective of making relatively small holdings profitable (Mud, sweat and tractors: the story of agriculture 2010). The reduction of EU tariffs on imported wheat in the same period meant that producers found themselves once more at the mercy of the global market (Mud, sweat and tractors: the story of agriculture 2010). Notwithstanding this, by the end of the 20th century the UK was almost self-sufficient in bread wheat (Mud, sweat and tractors: the story of agriculture 2010). Agriculture remains a high-risk business in the 21st century, with some associations of wheat growers attempting to mitigate some of the uncertainties by striking supply deals with major supermarkets (Mud, sweat and tractors: the story of agriculture 2010)

2. 3. Soils and climate

Arable agriculture is dependent to a significant degree on soil and climatic conditions. Rye, for instance, needs light sandy soils, whereas barley and legumes do well on clay (Hoskins 1951, 14). Accordingly in the 16th century the East and the South were the main areas of Scotland for growing pulses (Fenton 2007, 201), and there is documentary evidence from the 17th to the 20th century for the cultivation of rye on a number of Hebridean islands with suitable soils such as those of the machairs (Fenton 2007, 257). From the Agricultural Revolution onwards new developments have extended the cultivation ranges of many food crops. The Norfolk four-course rotation, for example, made light soils suitable for wheat, resulting in a decline in rye-growing (Fussell 1943; Sheppard and Newton 1957), and many of the new varieties of hard wheat developed in the second half of the 20th century are suitable for less ideal growing conditions (Holderness 1985, 48 ).

2. 4. Choice of Crops

Until the 19th century it was common for farmers in various parts of Britain to cultivate different cereal and legume crops together in the same field in order to turn them into bread (Fussell 1943, 213) In Scotland peas were commonly combined with beans, oats or barley, as well as growing rye with oats or bere with barley; the result was referred to as ‘masloch’, ‘mastillion’ or ‘mashlum’ (Fenton 2007, 201; 257). From the 18th century onwards, as mentioned, more and more land was used for wheat, with the result that rye had almost disappeared by 1750 (Sheppard and Newton 1957, 71). In the early 19th century a rising demand for higher-gluten wheat coupled with an improving water transport system led to the extension of commercial wheat-growing in southern and eastern England into areas previously used for pasture or the cultivation of other crops (Petersen 1995, 163).

The introduction of scientific plant breeding can arguably be regarded as one of the major advances in 20th century agriculture (Mud, sweat and tractors: the story of agriculture 2010). In the 1970s the Cambridge-based Plant Breeding Institute developed a dwarf variety of wheat with larger yields and a decreased likelihood of being blown over by the wind (Mud, sweat and tractors: the story of agriculture 2010). The institute having been disbanded in 1987, however, plant breeding today is largely carried out by commercial companies; one of the major objectives is the development of crop strains adapted to an increasing variety of climatic conditions (Mud, sweat and tractors: the story of agriculture 2010).

The steep decline in the use of horses in agriculture between around 1940 and 1970 was reflected in a comparable decline in the acreage under oats, with the exception of the Scottish Highlands, where oats have long held a greater significance as human food than in many other parts of Britain (Holderness 1985, 46-47). By the late 20th century wheat and barley had become the chief crops on British tillage farms, with both rye and maize playing an insignificant role (Holderness 1985, 46-47).

2. 5. Drainage and Ploughing

Much of the land used for arable farming in Britain is wet and in need of drainage in order to make it suitable for ploughing and eventually for growing cereal crops. For many centuries drains were dug by hand, though simple draining ploughs are mentioned in 17th century sources (Blandford 1976, 64). Major advances in drainage technology did not take place until the 19th century, with the development of several types of mole plough designed to cut underground drainage ditches (Blandford 1976, 64-65), the invention of cylindrical drain-pipe manufacturing machines in the 1840s (Langlands et al. 2008, 24) and the introduction of systematic underground tile drainage, which helped to improve both the quantity and quality of yields as well as enabling the use of new machinery (Fenton 1976, 23). Between 1846 and 1870 ca. two million acres of land were drained at public expense (Langlands et al. 2008, 24). The availability of drainage grants in the 1960s and 1970s resulted in another peak in land drainage in the second half of the 20th century (Brassley 2000, 68).

Fig. 2.5.1. Egyptian plough, New Kingdom
(‘Wooden plough from Egypt New Kingdom
(1550-1070 BC)’ nd.[l])

The earliest known ploughs were simple scratching tools which separated the soil to both sides (Blandford 1976, 44). Both the ancient Egyptians and ancient Greeks used wooden ploughs (fig. 2.5.1.) with iron shares, and there are records of wheeled ploughs from the Roman Empire (Blandford 1976, 16; 44).


Fig. 2.5.2. Rotherham Swing Plough
(‘The Rotherham Plough’ nd.[k])






While ploughs used in Britain remained fundamentally unchanged in design from before the Norman Conquest until the 18th century, there was significant regional variation as a response to differing soil and climatic conditions (Blandford 1976, 50-51). 16th and 17th century ploughs, such as the Kent Plough used on the chalk and marsh soils of southeast England, were generally of heavy construction, but lighter ploughs began being developed along the East Coast in the 18th century (Blandford 1976, 52-53). During this period the main structural components and points of wear of ploughs were increasingly made from iron (Blandford 1976, 50), first wrought and following Robert Ransome’s patent in 1785 also cast (Fussel 1948, 90). The increasing use of horses rather than oxen for ploughing during the 18th and 19th centuries stimulated many advances in plough construction, with the Rotherham Swing Plough (patented 1730) representing the first commercially successful iron plough as well as being one of the first to be produced on a factory scale (fig. 2.5.2.); its design remained standard until the advent of steam power (Blandford 1976; Langlands et al. 2008).

Fig. 2.5.3. Modern tractor ploughing
(‘The farm today’ nd.[d])

The manufacture of ploughs became highly industrialised during the course of the 19th century, with mass-produced standardised components and scientifically designed ploughs (Blandford 1976; Langlands et al. 2008). When steam tractors came into use in the latter part of the 19th century, followed not too long afterwards by those with internal combustion engines, plough attachments were developed which coupled to the tractor hitch (Blandford 1976, 70-71). The first officially–sponsored tractor trials in England, organised by the Royal Agricultural Society in 1910, featured both steam and oil-powered models (Blandford 1976, 33). It was the huge demand for food production coupled with a lack of manual labourers during World War 1, however, which gave major momentum to the development and use of tractors in British agriculture (Blandford 1976, 34). Modern high-powered tractors of the late 20th and early 21st century are able to cut a large number of furrows at one pass (Blandford 1976, 71) (fig.2.5.3.).

Fig. 2.6.1. Broadcast sowing
(‘Holy propagation: scatter, layer, cut’ 2009b)

2. 6. Sowing

Before the 18th century broadcasting was the common way of sowing grain in Britain. Sowers followed the plough, carrying seed in an apron or a specially designed container called a seedlip and scattering handfuls of it as evenly as possible across the field, a method which demanded a steady hand and a good sense of rhythm (Dorrington 1998a; Langlands et al. 2008) (fig.2.6.1.). Although this technique left a sizable proportion of the seed exposed to birds and other pests, it continued in use well into the 19th century (Fussell 1952; Langlands et al. 2008). Dibbling began to be used as an alternative to broadcasting in the 17th century but did not become widespread until the following century (Fussell 1952; Blandford 1976; Dorrington 1998a), around the same time as it was itself being supplanted by the introduction of mechanical seed drills (Blandford 1976, 77).

While there is evidence for some form of seed drill as least as far back as 2800 BC in China, this device was only re-invented in Britain in the early part of the 18th century, when numerous models were designed, most of them not very practical (Anderson 1936; Fussell 1952). After Jethro Tull introduced the first successful design in the 1730s (fig.2.6.2.), the use of seed drills slowly spread through the main cereal-growing areas, not coming into widespread use until the mid-19th century (Anderson 1936; Langlands et al. 2008). While travelling seed drills could be hired by farms which did not have their own (Anderson 1936, 189), and the seed fiddle was introduced from the U.S. after ca. 1850 (Fenton 1976; Dorrington 1998a), a lot of corn was still sown by hand by the middle of the 19th century, especially on smaller farms, due largely to the cost of machinery (Stephens 1860, 31-32).


Fig. 2.6.2. Reconstruction of Jethro Tull’s seed drill
(Anderson 1936, 170)



One of the advantages of the mechanical seed drill lay in its sowing of the grain in even rows, allowing for the use of horse-drawn hoes between the rows and in turn greatly increasing yields (Langlands et al. 2008, 46). Today’s modern tractor-drawn drills cover a wide area of ground at the same time while functioning on the same principle as their horse-drawn predecessors (Blandford 1976, 79).

2. 7. Fertilising

Until the mid-19th century the only available fertilisers were organic waste materials. As well as dung produced on the farm or collected from town stables, early farming manuals refer to seaweed, oyster shells, fish, bones, horns, blood, rags, hair ashes, soot and malt dust among others, many of them by-products of urban industries (Fussell 1948, 87). The mixing of different types of soil, a process known as marling or claying, was another means of improving yields by enhancing soil structure (Fussell 1948, 87). Artificial fertilisers began making an impact in the 1840s, with nitrate of soda being imported from Chile (Fussell 1948, 87) as well as the beginning of the systematic exploitation of phosphate materials such as bones, certain types of sand and chalk (Mazoyer and Roudart 2006, 367). Guano was imported into Britain from 1840 onwards. Experiments with chemical fertilising substances at Rothamstead experimental station in Hertfordshire led to the commercial production of superphosphate manure from 1842, which, along with potash salts becoming cheaply available from 1861, allowed many farmers to free themselves from the commitments of long-term crop rotations and specialise in cash crops, with cereal growers no longer being forced to keep livestock solely for manure (Fussell 1948; Langlands et al. 2008).

World War 2 saw a doubling of the usage of chemical fertilisers in an effort to raise agricultural yields (Holderness 1985, 7). An increasing variety of chemical herbicides and insecticides became available from the late 1940s onwards, though it took another 20 years or so before their toxic effects on humans, animals and the wider environment became a matter of general awareness (Mud, sweat and tractors: the story of agriculture 2010).


Fig. 2.7.1. Miraculous new chemicals:
DDT advertisement, LIFE magazine, 1945
(‘Rachel Carson’ 2007)

The infamous insecticide DDT (fig. 2.7.1.) was banned in Britain in 1984, and general legislation regulating the use of pesticides was introduced in 1986 (Mud, sweat and tractors: the story of agriculture 2010).

Sunday 19 June 2011

Chapter 3: Harvesting

3. 1. Shearing and binding

From the Middle Ages until the early 19th century the main tool for shearing grain was the sickle, also known as the hook (Fenton 1985, 114). In 16th century Scotland the saw-toothed sickle was commonly used in the Lowlands (Fenton 1985, 115-116), and there is evidence for a small semi-circular smooth-bladed sickle being used in the Northern Isles between 1633 and 1900 (Fenton 1985, 116), while a broader smooth-bladed version was the standard tool in southern Scotland from the end of the 18th century (Fenton 1985, 116) (fig. 3.1.1.).

Fig. 3.1.1. Smooth-bladed sickle
(Dorrington 1998b)

Experiments carried out in the late 18th and early 19th century regarding the use of scythes for cutting cereal crops were initially unsuccessful, especially in the south of Scotland, due to the absence of smooth, stone-free ground prior to the development of horse-drawn rollers as well as to many farmers’ belief that scythes were inclined to shake the grains off the stalks (Fenton 1976, 120). However, since they speeded up the work, scythes steadily gained popularity for grain harvesting in northeast Scotland, where, unlike in the south, there was a shortage of manpower; they remained the main grain cutting tool until their replacement by horse-drawn reapers in the second half of the 19th century (Fenton 1985, 122).

Fig. 3.1.2. McCormick’s reaper (‘McCormick reaper’ nd.[j])

The development of horse-operated reaping machines has been regarded by many as the revolution in harvesting technology (Fenton 1985, 125). While there are references to mechanised reaping-implements as far back as the works of Pliny and Palladius, modern experiments began in the late 18th century (Langlands et al. 2008, 256).

Numerous early versions failed; those with scythes attached to furiously rotating wheels tore up and trampled the crop, and others were too complex and fragile to be practical (Langlands et al. 2008, 256-257). The turning point was the development of Cyrus McCormick’s reaper, patented in 1834 in the U.S. and exhibited at the Great International Exhibition in London in 1851 (Fenton 1985, 128) (fig. 3.1.2.). One of the first effective designs, it cut the grain with a knife going back and forth and was drawn by a team of horses walking in front of the machine and along the side of the grain (Fenton 1985, 128).

Fig. 3.1.3. Hornsby reaper binder
(Langlands et al. 2008, 257)

By the 1860s horse-drawn reapers were becoming popular in most parts of Britain, and the development of the ‘reaper binder’, which had an integrated mechanism for binding the crop into sheaves, in the 1870s further increased its usefulness and appeal (Langlands et al. 2008, 257)(fig. 3.1.3.). In the 1930s tractor-drawn reaper binders were in use alongside horse-drawn ones (Mud, sweat and tractors: the story of agriculture 2010).

Fig. 3.1.4. Tractor-drawn combine harvester
(Rucker 2007)
Fig. 3.1.5. Self-propelled combine harvester
(Chisholm 2007)
Arguably the most significant development in 20th century arable farming, however, was the combine harvester, which threshes the grain as it cuts (Fenton 1976, 89). The 1934 Royal Show exhibited American combine harvesters, but in those early days only very few were used in Britain, usually on large estates (Mud, sweat and tractors: the story of agriculture 2010) (fig. 3.1.4.). However, as part of the World War 2 drive to increase food production self-propelled combines were leased from America (Mud, sweat and tractors: the story of agriculture 2010). Since the introduction of combine harvesters better suited to British conditions after the war the number in use has increased steadily (Blandford 1976, 133) (fig. 3.1.5.).

3. 2. Threshing

Fig. 3.2.1. Flail threshing (Butterworth 1892, 23)

In the 16th and 17th century grain was threshed by various manual methods, generally throughout the winter months (Langlands et al. 2008, 258). There are references to flail-threshing in Scotland from 1375 onwards (Fenton 1985, 138) (fig. 3.2.1.); other methods commonly used in various parts of Britain included rubbing the ears between bare feet- to preserve the straw for thatching- (Fenton 1985, 36-37), people treading the sheaves, lashing the ears against hard or toothed surfaces (Fenton 1985, 134-137) and plucking the grains by hand (Fenton 1976, 52).

Fig. 3.2.2. Swing rioters (‘Captain Swing’ nd.[i])

The introduction of threshing machines caused much resentment in the major grain-growing areas of Britain, since it deprived many rural labourers of what was often their only source of employment in winter (Langlands et al. 2008, 258). Their anger found expression in protests and uprisings such as the infamous Swing Riots of the early 1830s, when threshing machines were destroyed and wheat ricks set on fire (Langlands et al. 2008, 261) (fig. 3.2.2.; fig. 3.2.3.).

Fig. 3.2.3. ‘Wanted’ poster, 1831
(‘The Swing Riots’ nd.[h])

However, the spread of threshing machines continued apace. The first ‘threshing mills’ were fixed installations in barns (Langlands et al. 2008, 258). Early models were water-powered, based on mechanised flails and rather dangerous (Fenton 1976, 83). The first fully successful design was patented by the Scotsman Andrew Meikle in 1788; powered by a horse, the sheaves were fed through a pair of rollers into a revolving drum, where the grain was knocked out by velocity (Fenton 1976; Langlands et al. 2008).


Before the 1830s threshing machines were powered by water, horse or wind (Fenton 1976, 87). On farms where water was used as the motive power, mill-dams and mill-lades became part of the layout, along with water-wheels alongside barns walls (Fenton 1976, 87). Where horses were used as the power source, circular horse-walks, either covered with overhead gearing or open with underground gearing, were built against the barns (Stephens 1860; Fenton 1976) (fig. 3.2.4.; fig. 3.2.5.).

Fig. 3.2.4. Covered horse walk attached to barn at
Little Powgavie, Perthshire
(Fenton 1976, 88)
Fig. 3.2.5. Covered horse walk attached to barn at
Cossans, Angus
(Fenton 1976, 88)

In his ‘Book of the farm’ Henry Stephens describes how the threshing power source influenced the most advantageous siting of a farmstead (Stephens 1860, 79). Threshing machines quickly replaced flails on many larger farms (Fenton 1985, 133), though Stephens (1860, 495) mentions flail-threshing still being common practice.

From the 1830s onwards steam power replaced horse-power for threshing machines, in part because the machine’s juddering motion was detrimental to horses (Langlands et al. 2008, 258), and tall chimney-stacks for the coal-fired boilers appeared on farms (Fenton 1976, 87). A further important development came in the form of moveable threshing machines, which contractors hired out to farmers (Fenton 1976; Langlands et al. 2008) (fig. 3.2.6.). On small Scottish crofts flails were gradually replaced by small hand- or pedal-operated threshers from around the 1830s, but flail-threshing was practised well into the 20th century (Fenton 1976, 89).

Fig. 3.2.6. Threshing with movable threshing machine
powered by steam engine (Eastwood 2009)




When tractors came into widespread use from around the time of World War 1, the tractor pulley was used to drive the threshing machine (Fenton 1976, 89), and finally the advent of combine harvesters, first tractor-drawn and then self-propelled, which thresh the grain as they cut, eliminated the need for barn threshing-mills (Fenton 1976, 89). However, many smaller farms stuck to older proven technologies which did not necessitate great expenditure. In Scotland horse-driven threshing mills, for instance, did not completely disappear until the second half of the 20th century (Fenton 1976, 89).

3. 3. Winnowing

Fig. 3.3.1. Winnowing with barn fanners (Fenton 1976, 92)

Until the early part of the 18th century winnowing was carried out by hand, either in the open air or in a through-draught in the barn, using a skin stretched over a wooden rim (Fenton 1976, 93). Winnowing machines, also known as corn fanners, were introduced from the 1730s onwards, becoming components of threshing mills (Fenton 1976, 91) and eventually of combine harvesters (Blandford 1976, 131) (fig. 3.3.1.).

Chapter 4: Milling

4. 1. Politics and economics

Once feudal and ecclesiastical landlords had lost their manorial rights, which typically included the milling of all grain on their lands, independent mills multiplied; in grain-growing areas there were frequently several in a village and water mills in every riverside town (David 1977, 14). However, the development of grain mill structures and milling technology in the immediate post-medieval period is somewhat poorly represented in the archaeological record, on the one hand due to a lack of identifiable remains, and on the other hand due to the scale of alterations to existing mill buildings from the period and the fact that there is little if any surviving contemporary machinery (Watts 2002, 117).

While bakers had previously sent their own grain to the mill to be turned into flour, in the 18th century millers began buying grain themselves and selling flour and by-products (e. g. bran) (Sheppard and Newton 1957, 31). Larger institutions such as prisons, workhouses and asylums often had their own mills, as well as bakeries, at this time (Petersen 1995, 49). During the time of the Napoleonic Wars of the late 18th and early 19th centuries associations of working class people founded cooperative mills, often named Union Mills, in cities such as Birmingham, Halifax, Leeds and Sheffield, some of which had considerable impact on the local and regional milling trade. The opening of a Union Mill in Wolverhampton, for example, soon led to a significant reduction in the output of another (privately-owned) local steam-mill (Trinder 1993, 124) (fig. 4.1.1.).

Fig. 4.1.1. Union Mill, Wolverhampton
(‘Industry and the Canal - Union Mill’ nd.[g])

Until the 1830s virtually all wheat flour consumed in Britain was produced domestically, with imports only rarely exceeding 1% of the total, and while imports increased considerably after 1831 the great majority of flour and meal consumed continued to be home-produced until the onset of the agricultural depression of the 1870s (Borchert 1948; Petersen 1995; Fenton 2007). From the late 19th century onwards there was a progressive consolidation of the milling industry, characterised by the merger of small rural mills, trustification and the establishment of large-scale, highly mechanised mills in major ports, with the result that by 1939 the ‘Big Three’ milling concerns (Joseph Rank Ltd., Spillers Ltd. and the Co-operative Wholesale Society) controlled ca. 66% of total flour production in Britain (Burnett 1995, 72).

4. 2. Querns and hand-mills

The use of querns for grinding grain goes back a long time in Britain. Saddle- and trough querns were used since prehistoric times and rotary querns since the Roman period (Fenton 1976, 101) (fig. 4.2.1.). Notwithstanding this long period of usage, there a very few intact medieval querns, since people circumventing the landlord’s right to have all the grain grown on his land milled at his mill- and receive a fee in kind for this service- were routinely punished by having their querns confiscated and destroyed (David 1977, 40); this also took place as recently as the 19th century in the Hebrides (Dodgshon 1992; MacLellan 1997). In Scotland as a whole querns had gone out of widespread use by the mid-19th century (Fenton 1976, 102), though they continued to be used into the 20th century in more isolated areas, especially where other types of mills were scarce (Fenton 1976; David 1977) (fig. 4.2.2.).

Fig. 4.2.1. Neolithic saddle quern from Wales
(‘Saddle quern and grain rubber
used by Wales' first farmers
c. 5,000 years ago’ nd.[f])

Today a variety of small hand- or electrically-powered grain mills made by a number of international manufacturers are marketed towards health-conscious consumers who want to grind their own grain for baking (BeSmart website 2004a; ‘Manual grain mill basics or Grinder 101’ 2009a) (fig. 4.2.3.).

Fig. 4.2.2. Rotary quern in use
in Shetland, ca. 1910-1920
(Watts 2002, 43)
Fig. 4.2.3. Modern hand grain mill
(‘Grain mills – hand mills’ 2004b)
Fig. 4.3.1. Illustration of 14th century post mill from the Smithfield Decretals (Langdon 2004, 289)




4. 3. Windmills

While the origin of the windmill has been a matter of much debate, its modern form is believed to have been developed in 12th century Europe (Palmer and Neaverson 2001, 55). There is evidence for windmills in Britain from the end of the same century (David 1977, 21), with their earliest incarnation being the post mill, as seen on medieval manuscript illustrations (fig. 4.3.1.). Its wooden body contained the mill stones and turned on a vertical post mounted on a timber frame; the whole body of the mill had to be manually turned into the wind using a tail pole (Sheppard and Newton 1957; Palmer and Neaverson 2001) (fig. 4.3.2.). Since these wooden structures were, due to their very purpose, subject to frequent wind damage, very few survive.


Fig. 4.3.2. Internal workings of a typical post mill
(‘Windmill machinery’ nd.[e])

The tower mill, invented around the mid-14th century, was somewhat less laborious to operate in that its sails were carried by a cap on top of the structure, which was turned to face the wind, the main body being fixed (Sheppard and Newton 1957, 9). While windmills soon became numerous in various parts of England and Wales, there is no record of them in Scotland until the middle of the 16th century, and they were never widespread there (Watts 2002, 132). This might be due to environmental conditions being less suited to the operation of windmills, or perhaps simply to neither physical remains nor documentary evidence surviving. Some examples were, however, erected on Shetland and on Orkney in the 18th and 19th centuries, due to a shortage of suitable streams for water mills (Fenton 1976, 105).Written sources indicate that by the 17th century London was surrounded by at least ten windmills supplying the city with flour and meal (David 1977, 14). The 18th century saw significant developments in the technology of wind milling including remote-controlled sails and the invention of an automatic winding device called a fantail (patented 1745) (Watts 2002, 147) as well as a marked increase in the use of cast and wrought iron on working parts of the mill machinery in an effort to increase their durability (Watts 2002, 117). From the end of the century onwards wooden post mills were increasingly replaced with more durable brick or stone tower mills, of which far more examples are extant (Palmer and Neaverson 2001, 57). Tower mills arguably reached the pinnacle of their development in the first half of the 19th century, with many of the most impressive structures, such as Wilton Mill, Wiltshire (erected in 1821) being built around the time of the Napoleonic Wars (fig. 4.3.3.). Many other examples from this period survive, particularly along the North Sea coast in Lincolnshire, East Anglia and also in Holland (Trinder 1993; Palmer and Neaverson 2001) (fig. 4.3.4.).

Fig. 4.3.3. Wilton Mill, Wiltshire
(Wilton Windmill website nd.[c])
Fig. 4.3.4. Stembridge tower mill, Somerset,
dated 1822 (Goodey 2009)

4. 4. Water mills

Fig. 4.4.1. Workings of a
horizontal water mill
(Watts 2002, 64)
Fig. 4.4.2. Horizontal water mill
with dry stone walls and turf roof,
Shetland, photographed ca. 1900-1910
(Watts 2002, 17)

The origin of water mills, too, is far from clear, but it is likely that they originated in pre-Roman times (Palmer and Neaverson 2001, 55), though documentary evidence for Britain only goes back as far as AD 762 (Watts 2002, 72). The oldest surviving type is a horizontal wheeled mill with the upper stone directly mounted on a vertical shaft at whose lower end a set of paddles (known as the tirl) was attached. Water was channelled by a leat into the chute and on to the paddles in a chamber below the lower stone (bedstone) (Palmer and Neaverson 2001, 55) (fig. 4.4.1.).



Early water mills in northern Scotland typically consisted of small rubble-stone buildings, often oval in shape, containing the millstones. They were constructed in such a way as to require only small amounts of running water and allow many different mills to use the same stream. Catering for single farms or hamlets, they were characteristic of areas of dispersed settlement (fig. 4.4.2.). In areas of predominantly nucleated settlement, such as Lowland Scotland and large parts of England, village or manorial mills usually served an entire community (Palmer and Neaverson 2001, 56).

The development of gearing in 16th century corn mills made it possible for more than one pair of millstones to be driven from a single wheel, either simultaneously if enough water power was present or alternately (Watts 2002, 119). The early 17th century mill at Norbury (Derbyshire), for example, had two waterwheels housed in opposite ends of the mill building (Watts 2002, 120). Written sources from this period indicate the use of under-, over- and breastshot water wheels (Watts 2002, 129). Both size and power of waterwheels in grain mills increased in the course of the 17th century (Watts 2002, 122), and the 18th century saw a significant increase in the use of iron in milling machinery, as in the case of windmills (Watts 2002, 117).

Fig. 4.4.3. Woodbridge tide mill, Suffolk (restored)
(Tandy 2003)

Tidal mills were another type of watermill, constructed where both a suitable tidal range and a long coastal inlet were present, with their most characteristic feature being a gated dam built across the inlet to contain the incoming tide and subsequently release the water through a wheel on the outgoing tide. A house usually adjoined the mill building, since the mill’s dependency on the tides necessitated operation at varying times of day or night. While few tidal mills survive, largely due to storm damage, their former existence is often indicated by remnants of the dam, which also provided access to the mill. Working tide mills remain at Woodbridge, Suffolk (fig. 4.4.3.), and Elling, Hampshire (Palmer and Neaverson 2001, 56.)

4. 5. Steam mills and roller milling

The application of steam power to grain milling in the latter part of the 18th century significantly increased output. Around 57 steam engines were installed in British flour mills before the turn of the century, either in new purpose-built steam mills or in converted water mills (Trinder 1993, 122). The most famous of the early steam mills, however, Albion Mill in Blackfriars in London, had a short and not very successful life; opened in 1784, it was a commercial failure, never operating at full capacity and finally burning down in 1791 (Trinder 1993, 122) (fig. 4.5.1.).

Fig. 4.5.1. Albion Mill burning down
(‘The “fireproof” mill’ nd.[r])




While the harnessing of steam power doubtlessly had an impact on the milling industry, some experts caution against exaggerating the speed or extent of transformation, citing statistics such as 2/3 of horsepower used for grain milling still coming from water in 1870 (Petersen 1995, 66). Nevertheless the increase in grain imports in the second half of the 19th century encouraged the construction of steam mills in ports and along major waterways (Palmer and Neaverson 2001, 58).

Fig. 4.5.2. Transverse section of modern roller mill
(‘Welcome to the KMEC Company FMFQ Roller Mill,
pneumatic wheat flour mill auto
or manual control’ nd.[q])

Arguably the most important development in grain milling technology in the 19th century, and still the basis of modern machinery today, was the invention of roller milling (fig. 4.5.2.). Invented in Switzerland in 1834, it was adopted on a large scale in the US before becoming popular in Britain in the latter part of the century and, being significantly more efficient (Trinder 1993, 124), gradually superseded the ancient method of grinding grain between millstones.

The first roller mill in Britain opened in Glasgow in 1872, and eventually this new technology (promoted to no small degree by the International Exhibition of Flour Mill Machinery in London in 1881) (Trinder 1993, 125) more so than steam milling alone, led to the decline of both water- and windmills (David 1977, 25), many of which lacked the money, the power or the space necessary to install roller milling plants (Palmer and Neaverson 2001, 55). One of the reasons the new technology became so popular so quickly was the fact that it removed the germ from the grain before grinding, which allowed for the production of uniformly white flour with significantly increased storage potential (Sheppard and Newton 1957; David 1977). It was largely due to the impact of roller milling that milling became a large-scale industry in Britain towards the end of the 19th century (Burnett 1966, 104-105).

4. 6. Bolting

Fig. 4.7.1. 19th century French
burr millstone with local
sandstone centre,
Livingston,
West Lothian, Scotland
(Watts 2002, 64-65)

Before the late 17th century flour had to be sifted (bolted) by hand after milling, a laborious and time-consuming process. The invention of an automatic bolter, agitated by the milling machinery and requiring only a small capital investment, eliminated this requirement and cut down on flour wastage (Petersen 1995, 53). In the 18th century advances in weaving techniques had a knock-on effect on milling technology in the form of fine silk replacing canvas, linen or wool in meal sieves, leading in turn to finer and whiter flour being produced (Sheppard and Newton 1957; David 1977). It was only in the latter part of the 20th century that a newly developed type of nylon replaced silk for this purpose (David 1977, 31).



4. 7. Millstones and buildings

As discussed above, until the later 19th century grain was ground between circular stones (Sheppard and Newton 1957, 9). 16th century mills used millstones made from various kinds of rock and sourced from different parts of Europe, such as German lava stones, French millstones and those made of native ‘Millstone Grit’ (Watts 2002, 142). The term ‘burrs’ for shaped blocks of French stone, considered to possess superior grinding power, first appears in early 17th century sources (Watts 2002, 143) (fig. 4.7.1.).

With regard to mill buildings a significant increase in size can be demonstrated in the immediate post-medieval period, due to the introduction of additional equipment such as grain-drying kilns and dressing machines; furthermore granaries for the storage of grain as well as domestic accommodation for the miller were frequently erected alongside the mill itself (Palmer and Neaverson 2001, 56). Technological developments from the time of the Industrial revolution onwards, designed to meet the demands of a rapidly expanding population by increasing productivity, instigated another series of changes to mill buildings and equipment (Watts 2002, 147). Extra floors for grain storage led to an increase in the height of buildings, while new machinery was introduced to e. g. clean the grain before milling and to refine flour more efficiently. Water wheels became larger and more efficient and, as described above, mechanical components used in shafting and gearing mechanisms were increasingly made of iron (Watts 2002, 147). The late 19th and early 20th centuries saw the construction of huge steel and reinforced concrete milling plants in major ports, such as the Homepride complex in Wallasey, Merseyside, built from 1898 onwards (Trinder 1993, 125).

Chapter 5: Baking

5. 1. Laws, politics and economics

Having been instituted in the 13th century and amended at various points in the following centuries, the assize of bread regulated the types of bread that could be sold by bakers as well as the prices which could be charged for them. It was based on bakers being paid a fixed fee to cover their operating costs and living expenses, which was added to the current price of raw materials (Petersen 1995, 99). The main text of the statute having remained unchanged for almost 500 years, much of its language had become so obscure as to be almost indecipherable by the early 18th century, and in various regions its meaning was either being guessed at, or the assize had been completely abandoned (Petersen 1995, 99). A new and comprehensive Bread Act was passed in 1709, specifying in simple and updated language the different permissible types of bread to be sold as well as allowing local magistrates to set the assize either by weight or by size, i. e. allowing for the selling of loaves of fixed weight (e. g. quarter peck/ quartern loaf) at varying prices according to the market price of ingredients or else for the selling of loaves of varying weight at fixed prices (e. g. penny loaf) (Petersen 1995, 99-100). From 1815 to 1836, however, the assize of bread was gradually abolished, first in London and subsequently in the rest of the country (Petersen 1995, 115). While this led to an increase in the number of bakeries, they remained small businesses, and widespread undercutting lowered the standard of bread being produced. It was only following the Sale of Food and Drugs Act of 1875 that the number of small, often cramped and unsanitary cellar bakeries dropped and there was an upswing in the number of larger enterprises which had the necessary capital to invest in powered machinery (Trinder 1993, 125).

5. 2. Home baking

5. 2. 1. Girdles and bakestones


Fig. 5.2.1.1. Welsh bakestone
(‘Kitchenalia collection’ 2008b)

Baking bread on some type of flat surface over a fire was a well-established method by the 16th century. Girdles for baking flat breads were known in Scotland from at least the 14th century, being commonly used for oatcakes and bere bannocks (Fenton 1976; 2007), and circular iron plates for baking had been introduced into Wales in the Middle Ages (Tibbot 2002, 81) (fig. 5.2.1.1.). In England it was apparently a common enough method to be condemned by the physician Andrewe Boord as “not laudable” (Boord 1576). With regard to Wales various forms of evidence, including 17th and 18th century inventories and oral evidence collected in the 20th century, show that portable bakestones of varying materials and designs were in regular use across the social strata into the 19th and 20th centuries (Tibbot 2002, 81-82).

5. 2. 2. Ovens

Fig. 5.2.2.1. Devon gravel-tempered clay oven
found in Tudor part of Welsh farmhouse
(David 1977, 263)

There is evidence for built-in wall ovens in domestic buildings from the early 17th century. Some preserved specimens exist of Devon gravel-tempered clay ovens of this period; built into the side walls of open hearths, they were the typical bread ovens of West Country cottages as well as being exported to Wales, Ireland and as far as America, and continued to be made in Barnstaple potteries until 1890 (David 1977; Tibbot 2002) (fig. 5.2.2.1.).


Looking at Wales, we can see a development of ovens and baking technologies characterised by strong regional differences and the very gradual spread of new inventions. While wall ovens, made of clay, as described, stone or fire-brick and fired with wood or peat, became common in southeast Wales from the beginning of the 17th century, they spread very slowly through the rest of the country, not being used in most large farmhouses until the late 19th century and never in some parts of the Northeast (Tibbot 2002, 94-96). Early examples of stone or brick ovens were either built into the large fireplace itself or to one side of it (Tibbot 2002, 95). ‘Outside kitchens’ or ‘outhouses’ present another case of slow diffusion; recorded as extant in southeast Wales in the 17th century, these separate buildings which typically contained a copper for boiling water as well as a baking oven and were used for baking, washing and preparing animal food did not become widespread until the 19th century (Tibbot 2002, 95). In many parts of Wales the pot-oven was the only available type of oven until the latter part of the century (Tibbot 2002, 91). The open range, too, while developed in the mid-18th century, took 100-150 years or longer to come into widespread use (Tibbot 2002, 115). In urban areas of Wales communal brick ovens were constructed in the 19th and 20th centuries to serve a number of private dwellings and can be regarded as precursors of larger custom-built bakehouses such as Derwen Bakery, established in Aberystwyth in 1900 and now part of the Museum of Welsh Life (Tibbot 2002, 101-102) (fig. 5.2.2.2.).

Fig. 5.2.2.2. Derwen Bakehouse,
Museum of Welsh Life
(Smith 2005)

Interestingly it appears that, unlike in England, Wales or other parts of Europe, built-in ovens never plaid a significant role in Scotland, except in castles and other noble households where wheat bread was baked and consumed regularly (Fenton 1976, 199; 2007, 81). This raises a chicken-and-egg-type question, namely did the majority of the population consume no wheat bread because they did not have ovens to bake it in, or did they not have ovens because they did not eat raised wheat bread and thus did not need them? The existence of pot-ovens in Scotland (Fenton 1976, 199) would suggest that raised breads could be baked if desired, but evidence suggests that there was a general preference for flat-type breads such as bannocks and oatcakes. More research may help to shed light on this point.

By the early 20th century gas and electric cookers were rapidly replacing coal ranges in urban areas (David 1977, 163). In Wales built-in cast iron wall ovens gradually superseded pot-ovens, although initially there was some reluctance to give up the pot-oven, which was considered by many to produce better bread (Tibbot 2002, 91-93). Built-in wall ovens continued to be used in rural Wales until the spread of electricity in the 1950s (Tibbot 2002, 100). However, even then the cost of the new power source meant that priority was often given to electric lighting (Tibbot 2002, 116-117), and some preferred cooking and baking over an open fire or on a range for other reasons (Tibbot 2002, 120).The development of smaller and cheaper electric cookers such as the Jackson or Baby Belling models (fig. 5.2.2.3.) contributed to a significant degree to the eventually widespread switch to electric cooking and baking in rural Wales in the second half of the 20th century (Tibbot 2002, 125).

The invention of the fan oven, introduced by Belling in 1971, represents an adaptation of an efficient system of hot-air circulation to domestic baking, largely eliminating the need to move loaves of bread around the oven for even baking (David 1977, 166) (fig. 5.2.2.4.).

5. 2. 3. The decline of home baking

Fig. 5.2.2.4. White wheat bread baked
by the author in a
modern electric fan oven
(photograph by David Byrne)
Fig. 5.2.2.3. 1951 Belling Baby model 51
table-top cooker
(‘Electric cooking gadgets –
1951 Belling Baby model 51
table-top cooker’ nd.[p])

The decline of home baking in the 19th century can be attributed to various factors, most of which were related to the ongoing industrialisation and urbanisation of British society (Burnett 1966, 3). Many of the growing number of urban dwellers found themselves in tenements without ovens, and those lucky enough to live in artisans’ houses equipped with metal ranges found their so-called ‘bread ovens’ unsatisfactory for their needs, being too small to bake an adequate supply of bread for a large family as well as having a tendency to produce dry or burnt loaves (Petersen 1995; Tibbot 2002). Furthermore commercially baked bread was cheaper unless one had access to free or virtually free flour or grain, cheap fuel and clean water (Petersen 1995, 45); it was also more convenient to buy bread from a baker and thus eliminate the hard work and mess (which tended to attract pests) that home baking entailed (Petersen 1995, 48).

It was for reasons such as these that the 19th century witnessed a steady disappearance of both suitable ovens and skilled home bakers (Petersen 1995, 49). Domestic bread baking persisted longer in the Midlands and North of England due to the easier availability of fuel (Burnett 1966, 3), and in some areas a type of compromise was popular wherein bread dough was prepared at home and then taken to the bakery to be baked for a small fee (Trinder 1993; Petersen 1995). In rural Wales home baking remained the norm well into the 20th century and bakery bread unpopular, being commonly referred to as bara starfo (bread for starving) (Tibbot 2002, 99-100). It was only when bakers’ vans began delivering fresh bread to village shops and isolated farms after World War 2 that domestic bread-making decline here, too, and many of the old wall-ovens were abandoned or closed off (Tibbot 2002, 110).

In the late 20th and early 21st century home baking of bread and other articles has undergone something of a revival in Britain, inspired to no small degree by TV celebrity chefs such as Delia Smith and Nigella Lawson (Wallop 2007). This can be witnessed in the popularity of baking courses as well as in the greatly increased sales of baking products and bread machines (O’ Brien 2002; Wallop 2007). However, a look around any supermarket would suggest that commercially-produced sliced bread, so far, is still king.

5. 3. Yeast and other leavens

Traditional bread fermentation techniques varied significantly between different parts of Britain (Samuel 2002, 171). The 16th century English physician Andrewe Boord regarded leaven of any kind with a degree of suspicion, calling it ‘heavy and ponderous’ (Boord 1576). Ale yeast or ale barm was commonly used for bread-making in 16th and 17th century England, though the above-described ‘brown bread’ for servants was made with a sourdough leaven (Markham 1615; David 1977). Yeast became a subject of scientific investigation in the 17th century, though specialised baking yeast did not become available until the mid-19th century (David 1977, 91-93), and brewers’ or distillers’ yeast was used almost exclusively until the late 1800s (McLaren and Evans 2002, 171). In the countryside, where yeast of any kind was often hard to procure, unleavened or soda breads were commonly baked (Petersen 1995, 45).

Fig. 5.3.1. Proofing dried yeast
the 21st century way,
on an overheating laptop
(photograph by David Byrne)

While self-raising flours and bread mixes containing dried yeast came onto the market in the 1870s (David 1977, 75), early 20th century rural Welsh home bakers generally used their own homemade liquid yeast, and their urban counterparts purchased liquid brewers’ yeast from inns (Tibbot 2002, 87). Some English bakers continued making bread with ale barm into the 1920s, citing its superior taste as the main reason (David 1977, 101).

Nowadays bakers’ yeast is made in specialised factories (David 1977, 93), and dried (including instant) yeast is commonly used by home bakers (fig. 5.3.1.), partly because fresh compressed yeast appears to be very difficult to obtain in Britain, with only very few bakers grudgingly ‘obliging’ customers by selling some of theirs, in contrast with other European countries such as Denmark (David 1977, 115) or Germany, where fresh yeast can be found in the chilled cabinets of almost every food shop. The situation in Ireland, incidentally, seems to be comparable to that in Britain; this author, when trying to purchase some fresh yeast from various supermarket bakeries in the 1990s and 2000s, was left with the distinct impression that it would have been an easier and less furtive undertaking to obtain a quantity of class A drugs. Modern bread mixes as sold in shops are mostly of the Irish soda bread type, containing chemical raising agents (David 1977, 74).

5. 4. Commercial baking

Historically, commercial baking in Britain has tended to be on a smaller scale than in other European countries, particularly with regard to the range of breads, rolls and other products (McLaren and Evans 2002, 172). It has been argued that English consumers’ demand for warm bread as well as for the cooking service provided by neighbourhood bakers was the principal obstacle to the establishment of factory-scale bakeries before the late 19th century (Trinder 1993; Petersen 1995). Furthermore home-baking was very popular in certain areas, such as northern England, as evidenced in the fact that in 1804 Manchester, with a population of nearly 100,000, did not have a single commercial bakery (Sheppard and Newton 1957, 31). The situation was somewhat different in Scotland, where bakers used a different process better suited to factory-scale production, and consumers preferred cold bread. As a result, bread was being mass-produced in Glasgow before the end of the Napoleonic Wars (Petersen 1995, 76).

From around 1800 the design of commercial ovens was improved by the introduction of a separate coal-burning chamber from which hot air was carried around the back of the main baking oven by means of a flue, with wood faggots being used simultaneously inside the main oven for immediate heat. Special quick- and hot-burning coals, known colloquially as ‘bakers’ nuts’, were sold for use in these ovens. (Petersen 1995, 46).

By the early 19th century bakeries made three classes of bread: white, which contained no bran, wheaten, which contained only the finer bran, and household, which was wholegrain (Trinder 1993, 125). Bakers in London and other large cities were regularly accused of adulterating their bread with such substances as bone-meal, chalk, lime, white lead and alum. Of these only alum, which was used to whiten bread and which is nowadays considered to be harmless for human consumption and used as a preservative and in some baking powders, was confirmed by chemical analysis at the time (Sheppard and Newton 1957, 73). The above-mentioned Sale of Food and Drugs Act of 1875 outlawed these and other adulterations (Burnett 1966, 207).

Fig. 5.4.1. Initial suspicion regarding
the effects of Aerated Bread
(Punch 1860)

The mechanisation and industrialisation of the baking trade accelerated during the second half of the 19th century. The steam tube oven, patented in 1851, worked on the principle of a series of sealed tubes containing distilled water and whose ends were heated in a furnace, resulting in steam inside the tubes making their walls extremely hot. This type of oven became increasingly popular in England in the early part of the 20th century (Sheppard and Newton 1957, 112).

In 1860 Dr. John Dauglish invented Aerated Bread, made by a totally mechanised process using carbonic acid gas instead of yeast to leaven the bread. While this reduced production time and increased output, the resulting lack of taste proved unpopular with the masses, though it found favour with some members of the middle class, who liked the hygienic production process as well as the bland flavour, believed to be beneficial for weak digestions (Petersen 1995, 77). Aerated Bread did in fact continue to be made until the 1980s (Aerated Bread Company (ABC) nd.[a])(fig. 5.4.1.).

The International Exhibition of Flour Mill Machinery in London in 1881 gave further impetus to the development of factory-scale bakeries through the exhibition of such devices as mechanical mixers and dough-kneaders as well as the already mentioned steam tube ovens (Trinder 1993, 125-126). Gas ovens began to appear in bakeries from the 1880s (Sheppard and Newton 1957, 113), and by the end of the century most larger bakeries were using flour-sifters and draw-plate ovens as well as temperature-controlled ovens fuelled by gas or oil (Burnett 1995, 69). Legislation to improve working and sanitary conditions in bakeries contributed further to bringing the British baking industry closer to the factory system by the end of the 19th century (Burnett 1966, 106). By 1900 a few so-called plant bakeries, such as the famous J. Lyons & Co., were making bread in London and selling it both through grocers’ shops and through their own chains of retail bread shops and tea shops (Burnett 1995, 69).

In Wales on the other hand commercial bakeries did not become common in villages until the early part of the 20th century (Tibbot 2002, 101), and many bakers in both rural and urban areas continued to use the old brick ovens well into the first half of the century (Tibbot 2002, 110).

Factory baking experienced a particularly rapid expansion in the 1945-1955 period when bread prices were still controlled and thus only large-scale operations could make a profit (Sheppard and Newton 1957; Burnett 1995). During the 1950s the big three new bakery companies - J. Rank Ltd., Spillers Ltd. and the Canadian-based Allied Bakeries - ensured a large share of the market by selling their products through grocery shops and dairies across the country, making it unnecessary for housewives to go to the bakery to get bread (David 1977, 37). The British baking industry continued to consolidate rapidly following the abolition of bread price control in 1956 (Burnett 1995, 75).

In a development that echoed the invention of Aerated Bread a century previously, the Chorleywood Bread Process, while using yeast (in fact large quantities of it), once again speeded up the baking process in the 1960s by replacing the maturing of the dough with a few minutes of intense mechanical mixing in high-speed machines (David 1977, 37). Today ca. 80% of bread sold in Britain is made by this process (Lawrence 2004; Morris 2010). Since the late 20th century many supermarkets have installed ‘live’ bakeries to lure customers with ‘crusty’ bread hot from the oven – which is, however, frequently delivered to the store par-baked and frozen and only heated up on site, and whose taste and texture are generally much the same as those of the pre-sliced wrapped variety sold for considerably less (David 1977; Lawrence 2004). Most in-store bakeries which do not use pre-baked bread make theirs from ‘premixes’ measured out in a factory, thus avoiding the need to employ specially-skilled staff (Lawrence 2004, 105). Even most of the independent bakers in Britain use the same ‘premixes’, complete with additives, to make their bread (Lawrence 2004, 105). Others are surviving by offering specialty breads at premium prices (Burnett 1995, 75).

Additives have been used in factory bread since the 1920s (Lawrence 2004, 106). Soya flour whitens bread, as alum did in the 19th century, and chemical oxidants, emulsifiers and hydrogenated fats are required to make the Chorleywood Bread Process successful (Lawrence 2004, 107-108). Some of the more dangerous additives used in the past, such as chlorine compounds and potassium bromide, are now banned by law, but the major baking concerns are continually looking for new ones in their endeavour to create bread which can be quickly and cheaply produced and which keeps fresh for longer (Lawrence 2004, 108-109; 113). Current law also requires some of the nutrients which are lost through roller milling to be put back into white flour by millers; calcium is restored at almost four times the original level, but only some of the iron, vitamin B1 and vitamin B3 naturally present in wholemeal is added back (Lawrence 2004, 117)

Saturday 18 June 2011

Chapter 6: Bread consumption and diet

6. 1. Warm, fresh or stale

In the 16th century the physician Andrewe Boord advised his readers that hot bread was unwholesome, lying in the stomach “like a sponge”, and that bread should be kept for a day and a night before being eaten in order to be nutritious, while also cautioning against the consumption of old, musty or mouldy bread (Boord 1576). At the same time he appears to have been aware of the appetising smell of a freshly-baked loaf (Boord 1576). By the 18th century aroma and flavour apparently had won over concerns of digestion and nutrition, with a large proportion of the bread sold in towns and cities being consumed while still warm and in fact often representing a family’s only warm weekday food (Petersen 1995, 67). This predilection for warm bread continued into the 19th century, despite writers such as Elisabeth Beeton echoing Boord’s advice from three centuries earlier (Beeton 1861, 784).

6. 2. The role of bread in the diet

The importance of bread in the everyday diet of British people over the last five centuries has varied according to a number of different factors, including time, geography and social status. In some poor and remote areas such as some of the Scottish islands up to the 19th century the cereal harvest was not sufficient to last all year; on North Ronaldsay, Orkney, for instance the amount of barley harvested was only enough to bake bread during the winter months, and people had to rely on fish and milk to get them through the rest of the year (Fenton 2007, 256). In most areas of Britain, however, as indeed in many other parts of Europe, a diet of ‘bread and relishes’ based on some type of bread as the staple and small amounts of other foods to enhance flavour and nutrition was the norm for many centuries up to the 19th (Prentice 1950, 66). By 1770 wheat bread had become the main food of most British people (Petersen 1995, 4), and the poor in both urban and many rural areas were being accused (by the rich) of indulging in luxury and wastefulness by insisting on white bread over cheaper coarser bread (Petersen 1995, 26).There were, however, a number of potential reasons for this apparent pickiness which would not necessary have been obvious to those on the higher rungs of the social ladder. White wheat bread was more attractive in both appearance and taste to someone who lived on little else (Petersen 1995, 26), and unlike modern roller mills the stone-milling of the time did not break the bran into small particles, making it quite difficult to digest (Petersen 1995, 24). Furthermore fibre is a negative in nutritional terms, meaning that a proportion of the energy provided by wholemeal flour is used up in the digestion of the bran. Unlike modern dieters, people relying on bread as their main source of calories could ill afford such a loss (Petersen 1995, 25-26).

In the period 1770-1870 the majority of the British population spent the greatest part of their income on food, with bread constituting the largest single item in most families’ food budget (Petersen 1995, xiii-xiv). When the price of bread went up, as it periodically did, poorer people would continue to eat the same amount of it, cutting out other items in the budget to compensate (Petersen 1995, 5). Wheat remained the staple of the British up to the 1870s (Petersen 1995, 40), when factors such as the decline of domestic arable agriculture and the parallel growth of horticulture, the beginning of large-scale imports of cheap meat and wheat and a gradual reduction in food taxes increased purchasing power across social classes, resulting in a lower cost of staples and enabling most people to afford a somewhat more diverse diet (Clark 2001, 94). It has been suggested by some earlier authors that the “new 19th century diet” miraculously enhanced people’s physical and mental abilities, eliminated premature old age and doubled their life expectancy (Prentice 1950, 69), and that the increasing urbanisation encouraged social competition, thus leading to “more sophisticated” tastes and eating habits (Burnett 1966, 2), a rather loaded term. Contemporary evidence appears likely to at least slightly dampen such enthusiasm; records of the diet of Lancashire labourers in 1864 indicate that they lived mostly on bread, oatmeal, bacon, treacle, tea, coffee and very small quantities of butter (James 1997, 76), a neither very varied nor vey healthy diet from a modern nutritional perspective. The situation did not improve when cheap jams came on the market in the 1880s; they were immediately popular, especially with low income families, and many poor children began to live mostly on bread and jam. Most of these ‘jams’ unfortunately contained very little of whatever fruit they were supposedly made from, instead consisting of cheap vegetable or fruit pulp mixed with colourings and copious amounts of sugar (James 1997, 76-77).

To whatever degree the British diet diversified in the latter part of the 19th century, average bread consumption still stood at 1 lb per head per day by the early 20th century (David 1977, 4). However, flour and bread consumption fell between 1918 and 1938 as an increase in real wages combined with a growth in more sedentary occupations, resulting in reduced calorie requirements as well as a more varied diet (Burnett 1995, 71). This trend was reversed temporarily during World War 2, when bread was not rationed but many other foods were, prompting a return to a more ‘bread and relishes’ type diet (Prentice 1950; Burnett 1966; Burnett 1995). During this period the wheatmeal content of bread was increased by law in order to provide the population with more iron and vitamin B in their diet (Keane 1997, 173). While bread was rationed between 1946 and 1948, this was largely observed only on paper (Sheppard and Newton 1957, 68) (fig. 6.2.1.).

Fig. 6.2.1. Protesters against
bread rationing
(Rohrer 2010)

The second half of the 20th century saw a marked decline in the consumption of bread and flour, due to a number of reasons including changing meal patterns, a rising standard of living and a movement away from cheap carbohydrate foods to more expensive proteins (Burnett 1966; Holderness 1985; Burnett 1995). During the 1950s and 1960s bread increasingly became a branded product; aided by the spread of supermarkets, self-service stores and television, Mother’s Pride, Sunblest and Wonderloaf became household names (David 1977; Hardyment 1995) (fig. 6.2.2.).

Fig. 6.2.2. Sunblest bread advertisement,
1950s
(Hardyment 1995, 132-133)

First sold in Britain in the early 1950s, sliced wrapped bread was not very popular initially, but soon caught on (Hardyment 1995, 36; 49). Market research carried out in the later 20th century revealed that people bought pre-sliced factory bread due to its availability, convenience and hygiene; interestingly, but perhaps not surprisingly, there was no mention of taste (David 1977, 38). While there was an increase in the consumption of brown breads, particularly in the 1980s and 1990s, due largely to its higher fibre content being promoted as healthier (Burnett 1995, 74), and traditional regional staples such as Scottish oatcakes, pre-packaged and sold in supermarkets and health food shops, have enjoyed something of a renaissance in recent years (Fenton 2007, 263), a nationwide survey involving 2000 adults and commissioned by the Federation of Bakers (FoB) and the Flour Advisory Bureau (FAB) in 2007 shows that bread consumption in Britain is currently still dominated by white sliced bread (FoB and FAB, 2-3) (fig. 6.2.3.).

Fig. 6.2.3. Bread consumption
in Britain 2007
(FoB and FAB nd., 3)

6. 3. Bread as a social indicator

As early as the 16th century, and probably much earlier, there appears to have been a strong correlation between social status and the kind of bread baked and eaten in the home. White bread had played a significant role in urban areas of various parts of Europe since the Middle Ages (Petersen 1995, 31), and Andrewe Boord praised wheat bread for ‘making a man fat’ - a desirable outcome in the 16th century - and ‘setting him in temperance’ (Boord 1576). Wheat bread would seem to have been the most highly-regarded and sought-after bread. In 16th century Scotland wheat bread (home-baked or bought) was eaten by the higher classes, but rarely by the poor, wheat being largely a cash crop at the time (Fenton 1976, 163).

Gervais Markham in the 17th century argued that manchet bread – made from the finest white flour available at the time and frequently enriched with butter, eggs or milk – was the best and principal bread for “simple meals” – simple, that is, for his upper-class readership - but also gives instructions on making rye bread, as well as a recipe for a “brown bread” described as the “coarsest bread for man’s use” and the kind of bread to be given to hind servants. Made from a mixture of barley, pease, malt and wheat or rye, it sounds particularly appetising when Markham advises his readers to use water as hot as possible to minimise the smell or rankness of the pease (Markham 1615, 269-270). In 17th century Wales oat- and barley bread were considered indicative of a lower standard of living than wheat bread (Tibbot 2002, xiii), while in Scotland by the same time the exclusive consumption of pease- or bean-bread had become limited to the poorer classes (Fenton 1976, 166). From around the middle of the 18th century more and more Scots began eating wheaten and white bread, including the ‘lower orders’ who consumed it alongside barley and oat cakes, and regular consumption of wheat bread soon became a status symbol (Fenton 2007, 214-216; 256). However, unleavened bread made with a mixture of pease- or bean meal and bere meal remained common in Scotland until the first half of the 19th century (Fenton 2007, 201), while oven-baked leavened wheat bread was one of the perks regularly provided to harvest labourers in the 18th century (Fenton 2007, 214). By the 19th century wheat bread had descended the social scale far enough that bakers’ shops became common in towns (Fenton 1976, 164).

Fig. 6.3.1. Home baking of
Scottish oatcakes, 1960s
(Fenton 2007, 7)

There was considerable regional variation in the types of breads baked and eaten. In areas such as northern England, northern Wales and parts of Scotland where little wheat was grown or available oatcakes were the staple well into the 20th century (Trinder 1993; Tibbot 2002) (fig. 6.3.1.). In other areas of Wales leavened barley bread was commonly baked under an inverted iron pot on a bakestone or griddle (Tibbot 2002, 90). 18th century eyewitnesses and historians commented on the general prevalence of oats and barley in the rural Welsh diet, with wheat bread not regularly consumed by the lower classes until the latter part of the 19th century (Tibbot 2002, 79-80). It was only following a dramatic drop in the price of wheat in the 1880s that the consumption of oat and barley bread in Wales declined significantly (Tibbot 2002, 80). Bread’s significance as a status delineator, however, endured for somewhat longer; in the late 19th and early 20th century it was usual for a Welsh farmer and his family to eat white bread every day, but farm servants were fed mostly on barley- or mixed bread, being given one slice each of white bread as a Sunday treat (Tibbot 2002, 2).

At the same time as white bread was being strongly associated with higher social standing, a simultaneous trend in the opposite direction took place from the 18th century onwards, when the digestive benefits of brown bread began to be appreciated by some members of the middle classes (Petersen 1995, 36). Its health benefits were promoted by 19th century food writers (Beeton 1861, 782), and by the interwar period the wealthiest classes consumed far more brown bread than the working classes, in a complete reversal of the pattern of previous centuries (Burnett 1995, 71). Regional traditions endured in some areas into the second half of the 20th century; Scottish oatcakes, for example, remained the staple, at least in rural areas, until the 1950s or 1960s, being called breid in some parts while (usually white) wheat bread from the baker’s was referred to as loaf (Fenton 2007, 7). Generally speaking, however, by the 1970s genuine wholemeal bread had become as inaccessible to most people, due to availability and/or price, as fine white bread was in the 16th century (David 1977, 35). By the end of the 20th century overall bread consumption was lowest among the wealthiest section of British society and highest among Old Age Pensioners, due, it has been argued, to their relative poverty and tendency to have more meals at home (Burnett 1995, 74). Most brown and wholemeal bread was eaten by the wealthiest, followed, perhaps surprisingly, by Old Age Pensioners- possibly motivated by its health benefits (Burnett 1995, 74). Overall bread accounted for an average 1/18th of household food expenditure and 1/10th of calories consumed, with both figures higher in low-income households (Burnett 1995, 75). It appears that while bread may no longer be as overtly linked to social status as in previous centuries, there is still a strong correlation between how much and what kind of bread one eats and one’s place in modern British society.