Foodborne Illness: The History of an Invisible Enemy

The history of foodborne illness is as complex and tortuous as the history of eating. Since the very beginning, foodborne illness has been a perpetual hitchhiker in our journey with food. With every human advancement in eating and acquiring food, foodborne illness has been ready with a challenge, finding new ways to survive in changing environments. Bacteria’s tenacious flagella have withstood numerous developments in our diet and continue to plague our lives today.


Humans discover fire.

Humans discover fire.

The food rules for early man were simple: eat what you can get. Lacking discerning palettes, these opportunistic hunters were most likely consuming contaminated meat, poisonous mushrooms and indigestible grains. The meat from abandoned carcasses? On the menu. The sickest, weakest animals? A quick and easy appetizer. The variety of microbial fauna they frequently ingested has been preserved for posterity in the form of coprolites (fossilized feces). These generous deposits give us a glimpse of their diet as well as the pathogenic organisms therein.[1]

The discovery of fire sparked a series of advancements in the human diet. As history progressed, new benefits presented new challenges, creating a game of microbial tug-o-war that continues in the present day.


Ancient Egypt grain harvesting.

Ancient Egypt grain harvesting.

Challenge: The quiet magic of crop cultivation meant humans could stop moving with the herd and lay down some roots – with consequences. The newfound proximity to neighbors enabled disease to ravage an entire community with speed and precision.  Curious animals that were attracted to human settlements enriched life with milk, cheese and animal fat for cooking.[2]However, they also introduced humans to a host of previously un-encountered pathogens, rupturing man’s symbiotic bliss with their animal companions. Irrigation was brilliant for growing crops, but it was also brilliant for spreading human and animal feces. Communicable disease, often transmitted through food, flourished in these new environments.


Challenge: It was the pinnacle of luxury to have enough food that storage was necessary. People were dazzled by the concept of extra food, but they had zero knowledge of how to store it properly to avoid spoilage, mold and contamination. Novice food hoarders were exposed to mysterious foodborne illnesses like ergotism, most famously associated with the Salem Witch Trials.



Ergot or (Claviceps Purpurea) is a fungus common on grasses, cereal crops and ryes.[3]It’s also the source of lysergic acid diethylamide (LSD).[4] When ingested, ergot causes ergotism and symptoms like spasms, fits, hallucinations and prickly sensations in the limbs.  Researchers Linnda Caporael and Mary L. Matossian have linked ergotism to the Salem Witch Trials, hypothesizing that it could have been transmitted unknowingly through contaminated rye that the young girls, the original accusers, worked with to make bread.[5] So it turns out the culprit in Salem may have been the devil inside…inside the intestines.


Challenge: Culinary creativity was born out of a lack of necessity. With food becoming more abundant, people were able to get creative with food preparation, experimenting with different cooking techniques and exotic spices introduced by the budding international trade system. Spices were a delight to medieval eaters who had never before worried about food being delicious as much as existent; they even made spoiled meat palatable again! Spices masked the unpleasant tastes that could betray a food’s quality and people feasted away, allowing illness to spread. The legacy of those flavor sensations lives on without the necessity of concealing spoiled meat. Nutmeg was so popular, it still enhances the flavor of meat today – especially hot dogs.[6]


Excrement, waste, dung, poop. Whatever you call it, human waste is a fact of life. It’s also a primary carrier of foodborne illness. It’s no surprise that it has played a critical role in the development, study and control of foodborne illness and other disease.

Roman toilets. Part of the Roman sewage system.

Roman toilets. Part of the Roman sewage system.

The Romans had a sophisticated sewage system that unfortunately degenerated along with the empire. At this low point, people flung their waste out of chamber pots and into the streets, a habit that lasted for hundreds of years across cultures. The consequences manifested in 19th century London. It was filthy; human and animal feces festered in every crevice of the city. By 1858, it escalated until the “Great Stink” elucidated the need for a sewage system. The Metropolitan Board of Works developed a system of flush toilets that made waste disappear like magic…and reappear in the River Thames. It was apparent that more work was needed in the art of human waste disposal to avoid fecal pollution of the city’s main water source.[7] Today we’re lucky to pretend our waste really does disappear, but many people have worked very hard to inspire that illusion.


Challenge:  Discovering the causes of disease was as illuminating for public health as the first telescope was for the stars. Making connections between diseases and causes paved the way for developing preventative measures and treatments, changing the way the world viewed disease.  We have these early pioneers to thank for our current knowledge in the bacterial lexicon:

  • Ignaz Semmelweiss found that mortality rates of women in childbirth were related to a lack of physician hand hygiene in 1847.[8]
  • Dr. John Snow discovered that contaminated water caused the transmission of cholera in 1848.[9]
  • Louis Pasteur proved a solid connection between disease, microorganisms and spoilage in 1870.[10]

These discoveries revealed an invisible realm of bacteria more complicated than anyone had ever imagined. The first steps in the field of microbiology only scratched the surface. The science of understanding, tracking and controlling disease remains an ever present challenge today.


The GE Monitor Top Refrigerator of the 1930's.

The GE Monitor Top Refrigerator of the 1930’s.

Challenge: Scientific studies in the 1800s confirmed that refrigeration inhibits the growth of microbes. People had been using primitive means to chill their food for years; it was time for an upgrade in technology. By World War I, home refrigerators existed, but they were exorbitantly priced and often used toxic chemicals as coolants. By World War II, home refrigerators and freezers were widely available.[11]  With them, our eating habits boomed right along with the economy.

Industrial refrigeration was game changing for the global food trade. In 1882, William Soltau Davidson innovated a method to refrigerate an entire ship, allowing perishable food to travel long distances without spoilage.[12] Along with sophisticated new transportation systems, this development led to an enormously complex international food supply.  Mitigating foodborne illness in today’s global food system remains a struggle because of the system’s size and intricacy. Look at the label of the next thing you eat and imagine that every ingredient was sourced from a different location.

Moving Forward

We’ve come a long way from scavenging the scraps of other predators for food. Modern technology has rendered seasonality a non-issue, ensured we have plenty of food and increased the variety of foods we enjoy from all corners of the globe.[13]  However, the bountiful sources and touch points of our food supply create ample opportunities for the contamination of food and the proliferation of foodborne illness, despite our extensive knowledge of disease. If we’ve learned one thing from the history of foodborne illness, it’s that microbes are as steadfast survivors as we are. In Fundamental Food Microbiology, Bibek Ray puts it plainly:

History suggests that there will probably always be new pathogens, and thus, as we develop methods to control existing pathogens, we have to remain alert for new ones. [14]

Fortunately, our past successes, challenges and failures in foodborne illness have prepared us to meet arising obstacles. Standing in a strong foundation built in the past, we’ll welcome future food challenges with determination.

AshleyBellAuthor: Ashley Bell is a full time nonprofit outreach and program manager and part time history detective. She likes to look to the past to explain where we are today.

[1] M. Satin. (2014).  History of Foodborne Disease – Part 1 – Ancient History.  Encyclopedia of Food Safety (Internet). Retrieved 2014 May 6 from:

[2] Gerald T. Keusch. (Sept 2013).Perspectives in Foodborne Illness. Foodborne Illness: Latest Threats and Emerging Issues (Internet). Retrieved 014 May 4 from:

[3] R. Early. (2009).  Pathogen Control in Primary Production: crop foods. Foodborne Pathogens: Hazard, Risk Analysis, and Control, Second Edition (Internet).  Retrieved  2014 May 4 from:

[4] Cynthia A. Robers. (2001).  The Food Safety Information Handbook, Oryx Press (Internet). Retrieved 2014 May 6 from:

[5] Ibid.

[6] M. Satin.

[7] Gerald T. Keusch

[8] Cynthia A. Robers

[9] Ibid.

[10] Ibid.

[11] Gerald T. Keusch

[12] Ibid.

[13] Ibid.

[14] Bibek Ray. (2005). Fundamental Food Microbiology (Third Edition). CRC Press (Internet). Retrieved 2014 May 7  from:


2 responses to “Foodborne Illness: The History of an Invisible Enemy

  1. Pingback: Summer Reading Recap: Rome | AntiquityNOW

  2. Pingback: Summer Reading Recap: Mesopotamia and the Middle East | AntiquityNOW

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s