Is Food Fortification Necessary? A Historical Perspective

More than two billion people worldwide suffer from micronutrient deficiencies because they are not meeting their daily dietary requirements for essential vitamins and minerals.  These deficiencies not only affect an individual’s long term health but can also raise societal and public health care costs and potentially depress a nation’s economic productivity (WHO/FAO).  It may be difficult for those living in developed countries to remember when diseases such as goiter, rickets, beriberi, and pellagra were once common health problems back in the early 20th century.  

Today, these diseases are rarely seen due to a series of food fortification programs that helped stave off a multitude of nutrient deficiencies.  According to the World Health Organization and the Food and Agriculture Organization of the United Nations, food fortification is the practice of deliberately increasing content of an essential micronutrient in a food, so as to improve the nutritional quality of the food supply and provide a public health benefit with minimal risk to health.  

This article—the first in a series to be published in the Food Insight newsletter over the next several months—exploring food fortification—delves into the historical origins of food fortification in the U.S. and provides some unique insights into its contribution to improved public health. 

Salt of the Earth
During the 1921 American Medical Association (AMA) convention, two Ohio doctors presented findings from their clinical trial demonstrating the effectiveness of sodium iodide treatments for the prevention of goiter in Akron schoolgirls.  Prior to their study, research from Europe had also suggested an association between iodine deficiency and goiter (enlarged thyroid gland).  It was found that without iodine, the body could not properly synthesize thyroid hormones, which often resulted in an unsightly neck goiter or in more serious cases, mental retardation.  Iodine deficiency generally occurs in areas where the soil has been depleted of iodine because of flooding, heavy rainfall, or glaciation.

Shortly after the publication of the results, Michigan became the first state to institute a public campaign to provide dietary iodine via salt.  An extensive educational campaign that involved schoolteachers, industry, as well as the medical and public health communities, helped increase consumer awareness and demand for iodized salt so that by 1924, iodized salt was commonplace, despite the fact that iodization was never mandatory.  Epidemiological studies following implementation saw a significant decline in the incidence of goiter confirming the success of the program (Bishai and Nalubola, 2002).  Most table salt continues to be fortified with iodine today.

Milk: Delivering Your Daily Vitamin D
In the early 20th century, rickets (soft bones and skeletal malformation from incomplete bone growth) was common among underprivileged children living in industrialized, northern U.S. cities.  Inadequate diet, poor hygiene, and lack of exercise were among the factors believed to play a role in the formation of this disease.  The relationship between diet and rickets was not clearly understood until an English physician conducted the first experimental study on rickets with dogs.  His observations of specific “anti-rachitic” factors found in cod liver oil, butter and whole milk eventually led to the identification, purification, and synthesis of vitamin D (Rajakumar, 2003).  Prior to this discovery, irradiated milk, cod liver oil, and milk from cattle fed irradiated yeast were common treatments to combat rickets (Bishai and Nalubola, 2002).  After the discovery of synthetic vitamin D, these products were no longer needed to be used.

Today we know that our bodies can produce vitamin D when skin cells are exposed to sunlight, making the sun our major source of vitamin D. However, because of concerns about skin cancer there is a need for dietary sources of vitamin D as well. The Food and Drug Administration (FDA) established a standard of identity (SOI) for milk, which includes the optional addition (with the exception of evaporated milk) of vitamins A and D. (Note: Certain foods have SOIs which are legal definitions of a food or food product that specify exactly what can and cannot be added to the food and in what amounts.)  Today, the majority of our milk is fortified with vitamin D.  However, additional food sources of vitamin D are limited, and thus obtaining vitamin D solely through dietary sources can be challenging.  In fact, many Americans fall short of the daily requirements for vitamin D.   Naturally occurring sources are limited mostly to oily fish (e.g., salmon, mackerel, and sardines) and cod liver oil.  Besides milk, select foods such as cereals and orange juice may be fortified with vitamin D.  Supplements may also be necessary and are readily available. Because certain brands rather than all items within a food category may be fortified, it may be helpful to check the Nutrition Facts Panel.

Grains:  Serving Our Nation
By 1938, the AMA Council of Foods and Nutrition endorsed the addition of nutrients to foods if there was sufficient scientific justification that doing so would improve public health.   During this time, the American diet relied heavily on refined flours.  However, processing flour decreased the availability of essential B vitamins.  As a result of niacin deficiency pellagra (described by “the four D’s”: diarrhea, dermatitis, dementia, and death) was widespread in southern states. To a lesser degree, beriberi (a neurological disorder) from thiamin deficiency and riboflavin deficiency (presenting as redness, swelling and cracking of the mouth and tongue) also persisted.   As a result, bakers began to add high-vitamin yeasts to their breads followed by synthetic B vitamins, as they became available. By the end of 1942, 75 percent of white bread was fortified with thiamin, niacin, iron, and riboflavin, helping to alleviate these conditions.  In 1942, the U.S. army agreed to purchase only enriched flour for their soldiers in an effort to improve the health of recruits, which created additional demand for enrichment of the nation’s white bread (Bishai and Nalubola, 2002). 

After World War II the FDA decided not to mandate enrichment during the development of the SOI for bread.  Rather it established two different SOI for enriched and non-enriched products. Throughout the ensuing years, the SOI for other grain products such as corn meal, rice, noodles, and macaroni have also been established.

In 1998, based on recommendations from the U.S. Public Health Service, the Centers for Disease Control and Prevention (CDC), and others, folic acid fortification became mandatory for enriched grain products after several studies found folic acid supplementation to be beneficial in reducing the occurrence of neural tube defects in newborns (Honein, et al., 2001).  According to the CDC, folic acid fortification of grain products reduced the incidence of neural tube defects by one-third between 1995 and 2002. (http://www.marchofdimes.com/22663_25314.asp).  Still, there are questions and ongoing debate about folic acid fortification levels and whether fortification levels are too high or whether it is even necessary.

Food fortification has evolved tremendously over the years and has now expanded to include a wide variety of vitamins, minerals, essential amino acids, and proteins intended to maintain and improve one’s health beyond basic nutritional needs.  The next article in this series will explore the issue associated with current fortification programs, including discussions in terms of its impact on public health.