January 22, 2003
Food Fortification Spurred By Military Purchases
Researchers Conduct Comprehensive Review of Food Fortification History in the U.S.
Food fortification with vitamins and minerals is one of the most effective methods to improve health and prevent nutritional deficiencies. It is greatly responsible for the virtual eradication of disease such as goiter, rickets, beriberi, and pellagra in the United States. New research from the Johns Hopkins Bloomberg School of Public Health suggests that developing nations could implement successful food fortification programs by requiring fortified foods for their military personnel. The conclusions are based on a detailed review of the history of food fortification programs in the United States, which is published in the January 22, 2003, edition of the journal Economic Development and Cultural Change.
“Food fortification in the U.S. was accomplished with a great deal of cooperation between the food industry and public forces. Historically, food producers have been eager to supply fortified food once it was proven it could be profitable,” said David Bishai, PhD, co-author of the review and assistant professor of population and family health sciences at the Johns Hopkins Bloomberg School of Public Health. “Many of the industrial and market forces in the U.S. do not apply to developing nations, but our research shows that governments can take steps to encourage manufacturers to fortify food for the public. One way may be to have military purchasers demand only fortified products.”
Dr. Bishai and co-author Ritu Nalubola examined the major waves of food fortification in the United States, which include the iodization of salt in the 1920s, fortification of milk with vitamin D in the 1930s, enrichment of flour and bread in the 1940s, and the widespread addition of calcium to a variety of products beginning in the 1980s. For salt, milk, and bread, food fortification was accomplished by establishing the health benefits through scientific research and enlisting the support of food manufacturers. In most cases, manufacturers found the measures to be profitable after appealing to consumers with advertising and public service campaigns. Widespread compliance was often achieved through market demand rather than through government mandates. Governments are often large food purchasers and can influence industry.
However, the researchers noted that efforts to enrich bread and flour were particularly slow because there was little public interest or economic incentive. The situation did not change significantly until World War II, when Britain began to manufacture only enriched flour and started a public campaign to improve the nation’s health during wartime. Despite similar patriotic campaigns in the United States, only 40 percent of the nation’s manufactured flour was enriched, because smaller companies continued to produce cheaper unenriched flour to compete with larger manufacturers. In 1942, the U.S. Army decided it would purchase only enriched flour. The move encouraged many more manufacturers to produce enriched flour, but 100 percent compliance was not reached until 1943 when the War Foods Administration temporarily required enriched bread. Today, the Food and Drug Administration does not restrict the sale of unenriched products as long they are properly labeled, but most flour remains enriched with B vitamins, iron, and folate because of consumer demand.
“The flour enrichment efforts during World War II can be an important model for developing nations attempting to build successful food fortification programs,” said Dr. Bishai.
“History of Food Fortification in the United States: Its Relevance for Current Fortification Efforts in Developing Countries” was written by David Bishai and Ritu Nalubola.
The research was funded by the Hopkins Population Center at the Johns Hopkins Bloomberg School of Public Health.Public Affairs Media Contacts for the Johns Hopkins Bloomberg School of Public Health: Kenna Brigham or Tim Parsons @ 410-955-6878 or email@example.com