Why nutritional iron deficiency persists as a worldwide problem

J Nutr. 2011 Apr 1;141(4):763S-768S. doi: 10.3945/jn.110.130609. Epub 2011 Mar 2.

Abstract

The earliest studies of food iron absorption employing biosynthetically incorporated radioisotopes were published in the 1950s. Wheat flour has been fortified with iron in Canada, the United Kingdom, and the United States since the 1940s. However, half a century later, nutritional iron deficiency (ID) is estimated to affect 1.5-2 billion people worldwide. The reasons for the apparently limited impact of health and nutrition policies aimed at reducing the prevalence of ID in developing countries are complex. They include uncertainty about the actual prevalence of ID, particularly in regions where malaria and other infections are endemic, failure of policy makers to recognize the relationships between ID and both impaired productivity and increased morbidity, concerns about safety and the risks to iron-sufficient individuals if mass fortification is introduced, and technical obstacles that make it difficult to add bioavailable iron to the diets of those at greatest risk. It is, however, likely that the next decade will see a marked reduction in the prevalence of ID worldwide. More specific assessment tools are being standardized and applied to population surveys. The importance of preventing ID during critical periods of the life cycle is receiving increased attention. Innovative approaches to the delivery of bioavailable iron have been shown to be efficacious. The importance of integrating strategies to improve iron nutrition with other health measures, and economic and social policies addressing poverty as well as trade and agriculture, are receiving increasing consideration.

Publication types

  • Review

MeSH terms

  • Dietary Supplements
  • Ferritins / blood
  • Food, Formulated
  • Humans
  • Iron / administration & dosage
  • Iron Deficiencies*
  • Receptors, Transferrin / blood

Substances

  • Receptors, Transferrin
  • Ferritins
  • Iron