| Literature DB >> 25630617 |
Maaike J Bruins1, Gladys Mugambi2, Janneke Verkaik-Kloosterman3, Jeljer Hoekstra3, Klaus Kraemer4, Saskia Osendarp5, Alida Melse-Boonstra6, Alison M Gallagher7, Hans Verhagen8.
Abstract
Fortification of foods consumed by the general population or specific food products or supplements designed to be consumed by vulnerable target groups is amongst the strategies in developing countries to address micronutrient deficiencies. Any strategy aimed at dietary change needs careful consideration, ensuring the needs of at-risk subgroups are met whilst ensuring safety within the general population. This paper reviews the key principles of two main assessment approaches that may assist developing countries in deciding on effective and safe micronutrient levels in foods or special products designed to address micronutrient deficiencies, that is, the cut-point method and the stepwise approach to risk-benefit assessment. In the first approach, the goal is to shift population intake distributions such that intake prevalences below the Estimated Average Requirement (EAR) and above the Tolerable Upper Intake Level (UL) are both minimized. However, for some micronutrients like vitamin A and zinc, a narrow margin between the EAR and UL exists. Increasing their intakes through mass fortification may pose a dilemma; not permitting the UL to be exceeded provides assurance about the safety within the population but can potentially leave a proportion of the target population with unmet needs, or vice versa. Risk-benefit approaches assist in decision making at different micronutrient intake scenarios by balancing the magnitude of potential health benefits of reducing inadequate intakes against health risks of excessive intakes. Risk-benefit approaches consider different aspects of health risk including severity and number of people affected. This approach reduces the uncertainty for policy makers as compared to classic cut-point methods.Entities:
Keywords: cut-point method; food fortification; nutrient reference values; public health; requirements; risk–benefit assessment
Year: 2015 PMID: 25630617 PMCID: PMC4309831 DOI: 10.3402/fnr.v59.26020
Source DB: PubMed Journal: Food Nutr Res ISSN: 1654-661X Impact factor: 3.894
Fig. 1The risks of adverse health effects from decreasing intakes and the risks of adverse health effects with increasing intakes. The Estimated Average Requirement (EAR) reflects the intake where 50% of a population group is at risk of inadequacy, whereas the Tolerable Upper Intake Level (UL) is set an uncertainty factor lower than the No Observed Adverse Effect Level (NOAEL) or Lowest Observed Adverse Effect Level (LOAEL). The Recommended Nutrient Intake (RNI) is set at two standard deviations above the EAR and reflects the intake level at which 2.5% of a population group is at risk of inadequacy.
Fig. 2Individual micronutrient intakes from one food A fortified at 100% relative level (left) or micronutrient intakes from four foods A, B, C, and D fortified at 25% relative level (right). Scenarios include consumption of no fortified food A (top), one portion of fortified food A (middle), or two portions of fortified food A (bottom). Top: when no food A is consumed, the EAR would not be met with only food A fortified, but would be met with foods A, B, C, and D fortified even at 25% level. Bottom: when two portions of food A are consumed, the UL would be exceeded with only food A fortified, but the UL would not be exceeded with foods A, B, C, and D fortified.