Study reveals 80% of pregnant women affected by iron deficiency in third trimester

Researchers find over 80% of women are iron deficient by their third trimester despite the absence of anaemia in early pregnancy, even in high-resource settings, highlighting the need for improved screening and treatment protocols.

 

80% of pregnant women affected by iron deficiency in third trimester

Iron deficiency during pregnancy has long been a concern for healthcare providers, but a new study published 26 September 2024 in The American Journal of Clinical Nutrition [1] reveals the problem may be far more widespread than previously thought. The research, conducted in Ireland, found that despite the absence of anaemia in early pregnancy, an overwhelming majority of women become iron deficient by their third trimester.

The study, led by Elaine K. McCarthy and colleagues, followed 629 primiparous women with low-risk, singleton pregnancies throughout their gestational period. The researchers measured iron biomarkers at 15, 20, and 33 weeks of pregnancy. Their findings paint a stark picture of iron status deterioration as pregnancy progresses.

Using a serum ferritin threshold of <30 μg/L to define iron deficiency, the prevalence increased dramatically from 20.7% at 15 weeks to 43.7% at 20 weeks, and a staggering 83.8% by 33 weeks. Even when using a more conservative ferritin threshold of <15 μg/L, which is associated with compromised foetal iron accretion, over half of the women (51.2%) were iron deficient by their third trimester.

Dr McCarthy noted: “In this high-resource setting, iron deficiency defined by a variety of biomarkers and thresholds, was very common during pregnancy, despite the cohort profile as generally healthy.”

Implications for maternal and infant health

The consequences of maternal iron deficiency extend beyond the mother’s health. Previous research has linked iron deficiency during pregnancy to adverse outcomes for both mother and child, including postpartum depression, postpartum haemorrhage, preterm birth, low birth weight, and small-for-gestational-age birth. Moreover, even without anaemia, maternal iron deficiency can compromise foetal iron accretion in utero, potentially leading to long-term neurodevelopmental challenges for the child.

The study’s findings underscore the inadequacy of current screening practices, which often rely solely on haemoglobin levels to detect anaemia. Notably, none of the study participants were anaemic in the first trimester, yet the vast majority developed iron deficiency as their pregnancies progressed.

Need for improved screening protocols

The researchers argue for a more comprehensive approach to iron status assessment during pregnancy. They propose a threshold of ferritin <60 μg/L at 15 weeks of pregnancy as a predictor of iron deficiency (defined as ferritin <15 μg/L) at 33 weeks. This early indicator could help identify women at risk of developing iron deficiency later in pregnancy, allowing for timely intervention.

“Women should be screened early in pregnancy for iron status, with a suggested target ferritin concentration of >60 μg/L,” the authors recommend.

Factors influencing iron status

The study also examined various factors that might influence iron status during pregnancy. Interestingly, maternal obesity did not significantly affect iron deficiency rates. However, smoking in early pregnancy showed a trend towards lower ferritin levels.

One promising finding was the protective effect of iron-containing supplements. Women who took such supplements either before or during early pregnancy had significantly lower rates of iron deficiency throughout their pregnancies. This suggests a potential prophylactic role for multivitamin supplements containing iron, even at lower doses than typically prescribed for iron deficiency treatment.

Inflammation and iron status

The researchers also investigated the impact of inflammation on iron biomarkers during pregnancy. They found that traditional thresholds used to identify inflammation in non-pregnant populations may not be suitable during pregnancy. The most pronounced effects on iron biomarkers were observed in women with C-reactive protein (CRP) levels >10 mg/L, suggesting that current inflammation thresholds may need to be re-evaluated for pregnant women.

Call for action

In an accompanying editorial, Drs Michael Auerbach and Helain Landy describe the study’s findings as “overwhelming” and call for a change in approach to diagnosing and treating iron deficiency during pregnancy. They argue that the current guidelines from organisations such as the United States Preventive Services Task Force, which do not recommend routine screening for iron deficiency in the absence of anaemia, are inadequate in light of these new findings.

The editorial authors state: “We call for American College of Obstetricians and Gynecologists and the USPSTF to screen all pregnant females for iron deficiency, irrespective of the presence or absence of anaemia, and recommend supplementation when present.”

Conclusion

This groundbreaking study highlights the urgent need for improved iron deficiency screening and treatment protocols during pregnancy, even in high-resource settings. By identifying at-risk women early and providing appropriate interventions, healthcare providers may be able to mitigate the potentially serious consequences of iron deficiency for both mothers and their infants.

Reference:
  1. McCarthy, E. K., Schneck, D., Basu, S., et. al. (2024). Longitudinal evaluation of iron status during pregnancy: a prospective cohort study in a high-resource setting. The American Journal of Clinical Nutrition. https://doi.org/10.1016/j.ajcnut.2024.08.010