Iron deficiency ‘very common’ among healthy, low-risk pregnant women

0
Iron deficiency ‘very common’ among healthy, low-risk pregnant women

September 27, 2024

3 min read


We were unable to process your request. Please try again later. If you continue to have this issue please contact [email protected].

Key takeaways:

  • More than 80% of pregnant women in Ireland were iron deficient by their third trimester.
  • The findings highlight the importance of screening for iron deficiency, even in high-resource settings.

Approximately four of five healthy pregnant women in Ireland were iron deficient during their third trimester despite none of the women presenting as anemic during the first trimester, according to a large longitudinal analysis.

International guidelines vary regarding how to screen, prevent and treat iron deficiency during pregnancy and a lack of consensus regarding biomarkers and thresholds to define iron deficiency can complicate interpreting findings across studies, according to Elaine K. McCarthy, PhD, lecturer in nutrition at University College Cork, Ireland, and colleagues. Iron deficiency during pregnancy is associated with neurodevelopmental challenges for the infant, as well as complications for the mother.



Pregnant_woman_with_IV_STOCK

More than 80% of pregnant women in Ireland were iron deficient by their third trimester. Image: Adobe Stock.

“Iron deficiency was very common among our cohort of generally healthy, low-risk first-time mothers, with four out of five women iron deficient in the third trimester,” McCarthy told Healio. “Iron deficiency during pregnancy can have lasting health consequences for both a mother and her baby, therefore prevention should be prioritized.”

Researchers analyzed data on iron levels and inflammatory markers for 629 primiparous women with low-risk, singleton pregnancies living in Ireland (98.2% white). Iron (ferritin and soluble transferrin receptors), C-reactive protein and alpha-glycoprotein were measured at 15, 20 and 33 weeks’ gestation. Researchers excluded women with anemia, defined as a hemoglobin level less than 110 g/L, at their first routine antenatal visit. Researchers assessed longitudinal changes in iron biomarkers across pregnancy and the prevalence of iron deficiency.

Elaine K. McCarthy

The findings, conducted in collaboration with the University of Minnesota and the Masonic Institute for the Developing Brain, were published in The American Journal of Clinical Nutrition.

Within the cohort, 73.6% of women reported taking an iron-containing supplement either prepregnancy and/or during their first trimester; most were multivitamins that contained iron at the level of the European recommended daily allowance of 15 mg to 17 mg daily.

Researchers found that the prevalence of iron deficiency, defined as a ferritin level less than 15 g/L, increased throughout pregnancy, from 4.5% at 15 weeks’ gestation, to 13.7% at 20 weeks’ gestation, to more than half of pregnant women (51.2%) by 33 weeks’ gestation.

When defining iron deficiency as a ferritin level of less than 30 g/L, rates of deficiency rose to 20.7%, 43.7% and 83.8% at 15, 20 and 33 weeks’ gestation, respectively.

Using a soluble transferrin receptor level of more than 4.4 mg/L as a cutoff, the prevalence of iron deficiency was 7.2% at 15 weeks, 12.6% at 20 weeks and 60.9% at 33 weeks, rates that were similar to the prevalence data for a ferritin cutoff of less than 15 g/L, according to the researchers.

“Given the challenges with ferritin, the search continues for other biomarkers that reflect the early stages of iron deficiency,” the researchers wrote. “We documented some agreement between soluble transferrin receptor level of > 4.4 mg/L and ferritin of <15 g/L as deficiency definitions. Previous investigations have shown the usefulness of soluble transferrin receptor level in the pregnant population, but further consideration of the appropriate threshold in this population is still needed.”

Researchers also sought to propose an early pregnancy iron status cutoff to predict iron deficiency in the third trimester. Using a cut point analysis method with an area under the curve of 0.75, a ferritin level of less than 60 g/L was the best predictor at 15 weeks of iron deficiency at 33 weeks, researchers wrote.

Compared with women who did not take iron-containing supplements during prepregnancy and/or early pregnancy, those who did had a lower prevalence of deficiency using the less than 15 g/L ferritin cutoff at 33 weeks (OR = 0.55; 95% CI, 0.38-0.81; P < .05).

“This research highlights the importance of screening for iron deficiency in early pregnancy to identify those at the greatest risk to prevent adverse health consequences for both mother and baby,” McCarthy told Healio. “Further large-scale studies that measure iron status across multiple timepoints in pregnancy are needed, with comprehensive investigations in other populations and global settings.”

As Healio previously reported, the U.S. Preventive Services Task Force published a final statement in August that evidence is lacking to recommend for or against iron supplementation in pregnant persons, adding that clinicians should use their judgment amid a lack of evidence.

For more information:

Elaine K. McCarthy, PhD, can be reached at [email protected]; X (Twitter): @ElaineMcCarthy_.

link

Leave a Reply

Your email address will not be published. Required fields are marked *