Recent developments in first-trimester testing promise great improvements in predicting adverse pregnancy outcomes
Traditionally, expectant mothers have had their pregnancies predictively categorised as low risk or high risk, depending on the perceived probability of an adverse maternal or neonatal outcome. Although appealing in its dichotomous simplicity, such a categorisation does not reflect the spectrum of risk that exists for all pregnant women, nor does it acknowledge significant limitations that have, until recently, precluded the accurate prediction of obstetric risk, particularly among women who have never previously given birth. For example, an algorithm for the prediction of pre-eclampsia among women in their first pregnancy, based on maternal risk factors alone, yields only a 37% detection rate for a 10% false-positive rate.1 However, just as screening for fetal aneuploidy has evolved from using maternal age alone to using non-invasive prenatal testing of cell-free fetal DNA in maternal serum, so too has first-trimester testing been refined to permit, with significantly improved efficacy, the early prediction of other important obstetric concerns, such as fetal growth restriction and pre-eclampsia. With these developments, we are on the cusp of a new era in antenatal care, in which common and important pregnancy outcomes can be more reliably predicted from an early gestation, thereby promising improved triaging of patients, the institution of targeted surveillance and prophylactic therapies, and recruitment of a truly high-risk population to clinical research trials.
The full article is accessible to AMA members and paid subscribers. Login to read more or purchase a subscription now.
Please note: institutional and Research4Life access to the MJA is now provided through Wiley Online Library.
No relevant disclosures.