Current statistical methods that women can use to assess their breast cancer risk are woefully inadequate. They consider such factors as age, family history, and whether a woman has ever had any breast biopsies to calculate whether she’s at average or increased risk of breast cancer. But the models are only about 55 to 60 percent accurate at predicting who’s at increased risk—which isn’t much better than a roll of the dice.
Measuring levels of sex hormones such as estrogen and testosterone via a blood test could help improve the predictability of these models, according to research results presented this week by Boston researchers at the American Association for Cancer Research conference. The researchers were able to analyze hormone levels in 473 blood samples obtained from women who participated in the Harvard Nurses Health study two decades earlier, and who later went on to develop breast cancer.
They then compared these hormone levels to those of 770 women who also participated in the study, but who did not develop breast cancer. They found that a form of estrogen, testosterone, and another hormone prolactin were all more likely to be elevated early on in those who went on to develop breast cancer.
“We confirmed what we and others have previously seen—that these hormones are associated with breast cancer risk,” said study co-author Shelley Tworoger, an epidemiologist at Brigham and Women’s Hospital. Adding such a measurement to the current risk models could increase their predictability by 4 to 6 percentage points, she added.
Next up: a study to evaluate whether factoring in certain genetic markers and breast density as measured on a mammogram might improve these risk assessment methods even more. This could help pinpoint the women at highest risk, Tworoger said, to help them determine whether they might benefit from a prevention drug such as tamoxifen or a more sensitive screening tool, such as MRI imaging that can detect the tiniest abnormalities.