Eating fish high in omega-3 fatty acids has been linked to a lower risk of stroke. A new study suggests it’s not only how much fish you eat that matters, but how it’s prepared.
Dr. Fadi Nahab of Emory University led a team that studied the role of race and geography in stroke incidence, with a particular emphasis on the “Stroke Belt’’ in the southeastern United States, where stroke death rates are higher than in the rest of the country. African-Americans also have a higher rate of stroke than whites.
For the study, more than 21,000 people answered a telephone survey about their fish consumption. African-Americans ate more fish per week than whites, but they were also 3 1/2 times more likely to eat at least two servings of fried fish per week than whites. Fried fish consumption was 30 percent higher in the Stroke Belt than the rest of the country.
Eating fried fish may lower health benefits in two ways, the researchers said. First, lean fish such as cod or haddock are more likely to be fried than omega-3-rich salmon, herring, or mackerel. Second, frying fish is believed to reduce natural omega-3s and replace them with cooking oils.
BOTTOM LINE: African-Americans and people living in the Stroke Belt ate more fried fish than whites or people in the rest of the country, a factor that could be related to higher incidence of stroke.
CAUTIONS: The food surveys were a snapshot, so they did not account for dietary changes over time that might be important in stroke risk. More research is needed to establish whether people who eat fried fish actually are more likely to have strokes.
WHERE TO FIND IT: Neurology, online
Facial recognition may peak later than believed
Maybe it’s not all downhill, mentally speaking, after age 25. That’s when we’re best at remembering names, but a new study from Harvard says facial recognition doesn’t peak until we’re in our early 30s.
Using an online test, Laura Germine, a PhD student in psychology at Harvard University, and her colleagues tested about 44,000 people from 10 to 70 years old to see how well they recognized computer-generated faces. The participants also performed some other mental tasks, such as remembering names. As other research has shown, that skill plateaus among people 23 to 24 years old. But facial recognition improved up to age 34, and then declined, they concluded from three different experiments.
“Research on cognition has tended to focus on development, to age 20, and aging, after age 55,’’ Germine said. “Our work shows that the 35 years in between, previously thought to be fairly static, may in fact be more dynamic than many scientists had expected.’’
BOTTOM LINE: People are best at recognizing faces when they are in their mid-30s, later than previously thought.
CAUTIONS: Study participants who found their way to an online memory test may have a greater interest in memory, and their results may differ from people in general.
WHERE TO FIND IT: Cognition, online