Thousands of medical students will submit their final preferred list of postgraduate residency programs to the National Resident Matching Program (NRMP) on Feb. 26 after a year of interviews, research, and plenty of hand-wringing to determine where they might like to start their careers in medicine.
Stirring up the high-pressured match season a little more, US News & World Report this week published a ranking from the physician social network Doximity of U.S. medical residencies based on thousands of surveys from physicians who have completed internal medicine residencies in the United States. The surveys were sent through web notifications and emails to 18,695 members of the social network.
Massachusetts General Hospital and Brigham and Women’s Hospital in Boston achieved the top nods along with Johns Hopkins University in Baltimore. All three received 600 nominations from the physicians surveyed, and following close in line is the University of California, San Francisco Medical Center.
Each of the physicians Doximity surveyed completed a U.S. residency in internal medicine. Approximately 3,400 responded (18 percent), sending in approximately 2.7 nominations apiece for an internal medicine residency. The survey ran from mid-December to Feb. 10, representing around 1,300 hospitals and every state except Alaska and Wyoming.
Medical students wringing their hands about their futures might use this survey to inform their selection process. But the question remains: should they?
A list like this could easily be associated with other well-known rankings by US News, including the Best Colleges list that many high school students turn to annually. US News’ Best Hospitals ranking also brings huge marketing power to those hospitals and health systems ordained at the top of the list.
But Ben Harder, general manager of health and science at US News, said this list is “definitely not a U.S. News ranking” in an email exchange Friday.
“U.S. News builds nearly all of its rankings on objective quality data,” said Harder. “Unfortunately, objective clinical data on residencies is nonexistent.”
In the explainer piece Harder wrote online for US News, he notes a few problems with the objectivity of the data Doximity collected while also citing its potential to influence students.
In the surveys, physicians tended to nominate medical residencies near their own practices, while the South and West were underrepresented. Respondents were also more likely to be subspecialists and not internists (so it’s questionable whether their opinions of internal medicine residencies is the best source for a ranking).
Doximity also surveyed residency and fellowship directors, whose responses were pooled with the physicians’ at equal weight, despite the potential for bias.
“A few institutions did have three or four who responded, while most had no more than one, so it’s reasonable to take the nominations from that subgroup with a grain of salt,” said Harder. “At the same time, a large majority of nominations for all of the programs came from doctors who didn’t have a personal affiliation with that program.”
So with these data limitations, why is this survey even relevant for medical students?
Harder wrote online that the list’s “insight into programs’ reputations could inform their preferences, particularly if they are concerned about how future colleagues might judge their medical pedigree.”
“It certainly responds to the desire of medical students for information about the training programs,” wrote Dr. Joanne Conroy, the chief healthcare officer at the American Association of Medical Colleges, in an email exchange. “This is especially relevant as they are making their priority lists for submission to the NRMP.”
Dr. Conroy does not expect this list to affect this year’s match season since most applicants have already submitted their final selections.
What do you think of this list? Would you rely on it based on the data?