Consumer Reports this morning unleashed a familiar debate when it released a new hospital rating report using Medicare data to show differences in how hospitals perform on 27 kinds of elective surgery.
The rating system found that many community hospitals in Massachusetts perform better than Boston’s renowned academic medical centers. Hospital leaders have cried foul, calling the data misleading and a disservice to patients.
The consumer group said the ratings were its best attempt at assessing hospital performance using billing data, since it does not have access to more precise clinical data. Who does? Hospitals, and so far, in most cases, they have not made it widely available to the public.
“We’re trying to stir things up here,” said Dr. John Santa, medical director of Consumer Reports Health. “If folks feel there’s better information ... let’s have at it. Put the information out there for everybody to see.”
Santa said there are hundreds of privately held databases with information taken from medical records and used by hospital consortiums or specialty groups to assess quality, and by individual facilities to compare their performance against industry averages. “They’re not public,” he said.
What is public is detailed data collected by Medicare to issue payments to doctors and hospitals. That data is supposed to accurately reflect the tests and treatments provided to patients, along with their specific diagnoses, but everyone agrees it has flaws.
Using that data, the authors of the Consumer Reports rating “do a disservice if they put information out there that misclassifies hospitals,” said Dr. Elizabeth Mort, chief quality officer at Massachusetts General Hospital, which was rated poorly.
Mort said she has concerns about whether the rating accurately accounts for patient volume or the severity of patients’ illnesses, something that is more accurately captured in medical records than in billing data. She also said the rankings may have grouped surgical procedures together that have varying degrees of expected complications, reflecting poorly on those hospitals that do the more complex treatments.
Santa said his group, which worked with consulting firm MPA, used dozens of factors to adjust for differences in patient populations and weighted procedures based on the likelihood of complications.
The rating analyzes Medicare claims data from 2009 to 2011 in categories including back surgery, vascular surgery, hip and knee replacements, hysterectomies, and colon surgery. The authors evaluated the hospitals based on the percentage of patients who died in the hospital or had prolonged hospital stays.
They did not include emergency surgery or patients who were transferred from one hospital to another. Here’s a look at how a few in Massachusetts were rated:
- Baystate Medical Center, Springfield: Better than average
- Beth Israel Deaconess Medical Center, Boston: Average
- Brigham and Women’s Hospital, Boston: Worst
- Cape Cod Hospital, Hyannis: Average
- Lahey Hospital & Medical Center, Burlington: Worst
- Massachusetts General Hospital, Boston: Worst
- New England Baptist Hospital, Boston: Average
- Newton-Wellesley Hospital, Newton Lower Falls: Better than average
- North Shore Medical Center, Salem: Average
- Southcoast Hospitals Group, Fall River: Worst
- UMass Memorial Medical Center, Worcester: Better than average
The Massachusetts Hospital Association, in a written statement, said the ratings “result in greater confusion rather than clarity” and oversimplify the issue:
In publishing these rankings, CR does not adequately address serious questions about the use of unsupported methodologies and the stark differences between these ratings and other recent quality and safety reports. Several rankings published in this latest CR article are in direct conflict with the latest mortality data released by the Centers for Medicare and Medicaid Services (CMS).
The public will be far better served if CR joins other responsible parties to develop a common national framework for quality measurement and reporting – one based on scientifically-validated and broadly-endorsed methods. Until this happens, the conflicting and confusing messages sent to the public by numerous sources will continue to be a barrier to truly informing and engaging health care consumers.
As a consumer, I can’t help but ask: When will this “national framework” be ready for prime time? Meanwhile, what am I supposed to do the next time one of my loved ones needs a procedure?
Consumer Reports notes that as many as three in 10 patients suffer from infection or some other complication after surgery. And, as imperfect as it is, the Medicare data suggest that there may be real differences between how patients fare from one hospital to another.
Hospitals and doctors are working to get more clinical information out to the public. Mass. General and other Boston hospitals participate in the National Surgical Quality Improvement Program, reporting outcomes for certain procedures. And Mass. General lists limited surgical data on its own website.
Eventually, clinical data will be more accessible to the public. Until then, billing data does provide valuable information, said Dr. Ashish Jha, professor of health policy and management at Harvard School of Public Health.
If the industry put the energy it spends on debating the value of billing data into further developing the most accurate means of assessing health care quality, Jha said, “we’d all be better off.”