New research suggests that data submitted to federal regulators for approval of certain heart devices often differs greatly from what is published in peer-reviewed medical journals.
In an analysis published this month in the medical journal The BMJ (formerly British Medical Journal), researchers with the University of California, San Francisco report that they found many discrepancies in data submitted to the FDA by medical device manufacturers seeking premarket approval of some cardiovascular medical devices, compared to the data provided in subsequently published studies. In addition, researchers indicate that data used to obtain the approval for more than half of the devices examined were never published in a peer-reviewed journal.
Researchers focused on clinical trials involving high risk heart devices, comparing the data between premarket approval summaries submitted to the FDA and data published in peer-reviewed journals.
Results of safety and effectiveness of medical devices are reported to the FDA during the premarket approval process, to establish that the device is safe and effective. Researchers searched the publicly available FDA database for all cardiovascular devices that received premarket approval between January 1, 2000, to December 31, 2010. A Medline search was conducted to find corresponding published studies.
Researchers found that out of 177 studies, only 49% had been published in a peer-reviewed journal, with the average time from FDA approval to publication being about 6.5 months.
Of the 86 studies, these pertained to 60 cardiovascular devices, although 106 devices had received FDA premarket approval.
Researchers found the number of participants enrolled in the study differed in the FDA summary than in the corresponding published journal in 26% of those studies.
Of 152 primary endpoints in the published journals, or the primary question to be answered in the study, three were labeled as secondary endpoints in the FDA summaries, or things investigated after the study is completed. A total of 43 studies were unlabeled and a total of 15 were simply not found.
Thirty-five of the studies’ results were similar to FDA studies, yet 17 were substantially different and 31 could not be compared.
Sixty-six of the studies were considered pivotal in their corresponding journals, but only one was explicitly noted as pivotal in the FDA summary.
Researchers found 33% of the studies were published before FDA approval of the devices, and nearly all the studies were industry funded. More so, about half of the clinical trials for high risk cardiovascular devices approved by the FDA remain unpublished. Even when trials are published, study population, primary endpoints and results can differ substantially from data submitted to the FDA.
“Clinicians might not be aware of the FDA device summaries and so might not critically examine these data,” said Rita F. Redberg, MD, MSc, lead author of the study. “Thus for many high risk devices, clinical trial evidence might never be made readily available to the medical community or might be made available only after a long delay.”