Abstract
Background/Aims When new medications are introduced, safety data are often sparse. Health plan data may be useful for active surveillance for adverse events. We conducted a 12-month pilot study to demonstrate the feasibility of prospective active surveillance using data from several large health insurers. This presentation will summarize logistical challenges and lessons learned.
Methods We studied the safety of generic divalproex sodium, an antiepileptic drug introduced in July 2008, compared to the branded drug. We performed monthly surveillance using the maximized sequential probability ratio test, comparing observed outcomes in new users of the generic drug to expected counts based on new users of the branded drug from January 2002 – June 2008. Programmers at four HMO Research Network sites extracted data monthly, ran quality control (QC) programs, and returned summary tables to the lead site. Medication exposure was identified from outpatient pharmacy dispensings. Outcomes identified from health plan utilization data included potential adverse effects (e.g. pancreatitis and liver disease) and efficacy outcomes (e.g. hospital or emergency department visits for seizures.)
Results We will discuss lessons learned related to data sharing and HIPAA compliance, timing and synchronization of data pulls, and QC activities. Obtaining IRB approval took 3.5 months, and protocol development took 7 months. Developing the analytic program required two rounds of corrections, one leading to meaningful changes in results. QC programs identified substantial differences in monthly extracts across sites. Some QC flags required only understanding differences across sites, but others revealed data problems requiring resolution at the site level. Overall, 14% (4/28) of monthly extracts across the four sites could not be used due to underlying data problems or lack of analyst availability for timely extracts.
Conclusions Prospective drug safety surveillance is feasible, but future projects should anticipate the need for longer timelines and multiple revisions to data pulls and analytic programs. They should develop approaches to handling missing data, because some data pulls will not be usable. QC activities should continue throughout study implementation. A knowledgeable, engaged lead at each site is vital to ensure accuracy and appropriate interpretation of each site’s data.




