Historical Data Analysis of Hospital Discharges Related to the Amerithrax Attack in Florida

by Lauralyn K. Burke, DrPH, RHIA, CHES, CHTS-IM; C. Perry Brown, DrPH, MSPH; and Tammie M. Johnson, DrPH

Abstract

Interrupted time-series analysis (ITSA) can be used to identify, quantify, and evaluate the magnitude and direction of an event on the basis of time-series data. This study evaluates the impact of the bioterrorist anthrax attacks (“Amerithrax”) on hospital inpatient discharges in the metropolitan statistical area of Palm Beach, Broward, and Miami-Dade counties in the fourth quarter of 2001.

Three statistical methods—standardized incidence ratio (SIR), segmented regression, and an autoregressive integrated moving average (ARIMA)—were used to determine whether Amerithrax influenced inpatient utilization. The SIR found a non–statistically significant 2 percent decrease in hospital discharges. Although the segmented regression test found a slight increase in the discharge rate during the fourth quarter, it was also not statistically significant; therefore, it could not be attributed to Amerithrax. Segmented regression diagnostics preparing for ARIMA indicated that the quarterly data time frame was not serially correlated and violated one of the assumptions for the use of the ARIMA method and therefore could not properly evaluate the impact on the time-series data. Lack of data granularity of the time frames hindered the successful evaluation of the impact by the three analytic methods. This study demonstrates that the granularity of the data points is as important as the number of data points in a time series.

ITSA is important for the ability to evaluate the impact that any hazard may have on inpatient utilization. Knowledge of hospital utilization patterns during disasters offer healthcare and civic professionals valuable information to plan, respond, mitigate, and evaluate any outcomes stemming from biothreats.

Keywords: Amerithrax, time series, bioterrorism, hospital utilization, disaster preparedness, standardized incidence ratio (SIR), segmented regression, autoregressive integrated moving average (ARIMA), data granularity

Introduction

This study evaluates the impact of the 2001 bioterrorist anthrax attack on hospital discharge rate patterns in the metropolitan statistical area (MSA) of Palm Beach, Broward, and Miami-Dade counties in Florida. Knowledge of these inpatient utilization patterns during and after disasters will offer civic, healthcare, and emergency management professionals valuable information to plan, respond, mitigate, and evaluate any outcomes stemming from disasters, bioterrorism, epidemics, or pandemics. Interrupted time-series analysis (ITSA) was used to identify, quantify, and evaluate the magnitude and direction of this event in a specific period.

With the release of a 2015 report from the Bipartisan Blue Ribbon Study Panel on Biodefense, planning for biothreats, including pandemics and bioterrorism, is now considered an integral part of national security. Four of the 33 urgent recommendations address specific issues faced by hospitals during a biothreat. The panel also identified four “pillars of biodefense.” They include threat awareness, prevention and protection, surveillance and detection, and response and recovery.1 Adequate community emergency preparedness must address all four. Had these recommendations been implemented prior to the fall 2014 Ebola outbreak in the United States, it might have prevented the nosocomial transmission to the two healthcare workers who treated the index case at a Texas medical center.2, 3

This anthrax attack, labeled “Amerithrax” by the Federal Bureau of Investigation (FBI), was the first large-scale bioterrorist event in the United States. It occurred in the wake of the terrorist bombings of the World Trade Center and the Pentagon on September 11, 2001.4 Although much has been studied regarding community emergency management for disaster response and hospital emergency departments, there is a paucity of data regarding analysis and evaluation of the impact on hospital inpatient utilization.

Background

A few published studies have assessed disaster impact using hospital inpatient data stemming from Hurricane Hugo and the Severe Acute Respiratory Syndrome (SARS) pandemic.5–7 Only one study was found that evaluated the impact of a natural disaster on a hospital. Using ITSA, Fox (1996) evaluated the impact on hospital utilization from Hurricane Hugo on a public hospital in South Carolina. He found an immediate and sustained effect on hospital utilization from this category 4 storm. ITSA revealed that the average number of admissions increased by two patients per day for approximately two years, outpatient activity increased by four patients per day, and the emergency department showed the largest increase of 10 percent, which lasted throughout the two-year study period.8

Of particular relevance to this research are the studies on the 2003 SARS pandemic. SARS and anthrax have many similarities: (a) these illnesses are new or re-emerging diseases; (b) the infectivity, pathogenicity, and virulence were initially unknown; (c) patients who contracted SARS or inhalation anthrax required inpatient hospitalization; and (d) the healthcare system had never experienced these kinds of threats in modern history, so it was unknown how it would react and how the patients would act in the face of this new danger. In April 2003, the World Health Organization issued a travel advisory against Toronto and Beijing. Beijing had the most severe outbreak, and Toronto experienced a bimodal epidemic curve. Between November 2002 and July 2003, 8,450 people became ill and 850 people died. This pandemic was spread across 26 countries on five continents.9

Boutis et al. (2004) evaluated a Toronto pediatric hospital and found that emergency department visits decreased by 18 to 36 percent, inpatient admission rates increased 5 percent, and emergency department lengths of stay declined by 17 percent. Unlike in the 1996 Fox study, the impact of SARS on hospital utilization was short-lived, returning to expected levels in the subsequent months.10 Earnest et al. (2005) also used ITSA to make real-time predictions of the number of isolation beds needed during the SARS outbreak in Singapore.11 Like Fox (1996) and Boutis et al. (2004), they evaluated the number and rate of hospital admissions and found that ITSA accurately predicted bed occupancy rates in real time. These studies show that various ITSA methods will be important tools for hospital and public health management to plan and anticipate the hospital surge capacity needed for infectious disease outbreaks in the future.

Juster et al. (2007) used ITSA to analyze hospital admission rates for myocardial infarctions after a statewide smoking ban in public buildings was instituted in New York. The research of Juster et al. confirmed similar results of previous studies in two other states, giving much broader support for the legislation of smoking bans across the country.12 This Amerithrax hospital utilization study was patterned after the methods used in the Juster et al. (2007) study and the SARS pandemic research evaluating the impact on healthcare utilization.

Bioterrorism and Amerithrax

Bioterrorism is defined as the “deliberate release of pathogenic microorganisms—bacteria, viruses, fungi, or toxins—into a community.”13 Hawley and Eitzen (2001) state that bioterrorism is a “weapon of mass casualty” rather than “weapon of mass destruction” because it affects people rather than structures.14

Bacillus anthracis (B. anthracis) is a Gram-positive, encapsulated, rod-shaped, nonmotile, aerobic, spore-forming, soil-borne bacterium that causes the conditions of cutaneous, gastrointestinal, and/or inhalation anthrax.15–18 Its ability to produce toxins during metabolism increases its propensity for use as a biological weapon. Dixon et al. (1999) consider anthrax to be a “toxigenic condition” and that antitoxin therapies must be investigated further.19 Animal studies show that even if all the bacteria are eliminated by antibiotics, when the critical threshold for the exotoxins is reached, the resulting toxemia will cause death to the animal.20

The established time frame of Amerithrax is October 2 to November 21, 2001. By the end of the outbreak, there were 22 confirmed anthrax cases: 11 were inhalation, the remainder cutaneous. The first two cases of inhalation anthrax, including the first death (of the index patient), occurred in Florida. Five deaths occurred, and all were due to inhalation anthrax.21 There were two clusters of patients resulting from Amerithrax. The first cluster affected the population of south Florida, New York City, and New Jersey and had nine victims, seven with cutaneous anthrax and two with inhalation anthrax.22, 23 The second cluster was located in Washington, DC, New York, New Jersey, Pennsylvania, Connecticut, Maryland, and Virginia. This cluster had 13 cases, 9 of which were inhalation anthrax. Additionally, an additional 31 people tested positive for anthrax and were treated with prophylactic antibiotics. By Thanksgiving of 2001, approximately 35,000 at-risk people were started on antibiotic prophylaxis, and 10,300 of those people were advised to complete the recommended 60-day course of narrow-spectrum antibiotics.24, 25 The total economic impact of this relatively small outbreak was estimated at $6 billion.26–29

The index case was a 63-year-old male photo editor for a national tabloid newspaper who was admitted to a Palm Beach County hospital with severe respiratory distress. Although it was unknown at the time, he was in the fulminant phase of inhalation anthrax (pulmonary/inhalation anthrax is a biphasic condition). A work colleague remembered the patient opening a letter that contained a powdery substance at his desk in late September 2001. The patient threw away the letter and the envelope. Laboratory tests confirmed that B. anthracis was present in the patient’s cerebrospinal fluid. Because of the advanced nature of this case, the patient died three days after admission.30

The second case/patient was a 73-year-old coworker of the first patient, who processed mail at the media company. He was exposed when he handled the contaminated envelope. Even though he was admitted one day before the index patient was, his diagnosis was not confirmed until several days after the first patient died. This patient responded to therapy and was discharged alive from the hospital.31

Even with the tens of thousands of people evaluated for anthrax, with 22 confirmed cases, this occurrence did not significantly stress inpatient surge capacity because it was relatively small. Many government officials and health professionals assert that investments made in the public health infrastructure will benefit the bioterrorism preparedness response efforts, including the response to emerging infections, such as SARS or pandemic influenza, that threaten the public’s health.32

Methods

Three data analytics research methods were used in this study: standardized incidence ratios, segmented regression, and ITSA–autoregressive integrated moving average (ITSA-ARIMA).

An interrupted time series is a set of observations obtained by measuring a single variable over a specified length of time at regular intervals with an identified event occurring in the midst of the time period.33 The interruption can be an event (hurricane or bioterrorist attack), or it can be an intervention (initiation of a new drug or legislation). The ITSA-ARIMA method has been frequently used to evaluate the impact of an intervention or treatment on an identified health condition.34–36 This impact can have several characteristics. The effect can be sudden or gradually build up to its peak. The impact may also have a permanent effect or pass away within an identified time frame, which may also be considered the recovery period.37 Several terms are used interchangeably with ITSA-ARIMA, including quasi-experimental time-series analysis, impact analysis, and intervention analysis.38

ITSA research is classified as a quasi-experimental because it has elements of both experimental and observation designs. With or without controls, it is considered the strongest quasi-experimental design.39 Ansari et al. (2003) state that the pre-event segment of the time series can act as the control for the post-event segment when a control group is not available. The design of an ITSA helps control for confounding, secular trends, and regression to the mean (a type of selection bias).40 The format is similar to a pre-post design, but it has the added benefit of multiple data points both before and after the event.41 The increased number of data points before and after the event provides a baseline and response timeline regarding the impact of the event on the variable of interest.42 In time-series data, the series can be divided into two or more intervals or segments to assess or even discard if the research indicates.43 Change points are the exact position in the time series where an event or intervention occurred. ITSA also measures whether factors other than the intervention could explain the change.44

Like Wagner et al. (2002)45 and Yaffe (2000),46 England (2005) recommends that at least 50 data points be collected, preferably at least 20 before and after the event or intervention. As in segmented regression, the longer the time frame evaluated and the closer together each of the data points are, the higher the statistical validity. Following the population across a long length of time reduces the variability in the time series.47

Several issues must be considered when using time-series data. One is trend, which is the long-term upward or downward direction of the observations.48 Second, the time series must be stationary. Stationarity is when the mean value of the series remains constant over the entire time series.49 Third, the influence of seasonality must be evaluated, particularly when studying diseases that are influenced by the time of year, such as influenza and pneumonia during the winter months.50 Another assumption of the ITSA-ARIMA model is that the observations must be serially dependent. Serial dependency or correlation is the extent to which each variable influences subsequent variables in the time series and must be controlled for in order to avoid spurious conclusions. The time-series data must also be of sufficient length to test for significance and to account for seasonal rotations.51

The time-series data for this study consisted of county-level, age-adjusted quarterly inpatient discharge rates from the Florida MSA over a 21-year period (1988–2008), totaling almost 14.5 million discharges with 317,694 anthrax/differential diagnoses cases obtained from the Florida Hospital Discharge Data Set (FHDDS). Cases were identified by the principal diagnosis using ICD-9-CM (International Classification of Diseases, Ninth Revision, Clinical Modification) codes. See Table 1 for the list of differential diagnoses and the respective ICD-9-CM codes. In the 21-year time frame, there were 84 quarters. Aggregate data could only be released from FHDDS in the format of yearly quarters because of patient privacy concerns. The 56th quarter of the time series includes October 2001 and is the quarter of interest. The MSA encompassed 52 acute care hospitals with a total of 17,095 beds. Age-adjusted discharge rates for each quarter were calculated using census data for the MSA.

There are 52 data elements within the FHDDS, which is compiled by the Agency for Health Care Administration (AHCA) from the required reporting of all hospitals in Florida.52 Hospitals are legislatively mandated to supply specific patient and facility data within a quantified time frame. Categories of data include patient and provider demographic information; inpatient and/or encounter data that include all ICD-9-CM diagnostic/procedural and CPT procedure codes; and financial and administrative data for each episode of care. Length-of-stay (LOS) totals are included, but admission and discharge dates for each admission during the quarter are not included in the public data set. Quarterly data is the only format that is released by the AHCA for public health researchers.

The standardized incidence ratio (SIR) was the first analytics method used. This method is commonly used for evaluating disease clusters. A disease cluster is defined as an unusual grouping of instances of a particular disease or condition, real or perceived, that are grouped together in time and place and that is reported to a public health department.53 By this definition, Amerithrax can be considered a cluster. The SIR compares the observed number of cases to the expected number of cases to determine if the incidence of the specified disease in an identified population is similar to that observed in the reference or standard population.54

Segmented regression was the second analytics method used. This method evaluates the impact of an event on a time series by looking at changes in segments of time before, during, and after the event and can estimate the impact (the magnitude and direction) at various time points and changes in trend throughout period of study. Segmented regression was used to test the null hypothesis of no difference in the age-adjusted hospital discharge rates of the segments of the time series before, during, and after Amerithrax in the fourth quarter of 2001. Another advantage of segmented regression is that it can measure the size of the effect at varying points or periods after the event.55

The basic formula for segmented regression, described by McDowall (1980),56 is

Yt = b0 + b1T + b2D + b3P + et where T, D, and P are the independent variables of time (T) of the entire series; the dummy variable for pre- and post-event periods (D), coded 0 for the pre-event period and 1 for the post-event period; and the time since the event (P); and et is the random variation at time t not explained by the model. The b0 is the level at the start of the time series where it crosses the y-axis, b1 is the slope of the first segment or the segment prior to the event, b2 is the immediate change in the level of the slope at the point in the time series of the event, and b3 is the difference between the slopes before and after the event occurs.

The ARIMA statistical method performs an impact analysis to measure the effect and magnitude, if any, of an event or intervention on a collective phenomenon or trend over time (in this case, hospital discharges). This is particularly important in a time series where there is serial correlation.57 Procedures within ARIMA can remove any systematic seasonal deviations. It is also a robust method for predicting future observations or forecasting.58–61 An ARIMA model transforms the time series by the processes of identification, estimation, and diagnosis in order to predict future observations.62, 63

Like segmented regression, the impact is measured in two ways, by its onset and its duration. The first characteristic, onset, can be abrupt or gradual. An example of an abrupt onset would be that of a disaster such as Amerithrax. This satisfies an element of the impact analysis definition in that the specific time or date of the event must be known a priori. The second characteristic, duration, is identified as either temporary or permanent. The duration can also be construed as the recovery phase of a disaster. An abrupt and temporary pattern, as postulated in this research, is a pulse function, rather than a step function. A step function would indicate that the duration was permanent.64 The previously cited SARS studies indicated that a pulse function was seen in the SARS pandemic of 2003.

Serial dependency or correlation is one of the assumptions that must be met in order for ITSA-ARIMA to be calculated. The Durbin-Watson test, calculated as part of segmented regression, evaluates the serial correlation of the time series. The values for the Durbin-Watson test range from 0.0 to 4.0, with 0.0 denoting a positive correlation and 4.0 denoting a negative correlation. Values calculated close to 2.00 indicate no serial autocorrelation.65–68 This is an important step in preparing for the ITSA-ARIMA calculations.

Results

The age-adjusted hospital discharge rate for the MSA at the beginning of the time series was 17.61 per 100,000. The SIR for Q4 2001, the 56th quarter in the time series, was 0.98, or 2 percent lower than expected with a 95 percent confidence interval (0.95–1.01), indicating that this decrease was not statistically significant. Therefore, in this study, there is insufficient evidence to infer that the decrease in the hospital discharges was due to Amerithrax in October 2001 because the overall SIR was not statistically significant.

Table 2 illustrates the results of the segment regression model using the MSA age-adjusted discharge rate. After the variables have been entered into the model, the formula becomes: Yt = 17.61 + (0.11)b1 + (1.06)b2 + (−25)b3. The b0, 17.61 per 100,000, is the average inpatient discharge rate of the MSA at the beginning of the time-series in 1988. The slope, b1, is the rate of change prior to the event. In this analysis, there was a slight upward slope (b1) of 0.11 percent that was statistically significant, but relatively flat. The b2 is the change in level during the quarter of the event. In this study, there was a 1.06 percent increase in the discharge rate in the fourth quarter of 2001, but it was not statistically significant. The last coefficient, b3, is the change in the slope from before to after the event. For this analysis, there was a very slight downward slope of −0.246 percent through December 2008, which was statistically significant. To evaluate the difference between the pre- and post-event slopes, or the rate of change in the slope from before to after the event, b1 (0.11 percent) and b3 (−0.246 percent) are summed, which equates to −0.14 percent, or a 0.14 percent decrease. These two methods indicate that the change in the hospital discharge rate in the MSA cannot be attributed to Amerithrax.

The Durbin-Watson value calculated for this model was 1.794, as seen in Table 3, indicating very little autocorrelation. The segmented regression statistics showed that the data in the current format of yearly quarters violate one of the assumptions of ITSA-ARIMA: that the data must be serially correlated. Therefore, the ITSA-ARIMA could not be performed, and no results were generated, because the yearly quarter format of the data was too large a time frame and lacked sufficient data granularity to identify any details and adequately assess the changes that may have occurred. Had there been serial correlation, transformation of the data would have been performed by the SPSS Expert Modeler using maximum likelihood estimation with first-order differencing to correct the problem.69

In summary, the standardized incidence rates showed no statistical significance. The segmented regression model found a non–statistically significant decrease in the inpatient discharge rate during the fourth quarter of 2001 and a very slight statistically significant decrease through the remainder of the time series to 2008. Therefore, this decrease cannot be attributed to the anthrax attack in Florida. ITSA-ARIMA could not be calculated because of a lack of serial dependency due to a lack of data granularity.

Discussion

This study shows that data granularity is an important concept to address when determining the data points for the ITSA. LaTour, Maki, and Oachs (2013) define data granularity as the “individual data elements that cannot be further subdivided.” Data granularity is also known as “atomicity.”70 The Data Quality Model developed by the American Health Information Management Associations (AHIMA) details the ten specific characteristics of healthcare data and clinical documentation that are required for the data or documentation to be effective for administrative and clinical research, evaluation, and application.71 While this study had sufficient data points before and after the event, the granularity or detail of the inpatient activity was not at the same level as the data in the previously cited SARS, hurricane, and smoking ban legislation studies. This study demonstrates that the granularity of time frames (smaller daily, weekly, or monthly time frames, but not quarterly) is as important as the number of data points in the time series. This study may be help to begin establishing a level of data granularity necessary for this type of research. It may be depend on the type or size of the event. Small-scale or shorter-duration events (hurricanes, bioterrorism, tornadoes) may require the smallest level of granularity, such as daily rates. Studies of larger-scaled, longer-lasting events (influenza or Zika virus epidemics) may be able to use weekly or monthly rates and still be able to determine statistically significant changes in time-series data.

This study has implications for future research and policy analysis. It is important because public health and community emergency management professionals must address capacity, casualties, and countermeasures for their population during any disaster. Analysis of SARS, influenza pandemics, Amerithrax, and the Ebola outbreak can be applied for prevention and protection to lessen and mitigate the impact of future biothreats. Real-time data collection and public/population health reporting from electronic health records (EHRs) will facilitate surveillance and detection in the inpatient and outpatient domains. Finally, examination of surge capacity, inventory projections, personnel readiness, and medication and countermeasures would assist in response and recovery activities. ITSA can therefore address three of the four pillars of biodefense as specified by the Blue Ribbon Study Panel on Biodefense. ITSA-ARIMA and other data analytics of healthcare data support essentially all of the pillars except threat awareness.

The ability to predict how long and what type of impact an event will have on a population will allow policy makers and healthcare professionals to optimize the limited resources of each community. After-the-fact evaluation reports can also utilize the data generated from ITSA-ARIMA and segmented regression to identify quality improvement opportunities. SARS pandemic research indicates that ITSA is a robust tool for predicting bed capacity in real time during disasters or outbreaks provided that the appropriate time intervals of daily, weekly, and/or monthly data are used in the time-series studies. Reliable forecasting is a strong point of ITSA models and a necessity for community planners to predict staffing requirements and resource utilization. Various communities will experience different reactions to disasters. By incorporating ITSA and other data analytics tools, the recommendations of the Blue Ribbon Study Panel on Biodefense can be effectively and efficiently implemented for the most appropriate response.

The index case (one of the two Florida cases) of inhalation anthrax did not initially appear in the differential diagnosis subset (see Table 1) because of incorrect ICD-9-CM coding and sequencing. In the FHDDS, the inhalation anthrax pneumonia and the anthrax bacteria codes were not sequenced as principal diagnoses per official coding guidelines. Although specific medical records were not reviewed and this study does not evaluate the quality of the ICD-9-CM coding within the hospital discharge data set, data quality and the integrity of subsequent research are paramount. The correct identification of the principal diagnosis, which is the determined chief reason for admission after diagnostic and therapeutic study, is of utmost importance for reimbursement and data quality.72 Subsequent diagnosis codes identify comorbidities and complications that indicate severity of illness and intensity of service. The ICD-9-CM code for pneumonia in anthrax (484.5) specifies the medical condition. However, according to Schraffenberger (2010) and official ICD-9-CM coding guidelines, the bacterial code for pulmonary anthrax (022.1) is required to precede the code for pneumonia in anthrax to identify the cause of the pneumonia. Therefore, the bacterial code for pulmonary anthrax is considered to be the principal diagnosis.73 The record of the index case did specify the 484.5 ICD-9-CM pneumonia in anthrax code, but it was sequenced as the third diagnosis. Neither the 022.1 anthrax code nor any other bacterial codes were present among the remaining nine reported diagnosis codes. Until 2006, ten diagnostic codes were the maximum that could be reported to the FHDDS. In the index case, meningitis (320.89) due to other specified bacillus (B. anthracis was found in cerebrospinal fluid) was listed as the principal diagnosis. Again, the required anthrax bacteria ICD-9-CM code (022.8) should have preceded the meningitis code as the causative organism code and therefore the principal diagnosis. Bacterial meningitis due to anthrax would not have occurred unless the index patient had inhalation/pulmonary anthrax. Given that the index patient was admitted in the fulminant (second) phase of inhalation anthrax and the initial lab tests confirmed the anthrax bacteria in the cerebrospinal fluid, the attending physician may have thought this was the more important or significant diagnosis. Autopsy results would have indicated inhalation anthrax; however, the timing of these results are unknown. It must be remembered that, at the time, inhalation anthrax had never been seen in modern US healthcare.

Neither of the two Florida cases used external codes (E-codes) to identify the external cause (terrorism by bioagent, E979.6) of the disease even though it was known to be a bioterrorist event during these two hospitalizations. While not required in the FHDDS by AHCA until 2006, E-codes have always been highly recommended, and in some cases mandatory, to give a complete diagnostic description of the inpatient stay.74 Because of the possibility of additional incorrect sequencing and identification of principal diagnoses that were undetected, there may have been more differential diagnoses cases that were not included within the analysis data set. Once identified, this index case was added to the data subset.

Now that the new diagnostic coding classification system of ICD-10-CM/PCS has been implemented, clinical documentation improvement (CDI) activities are even more important because of the increased level of specificity now required. ICD-10-CM/PCS coding system provides more extensive data granularity to better identify specific disease mechanisms. Within the use of EHRs, this will trigger more efficient syndromic surveillance responses to public health authorities, both in the inpatient and outpatient arenas. ICD-10-CM/PCS is substantially larger than ICD-9-CM and therefore warrants the increased use of computer-assisted coding. The best practice of using encoder software to assign diagnostic and procedural codes would achieve many of the CDI objectives, and these features may alleviate the incorrect application of coding rules to better improve diagnosis and procedural reporting.

The new meaningful use (MU) requirements for EHR systems identify several public and population health reporting objectives. Syndromic surveillance notifications actively parlay into ITSA forecasting and support many of the biodefense report recommendations. The specific MU objectives for this heath data reporting and analysis would support “timely and effective prevention and response,” facilitate “public health action,” and would assist in “event detection, situation awareness, and response management.”75

Perspectives for Continued Study

The statewide data can and should be reformatted into smaller time frames and be made available to all legitimate public health researchers given the right political and legislative initiatives. This data granularity includes allowing the use of admission and discharge dates for all inpatient stays. Currently, the FHDDS is only available in quarterly format for all researchers. Clarification among public health researchers and agencies regarding the application of privacy and security regulations of patient data compared to the biothreat awareness and analysis is vital. The authors of this study feel that HIPAA privacy requirements are too stringently applied to all public health data requests (including the state health department) and prevent an appropriate level of analysis. US researchers have access to suitable data mining techniques used in other countries (as seen in the studies from Canada, China, Taiwan, and Singapore) and should have the same access to all levels of data. Infectious disease experts focus on highly pathogenic organisms that may initially have a limited impact on the public. Small outbreaks must be evaluated properly to determine best practices. When, not if, larger disease outbreaks occur, lessons learned from the smaller ones will be invaluable.

There may have been significant differences between inpatient and outpatient activity, given the large number of individuals treated prophylactically because of exposure to the anthrax bacteria. There were no individual or severity-of-illness data available because only the principal diagnosis code was used. There were 52 hospitals in the MSA that may have had considerable differences between physicians, available services, and treatment protocols.

Recommendations

This study needs to be repeated using smaller time intervals, preferably daily, but weekly and/or monthly time intervals may suffice. These smaller time frames were used in previously cited literature. The change in data granularity to smaller time frames may indicate statistically significant changes in hospitalizations related to Amerithrax.

This study should be repeated using the smaller time intervals of daily, weekly, or monthly discharge rates with hospital data from the other two geographic clusters affected by Amerithrax, the Washington, DC/Maryland/Virginia MSA and the New York/New Jersey MSA. These two clusters could then be compared to learn additional information regarding hospital utilization during disasters.

Consistency in medical coding practices, requirements, and application guidelines are paramount. Having 50 percent of the inhalation anthrax cases not appear in the initial subset because of improper ICD-9-CM sequencing could drastically alter any research. It was only one of two cases, but because of the intense public scrutiny this incident received, all efforts should have been expended by health informatics and information management (HIIM) personnel to ensure scrupulous application of diagnostic coding guidelines. Computer-assisted coding and the best practice of using encoder software to assign diagnostic and procedural codes would achieve a many of the CDI objectives.

Conclusions

This study is important because of the ability of ITSA to evaluate the magnitude and direction of a biothreat and disasters on hospital utilization patterns. Knowledge of these trends and patterns during disasters offer hospital, healthcare, and emergency management professionals valuable planning information to respond to and mitigate any possible negative outcomes stemming from bioterrorist incidents or epidemic/pandemic diseases. Trend and pattern analysis of patient care patterns, practice expenditures, and personnel management during disasters will provide a road map to evaluate opportunities for improvement. ITSA can be an effective method to forecast healthcare utilization, both inpatient and outpatient.

This study demonstrates that the granularity of the data is as important as the number of data points in the time series. The large quarterly time intervals hindered the appearance of detail in changes of either direction and magnitude in inpatient hospital utilization patterns. The lack of time/data granularity in this study, resulting from spurious confidentiality concerns, prevented the various ITSA methods from possibly identifying statistically significant results to accurately measure the impact of Amerithrax. HIPAA regulations allow public health research. After the data granularity issues are rectified, the repeated analyses may yield much different results. Autocorrelated data is required in order to use an ITSA-ARIMA method, particularly if forecasting healthcare utilization and hospital resource management is a goal of community emergency management professionals. Data quality, particularly data granularity and correct application of appropriate ICD-9-CM and ICD-10-CM/PCS codes, is a major hurdle and must be addressed. This research opens the door to additional applications of time-series analysis of hospital inpatient and outpatient utilization stemming from disasters.

Given their progressive role in ensuring data quality in all phases of healthcare, HIIM professionals are perfectly positioned to be key participants in this analysis and evaluation of hospital utilization. Elevating and strengthening the data analytics knowledge and skill sets of HIIM professionals, particularly new graduates, would allow for more opportunities for employment and advancement and allow the HIIM professional to become a more valuable member of the healthcare team. A bright future awaits HIIM professionals with strong data analytics skills.

It is essential that civic leaders, public health, and emergency management professionals have tools to plan for, mitigate, and evaluate the impact of all hazards and disasters on a specified population. Astute information governance demands emphasis on CDI and appropriate data analytics to support the mission of healthcare organizations, public health, and national security. ITSA can be one of the many tools in the arsenal to achieve these aims.

 

Lauralyn Kavanaugh-Burke, DrPH, RHIA, CHES, CHTS-IM, is a assistant professor in the Division of Health Informatics and Information Management at Florida A&M University in Tallahassee, FL.

Perry Brown, DrPH, MSPH, is a professor of public health in the Institute of Public Health at the College of Pharmacy and Pharmaceutical Sciences at Florida A&M University in Tallahassee, FL.

Tammie M. Johnson, DrPH, is an assistant professor in the Department of Public Health at the University of North Florida in Jacksonville, FL.


Notes

  1. Blue Ribbon Study Panel on Biodefense. “A National Blueprint for Biodefense: Leadership and Major Reforms Needed to Optimize Efforts.” Washington, DC, October 2015.
  2. Centers for Disease Control and Prevention. “Cases of Ebola Diagnosed in the United States.” 2014. Available at http://www.cdc.gov/vhf/ebola/outbreaks/2014-west-africa/united-states-imported-case.html.
  3. Perez-Pena, R. “Nurse Who Contracted Ebola in the U.S. Sues Her Hospital Employer.” New York Times. March 2, Available at http://www.nytimes.com/2015/03/03/us/nurse-who-contracted-ebola-in-the-us-sues-her-hospital-employer.html?_r=0.
  4. US Department of Justice.Amerithrax Investigative Summary. 2010. Available at https://www.justice.gov/archive/amerithrax/docs/amx-investigative-summary.pdf.
  5. Chang, H. J., et al. “The Impact of the SARS Epidemic on the Utilization of Medical Services: SARS and the Fear of SARS.“ American Journal of Public Health 94, no. 4 (2004): 562–6
  6. Svoboda, T., et al. “Public Health Measures to Control the Spread of the Severe Acute Respiratory Syndrome during the Outbreak in Toronto.” New England Journal of Medicine 350, no. 23 (2004): 2352–
  7. Fox, R. “Using Intervention Analysis to Assess Catastrophic Events on Business Environment.” International Advances in Economic Research 2, no. 3 (1996): 341–49.
  8. Ibid.
  9. Nelson, K. E., and C. F. Masters-Williams.“Epidemiology of Infectious Diseases: General Principles.” In K. E. Nelson and F. Masters-Williams (Editors), Infectious Disease Epidemiology, Theory and Practice, 2nd ed. Sudbury, MA: Jones and Bartlett, 2007, 25–118.
  10. Boutis, K., Stephens, K. Lam, W. J. Ungar, and S. Schuh.“The Impact of SARS on a Tertiary Care Pediatric Emergency Department.” CMAJ 171, no. 11 (2004): 1353–58.
  11. Earnest, A., M. Chen, D. Ng, and L. Sin. “Using Autoregressive Integrated Moving Average to Predict and Monitor the Number of Beds Occupied during a SARS Outbreak in a Tertiary Hospital in Singapore.” BMC Health Sciences Research 5 (2005): 36–39.
  12. Juster, H. R., B. R. Loomis, T. M. Hinman, M. C. Farrelly, A. Hyland, U. E. Bauer, and G. S. Birkhead. “Declines in Hospital Admissions for Acute Myocardial Infarction in New York State after Implementation of a Comprehensive Smoking Ban.” American Journal of Public Health 97, no. 11 (2007): 2035–39.
  13. Steinhauer, R. “A Readied Response—Bioterrorism.” RN 3 (2002): 48. Available at http://www.modernmedicine.com/modern-medicine/content/readied-response-bioterrorism?page=full.
  14. Hawley, R. J., and E. M. Eitzen Jr. “Biological Weapons—a Primer for Microbiologists.” Annual Review of Microbiology 55 (2001): 235–53.
  15. Henderson, D. “Bioterrorism as a Public Health Threat.” Emerging Infectious Diseases 4, no. 3 (1998): 488–92.
  16. Cieslak, T. , and E. M. Eitzen Jr. “Clinical and Epidemiologic Principles of Anthrax.” Emerging Infectious Diseases 5, no. 4 (1999): 552–55.
  17. Dixon, T. , et al. “Anthrax.” New England Journal of Medicine 341, no. 11 (1999): 815–26.
  18. Cieslak, T. , and E. M. Eitzen Jr. “Clinical and Epidemiologic Principles of Anthrax.”
  19. Dixon, T. C., et al. “Anthrax.”
  20. Inglesby, T. , et al. “Anthrax as a Biological Weapon, 2002: Updated Recommendations for Management.” JAMA 287, no. 17 (2002): 2236–52.
  21. Hawley, R. J., and E. M. Eitzen Jr. “Biological Weapons—a Primer for Microbiologists.”
  22. Jernigan, D. , et al. “Investigation of Bioterrorism-related Anthrax, United States, 2001: Epidemiologic Findings.” Emerging Infectious Diseases 8, no. 10 (2002): 1019–28.
  23. Nelson, K. E., and C. F. Masters-Williams.“Epidemiology of Infectious Diseases: General Principles.”
  24. Jernigan, D. B., et al. “Investigation of Bioterrorism-related Anthrax, United States, 2001: Epidemiologic Findings.”
  25. Smolinski, M. , M. A. Hamburg, and J. Lederberg, eds. Microbial Threats to Health: Emergence, Detection, and Response. Washington, DC: National Academies Press, 2003.
  26. Inglesby, T. V., et al. “Anthrax as a Biological Weapon, 2002: Updated Recommendations for Management.”
  27. Centers for Disease Control and Prevention. “Update: Investigation of Bioterrorism-related Anthrax and Interim Guidelines for Clinical Evaluation of Persons with Possible Anthrax. Morbidity and Mortality Weekly Report 50, no. 43 (2001): 941–4
  28. Nelson, K. E., and C. F. Masters-Williams.“Epidemiology of Infectious Diseases: General Principles.”
  29. Graham, B. World at Risk: The Report of the Commision on the Prevention of Weapons of Mass Destruction Proliferation and Terrorism. New York: Random House,
  30. Traeger, M. S., S. T. Wiersma, N. E. Rosenstein, J. M. Malecki, C. W. Shepard, L. Raghunathan, et al. “First Case of Bioterrorism-related Inhalational Anthrax in the United States, Palm Beach County, Florida, 2001.” Emerging Infectious Diseases 8, no. 10 (2002): 1029–34.
  31. Ibid.
  32. Hughes, J. , and J. L. Gerberding. “Anthrax Bioterrorism: Lessons Learned and Future Directions.” Emerging Infectious Diseases 8, no. 10 (2002): 1013–14.
  33. McDowall, D., et al. Interrupted Time Series Analysis. Beverly Hills, CA: Sage Publications,
  34. Morgan, O. , C. Griffiths, and A. Majeed, “Interrupted Time-Series Analysis of Regulations to Reduce Paracetamol (Acetaminophen) Poisoning.“ PLoS Medicine 4, no. 4 (2007): e105.
  35. Juster, H. R., B. R. Loomis, T. M. Hinman, M. C. Farrelly, A. Hyland, U. E. Bauer, and G. S. Birkhead. “Declines in Hospital Admissions for Acute Myocardial Infarction in New York State after Implementation of a Comprehensive Smoking Ban.”
  36. Garey, K. , et al. “Interrupted Time Series Analysis of Vancomycin Compared to Cefuroxime for Surgical Prophylaxis in Patients Undergoing Cardiac Surgery.” Antimicrobial Agents and Chemotherapy 52, no. 2 (2008): 446–51.
  37. Fox, R. “Using Intervention Analysis to Assess Catastrophic Events on Business Environment.”
  38. Ibid.
  39. Hartung, D. M., and D. Touchette. “Overview of Clinical Research Design.” American Journal of Health System Pharmacy 66, no. 4 (2009): 398–408.
  40. Ansari, F., K. Gray, D. Nathwani, G. Phillips, S. Ogston, C. Ramsay, and P. Davey. “Outcomes of an Intervention to Improve Hospital Antibiotic Prescribing: Interrupted Time Series with Segmented Regression Analysis.” Journal of Antimicrobial Chemotherapy 52, no. 5 (2003): 842–48.
  41. Hartung, D. M., and D. Touchette. “Overview of Clinical Research Design.”
  42. Chaberny, I. F., F. Schwab, S. Ziesing, S. Suerbaum, and P. Gastmeier. “Impact of Routine Surgical Ward and Intensive Care Unit Admission Surveillance Cultures on Hospital-wide Nosocomial Methicillin-Resistant Staphylococcus Aureus Infections in a University Hospital: An Interrupted Time-Series Analysis.” Journal of Antimicrobial Chemotherapy 62, no. 6 (2008): 1422–29.
  43. Yaffee, R. A. An Introduction to Time Series Analysis and Forecasting with Applications of SAS and SPSS. San Diego, CA: Academic Press, 2000.
  44. Wagner, A. K., S. B. Soumerai, F. Zhang, and D. Ross-Degnan. “Segmented Regression Analysis of Interrupted Time Series Studies in Medication Use Research.” Journal of Clinical Pharmacy and Therapeutics 27, no. 4 (2002): 299–309.
  45. Ibid.
  46. Yaffee, R. A. An Introduction to Time Series Analysis and Forecasting with Applications of SAS and SPSS. San Diego, CA: Academic Press, 2000.
  47. England, E. “How Interrupted Time Series Analysis Can Evaluate Guideline Implementation.” Pharmaceutical Journal 275 (2005): 344–47.
  48. Linden, A., J. Adams, and N. Roberts. “Evaluating Disease Management Program Effectiveness: An Introduction to Time-Series Analysis.” Disease Management 6, no. 4 (2003): 243–55.
  49. Yaffee, R. An Introduction to Time Series Analysis and Forecasting with Applications of SAS and SPSS. San Diego, CA: Academic Press, 2000.
  50. Fry, A. , et al. “Trends in Hospitalizations for Pneumonia among Persons Aged 65 Years or Older in the United States, 1988–2002.” JAMA 294, no. 21 (2005): 2712–19.
  51. Yaffee, R. A. An Introduction to Time Series Analysis and Forecasting with Applications of SAS and SPSS.
  52. Agency for Health Care Administration. “A General Overview.” 2009. http://www.fdhc.state.fl.us/index.shtml# (accessed October 4, 2009).
  53. Agency for Toxic Substances and Disease Registry. Disease Clusters: An Overview. U.S. Washington, DC: Department of Health and Human Services Available at http://www.atsdr.cdc.gov/hec/csem/cluster/docs/clusters.pdf.
  54. Katz, M. , and S. S. Fernyak. San Francisco Communicable Disease Report 19862003. San Francisco, CA: San Francisco Department of Public Health, Community Health Epidemiology & Disease Control Section, 2005, 1–21. Available at https://www.sfdph.org/dph/files/reports/StudiesData/CDReport_1986_2003.pdf.
  55. Ibid.
  56. McDowall, D., et al. Interrupted Time Series Analysis. Beverly Hills, CA: Sage Publications, 1980.
  57. Ibid.
  58. Ibid.
  59. England, E. “How Interrupted Time Series Analysis Can Evaluate Guideline Implementation.”
  60. Fox, R. “Using Intervention Analysis to Assess Catastrophic Events on Business Environment.”
  61. McDowall, D., et al. Interrupted Time Series Analysis.
  62. SPSS. PASW Forecasting 17.0 (User’s Manual). Chicago: SPSS, 2008.
  63. Garson, G. “Time Series Analysis: Key Concepts and Terms.” December 18, 2008. http://www.statisticalassociates.com/longitudinalanalysis.htm.
  64. McDowall, D., et al. Interrupted Time Series Analysis.
  65. Ibid.
  66. Garson, G. “Time Series Analysis: Key Concepts and Terms.”
  67. McDowall, D., et al. Interrupted Time Series Analysis. Beverly Hills, CA: Sage Publications,
  68. Wagner, A. K., S. B. Soumerai, F. Zhang, and D. Ross-Degnan. “Segmented Regression Analysis of Interrupted Time Series Studies in Medication Use Research.”
  69. Bosso, J. , and P. D. Mauldin. “Using Interrupted Time Series Analysis to Assess Associations of Fluoroquinolone Formulary Changes with Susceptibility of Gram-Negative Pathogens and Isolation Rates of Methicillin-Resistant Staphylococcus Aureus.” Antimicrobial Agents and Chemotherapy 50, no. 6 (2006): 2106–12.
  70. LaTour, K., S. E. Maki, and P. K. Oachs. “Data Capture, Maintenance, and Quality.” In K. LaTour, S. E. Maki, and P. K. Oachs (Editors), Health Information Management: Concepts, Principles, and Practice. 4th ed. Chicago: AHIMA, 2013, 175–185.
  71. Ibid.
  72. Ibid.
  73. Schraffenberger, L. A. Basic ICD-9-CM Coding. Chicago: AHIMA, 2010.
  74. Ibid.
  75. Healthit.gov. “Syndromic Surveillance Data Submission.” 2015. Available at https://www.healthit.gov/providers-professionals/achieve-meaningful-use/menu-measures/syndromic-surveillance-data-submission.

Printer friendly version.

Lauralyn K. Burke, DrPH, RHIA, CHES, CHTS-IM; C. Perry Brown, DrPH, MSPH; and Tammie M. Johnson, DrPH. “Historical Data Analysis of Hospital Discharges Related to the Amerithrax Attack in Florida.” Perspectives in Health Information Management (Fall 2016): 1-16.

Leave a Reply