Prediabetes Case Identification: Accuracy of an Automated Electronic Health Record Algorithm

by Ana M. Palacio, MD, MPH; Denisse Pareja, MD, MPH; Willy Valencia-Rodrigo, MD; and Jason R. Dahn, PhD

Abstract

The prevalence of prediabetes continues to grow in the United States. Our study aimed to determine the accuracy of a population health approach that applies an algorithm to electronic health data at identifying patients with undiagnosed prediabetes. We conducted a cross-sectional study among patients receiving care at the Miami VA Healthcare System. The algorithm classified subjects into three groups using HbA1c level. We then used ICD-9-CM or ICD-10 codes for diabetes and prediabetes and medication use data to classify subjects into one of five groups: known diabetic patients with HbA1c level greater than 6.5 percent in the past; new diabetic patients without evidence of abnormal Hba1c in the past or diabetic medication use; patients with known prediabetes with HbA1c of 5.7 to 6.4 percent in the past or on metformin therapy; new prediabetes patients; and normal patients. We identified 1,190 subjects and then selected 25 percent of each group to have an overall validation sample of 300 patients. Of the sample of 68 patients with prediabetes, 60 percent had a prior HbA1c value in the prediabetic range but no diagnosis or recommended treatment. The positive predictive values of the algorithm were 80 percent for new prediabetes, 100 percent for known prediabetes, 75 percent for new diabetes, and 100 percent for known diabetes. We concluded that this electronic medical record–based automatic algorithm is accurate and effective at identifying diabetes and prediabetes status in large at-risk populations receiving care in health systems such as the VA. Future studies should evaluate strategies that health systems can use to deploy sustainable evidence-based interventions known to delay the progression to diabetes.

Keywords: algorithms; electronic health record; diabetes mellitus; prediabetes; obesity

Introduction

The prevalence of type 2 diabetes mellitus (T2DM) among adults in the United States is increasing. In 2017, 30.3 million Americans had a diagnosis of T2DM, but this number is bound to increase substantially given the growing prevalence of prediabetes, reported as 84.1 million among American adults.1 Prediabetes increases the risk of diabetes in the near future.2 In Florida, 11.4 percent of people are reported to have T2DM, and 8.6 percent of people older than 20 years are reported to have documented prediabetes.3 However, these statistics probably underestimate the true magnitude of the problem in certain populations. The Veterans Health Administration (VHA) cares for a population at particular risk of diabetes. The 2014 US Department of Veterans Affairs (VA) report Screening and Management of Overweight and Obesity estimated that 78 percent of veterans are overweight or obese.4

Through a local quality improvement initiative, the Miami VA Healthcare System (MVAHS) found that from October 1, 2015, to June 16, 2016, an average of 650 patients had an HbA1c level within the prediabetes range each month. Among the 5,554 patients that met criteria for prediabetes, 88 percent (4,897) had not received a diagnosis. Given the evidence that interventions such as lifestyle changes and the use of metformin can reduce the prevalence of metabolic syndrome and insulin resistance and ultimately delay the progression to diabetes, the prompt identification and management of patients at risk of diabetes is pivotal.5, 6 Early identification will decrease the burden of the disease and eventually its overall cost, estimated at $327 billion in 2015.7

Setting up an effective strategy to identify those with prediabetes (i.e., at risk of diabetes) within a large health system is challenging. Most initiatives rely on individual providers using electronic health record (EHR) tools to identify patients at risk.8 This approach can be time-consuming and ineffectual given the number of alerts, reminders, and other preventive measures that need to be completed by individual providers.9

Large databases (e.g., EHRs) have been used to detect populations at risk, guide policymaking, and track healthcare system outcomes.10, 11 EHRs contain general information that has been used to improve treatment,  adopt preventive methods, and review epidemiology of diseases.12–14 EHRs may also offer an inexpensive and scalable opportunity to conduct hypothesis-driven cross-sectional or longitudinal research as well as to employ large-scale quality improvement strategies.15, 16 The use of clinical databases for these purposes is controversial, limiting their utilization.17 Underutilization is also related to differences in quality and content across databases, and to lack of knowledge of their availability.18

Nevertheless, recent studies have started to validate algorithms to identify specific outcomes from the EHRs by comparing results to a gold standard.19, 20 Such an approach can facilitate the identification of patients at risk or serve as a preventive screening.21 We propose using an automatic approach to systematically identify veteran patients with prediabetes. This approach would allow the health system to identify and engage with these patients and offer preventive programs known to delay or prevent the onset of diabetes.22, 23

The aim of this study was to develop a reliable, automatic EHR-based algorithm to identify patients with prediabetes who are at risk of T2DM. We validated the algorithm against the gold standard of manual chart review to determine its accuracy. This EHR tool could help healthcare systems translate evidence-based interventions for the prevention of diabetes into a population health strategy that could have a large impact on the quality of care provided.

Methods

Study Design

We conducted a cross-sectional study using data drawn from the EHR system used at MVAHS. This data registry contains information on all MVAHS patients, including laboratory, pharmacy, clinical, and administrative data.

Study Setting

The MVAHS consists of 372 hospital beds; a community living center attached to the main facility; two major satellite outpatient clinics, located in Broward County and Key West; and five community-based outpatient clinics, located in Homestead, Key Largo, Pembroke Pines, Hollywood, and Deerfield Beach.24

The VA Healthcare System 2010 diabetes mellitus guidelines recommended that primary care providers obtain hemoglobin A1c (HbA1c) levels annually among patients who are 45 years of age or older, have a body mass index (BMI) greater than or equal to 25 kg/m2, or have specific diabetes mellitus risk factors.25

Study Population

We included all patients from MVAHS 18 years of age and older who had an HbA1c value reported between September 18 and September 30, 2016. Our sample included 1,665 patients, from all MVAHS facilities. We selected patients from two large clinics, Miami and Broward County (more than 14 primary care teams at each site) and applied an algorithm to identify veterans with undiagnosed prediabetes.

Definition of Algorithm

A team consisting of three clinicians, an epidemiologist, and an IT specialist developed the algorithm for the purpose of identifying prediabetic patients and patients with poorly controlled diabetes with and without microvascular and macrovascular complications. The algorithm was developed for quality improvement purposes. It was tested and retested for reliability during May 2016. Since then, the Patient Aligned Care Team has used it to diagnose prediabetes in patients. Clinic personnel receive the algorithm output and verify that the patient meets the prediabetes criteria before entering a prediabetes diagnosis in the EHR. The purpose of this analysis was to rigorously evaluate the accuracy of the algorithm to expand its use to other quality improvement population health initiatives.

The algorithm collected the following parameters: HbA1c level during the selected period, presence of HbA1c level greater than or equal to 5.7 percent at any time in the past, prior diagnosis of prediabetes or diabetes using ICD-9-CM/ICD-10 codes, use of antidiabetic medication including oral hypoglycemic medications and injectable medications, and use of metformin. The algorithm assessed all these parameters simultaneously. (See Table 1.)

Using these parameters, we classified each patient into one of five different diabetes or prediabetes status categories: newly diagnosed prediabetes, newly diagnosed diabetes, known diabetes, known prediabetes, and normal. We applied the algorithm as follows:

Using the most recent HbA1c level, we first classified the patient into three groups according to their HbA1c laboratory values: HbA1c level less than 5.7 percent, HbA1c level 5.7–6.4 percent, and HbA1c level greater than or equal to 6.5 percent. We then used ICD-9-CM or ICD-10 codes and medication use data to classify subjects into one of the following categories:

  1. known diabetes with HbA1c level greater than or equal to 6.5 percent in the past;
  2. new diabetes without evidence of abnormal HbA1c level in the past or diabetes medication use;
  3. known prediabetes with HbA1c of 5.7 to 6.4 percent in the past or use of metformin therapy;
  4. new prediabetes; or
  5. normal. (See Figure 1 and Figure 2.)

Those with a new diagnosis based on the result of the recent test were further categorized into those who had a prior abnormal HbA1c value and those who did not.

Diabetes was defined as follows: Presence of HbA1c level greater than or equal to 6.5 percent, or ICD-9-CM code 250.xx or CM/ICD-10 code E11, or use of oral hypoglycemic agents or injectable antidiabetic medication.

Prediabetes was defined as follows: Patients without a diagnosis of diabetes (ICD-9-CM 790.29 or ICD-10 R73.03) who had an HbA1c level between 5.7 and 6.4 percent.

See Table 2 for a list of the criteria for each category.

Gold Standard

Our gold standard was chart review of a random sample of 25 percent of the patients identified by the algorithm from September 18 to September 30, 2016. We selected a random sample from each of the five categories to have a representative sample from each. Three independent qualified reviewers with clinical experience (D.P., W.V., and J.D.) reviewed the charts of the random sample of patients. The reviewers were blinded to the results of the algorithm. For the review, we defined the patient as having prediabetes if: prediabetes was listed as a problem in the assessment plan of at least one primary care clinical note, metformin was prescribed, the patient had a fasting plasma glucose level of 100 to 125 mg/dL, or the patient had an HbA1c laboratory value of 5.7 to 6.4 percent.26, 27 We considered a patient to have diabetes if the patient met one of the following criteria: diabetes was documented in the assessment plan of at least one clinical note; the patient used oral hypoglycemic agents, insulin, or other injectable medications; the patient had a fasting plasma glucose level of greater than 126 mg/dL; or the patient had an HbA1c laboratory value greater than 6.5 percent.28 All definitions are listed in Table 2. We classified the sample according to the normal, prediabetes, or diabetes status of the patients. When completing the chart review, we gathered the same information that was collected by the algorithm in a new collection form.

Validation of Algorithm

The algorithm identified 1,665 patients. From the 1,196 patients who received care in the Broward and Miami clinics, we selected 25.21 percent of each group for a sample of 300 patients. One reviewer (D.P.) reviewed the EHR charts for this sample using the gold standard to classify patients into the five described categories. Using the same methodology, two independent reviewers (J.D. and W.V.) conducted a second review of the charts of a random sample of 150 patients. Once second review was completed, we compared the group assignment between the two reviews, resolved any differences by consensus, and then compared the gold standard assignments to the results of the algorithm.

Statistical Analysis

We calculated the positive predictive value (PPV) of the algorithm by comparing each algorithm-based category (newly diagnosed prediabetes, newly diagnosed diabetes, known prediabetes, known diabetes) to the results of the gold standard. We also calculated the negative predictive value (NPV) by comparing the results of the algorithm-based “normal” category to the gold standard. To do this, we evaluated the random sample of patients without a diabetes or prediabetes diagnosis based on ICD-9-CM/ICD-10 codes, without a new HbA1c level in the prediabetes or diabetes range, and without medication use (metformin, insulin, hypoglycemic agents) to establish the NPV of the algorithm. We used SAS version 9.4 (SAS Institute Inc.) to calculate the PPV and NPV with the corresponding 95 percent confidence intervals. We then used the confirmed categories to calculate the prevalence of normal, prediabetes, and diabetes status as well as the prevalence of previously undiagnosed prediabetes, previously undiagnosed diabetes, and poorly controlled diabetes.

Ethics Approval

The study was approved by the MVAHS Institutional Review Board.

Results

The algorithm yielded a total sample of 1,655 patients. We selected patients only from large clinics (Broward and Miami) for a total of 1,190 patients (excluding six duplicates). Of those, 88.4 percent (1,053 patients) were men. Using the algorithm, we classified patients as follows: 13 percent new prediabetes (160 patients), 10 percent known prediabetes (113 patients), 1 percent new diabetes (16 patients), 35 percent known diabetes (423 patients), and 40 percent normal (478 patients). Prediabetes was detected in 23 percent of patients who had an HbA1c during the selected two-week period. We randomly selected 25 percent of each diagnosis group to reach an overall sample of 300 patients. The demographic characteristics and number of patients in each categorization group are listed in Table 3. In the sample of 68 patients with prediabetes, 60 percent had a prior HbA1c value in the prediabetes range but did not have a corresponding diagnosis or treatment recommendations.

In the sample of eight new diabetes cases, none of the patients had a prior HbA1c value in the diabetes range. The PPVs for the algorithm were 80 percent for new prediabetes, 100 percent for known prediabetes, 75 percent for new diabetes, and 100 percent for known diabetes. The NPV of the algorithm for not having known prediabetes, new prediabetes, known diabetes, and new diabetes was 90 percent for all categories. The PPV for all cases of diabetes was 98 percent, and the PPV for all cases of prediabetes was 88 percent. See Table 4.

Discussion

Our study found that an EHR-based algorithm is accurate in the mass identification of patients with undiagnosed prediabetes. Our algorithm showed high PPV and NPV, confirming its reliability as a tool. We used different variables, such as ICD-9/ICD-10 diagnostic codes, HbA1c laboratory value, and medication use (metformin, oral, and non-oral hypoglycemic agents), to accurately classify a patient as having prediabetes or diabetes. According to Wilke et al., the use of more than one parameter for patient identification when using an algorithm increases the PPV.29

The strengths of our study are the large sample size, the multicomponent algorithm that was tested, the access to complete medical records for validation, and the rigorous validation protocol. Nevertheless, our study had several limitations that need to be mentioned: Our sample included mostly male veterans receiving care at MVAHS. We did not include comorbidities other than the ones related to diabetes, such as macrovascular and microvascular complications. The generalizability of the algorithm to other sites is unknown; however, any practice with an at-risk population should have similar amount of data and a captive population. We included patients who had an HbA1c level obtained during a two-week period; however, given our weekly reports from this algorithm, the prevalence of abnormal values during this period was likely to be similar to that found during any other period. We validated the results in only a sample of all the patients; however, we randomly selected 25 percent of a large population for this sample.

Prevention programs, such as the Diabetes Prevention Program (DPP), have shown that the development of T2DM can be delayed or prevented with weight reduction and metformin use.30 However, health systems have no systemwide mechanisms to identify patients with prediabetes in order to deploy such strategies. A study conducted among National Health and Nutrition Examination Survey (NHANES) respondents revealed that only 11 percent of individuals with HbA1c values in the prediabetes range were aware of having this condition.31 To date, we have relied on each individual provider to test patients and adhere to diabetes prevention guidelines, by referring patients for weight loss and nutritional interventions or by prescribing metformin.32 Although such interventions have been found effective in clinical trials,33 the identification, referral, uptake, and effectiveness of the clinic-based approach has been limited in real-world settings. The reasons are multifactorial but could be summarized as a mismatch between the time and resources needed to identify at-risk patients and engage them in lifestyle changes to modify their health behaviors, and the time and resources that are available during the clinic visit.34

In the VA Healthcare System, many options are available for patients at risk of diabetes, such as education, nutrition counseling, medication, and prevention programs, such as the weight-management health promotion program MOVE!35–37 However, only 30 percent of patients at risk received one of these interventions. The VHA has previously reported that participation in lifestyle modification programs such as MOVE! is low; it is reported to be used only by 2 percent of the VHA outpatient population (more than 5.5 million).38 Older age, physical conditions, low literacy levels, higher prevalence of homelessness, and mental health disorders are present in veterans and make it harder to apply weight-loss programs among this population. The MOVE! was less frequently used in medical centers with higher home instability and lower rates of obesogenic psychiatric drug prescriptions, and veterans living farther away from a VHA hospital are less likely to attend the program, because of proximity-related barriers.39 A recent evaluation of a VHA adaption of a DPP-intervention revealed that the main barrier to the dissemination of the intervention was reaching those who could benefit.40 In that adaptation, they screened for prediabetes by offering an HbA1c test to those who attended a MOVE! orientation session, which limited the reach of their intervention.41 In this population of MOVE! participants, they found a prevalence of prediabetes that ranged from 22 percent to 31 percent, which is consistent with our finding of 23 percent among those who had an HbA1c test ordered by a provider in response to the VA recommendation of checking HbA1c levels among veterans older than 45 years with a BMI of 25 kg/m2 or higher.42 The advantage of the population-based approach is that we can use a computer-driven algorithm to identify the glycemic health of hundreds of patients per month, classifying the patients as having normal, prediabetes, or diabetes glycemic status.

Researchers, health plan administrators, and policy makers have used EHRs and databases, such as the one from Medicare, to identify patients at risk of chronic conditions.43–46 The advantage of claims-based algorithms is that they can be used for a variety of different outcomes.47 Specifically for diabetes, strategies using EHR data have been successfully used for patient identification and stratification.48–52 Extending these strategies to patients at higher risk of diabetes may help providers refer patients to nutritional or other existing services and, in the case of the VA, may increase the reach of programs such as MOVE! or community DPP-based intervention programs.53 Gopalan et al. found that prediabetes-aware individuals in the NHANES database were 50 percent more likely to engage in physical activity and healthy diet than those who were unaware of their prediabetic status.54 This finding suggests that a population health strategy aiming at increasing identification and awareness of the condition may lead to more adherence to lifestyle change recommendations. Healthcare providers in capitated systems that are evaluated and reimbursed based on their quality metrics should be particularly interested in testing such an approach.55

In conclusion, our algorithm accurately mass-identifies veterans with undiagnosed prediabetes as well as those with diagnosed but poorly controlled diabetes. Identifying patients at risk is the first step to implement prevention interventions on a larger scale. Future studies should provide evidence that using a technology-based population health approach can both identify patients at risk of diabetes and effectively engage them in preventive care strategies to delay the onset of diabetes.

 

 

Ana M. Palacio, MD, MPH, is an assistant professor of medicine in the Division of Population Health and Computational Medicine at the University of Miami Miller School of Medicine in Miami, FL.

Denisse Pareja, MD, MSPH, is a research associate in the Geriatric Research Education and Clinical Center at the Miami Veterans Affairs Medical Center in Miami, FL.

Willy Valencia-Rodrigo, MD, is an endocrinology and geriatrics physician affiliated with the Miami VA Healthcare System in Miami, FL.

Jason R. Dahn, PhD, is a staff psychologist and health behavior coordinator at the Miami VA Healthcare System in Miami, FL.

 

 

Notes

  1. Centers for Disease Control and Prevention. “National Diabetes Statistic Report, 2017.” Available at https://www.cdc.gov/features/diabetes-statistic-report/index.html.
  2. American Diabetes Association. “Statistics about Diabetes.” Available at http://www.diabetes.org/diabetes-basics/statistics/.
  3. Florida Health Care Coalition. Florida Type 2 Diabetes Report 2015. Available at http://www.flhcc.org/uploads/Resources/FL%20T2%20Diabetes%202015%20Report.pdf
  4. US Department of Veterans Affairs and Department of Defense. VA/DoD Clinical Practice Guideline: Screening and Management of Overweight and Obesity: Guideline Summary 2014. Available at https://www.healthquality.va.gov/guidelines/CD/obesity/OBESUMC20150106.pdf.
  5. A Decade Later, Lifestyle Changes or Metformin Still Lower Type 2 Diabetes Risk. National Institutes of Health. https://www.nih.gov/news-events/news-releases/decade-later-lifestyle-changes-or-metformin-still-lower-type-2-diabetes-risk Accessed May, 27, 2018.
  6. Tuso, P. “Prediabetes and Lifestyle Modification: Time to Prevent a Preventable Disease.” Permanente Journal 18, no. 3 (2014): 88–93.
  7. Economic Costs of Diabetes in the U.S. in 2017. American Diabetes Association. Diabetes Care. 2018 May;41(5):917-928. doi: 10.2337/dci18-0007.
  8. National Institute of Diabetes and Digestive and Kidney Diseases. “Game Plan for Preventing Type 2 Diabetes: Prediabetes Screening: How and Why.” Available at https://www.niddk.nih.gov/health-information/health-communication-programs/ndep/health-care-professionals/game-plan/Pages/index.aspx.
  9. Poissant, L., J. Pereira, R. Tamblyn, and Y. Kawasumi. “The Impact of Electronic Health Records on Time Efficiency of Physicians and Nurses: A Systematic Review.” Journal of the American Medical Informatics Association 12, no. 5 (2005): 505–16.
  10. Lee, J. Y. “Uses of Clinical Databases.” American Journal of the Medical Sciences 308, no. 1 (1994): 58–62.
  11. Mazzali, C., and P. Duca. “Use of Administrative Data in Healthcare Research.” Internal and Emergency Medicine 10 (2015): 517.
  12. Ibid.
  13. Biltaji, E., C. Tak, J. Ma, N. Ruiz-Negron, and B. K. Bellows. “Using Electronic Medical Records to Assess the Effectiveness of Pharmacotherapy in Pain: A Review of Recent Observational Studies.” Journal of Pain & Palliative Care Pharmacotherapy 30, no. 3 (2016): 210–17.
  14. Rimland, J. M., I. Abraha, M. L. Luchetta, et al. “Validation of Chronic Obstructive Pulmonary Disease (COPD) Diagnoses in Healthcare Databases: A Systematic Review Protocol.” BMJ Open 6 (2016): e011777.
  15. Mazzali, C., and P. Duca. “Use of Administrative Data in Healthcare Research.”
  16. Goldacre, M., L. Kurina, D. Yeates, V. Seagroatt, and L. Gill. “Use of Large Medical Databases to Study Associations between Diseases.” QJM 93, no. 10 (2000): 669–75.
  17. Black, N., and M. Payne. “Directory of Clinical Databases: Improving and Promoting Their Use.” Quality & Safety in Health Care 12, no. 5 (2003): 348–52.
  18. Ibid.
  19. McPheeters, M. L., N. A. Sathe, R. N. Jerome, and R. M. Carnahan. “Methods for Systematic Reviews of Administrative Database Studies Capturing Health Outcomes of Interest.” Vaccine 31, suppl. 10 (2013): K2–K6.
  20. Klompas, M., E. Eggleston, J. McVetta, R. Lazarus, L. Li, and R. Platt. “Automated Detection and Classification of Type 1 Versus Type 2 Diabetes Using Electronic Health Record Data.” Diabetes Care 36, no. 4 (2013): 914–21.
  21. Makam, A. N., O. K. Nguyen, B. Moore, Y. Ma, and R. Amarasingham. “Identifying Patients with Diabetes and the Earliest Date of Diagnosis in Real Time: An Electronic Health Record Case-finding Algorithm.” BMC Medical Informatics and Decision Making 13 (2013): 81.
  22. American Diabetes Association. Standards of Medical Care in Diabetes 2016. Available at http://care.diabetesjournals.org/content/suppl/2015/12/21/39.Supplement_1.DC2/2016-Standards-of-Care.pdf.
  23. Del Re, A. C., M. L. Maciejewski, and A. H. Harris. “MOVE: Weight Management Program across the Veterans Health Administration: Patient- and Facility-level Predictors of Utilization.” BMC Health Services Research 13 (2013): 511.
  24. US Department of Veterans Affairs. “Miami VA Healthcare System: About the Miami VA Healthcare System.” Available at https://www.miami.va.gov/about/index.asp.
  25. US Department of Veterans Affairs and Department of Defense. VA/DoD Clinical Practice Guideline: Management of Diabetes Mellitus (DM). 2010. Available at http://www.healthquality.va.gov/guidelines/CD/diabetes/DM2010_FUL-v4e.pdf.
  26. National Institute of Diabetes and Digestive and Kidney Diseases. “Diabetes Tests & Diagnosis.” Available at https://www.niddk.nih.gov/health-information/health-topics/Diabetes/diagnosis-diabetes-prediabetes/Pages/index.aspx.
  27. Diabetes Prevention Program Research Group. “Reduction in the Incidence of Type 2 Diabetes with Lifestyle Intervention or Metformin.” New England Journal of Medicine 346, no. 6 (2002): 393–403.
  28. Sacks, D. B., M. Arnold, G. L. Bakris, D. E. Bruns, A. R. Horvath, M. S. Kirkman, A. Lernmark, B. E. Metzger, and D. M. Nathan. “Guidelines and Recommendations for Laboratory Analysis in the Diagnosis and Management of Diabetes Mellitus.” Clinical Chemistry 57, no. 6 (2011): e1–e47.
  29. Wilke, R. A., R. L. Berg, P. Peissig, et al. “Use of an Electronic Medical Record for the Identification of Research Subjects with Diabetes Mellitus.” Clinical Medicine and Research 5, no. 1 (2007): 1–7.
  30. National Institute of Diabetes and Digestive and Kidney Diseases. “Diabetes Prevention Program (DPP).” Available at https://www.niddk.nih.gov/about-niddk/research-areas/diabetes/diabetes-prevention-program-dpp/Pages/default.aspx.
  31. Gopalan, A., I. S. Lorincz, C. Wirtalla, S. C. Marcus, and J. A. Long. “Awareness of Prediabetes and Engagement in Diabetes Risk-reducing Behaviors.” American Journal of Preventive Medicine 49, no. 4 (2015): 512–59.
  32. Damschroder, L. J., D. C. Aron, R. E. Keith, S. R. Kirsh, J. A. Alexander, and J. C. Lowery. “Fostering Implementation of Health Services Research Findings into Practice: A Consolidated Framework for Advancing Implementation Science.” Implementation Science 4 (2009): 50.
  33. National Institute of Diabetes and Digestive and Kidney Diseases. “Diabetes Prevention Program (DPP).”
  34. Damschroder, L. J., D. C. Aron, R. E. Keith, S. R. Kirsh, J. A. Alexander, and J. C. Lowery. “Fostering Implementation of Health Services Research Findings into Practice: A Consolidated Framework for Advancing Implementation Science.”
  35. Noël, P. H., C. P. Wang, M. J. Bollinger, M. J. Pugh, L. A. Copeland, J. Tsevat, et al. “Intensity and Duration of Obesity-related Counseling: Association with 5-Year BMI Trends among Obese Primary Care Patients.” Obesity 20, no. 4 (2012): 773–82.
  36. US Department of Veterans Affairs. “MOVE! Weight Management Program.” Available at https://www.move.va.gov/.
  37. Moin, T., L. J. Damschroder, M. AuYoung, M. L. Maciejewski, S. K. Datta, J. E. Weinreb, et al. “Diabetes Prevention Program Translation in the Veterans Health Administration.” American Journal of Preventive Medicine 53, no. 1 (2017): 70–77.
  38. Del Re, A. C., M. L. Maciejewski, and A. H. Harris. “MOVE: Weight Management Program across the Veterans Health Administration: Patient- and Facility-level Predictors of Utilization.”
  39. Ibid.
  40. Damschroder, L. J., D. C. Aron, R. E. Keith, S. R. Kirsh, J. A. Alexander, and J. C. Lowery. “Fostering Implementation of Health Services Research Findings into Practice: A Consolidated Framework for Advancing Implementation Science.”
  41. Moin, T., L. J. Damschroder, B. Youles, F. Makki, C. Billington, W. Yancy, et al. “Implementation of a Prediabetes Identification Algorithm for Overweight and Obese Veterans.” Journal of Rehabilitation Research and Development 53, no. 6 (2016): 853–62.
  42. US Department of Veterans Affairs and Department of Defense. VA/DoD Clinical Practice Guideline: Management of Diabetes Mellitus (DM).
  43. Wilke, R. A., R. L. Berg, P. Peissig, et al. “Use of an Electronic Medical Record for the Identification of Research Subjects with Diabetes Mellitus.”
  44. Thacker, E. L., P. Muntner, H. Zhao, et al. “Claims-based Algorithms for Identifying Medicare Beneficiaries at High Estimated Risk for Coronary Heart Disease Events: A Cross-sectional Study.” BMC Health Services Research 14 (2014: 195.
  45. Ludvigsson, J. F., J. Pathak, S. Murphy, et al. “Use of Computerized Algorithm to Identify Individuals in Need of Testing for Celiac Disease.” Journal of the American Medical Informatics Association 20, no. e2 (2013): e306–e310.
  46. Manaktala, S., and S. R. Claypool. “Evaluating the Impact of a Computerized Surveillance Algorithm and Decision Support System on Sepsis Mortality.” Journal of the American Medical Informatics Association 24, no. 1 (2017): 88–95.
  47. Mazzali, C., and P. Duca. “Use of Administrative Data in Healthcare Research.”
  48. Klompas, M., E. Eggleston, J. McVetta, R. Lazarus, L. Li, and R. Platt. “Automated Detection and Classification of Type 1 Versus Type 2 Diabetes Using Electronic Health Record Data.”
  49. Makam, A. N., O. K. Nguyen, B. Moore, Y. Ma, and R. Amarasingham. “Identifying Patients with Diabetes and the Earliest Date of Diagnosis in Real Time: An Electronic Health Record Case-finding Algorithm.”
  50. Wilke, R. A., R. L. Berg, P. Peissig, et al. “Use of an Electronic Medical Record for the Identification of Research Subjects with Diabetes Mellitus.”
  51. Zgibor, J. C., T. J. Orchard, M. Saul, G. Piatt, K. Ruppert, A. Stewart, and L. M. Siminerio. “Developing and Validating a Diabetes Database in a Large Health System.” Diabetes Research and Clinical Practice 75, no. 3 (2007): 313–39.
  52. Sharma, M., I. Petersen, I. Nazareth, and S. J. Coton. “An Algorithm for Identification and Classification of Individuals with Type 1 and Type 2 Diabetes Mellitus in a Large Primary Care Database.” Clinical Epidemiology 8 (2016): 373–80.
  53. Moin, T., L. J. Damschroder, M. AuYoung, M. L. Maciejewski, S. K. Datta, J. E. Weinreb, et al. “Diabetes Prevention Program Translation in the Veterans Health Administration.”
  54. Gopalan, A., I. S. Lorincz, C. Wirtalla, S. C. Marcus, and J. A. Long. “Awareness of Prediabetes and Engagement in Diabetes Risk-reducing Behaviors.”
  55. Hanchak, N. A., N. Schlackman, and S. Harmon-Weiss. “U.S. Healthcare’s Quality-based Compensation Model.” Health Care Financing Review 17, no. 3 (1996): 143–59.

Printer friendly version of this article.

Ana M. Palacio, MD, MPH; Denisse Pareja, MD, MPH; Willy Valencia-Rodrigo, MD; and Jason R. Dahn, PhD. “Prediabetes Case Identification: Accuracy of an Automated Electronic Health Record Algorithm.” Perspectives in Health Information Management (Summer 2018): 1-15.

Leave a Reply