Abstract
This study explored possible success factors for health information management certification exams. Based on the American Health Information Management Association (AHIMA) website, in 2018 and 2019, only 70 percent of first-time test takers passed the Registered Health Information Administrator (RHIA) exam; 26 percent passed the Certified Health Data Analyst (CHDA) exam in 2018; and only 10 percent passed the Certified Health Data Analyst exam in 2019. A quantitative systematic review and meta-analysis offered insight into factors related to passing certification exams. Sources included existing, relevant peer-reviewed and published literature since 1990 within 87 educational and health/medicine databases and 62 other articles and journal databases available at the University of South Dakota library. Outcomes from the systematic review include illumination of factors for passing health information management, healthcare, and education certification exams. Ultimately, this new information will help improve pass rates on certification exams.
Keywords: certification exam success factors, registered health information administrator, certified health data analyst, health information management
Introduction
During the past 10 years, implementation of electronic health records (EHRs) has dramatically changed how patients experience care, how healthcare professionals document and exchange patient information, and how health information is managed and maintained. Further, challenges such as data quality, interoperability, and usability continue to provide opportunities for improvement in healthcare work and decision-making.1-5 According to Cyganek et al.,6 “electronic health records should be considered as one of the most complex data objects in the information processing industry.” In 2018, only 63 percent of new graduates eligible for the Registered Health Information Administrator (RHIA) entry-level exam sat for the exam and passed, and only 26 percent of first-time test-takers passed the Certified Health Data Analyst (CHDA) exam.7,8
Considering the educational risk and cost, along with the expected workforce shortage, it is concerning that few people attempt the certification exams, and college graduates are frequently unable to pass competency exams on the first attempt. This motivated the present investigation. The following systematic review covers empirically demonstrated factors associated with successful completion of certification exams in healthcare and education professions. These hypothetical factors are consistent with literature review suggestions for further study with the goal of applying findings to study additional related health information management (HIM) certification exams in the future.
Certification exams are utilized to assess knowledge and proficiency in HIM after completion of relevant education.9 Many careers require certification and ongoing maintenance to ensure the currency of practitioner skills. A credential is commonly awarded to an individual who has passed a certification exam with the requirement to complete continued education on an ongoing basis. The credential signifies valuable knowledge in the medical workplace,10 and credentials are routinely held by professionals in healthcare settings.
Background
Maintaining a sufficient number of well-trained workers is problematic in the healthcare information industry because professionals are expected to frequently learn new technologies and to synthesize larger amounts of data.11 Jobs are evolving as workers develop those skills in preparation for new roles, which commonly require additional credentials. Factors that predict passing scores on certification exams in the HIM profession are not well understood at present, evident by a lack of research studies.
First-time test-taker pass rates for the RHIA exam average 72.4 percent for the five-year period of 2014-2018,12,13 with pass rates ranging between 69 percent and 75.8 percent (Table 1). The same time period shows an average pass rate of 48.4 percent for the CHDA exam with a pass rate range of 26 percent to 61.2 percent (Table 2).14,15 In 2018, first-time test-taker pass rates for the RHIA and CHDA certification exams are 71 percent and 26 percent, respectively.16,17 In 2018, 63 percent of new graduates eligible for the RHIA entry-level exam attempted and passed the exam, 71 percent of 1,129 first-time test-takers passed the exam, and only 26 percent of first-time test-takers passed the CHDA exam.18,19 Students, professionals, and faculty may benefit by understanding factors that improve pass rates on the exams and possibly addressing any concerns. The study described will identify and synthesize current, relevant theoretical and empirical research to answer what factors predict passing healthcare and education certification exams have thus far been identified.
Literature Review
This section includes an exploration of published literature regarding success factors for completion of HIM, healthcare, and education professional certification exams. The University of South Dakota (USD) Library was utilized to access results from education, healthcare/medicine, and ProQuest databases. Keywords used for searches were “certification,” “exam,” and “success factors.” Existing knowledge of theoretical models, frameworks, and success factors related to student and educational programs were identified in the literature in addition to critical gaps.
The theoretical model informing this study included the Community of Inquiry (COI) model (Figure 1), in which students, faculty, and content intersect to reflect the conceptual space defining the highest level of educational experience.20 Certificate exam scores indicated student levels of learning on the topics being assessed. The next section will discuss important HIM certification exam predictors of success described in literature.
Dolezel and McLeod21 found that a students’ cumulative grade point average (GPA), HIM course grades, and online course delivery predicted success in passing the RHIA credentialing exam. More than 80 percent of online students passed the certification exam on the first attempt, compared to less than 60 percent of campus-based students. McNeill and Brockmeier22 predicted passing using several educational program features, including total program expenditures; student-to-faculty ratio; faculty degrees earned; teaching experience; course didactic, laboratory, and professional practice hours; comprehensive examination requirement; and student mean cumulative college GPA on admission. However, the authors included only two years of data and reported no statistically significant relationships between each hypothetical predictor and the pass rate. Condon and Barefield23 found that a post-baccalaureate certificate program is an alternate method to address the workforce shortage.
Key insights on barriers to passing the Registered Health Information Technology (RHIT) credentialing exam provided insight that may be applied to the RHIA credentialing exam. Ellis24 found that a lack of confidence in test-taking and knowledge of exam content may play a part in eligible students not attempting certification exams. Farroll25 noted that grit and deliberate practice play significant parts in predicting which students persevered on credentialing exams. Preparation for certification exams included repeated experience of completing comprehensive exams using several types of exam prep tools, as well as other factors noted in literature from other related healthcare and education professions.
The RHIT exam is similar in the type and format of the RHIA exam. With the integration of technology and considering that credentialing exams are online, a prototype for a mobile application to better prepare students was developed.26 However, no studies of student results utilizing the mobile application appear to have been reported. There were only a few related, completed studies specific to the RHIA exam, and none are identified for the CHDA credentialing exams. The Commission on Accreditation for Health Informatics and Information Management Education (CAHIIM) requires educational program directors to monitor and report student success rates and pass rates on the corresponding credentialing exams as a part of the CAHIIM accreditation process. In 2018, 63 percent of new graduates eligible for the RHIA entry-level exam attempted and passed, 71 percent of first-time test-takers passed, and only 26 percent of first-time test-takers passed the CHDA exam.27,28
Noblin, Walden, and Safia29 reported mixed results, as only half of the examinees passed the RHIA exam after waiting an average of seven months between the completion of a prep course and taking the exam. The prep course workshops were sponsored by the American Health Information Management Association (AHIMA) and were offered four different times throughout the study. Interestingly, Noblin, Walden, and Safiar30 reported that some of the test-takers passing the exam had earned at least one additional credential prior to passing the RHIA exam. McNeill31 reported that students graduating from associate degree HIM programs with a comprehensive exam at the end of coursework were more likely to pass the certification exam.
Other healthcare and education certification examination literature provided insight about additional variables that could be studied to determine application to the HIM profession. Highlights from the studies are described below in the following order: medical licensing/residency program; physical therapy assistant; nurse anesthetist; athletic trainer; registered nursing; internal medicine; medical technologists; prosthetics; nursing wound, ostomy and continence certification; paramedic; and National Asthma Education Certification Board. Findings from the medical licensing/residency program studies are presented below.
According to Gohara, Shapiro, Jacob, Khuder, Gandy, Metting, and Kleshinski,32 and Gullo, McCarthy, Shapiro, and Miller,33 performance on required medical school courses predicted success on medical licensing exams better than pre-admission models using the Medical College Admission Test (MCAT). It is critical to understand factors that help students on first-time exam attempts, such as practice exams or other high-stakes exams, informed by an extensive review of literature. The in-service examination was used by most residency programs as a measure of progress during residency training. Bedno, Soltis, Mancuso, Burnett, and Mallon34 found that “the in-service examination was a way to assess the program and resident likelihood of success on the board certification exam” (p. 641). There was a moderate correlation with student performance on the in-service exam and performance on the board certification exam. Student GPAs also were important factors, with a significant correlation to passing the certification exam. Adding to the body of research, Schmitz and Bailey35 recommend a larger study to validate findings that in-training periodic exams predict certification results. Insights from the physical therapy assistant literature are highlighted next, followed by the Certified Registered Nurse Anesthetist (CRNA) exam and the Certified Athletic Trainer exam findings.
Completion of prior high-stakes exams improved the licensure exam scores. Findings were consistent with the National Physical Therapy Exam for Physical Therapist Assistants, where Schengel36 found correlations between grades received in anatomy, physiology, and overall GPA with the score received on the national exam. Hoversten37 tested four variables to predict success on the CRNA exam. Only age predicted passing scores reliably, showing that the younger the student, the higher the score on the first attempt of the exam. Other variables that failed to predict success on the certification exam included the setting for prior work experience and the number years of experience working in critical care. According to Bruce,38 variables that predicted success on the Certified Athletic Trainer exam included GPA, GRE performance, and completion of a calculus course. Other predictors studied by Bruce included the following: undergraduate GPA; percentile rank on GRE verbal, quantitative, and analytic writing scores; Biderman’s formula; the Basic Carnegie Classification; undergraduate institutional type setting (private or public); the academic profile of undergraduate institutions; whether the student completed higher level science, math, and advanced athletic training coursework during undergraduate education; and the student residency class. One very strong predictor of success identified by Bruce39 was Biderman’s Formula Score, which incorporated the GPA and GRE scores. Individual predictors, including the GPA, GRE performance, and students taking an undergraduate calculus class, were positive indicators for successfully passing the certification exams. According to Krieger, Thomas, Banaszak, and Schlabach,40 a student’s cumulative athletic training program GPA and self-efficacy were strong positive predictors of success on the credentialing exam for athletic trainers. The cognitive development variable did not show correlation with passing the exam. The test anxiety variable showed an inverse relationship, making test anxiety important for further study. In a later study, Frashah and Blomquist41 found self-efficacy as a strong predictor of success on project management certification exams. Registered Nursing certification exam and results from the American Board of Internal Medicine certification exam literature are described below.
Beeman and Waterhouse42 successfully identified variables that predicted passing the National Certification Licensure Examination for Registered Nurses. Seven significant predictors were identified, including the number of grades of C+ or lower in nursing foundation, theory, and application courses. Atsawarungruangkit43 studied pass rates for the American Board of Internal Medicine certification exam using 69 residency program factors. This exam is high stakes and is only offered once a year, so being prepared to take the exam is important. The characteristics of residency programs studied were specific to a residency program, and most do not merit inclusion specifically for comparison. The characteristics that may be beneficial to the present comparisons included program size and type, percentage of full time female faculty, faculty ratio to students, percentage of females, weekly number of hours worked, average hours per week of lectures, formal mentoring program availability, formal program in place to foster teamwork, continuous quality improvement training, additional training beyond the accreditation required length, student evaluation system, and having a process in place to assess graduation rates and performance scores. One region reported statistically significantly higher pass rates than the others. However, the program location variable was discarded because no data other than pass rates were reported. Three additional significant predictors were the faculty-to-student ratio, availability of a mentoring program, and competitiveness among students. The significance of GPAs in the medical technologist exam and significant results in prosthetic certification exams are described next.
Predictors for passing the MT(ASCP) Board of Registry Examination and the CLS (NCA) examination for medical technologists were the final GPA and the program comprehensive exam score.44 Miro sought to determine if relationships existed between the ABC prosthetics certification pass or fail rates and seven variables, including written multiple-choice exams, written simulation exams, clinical patient management exams, gender, Carnegie ranking of the educational institution, and use of an extending credential. Credential extension occurs if the certified orthotist credential was obtained and the prosthetist (CP) credential was added to the professional credentials.45 Credential extension was the only variable in the study that was significantly associated with passing the ABC prosthetics certification. The nursing wound, ostomy, and continence certification exam variables of GPA, four course scores, three self-assessment scores, and a comprehensive exam score were studied by Beitz46 and found to be consistent with other certification exams for which the entry-level GPA and grades in coursework is the highest predictor for successfully passing the certification exam. One successful method found in online environments for exam prep was having virtual, facilitated study groups.47 Erickson48 identified an important point: Students struggled to pass certification exams if there are changes in requirements and course material that fail to cover the content of the new requirements on a timely basis. With the importance of high-quality content comes the discussion on the importance of assessing the quality of program faculty.
The National Asthma Educator Certification Board Inc. (NAECB) evaluated the professional competence of asthma educators and may be a useful model to benchmark in the future when assessing improvement in pass rates in the HIM profession.49 Shaw, Gordon, Howard, Maldonado-Daniels, McClain, and Pehrsson50 evaluated differences between respiratory therapy education programs with the best and worst outcomes in order to identify factors to predict success on certification exams. Weak program outcomes were found to indicate a misalignment of curriculum requirements with exam content. Using the COI model, program requirements for curriculum content should be aligned with exam testing content, and faculty are competent to teach the required content. Theoretically speaking, the COI model requires that students be fully engaged in learning the content. The paramedic certification exam study and one additional study note the importance of optimal preparation of students for certification exams.
For the paramedic certification exam, Fernandez, Studnek, and Cone51 found that programs at least 1.6 years long better prepared students to pass the certification exam. Jenkins, Greene, Moore, and Putnam52 identified a correlation between student scores on the mock exam and national certification exam at one university in the Southern United States. Because of a negative correlation of the time gap between taking the mock and certification exam, students were encouraged to attempt certification exams soon after completing coursework and the mock exams. All students may not have the opportunity to complete a mock exam prior to taking the certification exam.
This literature review provided valuable information about factors related to passing HIM and related health and education credentialing exams. GPA commonly predicted success on summative certification exams for several disciplines,53-62 In addition, other important factors have been tested for related healthcare certifications. Review of HIM literature results were similar to other discipline credentialing exam success factors for the RHIA exam, but the list is much smaller and included the following student factors: cumulative GPA and HIM course grades. Education program factors for success were online course delivery and use of mock exams. Based on comparisons of healthcare factors for success and HIM factors that have been tested, several variables have not been tested for predicting success on the RHIA and CHDA credentialing exams being studied.
Methodology
A multi-step systematic review process was used to identify empirical and theoretical studies in peer-reviewed publications. Prior to searching the literature, a strategy and methods protocol for the review was created to increase the validity and merit of the research process, to reduce the risk of bias, to promote completion of a systematic process, and to improve the reliability and usefulness of the review by others.63 The key objective was to conduct a comprehensive search of literature to identify sources that contained success factors for passing certification exams that might be generalized to HIM certification exam pass rates. Qualitative peer-reviewed literature that contained evidence-based education or health care certification exam success factors were included. The initial search of literature included key education and health/medicine databases: Education Research Complete, the Education Resources Information Center (ERIC), MAS Ultra School Edition, PsychINFO, SocINDEX, CINAHL Complete, Health Source, Medline, PubMed Central, and SAGE Journals. ProQuest theses or dissertations were also included, as well as hand-searched HIM peer-reviewed professional journals and internet publications. A meta-analysis quantitative approach was used. Quantitative research designs using logistic regression to identify factors that predict passing a certification exam were used in many of the studies referenced here. Results were synthesized in a detailed matrix that identified gaps in the literature.
Data Collection
A comprehensive search strategy was used. This systematic process searched key words including credential exam success factors within subject headings of education and health/medicine. Literature not related to health care or education disciplines, not available in English, published prior to 1990, or duplicates were excluded. Initially, the title for each item was screened for relevance, and non-relevant articles were excluded from further review. Next, the abstracts were briefly reviewed to assess whether each study addressed the factors for success on a certification exam. Articles not meeting criteria were excluded from further review without being tracked further. A flowchart of the selection process was created utilizing the PRISMA flowchart method (Figure 2) to enable process replication.64,65
Abstracts that aligned with study intent were entered into a matrix66 with a chronological number assigned to each item. The matrix contained the article number, citation, abstract, positive or negative factors identified, certification exam category, and ranking for each item. The ranking of 1 indicated the item aligns well with this review, while items with designations of 2 or 3 were not utilized here. Finally, the matrix was sorted using the ranking column. Items with a ranking of 1 were utilized to create a detailed matrix. Columns in the detailed matrix included the author-assigned article number, discipline or profession, positive variables, negative variables, certification exam, and categorization of entry-level or advanced practice exam type for data analysis.
Data Analysis
After separating the 42 articles included for review into categories by discipline, a listing was compiled of positive and negative statistically significant findings. Each variable was then further categorized into student or program level variables.
Results
The listing below contains student-level (Table 3) and program-level (Table 4) variables that have been found to predict passing credentialing exams in several other healthcare professions, related most commonly to clinician credentials. Student factors for passing certification exams included: admission GPA, cumulative GPA, overall GPA, course grades, completion of preparatory courses, age, grit, deliberately practicing, participating in a mentoring program, and GRE performance. The following education program factors have been recognized as strongly associated with certification success: online program delivery; periodic testing; mock or comprehensive exams; availability of preparatory courses; availability of formal student mentoring programs; and credential extension.
Many studies have been completed by healthcare and education disciplines to identify factors for improving pass rates on high-stakes certification exams. The studies focus primarily on students, faculty, or program variables that result in identification of an almost equal number of negative and positive factors for success. The HIM profession has few published studies found during the literature review to inform strategies for increasing pass rates. The HIM profession publishes practice briefs created from HIM professional consensus in addition to research articles. Advancement of the profession is dependent on developing a larger quantity of researchers contributing to growth of the HIM body of knowledge and development of professionals willing to undertake education to acquire new skill sets to meet evolving workforce needs. The timing for development of doctoral degrees in the HIM profession is both optimal and critical to success.
This systematic review provides other healthcare and education professional disciplines an opportunity to review results from several disciplines holistically. This study identifies factors for further study and suggestions for strategies for improving certification pass rates. This study is informed by several studies; few are systematic reviews.
The Health Information Management Education Conceptual Framework67 was identified during the systematic review screening process. The framework describes four components for achieving success on the RHIA certification exam, including curriculum, students, faculty, and resources.
This study identifies several important factors for successfully passing certification exams that have not been studied in the HIM field. HIM professional study results indicate positive relationships between student cumulative GPA, overall GPA, online course delivery, and completion of a mock exam with passing the RHIA exam.68, 69 The Prosthetic Association70,71 and National Asthma Education72 studies show positive results for having a prior credential. The RHIA study that follows will analyze whether having a prior credential in HIM is a positive factor for helping students to successfully pass the exam.
Future research should explore the gaps in the HIM literature that indicate potential for empirical investigation, including periodic testing; comprehensive exam scores; completion of prep courses; impact of failing one key major course; and student variables, including grit, age, credentialing extension, participation in formal mentoring programs, and the impact of test anxiety.
Research showing the number of academic programs utilizing a mock exam or prep course, supporting materials, and correlation with corresponding pass rates could inform strategies for all academic programs in HIM. Students in the academic setting are allowed to ask for American with Disabilities Act (ADA) accommodations for additional test-taking time, etc., to better support student needs. Students that need the additional time or breaks may not be asking for the needed accommodation when sitting for the credentialing exam, even though the testing organization does allow testing accommodations. A study evaluating the student awareness and use of ADA accommodations would better inform academic programs’ need for better informing students of the availability of ADA accommodations during the exam.
The use of annual year-end exams could be utilized and studied to determine if student content retention increases and impacts passing HIM credentialing exams or levels of test anxiety. Another important student support mechanism in the academic setting is the advisor. The advisor is an influential formal mentor while the student is taking classes, but that level of one-on-one mentorship may decline after a student graduates. Assessing how and whether students remain in contact with the academic program after graduation may be vitally important in the student’s decision about when and whether to take the certification exam.
Suggested future research opportunities include the following listing: student skills and comfort with computers; financial impact of earning the credential versus not passing the exam; demographic and academic variables; comprehensive exams; effect of prior healthcare or related work experience; effect of having other certifications, such as the RHIT; time students wait to take the exam after graduation; longitudinal study to measure perceptions and validation of the profession; number of professional practice hours; perceived value of learning using asynchronous discussion posts; impact of working in HIM-related function while attending school; demands on full-time faculty other than teaching; pre-admission standards; analysis of performance on individual questions on the credentialing exam to assess alignment with workforce needs and curriculum requirements; comparison between RHIT and RHIA program resources, faculty, and curriculum; analysis of decline in RHIA first-attempt numbers; meaningful analysis and comparisons between more variables; motivation for testing (i.e., job requirement); exam preparation methods; overall high school GPA; student perception of competency; assessment instruments used by programs to measure student learning; and mobile applications for test preparation.
With the vast amount of knowledge gained from other healthcare and education studies, it would be timely to re-evaluate pass rates for students attending school part-time that extend program completion time to understand the time period pertinent content can reasonably be retained without sacrificing passing the exam. The combination of online delivery methodology and part-time student status should be studied to assess additional factors at play in passing certification exams (i.e., student engagement). Evaluating the types of professionals undertaking certification exams and the corresponding highest level of academic education, credentials held, years of experience in the field, and jobs they perform would provide insight into first-time test-takers. Thus, the present study can begin to fill gaps in literature related to successfully passing HIM, healthcare, and education certification exams.
Limitations
This study is limited to identifying success factors for passing certification exams in healthcare and education using peer-reviewed publications available through 87 education and health/medicine databases and 62 other articles and journals available through the University of South Dakota Libraries and published between 1990 and the present. There may be additional success factors that have been identified by other professions. Future strategies may include using a broader base of professions in the systematic review to identify additional success factors. Results of the systematic review can only be generalized to healthcare and education-related professions that utilize certification exams.
Notes
1. Gardner, Rebekah L., Emily Cooper, Jacqueline Haskell, Daniel A. Harris, Sara Poplau, Philip J. Kroth, and Mark Linzer. “Physician stress and burnout: the impact of health information technology.” Journal of the American Medical Informatics Association 26, no. 2 (2019): 106-114.
2. Liu, Caihua, Amir Talaei-Khoei, Didar Zowghi, and Jay Daniel. “Data completeness in healthcare: a literature survey.” Pacific Asia Journal of the Association for Information Systems 9, no. 2 (2017): 5.
3. Lowry, Svetlana Z., David Brick, Michael C. Gibbons, Paul Latkany, Svetlana Z. Lowry, Emily S. Patteron, Sandra Spickard Prettyman, Mala Ramaiah, Debora Simmons, and Sheryl Taylor. Technical evaluation, testing, and validation of the usability of electronic health records: empirically based use cases for validating safety-enhanced usability and guidelines for standardization. US Department of Commerce, National Institute of Standards and Technology, 2015.
4. Orlova, A. “Addressing Data, Information, and Record Quality Challenges Through Standards.” Journal of AHIMA 87, no. 10 (2016): 64.
5. Ratwani, Raj M., Erica Savage, Amy Will, Ryan Arnold, Saif Khairat, Kristen Miller, Rollin J. Fairbanks, Michael Hodgkins, and A. Zachary Hettinger. “A usability and safety analysis of electronic health records: a multi-center study.” Journal of the American Medical Informatics Association 25, no. 9 (2018): 1197-1201.
6. Cyganek, Bogusław, Manuel Graña, Bartosz Krawczyk, Andrzej Kasprzak, Piotr Porwik, Krzysztof Walkowiak, and Michał Woźniak. “A survey of big data issues in electronic health record analysis.” Applied Artificial Intelligence 30, no. 6 (2016): 497-520.
7. AHIMA. RHIA® Certification. Chicago: AHIMA, 2019. Accessed July 31, 2019. http://www.ahima.org/certification/RHIA.
8. AHIMA. CHDA® Certification. Chicago: AHIMA, 2019. Accessed July 31, 2019. http://www.ahima.org/certification/CHDA.
9. Ibid.
10. Ellis, Renita P. “Perceived value of certification and the Registered Health Information Technician credentialing examination.” PhD diss., Capella University, 2015.
11. Schwab, Klaus, and R. Samans. “The future of jobs report 2018.” In World Economic Forum, Geneva. 2018.
12. AHIMA. Certification Exam Activity Pass Rates 2013-2015. Chicago: AHIMA, 2016. Accessed April 20, 2016. http://www.ahima.org/certification/cchiim.
13. AHIMA. Certification Exam Activity Pass Rates 2016-2018. Chicago: AHIMA, 2019. Accessed July 15, 2019. http://www.ahima.org/certification/cchiim.
14. Ibid.
15. Ibid.
16. Ibid.
17. Ibid.
18. Ibid.
19. Ibid.
20. Garrison, D. Randy, Terry Anderson, and Walter Archer. “Critical inquiry in a text-based environment: Computer conferencing in higher education.” The internet and higher education 2, no. 2-3 (1999): 87-105.
21. Dolezel, Diane, and Alexander McLeod. “The Odds of Success: Predicting Registered Health Information Administrator Exam Success.” Perspectives in Health Information Management 14, no. Winter (2017).
22. McNeill, Marjorie H., and Lantry L. Brockmeier. “Relationships between academic program variables and success on the registered health information administrator certification examination.” Perspectives in Health Information Management/AHIMA, American Health Information Management Association 2 (2005).
23. Condon, Jim, and Amanda Barefield. “Assessment of success on the RHIA certification examination: a comparison of baccalaureate program graduates and postbaccalaureate certificate program graduates.” Perspectives in health information management/AHIMA, American Health Information Management Association 9, no. Fall (2012).
24. Ibid.
25. Farroll, Jennifer C. “Grit, deliberate practice, and athletic training education: Factors that determine board of certification exam success.” PhD diss., Union University, 2016.
26. Smith, Martin J. “Developing a Prototype for an RHIT Exam Preparation Tool.” PhD diss., The College of St. Scholastica, 2012.
27. Ibid.
28. Ibid.
29. Noblin, Alice, Amanda Walden, and Shelly C. Safian. “Value of In-Person Exam Preparation Workshops in Obtaining an AHIMA Credential.” Educational Perspectives in Health Informatics and Information Management Fall (2015).
30. Ibid.
31. McNeill, Marjorie H. “Does administering a comprehensive examination affect pass rates on the Registered Health Information Administrator certification examination?.” Journal of allied health 38, no. 4 (2009): 208-214.
32. Gohara, Sabry, Joseph I. Shapiro, Adam N. Jacob, Sadik A. Khuder, Robyn A. Gandy, Patricia J. Metting, Jeffrey Gold, and James Kleshinski. “Joining the conversation: Predictors of success on the United States Medical Licensing Examinations (USMLE).” Learning Assistance Review 16, no. 1 (2011): 11-20.
33. Gullo, Charles A., Michael J. McCarthy, Joseph I. Shapiro, and Bobby L. Miller. “Predicting medical student success on licensure exams.” Medical Science Educator 25, no. 4 (2015): 447-453.
34. Ibid.
35. Schmitz, Travis W., and Jessica H. Bailey. “Determining Predictors of Success on the American Board of Anesthesiology Written Certification Exam.” Medical Science Educator 24, no. 2 (2014): 195-200.
36. Schengel, Jonna K. Predicting performance on the Physical Therapist Assistant licensure examination. University of the Pacific, 2014.
37. Hoversten, Mary. Predictors of success on the national certification examination for graduate nurse anesthetists. University of South Dakota, 2011.
38. Ibid.
39. Ibid.
40. Ibid.
41. Farashah, Ali Dehghanpour, Janice Thomas, and Tomas Blomquist. “Exploring the value of project management certification in selection and recruiting.” International Journal of Project Management 37, no. 1 (2019): 14-26.
42. Beeman, Pamela Butler, and Julie Keith Waterhouse. “NCLEX-RN performance: Predicting success on the computerized examination.” Journal of Professional Nursing 17, no. 4 (2001): 158-165.
43. Atsawarungruangkit, Amporn. “Relationship of residency program characteristics with pass rate of the American Board of Internal Medicine certifying exam.” Medical education online 20, no. 1 (2015): 28631.
44. Faubion, Debra A. “Predicting success on national certification examinations in medical technology.” (1994): 0217-0217.
45. Miro, Rebecca M. “Predictors of Success on the Prosthetics Certification Examination.” (2014).
46. Beitz, Janice M. “Predictors of success on wound ostomy continence nursing certification board examinations: A regression study of academic factors.” Journal of Wound Ostomy & Continence Nursing 39, no. 4 (2012): 377-381.
47. Chaplock, Sharon Kayne. An exploration of virtual study groups used to prepare candidates for a professional certification exam. Marquette University, 2011.
48. Erickson, Mary Ann. “Contributors to first-time success on the National Athletic Trainers Association Board of Certification Exam as perceived by candidate sponsors.” (1999): 2417-2417.
49. Kain, Karen. “Determining characteristics that increase success on the national asthma educator certification exam.” (2010).
50. Shaw, Karen Lightbody. “Credentialing Success in Respiratory Therapy Education: Revisiting Bourdieu’s Concepts of Field and Capital.” (2012).
51. Fernandez, Antonio R., Jonathan R. Studnek, and David C. Cone. “The Association Between Emergency Medical Technician‐Basic (EMT‐B) Exam Score, Length of EMT‐B Certification, and Success on the National Paramedic Certification Exam.” Academic Emergency Medicine 16, no. 9 (2009): 881-886.
52. Jenkins, Neisa R. “Mock and National Examinations Correlations in a Health Information Associate Degree Program.” PhD diss., Walden University, 2013.
53. Bedno, Sheryl A., Michele A. Soltis, James D. Mancuso, Daniel G. Burnett, and Timothy M. Mallon. “The in-service examination score as a predictor of success on the American Board of Preventive Medicine certification examination.” American Journal of Preventive Medicine 41, no. 6 (2011): 641-644.
54. Bruce, Scott L. “Prediction modeling for graduate athletic training education programs.” (2014).
55. Bruce, Scott L., Elizabeth Crawford, Gary B. Wilkerson, David Rausch, R. Barry Dale, and Martina Harris. “Prediction modeling for academic success in professional master’s athletic training programs.” Athletic Training Education Journal 11, no. 4 (2016): 194-207.
56. Novalis, Sharon D., Jill M. Cyranowski, and Cathy D. Dolhi. “Passing the NBCOT examination: Preadmission, academic, and fieldwork factors.” The Open Journal of Occupational Therapy 5, no. 4 (2017): 9.
57. Desmarais, Linda, Margaret A. Woble-Valenski, and Eric Oestmann. “Factors influencing physical therapist assistant licensure examination success.” Journal of Physical Therapy Education 25, no. 2 (2011): 36-41.
58. Ibid.
59. Esparza, Shandra Dawn. Predictors of success on professional credentialing examinations of athletic training undergraduates. Walden University, 2012.
60. Krieger, Oscar H. Cognitive Development, Motivation, and Grade Point Average as Predictors of Success on the Board of Certification Exam for Athletic Trainers. Aurora University, 2014.
61. Middlemas, David A., James M. Manning, Linda M. Gazzillo, and John Young. “Predicting performance on the National Athletic Trainers’ Association Board of Certification examination from grade point average and number of clinical hours.” Journal of Athletic Training 36, no. 2 (2001): 136.
62. Ibid.
63. PLoS Medicine Editors. “Best practice in systematic reviews: the importance of protocols and registration.” PLoS Med 8, no. 2 (2011): e1001009.
64. Moher, David, Alessandro Liberati, and Douglas G Altman. “Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The PRISMA Statement.” Annals of Internal Medicine 151, no. 4 (2009): 264
65. Nagendrababu, V., H. F. Duncan, I. Tsesis, C. Sathorn, S. J. Pulikkotil, L. Dharmarajan, and P. M. H. Dummer. “PRISMA for abstracts: best practice for reporting abstracts of systematic reviews in Endodontology.” International endodontic journal 52, no. 8 (2019): 1096-1107.
66. Garrard, Judith. Health sciences literature review made easy. Jones & Bartlett Learning, 2016.
67. Ibid.
68. Ibid.
69. Ibid.
70. Ibid.
71. Ibid.
72. Ibid.
Author Biographies
Renae Spohn, PhD, MBA, RHIA, CPHI, CPHQ, FAHIMA, FNAHQ, is the director of HIM programs and coordinator of the MSHIIM Program at Dakota State University.
William Schweinle III, PhD, is a biostatistician professor at the University of South Dakota.
Patti Berg-Poppe, PhD, is a chair and physical therapy professor at the University of South Dakota.
Carole South-Winter, EdD, is an assistant professor at the University of South Dakota.
David DeJong, EdD, is the division chair of educational leadership at the University of South Dakota.