Predictors of Success on the RHIA Exam

By Jennifer L. Peterson, PhD, RHIA, CTR, and James F. Turley, MS

Abstract

The ultimate goal for most health information management (HIM) program graduates is successful passage of the Registered Health Information Administrator (RHIA) exam. As educators, it is our goal to successfully prepare our students for this endeavor. Past studies in this area have resulted in many recommendations for further research. The current study builds on this past research to provide further insight into predictors of graduate success on the RHIA exam. This study assessed variables impacting student success on the RHIA exam using data from students from one HIM academic program who graduated between 2014 and 2019. Variables included in the study were the dependent variable of the first-time RHIA exam score/pass or fail, and the independent variables of student mock exam score, time between graduation and examination, student self-report of English as a second language (ESL) status, introductory HIM course grade, introductory coding and intermediate coding course grades, overall GPA, and major GPA.

The study found that introductory HIM course grade, mock exam score, and time between graduation and examination were significant predictive factors in HIM graduate success on the RHIA exam. The study also revealed some interesting findings regarding student ESL status and exam success that merit further study. The results of this study provide educators with further insight into predictors of student success on the RHIA exam, as well as provide information that can be used by educators to aid in student RHIA exam success.

Keywords: RHIA exam, student success, RHIA exam passage rates, mock RHIA exam, English as a Second Language

Introduction

One of the goals for any health information management (HIM) educational program is for their students to successfully pass the Registered Health Information Administrator (RHIA) certification examination. The accrediting agency for HIM educational programs, the Commission on Accreditation for Health Informatics and Information Management Education (CAHIIM), requires that programs track RHIA exam pass rates. In addition, many HIM jobs require RHIA certification, students seek certification, and schools pride themselves on preparing students to pass this exam. At the current time, there is a shortage of credentialed allied health employees, so increasing the ability of students to successfully pass the RHIA exam will also serve the healthcare industry.

Program directors, faculty, and students are constantly working to find ways to adequately prepare students to insure they can successfully pass the RHIA exam. Are there ways to predict which students will pass and which won’t? Are there strategies that faculty can use to help students who may not be predicted to pass? What are the metrics that point to this? Answering these questions can enable students and faculty to work together so that students are prepared to successfully pass the RHIA exam.

While there is limited research in this area, a variety of studies have been carried out. In an attempt to predict what factors lead to student success, researchers have looked at grades, GPA, programmatic elements, socioeconomic issues, program format, and program offering of a mock RHIA exam. These studies have led to a variety of findings, at times even to inconsistent findings.

Background

An early study in this area was completed by McNeill and Brockmeier in 2005. This study looked at the scores of students who took the exam between 2000 and 2002 based on a variety of programmatic elements. These elements included program resources, expenditures, faculty characteristics, student-to-faculty ratios, curriculum, laboratory facilities, and professional practice hours.1 The study found that, although “it was expected that significant relationships and differences between an HIA program’s percentage of graduates passing the RHIA certification examination and program components would be found. This conclusion was not reached in this study.”2

Other earlier studies focused on the type of health information educational program. With the advent of increasing numbers of online programs, Russell, et. al., conducted a study focused on the performance of students in online versus face-to-face programs. While this study was designed to evaluate overall performance of students in the two formats, the researchers did find that there was a “statistically significant correlation between overall admission GPA and the RHIA certification examination score” for the combined groups.3 Interestingly enough, however, when they broke the online and on-campus groups apart, they found that this correlation did not hold for online students. This led to recommendations for further study into the differences between online and on-campus students.

Based on the growth of another type of HIM educational program officering, Condon and Barefield conducted a study focused on RHIA certification exam success between those completing a traditional baccalaureate program and those completing one of the new post-baccalaureate certificate programs. They found that there was no difference between the two groups in terms of RHIA certification exam success. They also found that the “amount of time between graduation and completion of the RHIA certification examination did not significantly impact graduates’ scores on the examination.”4 They further stated, however, that there was much conventional wisdom showing that the longer the time between graduation and examination, the lower the scores. They therefore recommended that, “graduates of both programs … still be encouraged to take the RHIA certification examination as soon as possible after graduation.”5 Their study led to recommendations for future study, including further review of other variables that could affect student exam success as well as the relationship between a mock exam and actual exam scores.

McNeill addressed the relationship between administration of a mock exam and student exam success in her study. She analyzed the RHIA exam pass rates of students from 46 schools, some who administered a mock examination and some who did not. She found that “the administration of a comprehensive examination before a student’s graduation did not make a significant difference on the HIA program’s pass rate.”6 Her recommendations for future study included a deeper look into the development of mock examinations as well as other student variables and their effect on exam pass rates.

In an attempt to look into this issue further, Condon completed a large study in 2013 that looked at a large number of variables and their effect on student success on the RHIA exam. Many of these variables, as opposed to being program variables, focused on student performance in the HIM program. Condon’s goal was to create a prediction model for RHIA exam success. He evaluated a number of variables, including HIM course grades, major GPA, demographics, mother tongue, and age at the start of the program. Condon found that certain course grades, major GPA, and mother tongue were significantly associated with exam scores. His predictive model for RHIA exam success was therefore based on age at start of program, core curriculum GPA, introductory and course grades, and mother tongue:

RHIA certification examination score = (age at start [.274]) + (core curriculum score [6.103]) + (Intro to HIM grade [7.875]) + (Coding grade [8.152]) + (mother tongue [-9.893]) + 21.650 (p. 76)

Based on Condon’s findings that student variables had an impact on the certification pass rate, he recommended future research focused on further student-related variables, including evaluation of the time between graduation and examination.7

Combining a focus on student variables as well as type of program, Dolezel and McLeod reviewed student success among those taking the exam between 2011 and 2013. They looked specifically at HIM course grades and cumulative GPA, a variety of demographic factors, and online versus on-campus formats.8 In their evaluation of student grades in specific courses as well as student cumulative GPA as potential predictors of first-time exam success, they found that cumulative GPA and health information technology course grade were associated with higher pass rates on the RHIA exam. They further found that “other variables did not add to the model’s predictive ability.”9 Dolezel and McLeod also found that online students had a much higher pass rate on the RHIA exam as compared to on-campus students (81 percent compared to 57 percent).10 Their explanation for this included the fact that many online students are older, working students who might be more experienced and who may be driven to achieve RHIA certification success. Based on the results of their research, Dolezel and McLeod also recommended that future studies evaluate the time from graduation to exam as well as delve deeper into the online versus on-campus issue. They also recommended including prior healthcare work experience.11

While a number of studies have found a variety of elements related to RHIA exam success and have identified some predictive elements, there is obviously further study needed to fine-tune these findings. A study is needed to combine a number of the elements that have been found to be predictive along with other variables that have been suggested for study. This study, therefore, has been designed to focus on select predictive elements as well as variables such as the time between graduation and the examination and the role that mock exams play in student RHIA exam success.

Methodology

In order to further assess variables impacting student success on the RHIA exam, a study was completed using a convenience sample of one school’s students graduating between 2014 and 2019. In order to assess the effect of the variables on the students’ passage of the RHIA exam, only students who took the RHIA exam were included. This resulted in inclusion of 83 total students.

Variables included in the study were the dependent variable of first-time RHIA exam score/pass or fail and independent variables: student mock exam score, time between graduation and examination, student self-report of English as a second language, introductory HIM course grade, introductory coding and intermediate coding course grades, overall GPA, and major GPA. Online versus on-campus setting was not evaluated, as all students included in this school’s program were enrolled in an on-campus program.

Following IRB review and approval, existing data was compiled for the students who took the RHIA exam. The data was analyzed using descriptive and inferential statistics to determine the correlation between the various independent variables and first-time RHIA exam score/pass or fail status.

Data Analysis

Data analysis was completed based on 83 total students from the school’s HIM program graduating between 2014 and 2019 who had taken the RHIA exam at the time of the study. The data was based on first-attempt results unless otherwise noted. Descriptive and inferential statistical analyses were used to determine relationships between the dependent and independent variables.

Mock Exam Score vs. First-Attempt Score

All final semester senior students were given a 180 question, four-hour mock RHIA exam during their final week in classes. Their scores on this mock exam were correlated with their scores on the actual exam. The correlation coefficient (R) for mock exam versus the actual exam score is 0.50. This indicates a positive relationship between mock exam score and first-attempt RHIA exam score. The scatter plot for this can be seen in Figure 1.

A review of the mock exam scores indicates that there is a divide between scores of 92 (51 percent) and below and 93 and above. Most students (23.5 percent) who scored 92 or below on the mock exam did not pass the RHIA exam on the first attempt. Likewise, most students (87.9 percent) who scored 93 or above on the mock exam did pass the RHIA exam on the first attempt.

An interesting subgroup of students in this group were four English as a second language (ESL) students. These students were some of the top performers on the mock exam but did not pass on their first attempt at the RHIA exam. Of these four, two subsequently passed the RHIA exam on further tries and two didn’t try again. ESL student RHIA exam success rate is discussed further below.

Time from Graduation to Exam Test Date and Exam Score

While students are encouraged to take the RHIA exam as soon as possible after graduation, some delay taking the exam for various reasons. These can include financial constraints, job searches, fear of the exam, or delaying until the exam is required for employment.

Table 1 shows the number of days graduates waited to take the exam, the average scores for each period, the number of graduates taking the exam during each period, and the pass rate for each period. The correlation coefficient (R) for the number of days from graduation to exam data and exam scores is -0.32, which indicates that a longer wait is correlated with lower scores. This can also be seen Figure 2. Figures 3, 4, and 5 further demonstrate the data regarding the time from graduation to exam date and the average scores, pass rate, and number of graduates taking the exam. As can be seen, average RHIA exam scores and pass rates decrease between zero and three months from graduation to exam date to 12-15 months from graduation to exam date.

Self Report of ESL and Exam Score

There were 9/83 (10.8 percent) students who self-reported that English was their second (or later) language. These students’ pass rate was analyzed to determine if ESL has an effect on RHIA exam pass rate. All of these students failed the RHIA exam on their first attempt. As noted previously, five of these nine students failed the RHIA exam in spite of scoring quite well on the mock exam. Of these nine students, three (33.3 percent) passed the RHIA exam on a subsequent attempt, one (11.1 percent) failed on subsequent attempts, and five (55.6 percent) did not retake the exam. To compare the overall RHIA exam performance of native English speaking and ESL students, a two-sample t-test was applied between these two groups assuming equal variances. It was noted that ESL students had lower RHIA exam scores (M = 277, SD = 25.8) than native English speaking students (M = 313, SD = 20.1, t(81) = -5.03, p = 2.92E-6), further demonstrating that ESL students tend to score lower on the RHIA exam.

Course Grades

In past studies, course grades in the introductory course as well as in coding courses have been found to be strongly positively correlated with RHIA exam scores12  In the current study, a review of student grades in the introduction to health information management, the introductory coding course, and the intermediate coding course also found positive correlations between course grades and RHIA exam scores.

There was a strong positive correlation between grade in the Introduction to HIM course and the RHIA exam score (0.51). It was noted that higher grades in the introductory HIM course were correlated with higher scores on the RHIA exam. As seen in other studies, higher student grades in the introductory coding course were correlated with higher scores on the RHIA exam at a correlation coefficient of 0.49.

There was a more moderate positive correlation for student grades in this class and scores on the RHIA exam (0.35). This may be related to the fact that student grades in this class tend to be higher overall than in the introductory course. As coding is a skill, once that skill is learned in the initial course, students tend to do better in subsequent courses. However, again it is noted that higher grades in this course are related to higher pass rates on the RHIA exam.

Student GPA

It was found that there was a relatively strong positive correlation between overall GPA and student score on the RHIA exam (0.49). As can be seen in Figure 6, higher student GPA was correlated with higher RHIA exam scores.

Multiple Regression Analysis

After analyses of individual variables, multiple regression analysis was completed to delve deeper into the relationships between variables and RHIA exam scores. The analysis led to a more fine-tuned understanding of these relationships.

Based on the fact that variables representing course performance and cumulative GPA were suspected of collinearity beyond that of independent metrics, the assumption of collinearity was first applied. Tolerances and variance inflation factors between these variables demonstrated that collinearity between them was not an issue.

Multiple regression analysis was then performed with the knowledge that grades for individual courses were not correlated with grades earned in others. The results of this analysis indicated that 65.35 percent of the variance could be explained by a regression performed on seven explanatory variables (R = 0.808, R2 = 0.654, Adj. R2 = 14.427, N = 64, p = 6.778E-11). The significance of this regression is well below the 0.01 level.

The overall regression equation is written as:

Score = 176.153 – 0.001 (Interval) + 0.496 (Mock) + 8.679 (Intro to HIM Grade) + 4.336 (Intro Coding Grade) + 5.562 (Intermediate Coding Grade) + 7.104 (GPA) – 31.417 (ESL status), with each variable being supplied in its proper unit of measure.

The impacts and significances of individual course grades and overall GPA were small in comparison to other variables (Intro Coding Grade, Coefficient = 4.336, p = 0.100; Intermediate Coding Grade, Coefficient = 5.561, p = 0.175; GPA, Coefficient = 7.104, p = 0.328) with the exception of students’ Introduction to HIM grade (Intro Grade, Coefficient = 8.679, p = 0.016). It was unexpected that overall GPA was the least significant and least influential of these variables given it is often used to gauge general academic performance.

The effect of the mock exam scores were both significant and impactful. While the effect appeared small due to the range of potential values for scores and the insignificance attributed to individual test points, the significance was clearer when interpreted as every additional point earned on the mock exam correlated with an additional 0.496 points on the actual exam (Mock, Coefficient = 0.496, p = 1.73E-4).

The effect of student ESL status, according to multiple regression analysis, was significant and impactful (ESL, Coefficient = -31.417, p = 4.434). Unlike other variables, ESL status is of a binary nature (false = 0, true = 1). While ESL students can have varying proficiency in the English language, in this analysis, this variable was analyzed only as ESL/non-ESL status.

Discussion

Analysis of the various metrics used in this study point to some predictors of success on the RHIA exam. It is clear that a higher score on the mock exam was correlated with a higher score on the actual exam. It was noted that there was a significant difference on pass rates on the RHIA exam between those students who scored 92 or below on the mock exam and those who scored above 92. This was further reinforced through the multiple regression analysis that found that each additional point earned on the mock exam correlated with an increase of approximately 0.5 points on the actual exam. This is a particularly meaningful finding because it allows faculty to discuss the student’s mock exam pass rate with them as well as best practices for preparation for the RHIA exam. Students who score 92 or below on the mock exam should be encouraged to plan for extra preparation in order to succeed on the RHIA exam.

It was also noted that the pass rate was much higher for students who took the exam within the first few months after graduation. Students who wait to take the RHIA exam 12-15 months following graduation have a significantly lower pass rate. Again, this is helpful information for faculty, as it provides statistical data that can be provided to students to encourage them to take the exam sooner, rather than later, after graduation. It was noted that, in this study, the majority of new graduates took the RHIA exam within one year of graduation.

In other studies, certain course grades were found to be predictive for success on the RHIA exam. This was partially confirmed in this study. Course grades in the Introduction to HIM course as well as the introductory and intermediate coding classes, were found to be correlated with higher scores on the RHIA exam. However, multiple regression analysis showed that the impact of individual course grades for the two coding courses were less significant than other variables. The students’ Introduction to HIM grades, however, were a significant predictor of RHIA exam success.

While it was expected that student overall GPA would be a significant predictor, GPA was found to be the least significant variable. It was found that there was a positive correlation between student GPA and RHIA exam score; however, multiple regression analysis found that overall GPA was the least significant and influential variable.

Self-reported ESL students were found to be an interesting subgroup in this study, and analysis of this group led to one of the most significant findings. All of the ESL students included in this study failed the RHIA exam on their first attempt, even though more than half of them scored quite well on the mock exam. In addition, multiple regression analysis identified that student ESL status was quite significant in terms of lack of RHIA exam success.

This is significant in that it seems to point to ESL as being a strong indicator in the pass/fail rate for the exam. In addition, the fact that more than half of these students did not make further attempts to pass the RHIA exam shows that the students lacked the confidence, financial resources, or desire to further pursue the RHIA credential. There are many issues that enter into ESL student success on standardized tests. Faulker-Bond and Sireci state “tests in any subject inevitably end up being partial measures of [English] language proficiency.”13 Beyond basic translation barriers, other issues include test format familiarity and understanding of questions. Multiple-choice questions, such as those used on the RHIA exam, are not used in many parts of the world, and many ESL students are not familiar with and do more poorly on these types of exams.14 The findings in this area are concerning and this is an area that needs to be studied further. More widespread data should be collected to insure there is no bias against ESL students on the RHIA exam.

While this study was limited to analysis of students in one academic program, the fact that six years of student data was analyzed increase the validity of the study. Further research utilizing student data from other schools could provide additional valuable insight into determinants of RHIA exam success and could strengthen the generalizability of these findings.

Conclusions

Based on the data collected and analyzed in this study, it can be concluded that performing well in the Introduction to HIM course, scoring well on the mock exam, and taking the RHIA exam within the first three to six months following graduation are predictors for higher success on the RHIA exam.

However, more research is needed to delve deeper into these success predictors. What factors help determine success on the mock exam? What factors enter into a student’s decision to take the exam earlier or later following graduation? How can we, as educators, help our students succeed in the predictive areas, from the introductory course through the mock exam?

In addition, this study raised significant concerns about ESL student success on the RHIA exam. This is an important area that should be studied further. ESL student success in a HIM educational program should be more closely correlated with success on the RHIA exam. Further research into this area of inconsistency should be completed.

Finally, this study elicited some predictive elements that can be utilized by health information program faculty to aid in graduate success on the RHIA exam. By reviewing such data and metrics, program faculty are better prepared to meet their program goals for student success on the RHIA certification exam, to ensure ongoing high student pass rates, and to produce students prepared to meet the needs of the healthcare industry.

Author Biographies

Jennifer Peterson, PhD, RHIA, CTR, is an associate professor and the program director of the Health Informatics and Management program in the Department of Health Sciences at Illinois State University.

James Turley, MS, is a master tutor in Tutoring Services at Heartland Community College.

References

1. McNeill, Marjorie H. & Lantry L. Brockmeier. “Relationships between Academic Certification Examination.” Perspectives in Health Information Management 3. (2006): para. 4.

2. McNeill, Marjorie H. & Lantry L. Brockmeier. “Relationships between Academic Certification Examination.” Perspectives in Health Information Management 3. (2006): para. 25.

3. Russell, Barbara L., et. al. “Evaluating Distance Learning in Health Informatics Education.” Perspectives in Health Information Management 5 (2008): 6.

4. Condon, Jim & Amanda Barefield. “Assessment of Success on the RHIA Certification Examination: A Comparison of Baccalaureate Program Graduates and Postbaccalaureate Certificate Program Graduates.” Perspectives in Health Information Management 9 (2012):

5. Condon, Jim & Amanda Barefield. “Assessment of Success on the RHIA Certification Examination: A Comparison of Baccalaureate Program Graduates and Postbaccalaureate Certificate Program Graduates.” Perspectives in Health Information Management 9 (2012): 6.

6. McNeill, Marjorie H. “Does Administering a Comprehensive Examination Affect Pass Rates on the Registered Health Information Administrator Certification Examination?” Journal of Allied Health 38, no. 4 (2009): 31.

7. Condon, Jim. “Predicting Registered Health Information Administrator Examination Scores” Electronic Theses and Dissertations. 807. (2013): 96-97.

8. Dolezel, Diane & Alexander McLeod. “The Odds of Success: Predicting Registered Health Information Administrator Exam Success.” Perspectives in Health Information Management (2017): para. 13.

9. Dolezel, Diane & Alexander McLeod. “The Odds of Success: Predicting Registered Health Information Administrator Exam Success.” Perspectives in Health Information Management (2017): para. 33.

10. Dolezel, Diane & Alexander McLeod. “The Odds of Success: Predicting Registered Health Information Administrator Exam Success.” Perspectives in Health Information Management (2017): para. 36.

11. Dolezel, Diane & Alexander McLeod. “The Odds of Success: Predicting Registered Health Information Administrator Exam Success.” Perspectives in Health Information Management (2017): para. 41-42.

12. Condon, Jim. “Predicting Registered Health Information Administrator Examination Scores” Electronic Theses and Dissertations. 807. (2013): 96-97.

13. Faulkner-Bond, Molly & Stephen G. Sireci. “Validity Issues in Assessing Linguistic Minorities.” International Journal of Testing 15 (2015): 115.

14. Oliveri, Maria Elena, Kadriye Ercikan, & Bruno Zumbo. “Analysis of Sources of Latent Class Differential Item Functioning in International Assessments.” International Journal of Testing 13 (2013): 273.

Posted in:

Leave a Reply