The Self-Assessment Process and Impacts on the Health Information Management Program Performance: A Case Study

by Renae Spohn, MBA, RHIA, CPHQ, FAHIMA, FNAHQ

Abstract

This study examined how health information management (HIM) educational programs can use the Malcolm Baldrige National Quality Award Model (MBNQAM) educational criteria to meet the self-assessment requirement for Commission on Accreditation for Health Informatics and Information Management Education (CAHIIM) accreditation. An existing instrument, Quantum Performance Group’s Organizational Assessment Survey authored by Dr. Mark Blazey, was used in this study. The instrument was designed to self-assess the entire organization. Results of the study demonstrate how the MBNQAM can be used to successfully self-assess HIM programs. This research adds to the body of literature surrounding the application of the MBNQAM for HIM programs and provides new information to deans, administrators, and educators that may be useful, as an added component, when self-assessing HIM programs. The results of this study will help to establish a foundation for HIM programs to strengthen the self-assessment process, providing a strong starting point for strategic planning prioritization for HIM program improvement initiatives. The improved process will help in maturing the HIM program while fulfilling accreditation requirements for self-assessment. As additional HIM programs formalize the self-assessment process, benchmarking opportunities with other HIM programs will be created.

Keywords: self-assessment in universities, self-assessment in education, performance excellence in education, education performance, quality models in education

Introduction

Ongoing assessment of health information management (HIM) program customer requirements and outcomes is critical to ensure that graduates are well prepared for both the current and the new workforce roles in HIM and health information technology (IT) that are evolving as healthcare organizations implement and re-implement electronic health records.1, 2 As accrediting agencies and employers continue to raise the bar and place more accountability on higher education institutions,3 HIM programs must ensure that graduates can quickly perform in both new and traditional HIM professional roles.

While HIM programs accredited by the Commission on Accreditation for Health Informatics and Information Management Education (CAHIIM) are currently required to self-assess the program,4 this study tested how a quality improvement framework can be integrated into the self-assessment process. In particular, this study examined how HIM educational programs can use the Malcolm Baldrige National Quality Award Model (MBNQAM) educational criteria5 to meet the self-assessment requirement for CAHIIM accreditation.6 An existing instrument, Quantum Performance Group’s Organizational Assessment Survey authored by Dr. Mark Blazey, was used in this study.7 The instrument was designed to self-assess the entire organization.

The results of this case study indicate that the self-assessment process can be enhanced by incorporating input from additional internal and external stakeholders using a standardized survey instrument in addition to tracking and reporting important program outcomes, which is the usual method of meeting the self-assessment requirement. Participation in benchmarking of other HIM programs could serve as a driver to strengthen program outcomes and create a platform for quality recognition for outstanding program performance in the future. Measuring outcome measures on an ongoing basis is already a key part of the self-assessment process in place at the university included in the case study.

Background Information and Literature Review

Many HIM program graduates are being hired into roles outside of the traditional walls of the HIM department and work in an environment that utilizes their critical thinking skills.8

HIM programs must assess and adapt to the evolving market needs for HIM professionals on a continuous basis by monitoring both the local and national market needs.9 While conducting internal program self-assessments is a requirement of CAHIIM accreditation, HIM program directors must also monitor and understand the needs of employers to identify improvements that will strengthen program outcomes. Once these needs are identified, HIM program directors and faculty must serve as transformational leaders who make curriculum changes to affect program outcomes.

Traits and characteristics of transformational leaders in higher education include the following: competency in knowledge, leadership skills, technical expertise, authenticity, communication to stakeholders on vision, purpose and values, establishment of an environment of trust and energizing organization, the ability to push back on historical culture, commitment to quality, passion, intensity, and persistence.10

Challenges for higher education include meeting customer demands for a high-quality education at a low cost. Obstacles for higher education to overcome include an inadequate assessment of whether the promise to educate is being fulfilled.11 While challenges for higher education are ongoing, the self-assessment process offers a mechanism for HIM programs to implement continuous improvement. Good practices that are recommended to have in place are to conduct business process reengineering, disclose learning metrics through dashboards, and internally continue the practices of course redesign on an ongoing basis.12

Organizational improvement models were compared through a literature review for potential use in this study. The models reviewed included the following: ISO 9000:2000, Deming Prize, the European Foundation for Quality Management (EFQM) model, and the MBNQAM.13 The models historically used in higher education include ISO 9000:2000, the EFQM model (a process-based model), and MBNQAM (a product-based model).

The goals of MBNQAM are to deliver ever improving value to students and other stakeholders,14 which are consistent with the goals of the self-assessment process. In one study, the MBNQAM education criteria for performance excellence was tested and validated in 15 universities and colleges in the United Arab Emirates, with results of regression analysis and confirmatory structural equation modeling showing that all of the hypothesized causal relationships in the Baldrige model are statistically significant.15 The EFQM model is commonly used in Europe, while MBNQAM is commonly used as the standard in the United States.16 Both models were found to produce similar results.17

One surprising limitation identified during the literature review was that educational institutions implement quality management systems without knowing how to fully manage them.18 Some additional challenges regarding traditional performance assessment in higher education include the following: the use of slow and retarded standards, lack of strategic links, complex implementation and performance, inflexibility, contradiction in accepting content improvement, neglect of customer needs and expectations, and leaders’ excessive concern for increasing profits and decreasing costs.19 Newer measurement systems being used are self-assessments or balanced scorecards.20

According to the EFQM, quoted in a study published in the Quality Management Journal, self-assessment is defined as a “cyclic, comprehensive, systematic, and regular review of an organization’s activities and results against a model of business excellence . . . culminating in planned improvement actions.”21

Self-assessment has been linked to better performing organizations. It takes training and time, but its benefits include the identification of strategic direction, alignment of quality processes and activities across an organization, development of short-term targets, linkage of quality projects and measures to the strategic plan, and better focus to actually produce improved organizational results.22 Some organizations using self-assessment have also reported greater returns on sales than organizations not using self-assessment.23

One study identified in the literature review described how Northwest Missouri State University elevated the organizational balanced scorecard concept to a higher level and applied it deeper into the organization at the individual faculty level by implementing a personal dashboard for faculty. Northwest Missouri State University is a two-time state award winner and two-time finalist for the Malcolm Baldrige National Quality Award (MBNQA). The university used 100 performance indicators that focused employees on improving individual performance continuously.24

The MBNQA process is managed by the National Institute for Standards and Technology (NIST).25 The NIST website provided three self-assessment instruments26 and a link to the state alliance where additional contacts for self-assessment processes were located.27 The state alliance applicable for the university under study was the Performance Excellence Network (also known as PEN or the Minnesota Council for Quality)28 serving the tristate area of Minnesota, South Dakota, and North Dakota. The usual use of the self-assessment instrument is to prepare for future application for a state and/or national MBNQA or to incorporate the results into an application for a state and/or national MBNQA. The self-assessment measures the agreement between leaders and subordinates on whether the organization is progressing toward organizational performance excellence.29 The instrument used by PEN is delivered by the National Council for Performance Excellence and is copyrighted by Quantum Performance Group, Inc. The instrument was developed and validated by Mark Blazey, PhD.30 Self-assessment survey samples for education are available on the website of Quantum Performance Group, Inc.31

One important finding from the literature review was the need to incorporate self-assessment findings into the strategic planning process to ensure that alignment of priorities can be achieved. If results are not incorporated into an existing process, it is likely that they will not be acted on and that opportunities for improved outcomes will be missed.32

In addition, qualitative findings can be enhanced by allowing collaboration among survey participants, by allowing participants to provide more detail in the respondent comments, and by allowing participants to share the rationale for their thinking on the survey.33

A finding that was concerning was that high-level results might raise the potential need for further drill-down, indicating a need for further study to test the lessons learned.34 Another significant finding was the need for external professional coaching in quality management in educational institutions to enhance internal organizational improvement.35

Methods

The university HIM program being utilized in this single holistic case study offers a coding certificate, an associate degree, a bachelor’s degree, a master’s in health informatics degree, and a healthcare track in the doctorate of science degree. The HIM programs are blended, giving students the option of attending classes in person or via the Internet.

Participants

The case study budget for the self-assessment was $900, limiting the number of survey participants to a maximum of 25. Expanding the number of survey participants would have likely yielded unanswered questions because of the lack of sufficient familiarity with the HIM program and processes. The invited participants were divided into five categories so that the data could be analyzed and reported in meaningful combinations. The participant pool makeup was unique because it included the following five categories: university HIM program leaders, HIM staff and faculty, other important internal partners of the HIM program, external partners consisting of HIM program advisory board members, and external partners consisting of the highest-ranking HIM professional in organizations employing HIM program graduates. The participants from four of the five categories were selected on the basis of their role or frequency of contact with the HIM program faculty and students, while the HIM program advisory board members were selected by random sample using randomizer.com. The five participant groups for the survey were equal in size. The goal for survey participation was 100 percent given the small number of survey participants. It is unusual to collect data from external partners while conducting a self-assessment using this instrument.

Instrument

The copyrighted survey instrument includes 44 questions and is divided into seven categories. Each question was cross-referenced to the Malcolm Baldrige Quality Award criteria written by Dr. Mark Blazey.36 A six-point Likert maturity scale was used for each question with the following possible choices: not evident, beginning, basically effective, mature, advanced, and role model. Participants were also required to select the one question that indicated the most important opportunity for improvement from each category of the survey.

Procedures

The author worked with the third-party survey administrator, the National Council for Performance Excellence, to plan the online survey procedures. The survey administrator communicated with respondents directly through e-mail. All preparatory survey information and training aids were distributed by the administrator.

A plan for reporting and analyzing the data was designed by the author and the survey administrator. It was determined that given the small number of survey participants, the report would include the following breakdowns: overall results, segmentation by the five respondent categories (provided there were at least two respondents from each category), segmentation including internal employees vs. external partners, and a final segmentation by HIM program graduate or non–HIM program graduate.

The category for benchmarking with other organizations within the database was selected. Since no other HIM programs were represented in the database, the best benchmark that could be selected was other universities in the database that completed a self-assessment within the same year.

University IRB approval was obtained, and verbiage to include in the survey respondent communication was developed. The third-party survey administrator distributed a confidential user ID and password to the respondents with instructions for the survey and a handout including terms and definitions for use while completing the survey. Survey participants were informed of the approximate completion time of the survey, the fact that the survey was geared toward the HIM program and not the entire institution, and their right to opt out of the research study at any time.

Several e-mail reminders were sent to participants throughout the survey period by the survey administrator and by the author. Lists of e-mail addresses of participants who had not opened the survey were sent to the author on a daily basis for e-mail follow-up. Although the author monitored the respondents that had not completed the survey, survey responses could not be accessed by the author. The survey remained open for 16 days.

The author concurrently completed state-level Malcolm Baldrige Award evaluator training and served as a Performance Excellence Network quality award evaluator. The training and evaluator team participation experience was necessary to advance the author’s principal investigator skills in evaluating the self-assessment results and for future use when working with the HIM program faculty in selecting program improvement priorities.

Results and Discussion

The 44-question self-assessment maturity survey was completed online by 23 of the 25 survey recipients, resulting in a survey response rate of 92 percent.

Using a scatter diagram (see Figure 1), a bivariate analysis of the relationship of the maturity and priorities for improvement for all survey items was conducted to determine strengths and opportunities for improvement. After survey data were tabulated, each survey question number was plotted on the scatter diagram on the basis of the two variables—maturity and priority for improvement. The item numbers placed in the upper right quadrant of the diagram identify the greatest opportunities for improvement. The areas that show the greatest strengths are placed in the lower left quadrant of the diagram.

The mean score was placed on the center of the scatter diagram. The standard deviation was then calculated, and three deviations were plotted both vertically and horizontally on the diagram. The standard deviation calculations created a square in the middle of the diagram without statistically significant meaning. Items outside of the square represent statistically significant findings, with the greatest strengths and opportunities located near the outside borders of the graph on the upper right and lower left quadrants of the diagram.

The top five strengths that were identified are as follows:

  1. Identifying needed skills, competencies, and staffing levels and building a workforce to accomplish the organization’s work
  2. Listening to students and other customers to obtain feedback and actionable information
  3. Senior leaders promoting legal and ethical behavior
  4. Controlling operational costs and managing innovation
  5. Developing strategic plans

The top five opportunities for improvement are as follows:

  1. Knowledge management and learning
  2. Effectively communicating, engaging, and encouraging workers to take action to improve performance and create student and other customer value
  3. Aligning work, tracking progress, and making changes to action plans quickly
  4. Managing and resolving student and other customer complaints effectively and promptly
  5. Designing key educational programs, services, and work processes to meet requirements and deliver student and other customer value

The greatest opportunities for improvement in each of the six survey categories are summarized in Table 1. Twenty-six percent of the respondents identified promoting a high-performance work culture and a motivated workforce as an opportunity for improvement, while 35 percent responded that effectively communicating, engaging, and encouraging workers to take action to improve performance and create student and other customer value was an opportunity. Both of these opportunities are thought to be related to the strong need to ensure that students are well trained with skills to perform traditional HIM roles as well as the new roles.

Pareto charts were created for each category showing the highest opportunities for improvement.

Each chart displays the letters that correspond to the survey question along the x-axis. Along the y-axis both the counts and percentages are displayed. The bar graph displays the count of votes (hits) for each area and is arranged from the greatest to the least number of hits indicating the greatest need for improvement.

In the leadership category (see Figure 2), effectively communicating, engaging, and encouraging workers to take action to improve performance and create student and other customer value was perceived by respondents to be the greatest opportunity for improvement.

In the strategic planning category (see Figure 3), aligning work, tracking progress, and making changes to action plans quickly were identified as having the greatest opportunity for improvement.

The respondents identified managing and resolving student and other customer complaints effectively and promptly as the greatest opportunity for improvement in the customer focus category (see Figure 4).

Knowledge management and learning were identified as the greatest opportunity for improvement in the measurement, analysis, and knowledge management category (see Figure 5). The interpretation of this item is that sharing among co-workers to learn from each other can be improved at the university.

In the workforce focus category, the greatest opportunity was identified as the need to determine factors that affect workforce engagement, promoting a high-performance work culture and a motivated workforce (see Figure 6).

Determining key educational programs, services, and work processes to meet requirements and deliver student and other customer value was the greatest opportunity identified in the operations category (see Figure 7).

The results category indicated that student learning and process results, trends, levels, and comparisons were the greatest opportunities for improvement (see Figure 8). The importance of benchmarking on a routine basis and monitoring and plotting the results of HIM program data over time is what “results, trends, levels, and comparisons” refers to in the MBNQA criteria. In addition, ensuring positive program results is important.

Each of the seven categories also received a score, but that information was not utilized for the case study analysis because the HIM program did not complete an application for a state-level quality award, where that information is used to determine the award level.

The final report also included a bar chart for each survey item, additional bar charts that display segmented results, and comments from respondents for each item.

The self-assessment survey results were benchmarked (see Figure 9) against data from eight similar universities and colleges that were available in the database for comparison. Benchmarking utilizes every item in every category to calculate the percentage of possible points for each category. The results indicate that the HIM program participating in the self-assessment exceeded the maximum scores of the benchmark sites in six of the seven categories. All categories are listed across the bottom of the graph on the x-axis, while the y-axis contains the possible percentage of points—100 percent for each category. The triangle indicates the maximum percentage of points achieved by the organizations being utilized for benchmarking, the circle represents the minimum percentage of points, the square represents the mean percentage of points, and the actual bar itself represents the university’s percentage of points.

The survey process was completed as planned and demonstrated how the MBNQAM can be used as a part of the self-assessment process by HIM programs.

Prior to and throughout the self-assessment process, the principal investigator participated in training to become a volunteer state evaluator to learn to apply the criteria and to participate in a more in-depth analysis of the self-assessment results at another site. The experience is important because it prepares the individual analyzing the results of the self-assessment to be able to apply the methodology of the MBNQA criteria at a deeper level when selected a specific opportunity for improvement.

Limitations

With the required timeline to complete the case study in one semester, planning and implementation of actual improvements could not be carried out during the study.

The funding for the project allowed for only 25 participants in the survey, limiting the qualitative feedback from survey participants. While the number of participants was small, not a lot of additional potential participants were available with the depth of knowledge of the HIM program and processes to be able to provide meaningful input on the survey. Additional funding would allow more survey participants.

The terminology used on the survey is very specific and not easily understood without additional reading or training. Handouts to explain the terminology used in the survey were provided to survey participants prior to completion of the survey. Additional education about the MBNQAM in advance of completing the survey might have helped participants to better understand the terminology of the survey.

The expertise of external partners was limited on portions of the survey where questions related specifically to internal university processes or where the entire university’s process could not be separated from that of the HIM program.

The priorities identified in the survey results need to be further assessed to determine if they represent an organizational improvement that is needed or an improvement in the HIM program itself that is needed.

It is unclear whether the other universities used as benchmarks in this study represent adequate benchmarks for the HIM program. Not having additional HIM programs’ self-assessment results within the database limits the confidence in the benchmarking process.

While the author learned a great deal through training, additional training will need to be completed and/or external resources will need to be utilized by HIM program leaders to advance the overall understanding of the MBNQA criteria prior to applying for a state or national quality award. Specific action plans for the priorities identified in this case study will be identified and aligned with HIM program strategic priorities in the next year. Action plans will be implemented utilizing the university’s continuous quality improvement (CQI) model.

Further Areas for Research

The opportunities for improvement identified in this case study were not compared to those identified during the university’s own self-assessment of the HIM program. Further research could be conducted to evaluate the consistency of the results of the two processes.

External evaluators with expertise in the MBNQAM are utilized when organizations have applied for recognition of excellence. Expert evaluators were not utilized to verify and clarify strengths and opportunities identified through the survey process. Further research could be conducted to determine whether the validation of results utilizing external evaluators would yield results consistent with survey findings.

Conclusions

The case study results indicate a mature HIM program, scoring above benchmarks. While the participants’ comments yielded a small amount of feedback identifying opportunities for improvement in the HIM program, the top four strengths and opportunities were identified, thus demonstrating how the MBNQA self-assessment process can be utilized by HIM programs as a supplement to the current process.

Most of the opportunities for improvement identified organization-wide opportunities, so it is imperative to test a process improvement that can be utilized by the entire organization. The university’s model for improvement will be utilized to plan and implement the improvements selected by the HIM program faculty and leadership over the next few years using results from the case study. Once the improvements have been implemented, the HIM program self-assessment results can be compared by repeating the process used in this case study if funding is available. Improvement in outcomes will be demonstrated if the HIM program’s maturity level increases to an advanced or role model level in the self-assessment survey process.

In addition, continuing to serve as an evaluator creates opportunities for the author to continue to learn how other organizations apply the MBNQAM and to understand the criteria for excellence in greater depth. It also provides opportunities for collaboration and sharing with other organizations and learning best practices from them.

 

Renae Spohn, MBA, RHIA, CPHQ, FAHIMA, FNAHQ, is an assistant professor of health information management at Dakota State University in Madison, SD.

 

Notes

  1. HealthIT.gov. “Workforce Development Programs.” 2013. Available at http://www.healthit.gov/policy-researchers-implementers/workforce-development-program.
  2. US Department of Health and Human Services, Health Resources and Services Administration. “Health IT Safety Net Employers.” 2013. Available at http://www.hrsa.gov/healthit/workforce/safetyemployers.html.
  3. Fenton, S., E. Joost, J. Gongora, D. G. Patterson, C. Holly, A. Andrilla, and S. M. Skillman. “Health Information Technology Employer Needs Survey: An Assessment Instrument for HIT Workforce Planning.” Educational Perspectives in Health Informatics and Information Management (Winter 2013): 1–36.
  4. Commission on Accreditation for Health Informatics and Information Management Education (CAHIIM). CAHIIM Accreditation Manual 2014. Available at http://cahiim.org/Accreditation_Manual.html#selfassess.
  5. National Institute for Standards and Technology. “Malcolm Baldrige National Quality Award Framework 2013–2014 Education Criteria.” Available at http://www.nist.gov/baldrige/images/2013_2014_Education_Criteria_Framework.jpg.
  6. Commission on Accreditation for Health Informatics and Information Management Education (CAHIIM). CAHIIM Accreditation Manual 2014. Available at http://cahiim.org/Accreditation_Manual.html#selfassess.
  7. Blazey, M. “Quantum Performance Group Organizational Assessment Survey.” Available at http://quantumperformance.com/assessments.htm.
  8. Sharp, M., R. Reynolds and K. Brooks. “Critical Thinking Skills of Allied Health Science Students: A Structured Inquiry.” Educational Perspectives in Health Informatics and Information Management (Summer 2013). Available at http://eduperspectives.ahima.org/critical-thinking-skills-of-allied-health-science-students-a-structured-inquiry.
  9. College of Healthcare Information Management Executives. Demand Persists for Experienced Health IT Staff. 2012. http://www.cio-chime.org/chime/press/surveys/pdf/CHIME_Workforce%20_survey_report.pdf.

Ricketts, Thomas C. and Fraher, Erin P.  “Reconfiguring Health Workforce Policy So That Education, Training and Actual Delivery of Care are Connected.”  Health Affairs, Vol. 32, No.11, pp 1874-1880.

10. Basham, L. “Transformational Leadership Characteristics Necessary for Today’s Leaders in Higher Education.” Journal of International Education Research 8, no. 4 (2012): 343–48.

11. Broadhead, R. “Rebuilding the Public’s Confidence in Higher Ed.” Presented at the March 21, 2013, meeting of the Academic Council. Duke Today, March 21, 2013. Available at http://today.duke.edu/2013/03/rhbhighered.

12. Massy, W. Stretching the Higher Education Dollar: Initiatives for Containing the Cost of Higher Education. American Enterprise Institute Special Report 1, April 2013.

13. Dodangeh, J., N. Y. Rosnah, N. Ismail, Y. M. Ismail, M. R. Biekzadeh, and J. Jassbi. “A Review on Major Business Excellence Frameworks.” Technics Technologies Education Management 7, no. 3 (2012): 1386–93.

14. Grewal, D. “Theories and Models for Managing Excellence in Higher Technical Education among Disparities in India.” Presented at International Conference of Technology and Business Management, UOWD, Dubai, March 26–28, 2012.   .

15. Badri, M., H. Selim, K. Alshare, E. Grandon, H. Younis, and M. Abdulla. “The Baldrige Education Criteria for Performance Excellence Framework: Empirical Test and Validation.” International Journal of Quality and Reliability Management 23, no. 9 (2006): 1118–57.

16. Dodangeh, J., N. Y. Rosnah, N. Ismail, Y. M. Ismail, M. R. Biekzadeh, and J. Jassbi. “A Review on Major Business Excellence Frameworks.”

17. Tari, J., and I. Saizarbitoria. “The Self-Assessment Process and Impacts on Performance: A Case Study.” International Journal for Quality Research 6, no. 4 (2012): 343–54.

18. Nigsch, S., and A. Schenker-Wicki. “Shaping Performance: Do International Accreditations and Quality Management Really Help?” University of Zurich Business Working Paper Series, November 2012.

19. Jalaliyoon, N., N. Abu Bakar, and H. Taherdoost. “Implementation of Balanced Score Card and European Foundation for Quality Management Using TOPSIS Method.” International Journal of Academic Research in Management 1, no. 1 (2012): 26–35

20. Ibid.

21. Van Der Wiele, T., A. Brown, R. Millen, and D. Whelan. “Improvement in Organizational Performance and Self-Assessment Practices by Selected American Firms.” Quality Management Journal 7, no. 4 (2000): 8–22.

22. Jalaliyoon, N., N. Abu Bakar, and H. Taherdoost. “Implementation of Balanced Score Card and European Foundation for Quality Management Using TOPSIS Method.”

23. Van Der Wiele, T., A. Brown, R. Millen, and D. Whelan. “Improvement in Organizational Performance and Self-Assessment Practices by Selected American Firms.”

24. White, J. “Personal Dashboards: A Cutting Edge Faculty Performance Evaluation System.” Journal of College Teaching & Learning 2, no. 10 (2005): 73–77.

25. National Institute for Standards and Technology. “Baldrige Performance Excellence Program.” Available at http://www.nist.gov/baldrige.

26. National Institute for Standards and Technology. “Baldrige Performance Excellence Program: Self-Assessing Your Organization.” Available at http://www.nist.gov/baldrige/enter/self.cfm.

27. The Alliance for Performance Excellence. Available at http://www.baldrigepe.org/alliance.

28. Performance Excellence Network. Available at http://performanceexcellencenetwork.org.

29. Javier, F. “Assessing an Asian University’s Organizational Effectiveness Using the Malcolm Baldrige Model.” Asian Journal of Business and Governance 2, no. 1 (2012).

30. Quantum Performance Group, Inc. “About Us.” Available at http://www.quantumperformance.com/aboutus.htm.

31. Blazey, M. “Baldrige Education: Sample Organizational Self-Assessment.” Quantum Performance Group, Inc. Available at http://www.quantumperformance.com/educationSA.htm.

32. Tari, J., and I. Saizarbitoria. “The Self-Assessment Process and Impacts on Performance: A Case Study.”

33. Lee, R., and J. DePue. “Using Baldrige Method Frameworks, Excellence in Higher Education Standards, and the Sakai CLE for the Self Assessment Process.” In Proceedings of the 38th Annual ACM SIGUCCS Fall Conference: Navigation and Discovery (SIGUCCS ’10). New York, NY: ACM, 2010, 165–70.

34. Tari, J., and I. Saizarbitoria. “The Self-Assessment Process and Impacts on Performance: A Case Study.”

35. Nigsch, S., and A. Schenker-Wicki. “Shaping Performance: Do International Accreditations and Quality Management Really Help?” University of Zurich Business Working Paper No. 326. November 2012. Available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2183699.

36. Blazey, M. “Baldrige Education: Sample Organizational Self-Assessment.”

 

Printer friendly version of this article.

Renae Spohn, MBA, RHIA, CPHQ, FAHIMA, FNAHQ. “The Self-Assessment Process and Impacts on the Health Information Management Program Performance: A Case Study.” Perspectives in Health Information Management (Spring 2015): 1-21.

Leave a Reply