Formative Evaluation Using Checklists to Improve Research Proposals

Abstract

Developing research proposals that protect human participants and understanding the institutional review board (IRB) approval process require high-level application of many skills.

The purpose of this article is to describe how faculty can use formative evaluation techniques and checklists to guide students to build skills in writing research proposals for studies that involve human participants or their data. Formative evaluation, the process of critically reviewing work to improve it, is emphasized, and checklists that summarize IRB criteria and standards and present the critical content of research proposals for studies involving human participants are provided. Teaching principles that can guide faculty in using the checklists to give feedback and help students develop high-quality research proposals are discussed.

Key Words: Health information management; Compliance; Privacy; Confidentiality; IRB checklist; Human participant protection; Institutional review board; and Clinical research

 

Introduction and Background

Research involving human participants or their data is frequently conducted by a variety of faculty, practitioners, and students in the health science disciplines. The research process begins with the development of a proposal, or a plan, to investigate a problem, question, or hypothesis. If the investigation requires the use of human participants or their data, then the proposal must comply with federal regulations designed to protect the welfare of the participants and the privacy of their data. Federal regulations require that an institutional review board (IRB) evaluate the research proposal and approve the use of participants. The IRB functions as an organizational committee charged with upholding human participant protections. It may also serve as a source of information and provide educational assistance to researchers regarding the use of human participants or their data. Knowledge about research proposal writing and IRB requirements is required by the curriculum in many baccalaureate health science programs, including health information administration.1 Development of skills in these areas is customary in health science graduate programs as well. As practitioners, health information professionals assist researchers in many disciplines to extract patient care data needed for research.

All of these contexts require not only knowledge of proposal writing and IRB regulations but the ability to continue to learn and update one’s knowledge as regulatory changes occur. Applying research findings that suggest that adults frequently learn on their own, Brookfield argued that teachers should be “facilitators of learning” rather than “transmitters of knowledge.” 2 In a discussion of how to design significant learning experiences, Fink emphasized that assessment is an attempt to measure the quality of something. 3 This, by definition, requires three features. First, forward-looking assessment means that students are given information and begin to look ahead to what they will be expected to do in a semester-long course. Second, clear and appropriate criteria and standards are needed to measure quality. Third, to create an environment of formative evaluation, self-assessment of performance is needed. Students benefit from many opportunities to assess their own performance and rate the quality of their work. This evaluative process improves both the researcher’s skills and the research proposal itself.

The purpose of this article is to describe how faculty can use formative evaluation techniques and checklists to guide students to build skills in writing research proposals for studies that involve human participants. A high-quality research proposal for a study that involves human participants or their data includes substantive detail. Formative evaluation, the process of critically reviewing work to assess and improve it, is important for learning to write high-quality proposals. Formative evaluation can be contrasted with summative evaluation, which is review for the purpose of assigning a grade or an approved/not approved decision, such as that of an IRB. Formative evaluation feedback using checklists empowers students to enhance the quality of their research proposals. Checklists outline necessary content for research proposals involving human participants, and learning activities are intended to build application skills. Specifically, this article focuses on the use of human participants or their data in research and presents (1) an overview of how faculty at one institution developed research proposal preparation checklists to prepare novice researchers for presentation of their proposals to the IRB, and (2) information on how to use the checklists and other formative evaluation techniques to enhance learning.

Development and Content of IRB Checklists: Formulation Process and Model Description

Development

The IRB preparation checklists (see table 1, table 2, and table 3) were developed by health science faculty during a two-year collaborative project at a regional comprehensive university. Table 1 presents the essential content of the research protocol narrative. Table 2 includes the content of an informed consent letter. Table 3 presents the content of a survey cover letter. The process included applying teaching principles to create a culture of formative assessment in which the checklists were used to guide evaluative feedback.4 Classroom assessment techniques to improve teaching and learning were used to pilot test the checklists.5, 6

The checklists were:

  1. developed from the research literature and IRB requirements (15 health science graduate students collaborated to provide input)
  2. revised by content experts
  3. pilot tested by health science faculty working with student researchers and by faculty reviewing proposals for the university IRB
  4. evaluated in pilot testing with 90 percent satisfaction on factors of comprehensiveness, accuracy, readability, and formative usefulness by faculty and students, and
  5. revised with feedback from faculty and content experts.

The checklists address the most common forms of research by faculty and students in education, health sciences, allied health, nursing, and psychology.

Content

The IRB preparation checklists include what the literature emphasizes to be required content in research proposals according to federal guidelines used in IRB review. Documents and events frequently cited in the literature include the start of the Clinical Center of the National Institutes of Health (1953), the World Medical Association’s adoption of the Declaration of Helsinki (1964), the U.S. Public Health Service’s review requirements for grant writing (1966), the role of the Institutional Review Board in review of the U.S. government’s 40-year syphilis study (1970s), and the Belmont Report, Ethical Principles and Guidelines for the Protection of Human Subjects of Research (1979).7 More recently, the Health Insurance Portability and Accountability Act of 1996 (HIPAA) places stringent consent requirements on almost any use or disclosure of health information for research (generalizable knowledge and clinical trials).8

Included in the checklists are the primary items required in a proposal, with participant protection as first priority.9 The primary items for review of the proposal by students and instructor include narrative descriptions of participants, setting, instrumentation, testing procedures, intervention, and study administration. Procedures for data collection, data storage, and data analysis to protect human participants are carefully reviewed. Special attention is given to written informed consent, assent, and letters to participants. Even though regulatory compliance to protect human participants is the first priority of IRB members, they may offer suggestions on alternate ways to conduct a part of a study. The checklists are intended to provide formative evaluation so that student researchers can be successful in the summative evaluation process of IRB review.

Using the Checklists: Validation through Examples

Health service administration faculty and students use the checklists in the undergraduate Health Services Research Methods course at a regional comprehensive university. This course meets the accreditation standards of the Commission on Accreditation for Health Informatics and Information Management Education (CAHIIM) to prepare health information administration professionals. The health service administration degree program includes options in health information management, health care administration, and ancillary health management. Each student in the research methods course must complete a research proposal. Examples of research studies that health information management students have proposed include studies that address the impact of HIPAA privacy regulations on professional practice, coding-productivity studies, studies that aim to identify factors that influence the selection or use of information technologies, analyses of the components of effective contingency plans for information services, and investigations of the relationships between managerial characteristics and successful careers. The course is over 50 percent Web-based and relies on the IRB checklists to communicate important course content. Most student research proposals involve the use of participants or their data, so student proposals need to comply with federal regulations. Students gain initial knowledge of the IRB process and human participant protection by completing the online National Cancer Institute tutorial, Human Participant Protections Education for Research Teams.10 Knowledge of the IRB review process and human participant protections are required for baccalaureate-level programs in health information administration as well as other health science majors.

In addition to the use of the checklists in the research methods course, other faculty use the checklists to assist graduate students in the health sciences to develop research proposals. In these cases, IRB training frequently begins with reading assignments and Web-based instructional tutorials. For example, students in the Master of Public Health degree program complete the online Human Participant Protections Education for Research Teams tutorial and use the IRB checklists while preparing their research proposals.11 Students in the Master of Science program in occupational therapy use the checklists to evaluate other student proposals. Using the checklists as a guide to evaluate proposals written by previous classes produces clear understanding of the criteria and standards. The graduate students gain a clear connection between IRB requirements, such as those listed in the Belmont Report, and the specific items included in the checklists. Novice faculty researchers also benefit from use of the checklists. While they may have expertise in some research methods, they may be developing research proposals with methods that are new to them. Health information management professionals work with graduate students and faculty to assist them in clinical research. Specifically, they assist with data collection and analysis that must be in compliance with IRB standards.

Overall, the checklists provide a basis for feedback when instructors and students assess rough drafts of required research proposals. If the assessment is conducted in person, then the checklists, with comments, are discussed and applied to improve the student’s work. If draft assessments are provided online or via e-mail attachments, the checklist items and relevant student work can be color-coded or numbered for reference. Peer assessments may also be used. Course evaluations have shown that using the checklists for formative evaluation has worked well for both faculty and students.

Discussion on Giving Feedback and Overcoming Resistance to Learning

Brookfield, when discussing skillful teaching, suggested five principles that can be adapted to help faculty overcome student resistance to learning.12 Special attention to guiding student learning may be needed with research-related education. Research proposal writing is usually a new experience for students, and they may be predisposed to find it disconnected from the clinical emphasis of their chosen major in a health science field. Using IRB preparation checklists with student researchers to provide formative evaluation of their research proposals can yield positive results. Table 1 is used with all proposals and addresses the content of the research protocol narrative. Faculty help students identify which items are applicable to their specific proposals. Research may also require informed consent; for example, when treatment is provided in a prospective study, informed consent is typically required (see Table 2). Mailed questionnaires with routine items are often used to collect data and should be accompanied by a survey letter of transmittal (see Table 3). Faculty need to emphasize that because each research proposal is unique, the student researcher should be thorough and include all required items from the checklists. Brookfield provides the following teaching principles that faculty can use to ease student anxiety and help students make their proposals better.13

Principle 1: Explain Your Intentions Clearly

“Explain your intentions clearly” is a guiding principle to keep in mind when the checklists are distributed to students.14 The checklists clearly communicate performance expectations in terms of criteria and standards. To learn essential content, new researchers might watch a video of the research proposal review process, read about the history of IRB regulation development, take a computer-assisted instructional tutorial on IRB requirements, and review IRB checklists. The checklists also provide a mechanism for feedback in the formative evaluation process. Researchers, excited to begin their work, may simply be unaware of the complex federal regulations that must be considered prior to conducting research, so performance expectations must be communicated clearly.

Principle 2: Improve Clarity in Instructions

“Improve clarity in teachers’ instructions” addresses the perception that ambiguity is one of the most demoralizing factors for students.15 Consequently, students benefit from knowing what is expected of them and by what criteria their efforts will be judged. The checklists specifically identify what needs to be included in a proposal or what parts of a proposal may need revision. Providing specific feedback and examples can help in rewriting, and the checklists depersonalize feedback by focusing on the content of the research proposal. Guided practice helps a student develop the ability to recognize a high-quality proposal. Students can rate and note revisions to a proposal as independent reviewers and then compare their responses to those of an expert reviewer. This helps build knowledge of essential content in the narrative section of a proposal, a survey cover letter, or a written informed consent letter.

Principle 3: Sort Out Causes of Resistance

“Try to sort out causes of resistance” is a principle that can increase efficiency in the revision process.16 Resistance is a complex phenomenon, and reasons for it are many. Often, students are concerned with meeting graduation deadlines, and faculty are concerned with getting their portfolios ready for tenure review. Be willing to discuss feedback with students. Many times, students are simply seeking reassurance and clarification. Review of draft proposals can take place together with discussion of the next phase of writing. Use of the checklists at intervals can help new researchers build confidence in their ability to develop a high-quality proposal that meets criteria and standards.

Principle 4: Conduct Regular Formative Evaluation

“Regular formative evaluation sessions” and “peer learning and peer teaching” can each provide a structure for regular review. 17-18 Development of proposals can proceed most effectively when time is taken to ask students deliberately and explicitly about problems they are having. Troubleshooting for a few minutes at the beginning of class can be an effective teaching method. Additional techniques that can be less intimidating for students than in-class discussion include private feedback, a buddy system for students, and small group feedback sessions. Students can practice using the criteria on their own work and make improvements based on that assessment.

Principle 5: Overcome the Fear of Looking Foolish in Public

Helping students “overcome the fear of looking foolish in public” can be accomplished via formative evaluation.19 Development of a research proposal and its presentation to and review by an IRB are “public” displays of work, and most students are not accustomed to them. The formative evaluation process and use of checklists builds student confidence and eliminates errors before the student presents a proposal to the review board. This creates a “success” experience for students and may positively dispose them to develop proposals for other research. Faculty can often create these “success” experiences by providing careful review prior to the formal IRB meeting. High-quality feedback that is frequent, immediate, based on criteria and standards, and delivered in a constructive way can help the student develop a good proposal and avoid the embarrassment of not attaining IRB approval.

Mock IRB Meetings

All five learning principles can be utilized by having students conduct and observe mock IRB meetings. Students can apply the checklists to each other’s proposals and practice giving and receiving clear, depersonalized feedback in a public setting. After each mock meeting, students should be given a chance to discuss their feelings of resistance or confusion. The student observers are typically excellent about noticing interactive and communicative nuances. In the observer role, they can provide further insight into the IRB process by describing what they have seen and heard. This exercise not only provides valuable information to improve student research proposals but also enhances students’ appreciation for the work of the IRB. In addition, Fink suggests that students perform better with the support of a friendly audience; specifically, “situations with a scoreboard and applause” facilitate higher performance levels.20

Conclusion

IRBs should be responsive to change and uphold evolving standards to protect human participants. Further, standards for human participation in research uniformly apply across a full range of healthcare settings, and the outcome measures and quality improvement approach of an IRB are things that students should be familiar with. Research proposals must of be high quality and in regulatory compliance to protect human participants from harm or violations of basic rights such as privacy and confidentiality.21 Faculty should create an expectation in students that responsibility is shared by the local IRB and the investigator, guided by federal and state law as well as local organizational policy.22 This notion of shared responsibility suggests that research proposals often need to be revised to bring proposed research into compliance with regulations. For example, the AHIMA Code of Ethics sets forth high standards for patient privacy and confidentiality.23 Students must understand how to conduct research that respects the patient’s right to privacy. As HIM practitioners, they must apply their knowledge of research-related issues when other investigators want access to patient data. HIM professionals may be members of IRBs or work with IRBs to interpret patient privacy policies and regulations.

The use of checklists and the ability to be comfortable with evaluative review methods will improve the process and product of research using human participants. Development of these skills creates important links among practice, education, and research. Checklists that incorporate criteria and standards into a formative evaluation process can also be used to evaluate and improve other written papers, projects, and interdisciplinary work. In addition to the emphasis on formative evaluation using checklists, students’ research proposals are graded at the end of the course. The research proposal is a major assignment in many research courses. Additionally, examinations on research methods and IRB certificate examinations are used. External measures might include the online National Cancer Institute tutorial or the University of Miami Collaborative IRB Training Initiative (CITI) course as ways to measure student success or practitioner continuing education.24, 25 Faculty and practitioners in health sciences play an important role in developing student researchers and in assisting all researchers to conduct research that is in compliance with IRB requirements, and the use of checklists in a formative evaluation process can help to accomplish these important goals.

Lynnda J. Emery, EdD, OTR/L, is a professor of occupational therapy in the college of health sciences at Eastern Kentucky University in Richmond, Kentucky.

Carolyn Harvey, PhD, CIH, is an associate professor of environmental/occupational health in the college of health sciences at Eastern Kentucky University in Richmond, Kentucky.

Catherine M. Andersen, MPH, RHIA, CPHIMS, is an associate professor of health services administration in the college of health sciences at Eastern Kentucky University in Richmond, Kentucky.

 

Notes

 

1. AHIMA Education Strategy Committee (2004). “HIM Baccalaureate Degree Knowledge Cluster Content and Competency Levels 2005.” Available at http://library.ahima.org/xpedio/groups/public/documents/ahima/bok1_026323.pdf.
2. Brookfield, S. D. (Editor). Self-Directed Learning: From Theory to Practice. New Directions for Adult and Continuing Education 25. San Francisco: Jossey-Bass, 1985, p. 53.
3. Fink, L. D. Creating Significant Learning Experiences: An Integrated Approach to Designing College Courses. San Francisco: Jossey-Bass, 2003.
4. Ross, G. R., A. Schwaller, and J. Helmin. “Creating a Culture of Formative Assessment: The Teaching Excellence and Assessment Partnership Project.” In M. Kaplan and D. Lieberman (Editors), To Improve the Academy, no. 18.Bolton, MA: Anker Publishing, 2000, pp. 195–214.
5. Angelo, T. A., and K. P. Cross. Classroom Assessment Techniques: A Handbook for College Teachers, 2nd ed. San Francisco: Jossey-Bass, 1993.
6. Cross, K. P., and M. H. Steadman. Classroom Research: Implementing the Scholarship of Teaching. San Francisco: Jossey-Bass, 1996.
7. Chadwick, G. L., and C. M. Dunn. “Institutional Review Boards: Changing with the Times?” Journal of Public Health Management and Practice 6, no. 6 (2000): 19–27.
8. Kouzoukas, D. L. “HIPAA’s Privacy Rule on Research: Insight into the Tension between Privacy and the Value of Knowledge.” Topics in Health Information Management 22, no. 4 (2002): 13–20.
9. McDaniel, D., M. Baker, and J. Lansink. “IRB Accreditation and Human Subject Protection.” Applied Clinical Trials, January 2002, 32–38.
10. National Cancer Institute. Human Participant Protections Education for Research Teams. Available at http://cme.cancer.gov/clinicaltrials/learning/humanparticipant-protections.asp.
11. Ibid.
12. Brookfield, S. D. The Skillful Teacher: On Technique, Trust, and Responsiveness in the Classroom. San Francisco: Jossey-Bass, 1990.
13. Ibid.
14. Ibid., p. 156.
15. Ibid., p. 151.
16. Ibid., p. 155.
17. Ibid., p. 156.
18. Ibid., p. 158.
19. Ibid., p. 152.
20. Fink, L. D. Creating Significant Learning Experiences, p. 93.
21. Chadwick G. L., and C. M. Dunn. “Institutional Review Boards.”
22. Stineman, M. G., and D. W. Musick. “Protection of Human Subjects with Disability: Guidelines for Research.” Archives of Physical Medicine and Rehabilitation 82, no. 12 (2001): Suppl. no. 2, S9–14.
23. AHIMA House of Delegates (2004). “American Health Information Management Association Code of Ethics, Revised and Adapted.” Journal of AHIMA 75, no. 10 (2004): 80A–80D.
24. National Cancer Institute, Human Participant Protections Education for Research Teams.
25. University of Miami. CITI Collaborative IRB Training Initiative Registration. Available at www.miami.edu/citireg/.

 

Article citation: Perspectives in Health Information Management  3;2, Winter 2006

Printer friendly version of this article

 

Leave a Reply