The development and testing of APTA Clinical Performance. Task Force for the Development of Student Clinical Performance Instruments

Similar documents
GUIDELINES: CLINICAL INSTRUCTORS BOD G [Amended BOD G ; BOD ; BOD ; Initial BOD ] [Guideline]

PHYSICAL THERAPIST ASSISTANT STUDENT EVALUATION: CLINICAL EXPERIENCE AND CLINICAL INSTRUCTION

GUIDELINES: CLINICAL EDUCATION SITES BOD G

2013 Supervisor Survey Reliability Analysis

It s a New World New PT and PTA Coursework Tools

University at Buffalo School of Public Health and Health Professions Department of Rehabilitation Science Doctor of Physical Therapy Program

University at Buffalo School of Public Health and Health Professions Department of Rehabilitation Science Doctor of Physical Therapy Program

Assessment Report Template Physical Therapist Assistant Program

GENERAL INFORMATION AND SIGNATURES

POLICY FRAMEWORK FOR DENTAL HYGIENE EDUCATION IN CANADA The Canadian Dental Hygienists Association

GRAND VALLEY STATE UNIVERSITY DEPARTMENT OF PHYSICAL THERAPY OVERVIEW OF CLINICAL EDUCATION CURRICULUM

GUIDELINES: PEER REVIEW TRAINING BOD G [Amended BOD ; BOD ; BOD ; Initial BOD ] [Guideline]

ADMISSIONS POLICIES ADMISSIONS CRITERIA

INTERNSHIP DUE PROCESS GUIDELINES

GENERAL INFORMATION AND SIGNATURES. Name of Clinical Education Site University of Utah Rehab and Wellness Clinic

D.L. Hart Memorial Outcomes Research Grant Program Details

Ministry of Children and Youth Services. Follow-up to VFM Section 3.01, 2013 Annual Report RECOMMENDATION STATUS OVERVIEW

VALUE BASED BEHAVIORS FOR THE PHYSICAL THERAPIST ASSISTANT BOD P [Position]

Petitioner s Guide for Specialty Recognition

Widener University Institute for Physical Therapy Education IPTE Office Contact:

BEHAVIORAL ASSESSMENT OF PAIN MEDICAL STABILITY QUICK SCREEN. Test Manual

PTA 240 PTA Clinical Education III Clinical Performance Instrument

APTA EDUCATION STRATEGIC PLAN ( ) BOD Preamble

PTA 224 PTA Clinical Education I Clinical Performance Instrument

Doctor of Physical Therapy Academic Assessment Plan

PHYSICAL THERAPIST ASSISTANT CLINICAL PERFORMANCE INSTRUMENT

American Board of Physical Therapy Residency & Fellowship Education

PTA 9 CLINICAL PRACTICUM II SYLLABUS AND COURSE INFORMATION PACKET SUMMER 2018

NEW YORK CITY COLLEGE of TECHNNOLOGY DENTAL HYGIENE DEPARTMENT

Physical Therapist Practice and The Movement System

WOMEN S HEALTH PHYSICAL THERAPY. Recertification Requirements for 2017

GENERAL INFORMATION AND SIGNATURES. Clinical Experience Number 1 Clinical Experience Dates May 23, July 29, 2005

RECERTIFICATION PROGRAMME FOR CONTINUING PROFESSIONAL DEVELOPMENT OF OPTOMETRISTS

JSCC PTA Program 2018 JACKSON STATE COMMUNITY COLLEGE PHYSICAL THERAPIST ASSISTANT PROGRAM. Introduction

ACCREDITATION COMMISSION FOR HOMEOPATHIC EDUCATION IN NORTH AMERICA

PTA 240 PTA Clinical Education III Student Self Performance Evaluation Instrument

June 21, Harry Feliciano, MD, MPH Senior Medical Director Part A Policy Palmetto GBA PO Box (JM) AG-275 Columbia, SC 29202

SUMMARY OF ACTION Program in Physical Therapy Florida Agricultural and Mechanical University 309 Ware-Rhaney Extension Tallahassee, FL

GUIDELINES FOR POST PEDIATRICS PORTAL PROGRAM

RECERTIFICATION PROGRAMME FOR CONTINUING PROFESSIONAL DEVELOPMENT OF OPTOMETRISTS

Clinical Education Special Interest Group January 2007 Newsletter

GENERAL INFORMATION AND SIGNATURES

Research Report. Key Words: Functional status; Orthopedics, general; Treatment outcomes. Neva J Kirk-Sanchez. Kathryn E Roach

Professional and Personal Performance Standards Counseling Program College of Education Seattle University

Development of a Statement on Autonomous Practice: Practice Committee, Section on Geriatrics

CLINICAL SITE INFORMATION FORM (CSIF)

PTA 235 PTA Clinical Education II Clinical Performance Instrument

Standards for Homeopathic Education and Competencies for the Professional Homeopathic Practitioner in North America

Re: Docket No. FDA D Presenting Risk Information in Prescription Drug and Medical Device Promotion

October ACPE Standards Revised 2016 Changes from ACPE Standards 2010 By The Standards Committee

PSYCHOMETRIC PROPERTIES OF CLINICAL PERFORMANCE RATINGS

EVMS Authorship Guidelines

PHYSICAL THERAPIST ASSISTANT PROGRAM ADMISSION INFORMATION

Geriatric Neurology Program Requirements

Assessing the Validity and Reliability of the Teacher Keys Effectiveness. System (TKES) and the Leader Keys Effectiveness System (LKES)

Guidelines for Documentation of Occupational Therapy

Copyright is owned by the Author of the thesis. Permission is given for a copy to be downloaded by an individual for the purpose of research and

EXHIBIT 3: ASSESSMENT TABLE GUIDANCE DOCUMENT

Part I Overview: The Master Club Manager (MCM) Program

A proposal for collaboration between the Psychometrics Committee and the Association of Test Publishers of South Africa

Research Questions and Survey Development

Regulations. On Proper Conduct in Research TEL AVIV UNIVERSITY

Practical measures for evaluating outcomes: Australian Therapy Outcome Measures (AusTOMs)

ABBREVIATED CURRICULUM VITAE

Reliability and validity of the International Spinal Cord Injury Basic Pain Data Set items as self-report measures

Georgia State University Counseling and Testing Center

Request for Proposals

IDEA Technical Report No. 20. Updated Technical Manual for the IDEA Feedback System for Administrators. Stephen L. Benton Dan Li

Performance Improvement Project Implementation & Submission Tool

GENERALIZABILITY AND RELIABILITY: APPROACHES FOR THROUGH-COURSE ASSESSMENTS

The Profession of Physical Therapy

Supervisor Handbook for the Diploma of Diagnostic Ultrasound (DDU)

Author s response to reviews

COURSE SYLLABUS. For PTHA The Profession of Physical Therapy CATALOGUE DESCRIPTION

PRACTICUM STUDENT SELF EVALUATION OF ADULT PRACTICUM COMPETENCIES Counseling Psychology Program at the University of Oregon.

Low Tolerance Long Duration (LTLD) Stroke Demonstration Project

INSTITUTIONAL REVIEW BOARD (IRB) PROCESS AND GUIDELINES FOR CONDUCTING RESEARCH AT ORANGE COAST COLLEGE

3-Year Academic Assessment Plan Cover Sheet Assessment plans are due February 16, to:

CHAPTER III RESEARCH METHODOLOGY

Title: Identifying work ability promoting factors for home care aides and assistant nurses


POLICIES GOVERNING PROCEDURES FOR THE USE OF ANIMALS IN RESEARCH AND TEACHING AT WESTERN WASHINGTON UNIVERSITY and REVIEW OF HUMAN SUBJECT RESEARCH

School orientation and mobility specialists School psychologists School social workers Speech language pathologists

The Assessment in Advanced Dementia (PAINAD) Tool developer: Warden V., Hurley, A.C., Volicer, L. Country of origin: USA

INTERNATIONAL STANDARD ON ASSURANCE ENGAGEMENTS 3000 ASSURANCE ENGAGEMENTS OTHER THAN AUDITS OR REVIEWS OF HISTORICAL FINANCIAL INFORMATION CONTENTS

Geriatrics / Gerontology Education

Developing Core Competencies for the Counselling Psychologist Scope: Initial Consultation and Call for Nominations

QUESTIONS & ANSWERS: PRACTISING DENTAL HYGIENISTS and STUDENTS

Outpatient Views on Direct Access to Physical Therapy in Indiana

Coursework Evaluation Tool For Foreign Educated Physical Therapists

PHTH 571, 572, 573 AND 620

Dental Therapist Project

GR 9 COVER SHEET. Suggested LIMITED LICENSE LEGAL TECHNICIAN RULES OF PROFESSIONAL CONDUCT (LLLT RPC)

Basis for Conclusions: ISA 230 (Redrafted), Audit Documentation

Occupational Therapy (OTHR)

Placement Evaluation

RADIOLOGIST ASSISTANT MASTER S PROGRAM APPLICANT PROCEDURES & CHECK LIST

American Physical Therapy Association Credentialed Clinical Instructor Program

Introduction. October 2018 Page 1

Ministry of Children and Youth Services. Independent Clinical Review Process for the Ontario Autism Program. Guidelines

Transcription:

Research Report The Development and Testing of APTA Clinical Performance Instruments Background and Purpose. The purposes of this article are to describe the process of developing the physical therapist (PT) and physical therapist assistant (PTA) Clinical Performance Instruments (CPIs) and to present the available information on the psychometric properties of each instrument. Subjects. Two hundred seventeen PTA students and 282 PT students participated in the pilot studies of the CPIs, and 181 PTA students and 319 PT students participated in field studies. Methods. To construct each instrument, content was first gathered from a variety of instruments and American Physical Therapy Association documents related to PT and PTA practice and education. Data compiled during the pilot and field study phases of the project led to the construction of the fourth (final) versions of the CPIs, which although not studied are currently in use. Results. Intraclass correlation coefficients (ICC [2,1]) measuring the interrater reliability of the CPI total score were good (ICC.87) for the PT total score and moderate (ICC.77) for the PTA total score. Construct validity was supported by the substantial differences in mean CPI score for students completing first as compared with final clinical experiences, by the correlation between CPI item scores and total days of clinical experience, and by the lack of correlation with the Social Skills Inventory score. Discussion and Conclusion. Sale of the fourth (final) versions of the PT CPI occurred in November 1997 and of the PTA CPI in March 1998. Data based on psychometric evaluation of the final version have not yet been collected and reported. In the task force s opinion, the third drafts can provide reliable and valid measurements of PT or PTA student clinical performance. The fourth versions were based on this iteration. [Task Force for the Development of Student Clinical Performance Instruments. The development and testing of APTA Clinical Performance Instruments. Phys Ther. 2002;82:329 353.] Key Words: Clinical education; Education, physical therapy; Evaluation; Student outcomes assessment; Student performance. Task Force for the Development of Student Clinical Performance Instruments Physical Therapy. Volume 82. Number 4. April 2002 329

Physical therapist (PT) and physical therapist assistant (PTA) academic programs establish systems for evaluating students during their clinical experiences. Reasons for evaluation include, but are not limited to, determining whether the student s progress is satisfactory, assessing the student s readiness to enter practice, providing the student with feedback, and obtaining feedback on the education program relative to the currency, relevance, and application of content. 1,2 Various systems and instruments have been developed for these purposes. These are theoretically based on the principles of competencybased education. 3 5 Personnel at academic programs and clinical education sites have developed instruments, apparently with a goal of evaluating overall competence to practice as well as behavior specific to certain patient populations or clinical sites. 5 12 Although many instruments had unique features, shared characteristics led some educators to seek consistency across educational programs (and students). Emergence of consortia, with personnel from multiple educational programs often working with a common group of clinical faculty, played a role in the development of uniform processes and instruments for assessing student clinical performance. 6,7,9,13 These trends, the task force believes, set the stage for developing the American Physical Therapy Association (APTA) PT and PTA Clinical Performance Instruments (CPIs). The purposes of this article are to describe the development of 2 CPIs (one for PTs and one for PTAs) and to provide information about the psychometric properties of the third drafts of these instruments (field study versions). This article does not report evidence about the psychometric properties of the final, published versions. No data are reported relating to the versions of the CPIs currently in use. Background In November 1993, a 10-person task force was charged by the APTA Board of Directors to develop clinical education evaluation instruments to measure student performance in PT and PTA clinical education. 14 Task force members were appointed by the Board of Directors from a group of 75 individuals with a variety of backgrounds, experiences, and expertise who were nominated by personnel from PT and PTA academic programs. The task force began its work by agreeing on a context in which they believe clinical education is pro- Members of the American Physical Therapy Association s Task Force for the Development of Student Clinical Performance Instruments: Kathryn Roach, PT, PhD, is Associate Professor, Department of Physical Therapy and Department of Orthopaedics and Rehabilitation, and Associate Director for Research, University of Miami School of Medicine, Coral Gables, Fla. She was Assistant Professor, Department of Physical Therapy and Department of Orthopaedics and Rehabilitation, University of Miami School of Medicine, at the time of the study. She was primarily involved in the conceptual development, design, ongoing revision of the instruments, design of pilot and field studies, management and coordination of data entry with the assistance of graduate students from the University of Miami, and resultant data analysis. She wrote portions of this article and reviewed and edited the final manuscript prior to submission. Jody Gandy, PT, PhD, is Director, Department of Physical Therapy Education, American Physical Therapy Association, Alexandria, Va. She was Director of Clinical Education, American Physical Therapy Association, at the time of the study. She served as staff liaison to this group and was responsible for project management by ensuring that the Task Force met its charge in accordance with American Physical Therapy Association policies and expected timelines, budgeted and procured the necessary funds to complete the project, and assumed responsibility for data collection. She wrote portions of this article and reviewed, coordinated, and edited the final manuscript prior to submission. Susan S Deusinger, PT, PhD, is Director, Program in Physical Therapy, and Associate Professor of Neurology, Washington University School of Medicine, St Louis, Mo. She was primarily involved in the conceptual development, design, ongoing revisions, design of pilot and field studies, and resultant data analysis, and she reviewed and edited the final manuscript prior to submission. Sherry Clark, PT, MS, is Associate Professor, School of Pharmacy and Allied Health Professions, Creighton University, Omaha, Neb. She was Center Coordinator of Clinical Education, Shepherd Spinal Center, Department of Physical Therapy, Atlanta, Ga, and Associate Professor and Co-Academic Coordinator of Clinical Education, Department of Physical Therapy, School of Allied Health Sciences and Graduate Studies, Medical College of Georgia, Augusta, Ga, at the time of the study. She served as co-chair of this group and was primarily involved in the conceptual development, design, and ongoing revision of the instruments. Pamela Gramet, PT, PhD, is Chairperson and Associate Director, Department of Physical Therapy Education, State University of New York Upstate Medical University, Syracuse, NY. She was Associate Professor and Academic Coordinator of Clinical Education, Department of Physical Therapy Education, College of Health Profession, SUNY Health Sciences Center at Syracuse, State University of New York at Syracuse, Syracuse, NY, at the time of the study. She was primarily involved in the conceptual development, design, and ongoing revision of the instruments. Barbara Gresham, PT, MS, is Instructor, Physical Therapist Assistant Program, McLennan Community College, Waco, Tex. She was Program Director, Physical Therapist Assistant Program, McLennan Community College, Waco, Tex, at the time of the study. She served as co-chair of this group. She was primarily involved in the conceptual development, design, ongoing revision of the instruments, and review of the manuscript before submission. 330. Task Force for the Development of Student Clinical Performance Instruments Physical Therapy. Volume 82. Number 4. April 2002

vided. This was done in an effort to ensure that the resulting instruments could be used with minimal training, were reflective of current practice expectations, and met the needs of academic institutions to comply with accreditation criteria. 1,2 The task force began its work in 1994 by agreeing on 3 foundation assumptions to use as a guide for the development of the CPIs: (1) that clinical competence is based on multiple behaviors deemed essential to the role of the PT or PTA, (2) that the CPIs should be constructed to measure performance along a continuum from novice to at least entry level, and (3) that the instruments must be responsive to the needs of both academic and clinical communities. 12,15 The task force believed that the design of the CPIs should allow measurement of behaviors, with multiple practitioners serving as educators. To achieve this goal, the task force attempted to develop instruments that could yield reliable scores as students provided patient/client care in a format appropriate for the environment to which they were assigned. Thus, the instruments would need to be psychometrically sound and provide useful information about student performance during clinical education. 16 Method The work of the task force and the evolution of the CPIs was in 4 phases that culminated in the development of instruments currently in use. These phases were: (1) development of the first drafts, including initial selection of target behaviors and a scoring/instructional protocol, (2) conduct of pilot studies using the second drafts of the PT and PTA CPIs created in response to feedback from a group of 50 people who the task force believed were experts in clinical education and research based on information provided on the nomination forms and in individual résumés, (3) testing of the third drafts to determine reliability, validity, and feasibility via field studies, and (4) modification of the third drafts in response to the field studies and preparation of the final versions for adoption by the APTA Board of Directors and for sale by APTA. Phase I: First Drafts of the CPIs The first drafts of the CPIs contained 23 PT and 20 PTA performance criteria and sample behaviors describing observable indicators for each performance criterion. Criteria were developed in an effort to be consistent with documents such as the first draft of A Normative Model for Physical Therapist Education, the Guide to Physical Therapist Practice, Volume I: A Description of Patient Management, 17 Paul Hagler, PhD, is Associate Dean, Graduate Studies and Research Faulty of Rehabilitation Medicine, University of Alberta, Edmonton, Canada. He was Professor and Director, Faculty of Rehabilitation Medicine, Centre for Studies in Clinical Education, University of Alberta, at the time of the study. He was primarily involved in the conceptual development, design, ongoing revision of the instruments, design of pilot and field studies, resultant data analysis, and review of the manuscript before submission. Rebecca Lewthwaite, PhD, is Director of Research and Education in Physical Therapy, Rancho Los Amigos National Rehabilitation Center, Rancho Los Amigos Medical Center, Downey, Calif. She was Director, Center for Research in Biokinesiology, and Associate Director of Physical Therapy for Research and Education, Rancho Los Amigos Medical Center, at the time of the study. She was involved in the conceptual development, design, ongoing revision of the instruments, design of pilot and field studies, and resultant data analysis. Bella J May, PT, EdD, FAPTA, is President, BJM Enterprises, PC, Dublin, Calif, and Professor Emerita, Medical College of Georgia, Augusta, Ga. She was Professor and Co-Academic Coordinator of Clinical Education, Department of Physical Therapy, School of Allied Health Sciences and Graduate Studies, Medical College of Georgia, at the time of the study. She was primarily involved in the conceptual development, design, and ongoing revision of the instruments. Babette Sanders, PT, MS, is Assistant Professor, Department of Physical Therapy and Human Movement Science, Northwestern University Medical School, Chicago, Ill. She was Instructor, Programs in Physical Therapy, Northwestern University Medical School, and Part-time Clinician, Department of Physical Therapy, Evanston Hospital, Evanston, Ill, at the time of the study. She was primarily involved in the conceptual development, design, and ongoing revision of the instruments. Michael J Strube, PhD, is Professor, Department of Psychology, Washington University, St Louis, Mo. He served as consultant psychometrician from July 1996 to April 1997. He provided assistance in field study design and construction, data analysis, enhancement of psychometric properties of the instruments using a visual analog scale, and he reviewed and commented on the final manuscript before submission. Yolanda Rainey, PT, MS, is Area Supervisor, Rehab Outreach of St Louis, Mo, at Oakwood Nursing and Rehabilitation Center, Virginia Beach, Va. She was Director of Rehabilitation and Center Coordinator of Clinical Education, NovaCare Geriatric Center, Portsmouth, Va, at the time of the study. She served on this group from 1994 to 1995 and was primarily involved in the initial development of the first draft versions of the instruments. Funding from the American Physical Therapy Association supported the work of Task Force for the Development of Student Clinical Performance Instruments. Physical Therapy. Volume 82. Number 4. April 2002 Task Force for the Development of Student Clinical Performance Instruments. 331

used by academic programs or clinical sites. A Normative Model for Physical Therapist Education describes practice expectations, educational outcomes, and content for the preferred PT curriculum. The Guide to Physical Therapist Practice describes the breadth and depth of PT practice, including patient or client management. Sample behaviors describing observable indicators for each performance criterion item were then identified. For example, in the first drafts of the CPIs, the performance criterion Performs physical therapy treatment that achieves desired outcomes included sample behaviors such as performs treatment consistent with the plan of care, provides treatment in a manner minimizing risk to the patient and others involved in the delivery of the patient s care, adapts physical therapy treatment to meet the individual needs and responses of the patient, and provides treatment in a manner minimizing risk to self. A visual analog scale (VAS) was selected for educators to record the quality of observed behavior for each item on the CPIs. A horizontal line, 100 mm in length, was used to represent the continuum of points between the lowest level and the highest level of student performance that could be observed by a clinical instructor (CI). The line was anchored on the far left with the words Novice Student Clinician and on the far right with the words Expert Clinician. A mid-range anchor was placed at 60 mm and was labeled with the words Entry-Level Clinician. Use of the VAS as a recording format has been suggested to be appropriate when evaluating complex human performance that cannot (and perhaps should not) be divided into the type of discrete units of behavior easily recorded using other formats. 18,19 In addition, because continuous scales such as the VAS can reflect degree of change 20 better than categorical scales, 21 the task force members believed that the VAS met the goals of clinical education assessment more effectively than would other approaches. Because of the large number of possible ratings available with a VAS, it may also decrease problems associated with end aversion bias. 21 End aversion bias causes raters to avoid extreme rating categories. Because of end aversion bias, some authors 21 argue that a 5-point Likert scale might actually be used as a 3-point scale. There is a belief that loss of response categories tends to decrease both efficiency and reliability. 21 Members of the task force also felt that the use of a VAS would address the problem of raters adding plus or minus designations or decimals to categorical scales. This decision was based on the experience of task force members in using other instruments. Finally, the task force members feared that respondents may attach meaning to the numbers on a rating scale that are distinct from the verbal descriptors attached to those numbers. 21 The task force felt that this phenomenon was a likely occurrence in an academic setting, where numbers are often associated with grades and with student success or failure. The first drafts of the CPIs included performance criteria that the task force considered essential, minimum elements for clinical practice. Four performance items related to safety and professional behavior were identified. The task force agreed that a problem with one of these items would be a warning or red flag for serious problems with student performance. The first drafts also included a preamble to provide users with a rationale for developing the CPIs, the basic assumptions upon which the instruments were designed, and reasons for considering its use. Directions for use of the CPIs also were included with the drafts. The first draft versions of the CPIs were reviewed by 50 people within and external to physical therapy who, in the task force s opinion, possessed expertise in academic and clinical education, outcome assessment, evaluation, and psychometric test and measurements. Individuals who had been nominated but not appointed to the initial task force by APTA s Board of Directors formed the group. Feedback was gathered from this group on the structure of the instruments, clarity of directions, relevance and number of performance criteria, consistency between PT and PTA instruments, and mechanism for identifying problematic student performance on any single criterion. The group recommended that the task force clarify the directions in an effort to eliminate what the group perceived as ambiguity. The group suggested that the task force describe how assignment of grades would be made by an academic program. In addition, they suggested that the task force identify the purpose of sample behaviors and clarify that the list of sample behaviors was not exhaustive and provided a mechanism for assessing performance during midterm and final clinical education experience. They also suggested that PT and PTA instruments, where appropriate, be consistent to make it easier for the CI rater who may evaluate PT and PTA students at the same time in their clinical facility, using the same instrument. Phase II : Second Drafts of the CPIs (Pilot Study Versions) Modifications From the Previous Versions The format of the CPIs was modified according to suggestions made by the group of 50 experts. The second drafts, or pilot study versions, of the CPIs included 23 PT and 20 PTA criteria (Tabs. 1 and 2). Two additional items (items 24 and 25 of the PT CPI and items 21 and 22 of the PTA CPI) were added to allow rating of the student s overall performance relative to academic and clinical expectations and entry-level performance. Particularly important performance criteria 332. Task Force for the Development of Student Clinical Performance Instruments Physical Therapy. Volume 82. Number 4. April 2002

Table 1. Second Draft (Pilot Study Version) of the Physical Therapist Clinical Performance Instrument: Interrater Reliability of Scaled Items a Item No. Wording ICC (2,1) N 1. Practices in a safe manner that minimizes risk to patients, self, and.44 28 others. 2. Demonstrates professional behavior during interactions with others..42 28 3. Adheres to ethical practice standards..38 28 4. Adheres to legal practice standards..35 28 5. Communicates in ways that are congruent with situational needs..42 28 6. Produces documentation to support the delivery of physical therapy.42 28 services. 7. Incorporates cultural considerations into the delivery of physical.11 28 therapy care. 8. Applies principles of logic and scientific method to the practice of.60 28 physical therapy. 9. Screens patients using procedures to determine the effectiveness of.41 28 and need for physical therapy services. 10. Performs a physical therapy examination..57 28 11. Evaluates clinical findings to arrive at a physical therapy.53 28 diagnosis. 12. Designs a physical therapy plan of care that integrates goals,.59 28 treatments, and discharge plan. 13. Performs physical therapy treatment that achieves desired patient.18 28 outcomes. 14. Educates others (patients, family, caregivers, staff, students, other.38 28 health care professionals) using relevant and effective teaching methods. 15. Participates in activities assuring quality of service delivery..21 28 16. Participates in physical therapy consultation services..62 28 17. Address patient needs for services other than physical therapy..04 28 18. Manages resources to achieve goals of the practice setting..22 28 19. Participates in fiscal management of the physical therapy practice.02 28 setting. 20. Utilizes support personnel according to legal and ethical.18 28 guidelines. 21. Recognizes that a PT has professional/social responsibilities not.32 28 definable in terms of work hours and job description. 22. Formulates and implements a self-directed plan for career.54 28 development. 23. Addresses prevention, wellness, and health promotion needs of.14 28 individuals, groups, and communities. 24. Rate this student s overall performance relative to academic and.59 28 clinical expectations. 25. Rate this student s overall performance relative to entry level..17 28 Novice Student Entry-Level Expert Clinician Clinician Clinician a PT physical therapist, ICC intraclass correlation coefficient, N number of students rated. and bibliographies to assist in user training that often occurs only through on-site materials reviewed by the CI. Members of the task force also added vignettes that depicted a hypothetical new graduate PT and a new graduate PTA who demonstrated competencies and deficiencies that might be observed in real life. Information Collected From Physical Therapy Community In preparation for conducting the pilot studies of the PT and PTA CPIs, feedback was sought from potential users on the second drafts of the PT and PTA CPIs through a survey and the conduct of regional forums. Beginning in October 1995, second drafts of the CPIs were disseminated to 434 physical therapy academic program directors, 454 academic coordinators of clinical education (ACCEs), and their respective clinical education sites and CIs in the United States and Canada. Notification of the opportunity to review these draft instruments was provided through the Education Division newsletter, R.E.A.D. (December 1995), and PT Magazine (October 1995). Approximately 50 people requested copies of the CPIs. Through the survey, the task force requested feedback on issues such as content, format, and process issues. Examples of issues raised included: how to distinguish performance requirements of the PT and PTA CPIs, whether the instruments were sufficiently comprehensive and userfriendly, how to mark the VAS to indicate student performance levels, how VAS scores can be converted into grades for use by academic programs, and clarity of the directions. were labeled as red-flag items (Figure), and a symbol of a flag was added to the left of those performance criteria that were identified as such. A Significant Concerns/ At-Risk check box also was added to allow the CI to indicate at midterm or final evaluation when, in the estimation of the CI, a student s performance placed him or her at risk for failing the clinical experience (Figure). Additional CPI components included tables of contents to make it easier for the user to locate information within the CPI, glossaries to define terminology, Ten APTA-sponsored regional forums were held between February 1996 and June 1996. These forums were presented by members of the task force for members of academic programs and consortia composed of groups of academic and clinical educators throughout the United States and in Victoria, British Columbia, Canada. During each forum, participants were asked semistructured questions to obtain verbal feedback, and they were surveyed in writing to obtain further comments and opinions. The survey instruments were dis- Physical Therapy. Volume 82. Number 4. April 2002 Task Force for the Development of Student Clinical Performance Instruments. 333

Table 2. Second Draft (Pilot Study Version) of the Physical Therapist Assistant Clinical Performance Instrument: Interrater Reliability of Scaled Items a Item No. Wording tributed during the forum and were collected from participants at the completion of the forum. More than 700 people attended these forums. An estimated 350 additional people provided feedback and comments to APTA staff by mail on the written survey that accompanied the evaluation instruments, resulting in feedback from approximately 1,050 people. Pilot Studies Pilot studies were conducted on the second drafts of the PT and PTA CPIs between October 1995 and April 1996 ICC (2,1) N 1. Practices in a safe manner that minimizes risk to patients, self,.70 10 and others. 2. Demonstrates professional behavior during interactions with.74 10 others. 3. Adheres to ethical practice standards..66 10 4. Adheres to legal practice standards..54 10 5. Communicates in ways that are congruent with situational needs. 6. Produces documentation to support the delivery of physical therapy services. 7. Incorporates cultural considerations into the delivery of physical therapy care. 8. Makes clinical decisions within the scope of PTA practice..72 10 9. Obtains accurate information by performing selected tests and measurements consistent with the plan of care. 10. Participates in modifying the plan of care..90 10 11. Performs physical therapy treatment based on a plan of care established by a PT. 12. Educates others (patients, family, caregivers, staff, students, other health care professionals) using relevant and effective teaching methods. 13. Participates in activities assuring quality of service delivery..87 10 14. Participates in addressing patient needs for services other than physical therapy. 15. Manages resources to achieve goals of the practice setting..68 10 16. Participates in fiscal management of the physical therapy practice setting. 17. Utilizes support personnel according to legal and ethical guidelines. 18. Recognizes that a PTA has responsibilities not definable in terms of work hours and job description. 19. Formulates and implements a self-directed plan for career development. 20. Assists the PT in addressing prevention, wellness, and health promotion needs of individuals, groups, and communities. 21. Rate this student s overall performance relative to academic and clinical expectations. 22. Rate this student s overall performance relative to entry level..89 10 Novice Student Entry-Level Expert Clinician Clinician Clinician a PT physical therapist, PTA physical therapist assistant, ICC intraclass correlation coefficient, N number of students rated. in an effort to provide information on the internal consistency, construct validity and interrater reliability of these versions of the instruments as well as on the factor structure of the PT CPI. The task force considered such analyses necessary to refine the CPIs and to test the use of the CPIs in the clinical setting. Procedure.73 10.72.35.80.90 10 10 10 10 Protection of human subjects. The University of Miami Medical Subcommittee for the Protection of Human Subjects reviewed the pilot study protocol. Because the researchers did not know the identity of the students or the CIs and the students were required to mail the data to APTA, thereby consenting to participate, this protocol was.70 10 exempted from obtaining written informed consent from the students and CIs..82 10 Instruments. In addition to the PT and.78.87 10 10 PTA CPIs, several instruments were used in this study to examine characteristics and satisfaction of participants in the study. The Student Survey requested.70 10 information about demo-.68.89.79 10 10 10 graphics, academic and clinical preparation, and satisfaction with the use of the CPIs. The CI Survey requested information on demographics, clinical setting, and satisfaction with the use of the CPIs. The ACCE Survey requested information on demographics and satisfaction with the use of the CPIs. User satisfaction was examined by having students, CIs, and ACCEs rate their level of satisfaction with various aspects of the CPIs, including time to complete, ease of use, and clarity of instructions. User satisfaction was measured using a 7-point Likert scale. A rating of 1 indicated that the respondent was very dissatisfied, and a rating of 7 indicated that the respondent was very satisfied. Subjects. The sample size identified by members of the task force for conducting the pilot studies was limited to 350 PT students and 350 PTA students. This sample size was set by the task force because the group believed it was a reasonable target sample that could be obtained without knowing in advance how many programs would consent to participate and would have students on 334. Task Force for the Development of Student Clinical Performance Instruments Physical Therapy. Volume 82. Number 4. April 2002

Figure. Sample performance criterion from the fourth draft (final version) of the physical therapist Clinical Performance Instrument (CPI). This sample performance criterion is the same for the physical therapist assistant CPI, with the exception of a sample behavior bulleted under c that states does not provide interventions and consults the physical therapist g supervisor. Superscript g indicates that the term is defined in the glossary. Physical Therapy. Volume 82. Number 4. April 2002 Task Force for the Development of Student Clinical Performance Instruments. 335

clinical education experiences during the time that the study was to be conducted. The sample size also was set in an effort to ensure a sufficient sample size for statistical analyses that would result in data that could be used to make decisions regarding the next draft versions of the CPIs. To obtain this sample, a 2-page questionnaire was mailed to all accredited and developing PT and PTA education programs to determine: (1) the willingness of the programs to participate in a pilot study of the PT or PTA CPI and (2) when students would be completing each of their clinical education experiences during the 1995 1996 academic year. Representatives of PT and PTA programs that could participate in the pilot studies and who had students completing their clinical education experiences between October 1995 and February 1996 were then contacted by telephone and asked to participate. Where multiple programs from the same state were able to participate, the task force considered obtaining a sample that had both public and private institutions, degrees of different levels, and varied levels of clinical education experience (eg, first, intermediate, and final). In addition, the length of clinical education experiences was considered. Personnel from academic programs that agreed to participate in the pilot study identified, in the aggregate, 350 PT students and 350 PTA potential students willing to participate in the study. The pilot studies actually included 282 PT students from 31 US accredited PT professional education programs (9 baccalaureate and 21 master s degree) representing 26 states and 24 students from 2 PT education programs in Canada (Quebec and Ontario). Two hundred seventeen PTA students from 23 US accredited PTA education programs representing 17 states also participated in the pilot studies. Physical therapist students were first-year, second-year, and, in some cases, third-year students engaged in all levels of clinical education, including first, intermediate, and culminating clinical experiences. Likewise, PTA students were both first- and second-year students engaged in all levels of clinical education. Lengths of clinical experiences for PT and PTA students ranged between 1 and 9 weeks for both part-time ( 35 hours/ week and 1 week) and full-time ( 35 hours/week and 1 week) experiences. A subset of pilot study participants, 70 pairs of PT CIs and 42 pairs of PTA CIs, each pair supervising one student, volunteered to be involved in the interrater reliability phase of the study. On behalf of the academic programs, the ACCEs agreed in writing to identify possible student and CI subjects and to obtain verbal consent from those subjects to participate in the pilot study. The ACCE at each participating academic institution then obtained verbal agreement from students enrolled in the physical therapy program to participate in the study. When a student agreed to participate, the ACCE contacted the center coordinator of clinical education (CCCE) at the student s assigned clinical education site to determine whether the CCCE was willing to participate. If personnel at the clinical site were willing to participate, the CCCE obtained verbal agreement from the CI assigned to the student. If the CI agreed to participate, APTA s Department of Clinical Education was notified that a study pair (one supervising CI and one student) was available at that clinical education site during a specified time period. For the interrater portion of the study, clinical sites were identified where a second CI was available and willing to evaluate the student s performance without consulting with the PT who supervised the student. The CIs and students were instructed to first complete the instrument that was typically used by the academic program for midterm and final evaluations to assess the students performance. Once this was done, the CIs and students also completed the CPI for both the midterm and final evaluations. The students and CIs were instructed to compare their midterm and final evaluations on the CPI. In the interrater reliability portion of the study, both CIs were instructed to independently conduct midterm and final evaluations using the CPI and to not discuss or compare results. Neither the students nor the CIs were instructed to attach numbers to the vertical marks on the VAS. After the final evaluation was completed, the students and the CIs independently completed their respective survey questionnaires and placed both copies of the CPI and their survey questionnaires in stamped envelopes addressed to the researchers. The students then sealed the envelopes and had the prerogative to and responsibility for placing the completed CPIs and survey questionnaires in the mail. The students and CIs were linked as a pair through their identification numbers, but neither the students nor the CIs could be individually identified. Upon receiving the completed CPIs, staff in APTA s Department of Clinical Education copied the CPIs completed by the CIs and used the code to identify the clinical education sites and academic institutions. Staff then forwarded copies of the completed CPIs from the supervising CIs to the ACCEs at the students academic institutions. The ACCE Survey was included. Once the ACCEs received copies of the CPIs from all of their academic program s participating students, they were supposed to complete and return the ACCE Survey. Once all CPIs had been copied and forwarded to ACCEs, the master list was destroyed. Analysis. For purposes of our study, the marks on the VAS were measured. The zero on the ruler was aligned with the lower anchor of the VAS such that scores ranged 336. Task Force for the Development of Student Clinical Performance Instruments Physical Therapy. Volume 82. Number 4. April 2002

from a possible 0 to 100 mm for each scored CPI item. Data entry was performed at the Division of Physical Therapy, University of Miami. All data were managed and analyzed using a VAX mainframe computer and SAS version 6.1* statistical software. 22 Analyses of internal consistency, interrater reliability, validity, and user satisfaction were conducted. Internal consistency, in the view of the task force, represents a psychometric characteristic of instruments that are designed to measure one overall construct. In the case of the CPIs, that construct is performance as a PT or PTA. If an instrument is designed to measure one construct, then all items of the instrument should be related to that construct. Internal consistency measures the degree to which items are related. 23,24 A total score was generated by taking an average of the item scores. Internal consistency. The purpose of the CPIs is to measure quality of behavior, namely the ability of a student to perform as a PT or PTA. Conceptually, these levels of performance range from a novice level to an entry level. If an instrument is designed to measure various aspects of a single behavior (performance as a PT or PTA), then it is desirable that all the items included in the instrument measure different features of that behavior rather than different parts of dissimilar behaviors. 21 Internal consistency of the PT and PTA CPIs was examined by calculating Cronbach alphas. Interrater reliability. Reliability can be demonstrated for a measurement only when the measurement is applied to a specific population. 21 Interrater reliability was selected by the task force for analysis in this study because the task force was most interested in whether 2 raters could agree about the performance of students at a given time. Interrater reliability of the CPIs was examined by calculating type 2,1 23 intraclass correlation coefficients (ICCs) for each of the items and for a total score generated by taking an average of the item scores. Construct validity. Two hypotheses were generated to determine construct validity. The first was based on the belief that students performance should differ between students on first clinical experiences and final clinical experiences. Evidence for construct validity was examined by calculating a Student t test to compare total CPI scores for students at the end of their first clinical experience with total CPI scores for students at the end of their last clinical experience. In the second hypothesis, the task force also assumed that clinical performance of PT and PTA students should be related to the amount of prior clinical experience. Therefore, construct validity was examined by calculating Pearson correlation coefficients to determine the relationship between the CPI * SAS Institute Inc, PO Box 8000, Cary, NC 27511. item scores and total days of clinical experience. Criteria for statistical significance were set at.002 to correct for multiple tests. User satisfaction. User satisfaction with the CPIs was measured on a 7-point Likert scale, applied to 17 items, with 1 indicating that the respondent was very dissatisfied and 7 indicating that the respondent was very satisfied. The median level of satisfaction with all 17 aspects of using the instruments was calculated separately for PT and PTA students, for PT and PTA CIs, and for ACCEs. Median levels of satisfaction were calculated because the calculation of medians rather than means is appropriate for ordinal data. Principal components analysis. A principal components analysis was performed to explore the number of distinct constructs represented in the CPIs. Because of the sample sizes, only data from the PT CPI were used for the analysis, and the PTA CPI was not studied in this way. Results of the Pilot Studies PT CPI internal consistency. The Cronbach alpha for the PT CPI was.97, indicating a high level of internal consistency. PT CPI interrater reliability. Reliability estimates for the items on the PT CPI ranged from.02 to.62, with the majority of items demonstrating what the task force would consider moderate reliability (Tab. 1). PT CPI validity. Pearson correlations between length of prior clinical experience and several items, including Designs a physical therapy plan of care that integrates goals, treatments, and discharge plan, were.49. The lowest correlation between length of prior clinical experience and a CPI item was.05 for the global rating item Rate this student s overall performance relative to academic and clinical expectations. This result was anticipated by the task force because this item essentially asks the rater to adjust for prior clinical experience. This same item failed to demonstrate a difference in score between students on their first and final clinical experiences. All other items demonstrated differences between students on the first and final clinical experience. PTA CPI internal consistency. The Cronbach alpha for the PTA CPI was.96, indicating a high level of internal consistency. PTA CPI interrater reliability. Reliability estimates for the items on the PTA CPI ranged from.35 to.89 (Tab. 2). This level of reliability was much higher than the task force anticipated and led the group to have some concerns about the independence of the 2 raters. Physical Therapy. Volume 82. Number 4. April 2002 Task Force for the Development of Student Clinical Performance Instruments. 337

Table 3. Principal Component Analysis With Varimax Rotation of the Second Draft (Pilot Study Version) of the Physical Therapist Clinical Performance Instrument Item Component 1 Loading Component 2 Loading Final Communality Estimate 1. Practices in a safe manner..39.80.80 2. Professional behavior..37.84.85 3. Ethical practice standards..34.87.89 4. Legal practice standards..28.85.80 5. Communicates..46.82.89 6. Documentation..80.43.82 7. Cultural considerations..41.85.90 8. Logic and scientific method..79.40.79 9. Screens patients..75.56.88 10. Physical therapy examination..84.43.89 11. Physical therapy diagnosis..87.42.94 12. Physical therapy plan of care..85.42.91 13. Physical therapy treatment..70.62.89 14. Educates others..85.38.87 15. Assuring quality of service delivery..87.30.85 16. Physical therapy consultation services..85.33.84 17. Need for other services..85.30.82 18. Manages resources..69.61.86 19. Fiscal management..86.36.88 20. Utilizes support personnel..79.45.84 21. Professional/social responsibilities..61.66.82 22. Career development..52.73.82 23. Prevention, wellness, and health promotion..78.41.78 24. Performance relative to expectations..19.44.23 25. Performance relative to entry level..72.52.79 PTA CPI validity. As was true for the PT CPI, the correlation was very small for the PTA CPI (.15) between length of prior clinical experience and the global rating item score for the item Rate this student s overall performance relative to academic and clinical expectation. The correlation between length of prior clinical experience and a PTA CPI item score was much higher (.39) for items such as Makes clinical decisions within the scope of PTA practice and Participates in modifying the plan of care. There were differences in average CPI scores between students on first and final clinical experiences for all the PTA CPI items. Principal components analysis. The unrotated principal components analysis of the PT CPI produced 2 components, with the first component accounting for 19% of the variance and the second component accounting for only 2% of the variance. All 25 items of the CPI loaded more strongly on the first component. A Varimax rotation was then performed. The first component accounted for 12.1% of the variance, and the second component accounted for 8.7% of the variance, suggesting that the instrument might contain 2 constructs. Component 1 was defined by items 6, 8 through 20, 23, and 25, and component 2 was defined by items 1 through 5, 7, 21, 22, and 24. Items associated with the first construct represented physical therapy specific clinical skills required for practice such as the physical therapy examination, diagnosis, plan of care, and treatment. The items associated with the second construct were less specific to physical therapy in that they represented behaviors required in all areas of clinical practice and applicable across all patient types. These items included such general clinical behaviors as safe practice, ethical behavior, professional behavior, and cultural considerations (Tab. 3). However, all items showed moderate to strong loading on both factors, with corresponding high communality estimates, suggesting that the PT CPI is probably measuring one underlying construct. The only exception to this finding was for the global rating item Rate this student s overall performance relative to academic and clinical expectations. This item had a communality estimate of only.23, probably reflecting the fact that it is the only item that rates overall or global performance relative to academic and clinical expectations rather than a specific performance criterion associated with a component of entrylevel practice. Limitations of the Pilot Study The students, CIs, and ACCEs who participated in the pilot study were not trained in the use of the CPI or in the study protocol, other than being given written instructions included in the study protocol and the 338. Task Force for the Development of Student Clinical Performance Instruments Physical Therapy. Volume 82. Number 4. April 2002

Table 4. Physical Therapist Clinical Performance Instrument Field Study: Descriptive Statistics of the Final Score for Each Item Item No. N X SD Range 1. 328 93.0 14.3 05 100 00.0 2. 327 95.8 11.1 06 100 00.4 3. 328 96.2 10.6 16 100 00.0 4. 322 94.9 14.1 16 100 01.8 5. 283 94.4 15.4 11 100 13.7 6. 324 92.0 14.5 23 100 01.2 7. 327 89.3 18.9 00 100 00.4 8. 318 93.7 14.8 12 100 03.1 9. 301 87.1 19.9 10 100 08.2 10. 258 87.7 21.5 04 100 21.4 11. 320 87.5 20.9 04 100 02.4 12. 315 85.3 22.2 05 100 04.0 13. 323 87.2 21.4 00 100 01.5 14. 322 88.8 20.1 03 100 01.8 15. 306 90.1 19.2 03 100 06.7 16. 225 86.7 20.5 00 100 31.4 17. 121 87.5 19.4 00 100 63.1 18. 201 85.2 20.2 04 100 38.7 19. 314 87.2 21.5 02 100 04.3 20. 262 90.1 17.6 05 100 20.1 21. 246 88.8 20.9 00 100 25.0 22. 285 90.6 19.2 00 100 13.1 23. 317 91.0 18.9 09 100 03.4 24. 144 85.5 22.1 05 100 56.1 25. 313 97.4 07.7 50 100 04.6 26. 298 89.1 19.6 06 100 14.0 % Not Observed Table 5. Physical Therapist Assistant Clinical Performance Instrument Field Study: Descriptive Statistics of the Final Score for Each Item Item No. N X SD Range 1. 192 90.9 14.7 18 100 00.5 2. 191 94.1 11.6 29 100 01.0 3. 192 94.0 11.5 15 100 00.5 4. 190 95.1 11.2 06 100 01.5 5. 190 94.0 12.5 34 100 01.5 6. 193 89.7 15.3 07 100 00.0 7. 189 87.8 17.1 03 100 02.1 8. 184 93.7 13.5 03 100 04.7 9. 182 85.9 18.5 00 100 05.7 10. 183 86.7 19.1 00 100 05.2 11. 181 85.2 19.6 04 100 06.2 12. 189 88.2 16.8 07 100 02.1 13. 173 85.2 19.8 00 100 10.4 14. 150 82.8 22.2 01 100 22.3 15. 116 81.4 22.8 01 100 39.9 16. 184 87.1 18.8 08 100 04.7 17. 139 88.6 18.4 03 100 28.0 18. 146 84.7 21.1 17 100 24.3 19. 183 89.9 15.5 23 100 05.2 20. 87 82.7 23.8 05 100 54.9 21. 188 95.3 09.6 40 100 02.6 22. 170 90.3 16.7 00 100 11.9 % Not Observed directions included with the CPI. Although these written instructions and directions described the use of the VAS, explained the distinction between criteria and behavioral criteria, and cautioned the evaluator about the risks of rater bias and inconsistency, reliability estimates may have been adversely affected by the lack of training and control. Although training is now available on the CPI, one cannot assume that all CIs using the CPI have received such training. Therefore, in those situations where training is not available or has not been provided, the situation under which the CPIs were tested may mimic the real world. Because of the design of the study, the researchers were unable to determine whether the subjects read, understood, or followed the study protocol or directions to complete the CPI. 25 The task force acknowledges that wording and terminology problems could cause some items to be less reliable, but also that large-scale personal training of CI evaluators is not feasible. Data generated from this study were used to compare the relative performance of various items based on our assumptions that any methodological difficulties would affect all items equally. Therefore, items with lower reliability coefficients were examined for possible problems with wording and terminology. Results of the pilot studies along with input from regional forums and other focus group meetings were used to revise the second versions of the CPIs. In addition, the task force s experience in working with raters, feedback from a community of interest (ie, academic faculty and clinical educators), and analyses from the pilot studies were used to design and conduct the field studies. Phase III: Third Drafts of the CPIs (Field Study Versions) Modifications From the Previous Versions Information from PT and PTA academic faculty and researchers, PT and PTA clinical educators, PT and PTA Physical Therapy. Volume 82. Number 4. April 2002 Task Force for the Development of Student Clinical Performance Instruments. 339

students; data from the pilot study; and consultation provided by a psychometrician were used by the task force to make changes to the PT and PTA CPIs. In the third drafts of the CPIs, the performance criteria were expanded, with the PT CPI increasing to 24 items and with the 20 PTA performance criteria retained (Tabs. 4 and 5). This draft introduced a modified VAS that eliminated the Expert Clinician anchor at the far right as used in the first and second versions. The result was the inclusion of 2 anchors on the VAS; the far left anchor was labeled Novice Clinical Performance, and far right anchor was labeled Entry-Level Performance. The task force chose to make this modification based on (1) feedback offered by PT and PTA academic faculty and clinical educators about the likelihood that students would (or should) achieve entry-level performance status and (2) advice from the consultant that multiple VAS anchors may complicate or confuse rater behavior. Some physical therapy educators argued that a mechanism was needed to enable acknowledgment of excellence in student performance. Thus, a With Distinction box was added to recognize student performance that exceeded entry level on any criterion (Figure). Wording changes in the performance criteria and instructions were made in an effort to enhance clarity and consistency with APTA documents, including the Evaluative Criteria for the Accreditation of Education Programs for the Preparation of Physical Therapists, 1 Evaluative Criteria for the Accreditation of Education Programs for the Preparation of Physical Therapist Assistants, 2 third draft version of A Normative Model for Physical Therapist Professional Education (to evolve into A Normative Model of Physical Therapist Professional Education: Version 97 26 ), and preliminary findings from a consensus conference convened to develop A Normative Model for Physical Therapist Assistant Education: Version 98. 27 Further modifications included adding new terms and their definitions to the glossary, clarification and expansion of the sample behaviors, clarification of the performance criteria and user instructions, and refinement of the case vignettes. New additions to this version included a box to indicate that the entire performance criterion was Not Observed (Figure) and an area for student and evaluator signatures. Information Collected From the Physical Therapy Community Availability of third draft versions of the CPIs were announced in the December 1996 issue of PT Magazine; the November 8, 1996, issue of PT Bulletin; the November 1996 Education Division newsletter, R.E.A.D.; and the Fall 1996 Education Section newsletter for review and comment by request. Approximately 80 copies of these instruments were requested for review. In addition, third draft versions of the CPIs were mailed to 493 physical therapy academic program directors, 516 ACCEs, and their respective clinical education sites and clinical educators throughout the United States and Canada. In addition, task force members presented the third draft versions of the CPIs at several national forums, which allowed nearly 500 people to provide feedback. For academic faculty and clinical educators who wanted to discuss the instruments at the local or regional level, the task force developed a questionnaire that was distributed with the instruments. This questionnaire posed closed-ended questions designed to obtain information about respondent demographics, opinions about whether performance criteria reflected entry-level expectations, whether sample behaviors accurately described each performance criterion, usefulness of the VAS in documenting student performance, ability to document excellence in student performance beyond entry-level expectations, ability to identify student problems in clinical performance, and management of student performance when the Significant Concerns/At- Risk box is checked or when the performance criterion is red-flagged or non red-flagged. In addition, several open-ended questions were posed to: (1) request respondents to identify criteria that need to be revised, added, or deleted, (2) identify major advantages and disadvantages of the instruments, (3) identify questions or concerns, and (4) solicit any comments that would assist in refinement of the instruments. People reviewing the instruments were encouraged to complete and submit the questionnaire for use in development of the fourth (final) versions of the CPIs. Field Studies Field studies were conducted between October 1996 and July 1997 to assess the psychometric properties of the third versions of the instruments. The field studies examined: (1) internal consistency, (2) interrater reliability, and (3) construct validity of the PT and PTA CPIs. The University of Miami Medical Subcommittee for the Protection of Human Subjects reviewed the field study protocol. Again, because of the subject anonymity guaranteed by the design, this protocol was exempted from informed consent. Procedure Instruments. In the field study, as in the pilot study, the research was designed to test the internal consistency, interrater reliability, construct validity, and student, CI, and ACCE satisfaction with the third drafts of the PT and PTA CPIs. The field study also examined the discriminant validity of the CPIs by examining the relationship between CPI item scores and social competence as measured by the Social Skills Inventory (SSI). 28 The SSI was administered to a subset of 50 PT students and 50 PTA students to determine whether the CPI was measuring social skills rather than entry-level clinical perfor- 340. Task Force for the Development of Student Clinical Performance Instruments Physical Therapy. Volume 82. Number 4. April 2002