Mini-PAT in Psychiatry Dr V M Aziz Dr Natalie Thomas

Similar documents
Guidance on colleague and patient questionnaires

UK Psychotherapy Training Survey Summary

Joan Sargeant PhD Head, Division of Medical Education, Faculty of Medicine, Dalhousie University, Halifax, NS, Canada ABMS Conference 2016 Looking

Postgraduate Diploma in Health Sciences in Medical Imaging (Nuclear Medicine Pathway)

RECOMMENDATIONS FOR STANDARDIZATION OF THE MCC 360 SCALE

Specialists in Forensic Psychiatry

Institute of Psychiatry, Psychology & Neuroscience

Developing Skills at Making Observations

Miller s Assessment Pyramid

Evolution of CPD towards Revalidation: International perspectives

Peer assessment of competence

Postgraduate Programmes in Cognitive Behavioural Therapy:

Mapping A Pathway For Embedding A Strengths-Based Approach In Public Health. By Resiliency Initiatives and Ontario Public Health

Medical Psychotherapy Advanced Training Programme Guide Health Education South West, Severn

Specialists in Forensic Psychiatry

The detection and management of pain in patients with dementia in acute care settings: development of a decision tool: Research protocol.

CORE TRAINING IN PSYCHIATRY CT1 CT3

CORE TRAINING IN PSYCHIATRY CT1 CT3

University Training College (UTC) of UKCP

A Framework of Competences for the Level 3 Training Special Interest Module in Paediatric Neurodisability

Specialists in General Psychiatry

Queen s Family Medicine PGY3 CARE OF THE ELDERLY PROGRAM

EUROPEAN FRAMEWORK FOR COMPETENCIES IN PSYCHIATRY

Credentialing with Simulation

Specialists in General. Psychiatry. A Competency Based Curriculum for Specialist Training in Psychiatry. Royal College of Psychiatrists

Swedish adaptation of the General Medical Council's multisource feedback questionnaires: a qualitative study

Victoria YY Xu PGY-2 Internal Medicine University of Toronto. Supervisor: Dr. Camilla Wong

THE STIGMATIZATION OF PSYCHIATRIC ILLNESS: What attitudes do medical students and family physicians hold towards people with mental illness?

Criteria for Registering as a Developmental Paediatrician

Direct Observation of Procedural Skills in Radiology

Specialists in General Psychiatry

Geriatric Neurology Program Requirements

University Training Colleges (UTC) Standards of Education and Training in Psychotherapeutic Counselling

Specialists in the Psychiatry of Learning Disability

Role of the consultant psychiatrist

BEST PRACTICE GUIDE PSYCHOTHERAPY TRAINING IN HIGHER SPECIALIST PSYCHIATRY TRAINING ST (4-6)

Chapter 9. Ellen Hiemstra Navid Hossein pour Khaledian J. Baptist M.Z. Trimbos Frank Willem Jansen. Submitted

Division of Clinical Psychology The Core Purpose and Philosophy of the Profession

Specialists in Old Age Psychiatry

EFPT Recommendations On Standards of Psychiatric Training

Universities Psychotherapy and Counselling Association

Specialists in Old Age Psychiatry

Victoria YY Xu PGY-3 Internal Medicine University of Toronto. Supervisor: Dr. Camilla Wong

REPORT ON EMOTIONAL INTELLIGENCE QUESTIONNAIRE: GENERAL

Academic support for the Assessment and Appraisal Workstream of Health Education England s review of the ARCP

Addendum Valorization paragraph

Job Description hours (worked flexibly within the service opening hours)

Assessment of Psychiatrists in Practice Through Multisource Feedback

Dr. Úna M MB, BCh, LRCP&SI, MRCPsych, DipHS

PSYCHIATRISTS SUPPORT SERVICE. Information guide for psychiatrists. On looking after yourself: a survival guide

Specific Standards of Accreditation for Residency Programs in Adult and Pediatric Neurology

Professional Doctorate in Counselling Psychology

Programme Specification

Direct Observation in CBME: Importance and Challenges

Stigma due to their work in mental health, among mental health professionals in Sri Lanka

Getting To Desired Outcomes:

Universities Psychotherapy and Counselling Association

Specialists in Forensic Psychiatry

ERS School and postgraduate education

COACHING, CLIENTS AND COMPETENCIES: How Coaches Experience the Flow State. Barrett W. McBride, PhD. Summary of Research Findings

Report of Recovery Star Research Seminar

Guidance on the implementation of the 2016 changes to the Cardiology curriculum

UNIVERSITY OF TEXAS RIO GRANDE VALLEY Rehabilitation Counseling (MS) Program Requirements

Hearing aid dispenser approval process review Introduction Hearing aid dispenser data transfer... 6

Job Description. Inspire East Lancashire Integrated Substance use Service. Service User Involvement & Peer Mentor Co-ordinator

Selecting for Medical Education

Claire Williams October E-portfolio

Core Competencies Clinical Psychology A Guide

CARDIOLOGY RESIDENCY PROGRAM MCMASTER UNIVERSITY

Teaching medical undergraduates: the psychiatrist as medical teacher

A Framework of Competences for Special Interest Module in Paediatric Epilepsies

Counselling Psychology Qualifications Board. Qualification in Counselling Psychology

Education and training in palliative care: winning hearts and minds

Entrustable Professional Activities and Trainee Evaluation in Competency-Based Geriatric Psychiatry Training

Maggie Keswick Jencks Cancer Caring Centres Trust Job Description. 1. JOB TITLE: Cancer support specialist. procedures

A survey of Foundation doctors attitudes towards psychiatry before and after their first clinical working year

Supervisor Handbook for the Diploma of Diagnostic Ultrasound (DDU)

Revising the In-Training Evaluation Report (ITER) to Improve its Utility

Feedback from patients: potential and pitfalls in measuring health professionals' performance

Overview of Engaging Young Men Project Follow-Up to Recommendations made in the Young Men and Suicide Project Report

Unit title: Oral Health Improvement: An Introduction (SCQF level 5)

GUIDELINES FOR POST PEDIATRICS PORTAL PROGRAM

School of Improvement Supporting trainees from Students to Consultants

ESSENTIAL SOCIAL WORK COMPETENCIES FOR SOCIAL WORK PRACTICE IN HEALTH CARE

Diploma in Implant Dentistry Opportunities 2014

Recruitment Information Pack. Trainee Applied Behaviour Analysis Tutor

Core Competencies for Peer Workers in Behavioral Health Services

A s assessment drives learning, analysis of

Specialists in Forensic Psychiatry

STRATEGIC PLAN

Clinical Fellowship Vascular/Thoracic Anesthesia

UCD School of Psychology Guidelines for Publishing

Report on the Ontario Principals Council Leadership Study. Executive Summary

One-off assessments within a community mental health team

A Cochrane systematic review of interventions to improve hearing aid use

Catching the Hawks and Doves: A Method for Identifying Extreme Examiners on Objective Structured Clinical Examinations

GOOD PRACTICE GUIDELINES Training in Forensic Clinical Psychology

400 Hour Evaluation of Student Learning Form Concordia University Social Work Practicum Program

Suicide Executive Bulletin

2010 National Audit of Dementia (Care in General Hospitals) Guy's and St Thomas' NHS Foundation Trust

Transcription:

Mini-PAT in Psychiatry Dr V M Aziz Dr Natalie Thomas Practice Points: Learning that occurs in the context of the daily workplace is more likely to be relevant and reinforced, leading to better practice. Mini-PAT aims to increase productivity, bring about positive behaviour change and provide an opportunity to reflect on their conduct and attitudes. Mini-PAT highlights those areas which need to be worked on, through producing an agreed-action development plan. Credible systematic feedback from the assessors helps trainees steer their learning towards desired outcomes. WPBA including mini-pat should be a part of the educational agreement that aims to help the trainees identify their personal goals and how they will achieve them. Assessment is an attempt to understand a person through obtaining and interpreting information about the knowledge, understanding, abilities and attitudes of that person (Rowntree, 1994). Assessments have multiple purposes including: ability to determine whether the set learning objectives are met; assessment of competency, evaluation of teaching programmes, and predicting future performance (Amin and Khoo, 2003). They include written examination questions, portfolio-based assessment, Clinical Assessments of Skills and Competencies (CASC), and multisource assessments. Multiple observations and several different assessments used over time can partially compensate for flaws in any one method (Wass, et al., 2004). Workplace-based assessments (WPBA) were proposed by the Postgraduate Medical Education and Training Board (PMETB) as an overarching assessment strategy. Davies and colleagues (1995) reported that learning that occurs in the context of the daily workplace is more likely to be relevant and reinforced, leading to better practice. Wass and colleagues (2004) reported that for assessment to drive learning, it should be educational and formative, but with an increasing focus on the performance of doctors, and on public demand for assurance that doctors are competent. Assessment also needs to have a summative function. WPBA is about the assessment of how doctors practice in their working environment (PMETB, 2005). Some of the advantages of WPBA include the fact that it is traineeled, is formative, and has the ability to evaluate performance in context (Miller, 1990). WPBA include (GMC, 2010): observation of clinical activities, such as the mini-clinical evaluation exercise and direct observation of procedural skills; discussion of clinical cases, such as the case based discussion; and multisource 1

feedback from peers, co-workers, and patients such as the mini-pat, and the patient satisfaction questionnaire. The Royal College of Psychiatrists (the College) introduced WPBA in 2007 with the focus on trainees personal development, learning needs, appraisal and assessment as integral to the process of education supervision (Everett Julyan, 2009). WPBAs provide motivation for trainees to meet regularly with their supervisors for the purpose of focusing on knowledge, skills and attitudes (PMETB, 2008) and can reliably and consistently improve and broaden psychiatrists' skills (Brown and Doshi, 2006). The reliability, validity, and acceptability of WPBAs have been scrutinised by several studies (Norcini and Burch, 2007; Norcini, 2010). In a feasibility study and generalisability analysis based on the application of three WPBAs, Wilkinson and colleagues (2008) concluded that WPBAs are feasible to conduct and can make reliable distinctions between doctors performances. The Academy of Medical Royal Colleges (2010) reported, on the outcome of trainee and trainer survey, that WPBA is fit for purpose especially when feedback with reflection is used as the primary source, followed by scoring. However, Wilkinson and colleagues (2008) suggested that WPBA is not sufficiently reliable to stand alone and in order for it to be valid and useful, trainees and assessors need to understand and value their role in the educational process. Mini-PAT is a form of multisource feedback (MSF) that involves the assessment of aspects of trainees competence and behaviour from multiple viewpoints (Fitch, et al., 2008). Mini-PAT was derived from (SPRAT) the Sheffield Peer Review Assessment Tool (Archer, et al., 2005) in which trainees seek feedback about their work performance from a variety of healthcare professionals, highlighting areas of strength and those in need of improvement (King, 2002; Lipner, et al., 2002). It can also incorporate self-assessment, where the trainees assess their own competence and behaviour for comparison with assessments from other sources. The aims of mini-pat are to increased productivity, and bring about positive behaviour change while its outcomes are to offer trainees insight into how others perceive their performance based on their workplace behaviour and provide an opportunity to reflect on their conduct and attitudes (Abdulla, 2008). It is useful in assessing communication, reliability, team working, and teaching skills (Hays et al, 2002). It can also help trainees develop a holistic approach to patient care by taking part in multi-professional ward reviews and care programme approach (CPA) meetings (Bhugra & Holsgrove, 2005) and is supportive to inter-professional team development (McLellan, et al., 2005). Because psychiatry differs from other medical specialities, with more emphasis on communication, interpersonal skills and relationship building, the College revised the MSF to reflect these differences (Finch, et al, 2008). The College uses mini-pat to assess the trainee s competencies as a communicator with patients, families, and colleagues; collaborator with other professionals and agencies; manager who organizes sustainable practices, and contributes to the effectiveness of the service; a scholar who contributes to the dissemination, and application of medical knowledge; 2

and a professional who shows best practice, high ethical standards and exemplary personal behaviours. Trainees should complete one mini-pat every six months involving eight to twelve raters. Different studies indicate that between eight and twelve raters can generate generalisability coefficients between 0.7 and 0.81 (Lockyer and Violato, 2004; Ramsey, et al., 1996). Five criteria for determining the usefulness of a particular method of assessment were described by Van der Vleuten (1996): reliability (the degree to which the measurement is accurate and reproducible), validity (whether the assessment measures what it claims to measure), impact on future learning and practice, acceptability to learners and faculty, and costs (to the individual trainee, the institution, and society at large). Ramsey et al. (1993) found that it is feasible and valid to obtain assessments from professional colleagues in areas of clinical practice, humanistic qualities and communication skills. Also, Whitehouse and colleagues (2002) found it to be practical and acceptable to senior house officers in hospital settings. The reliability of multi-source feedback and peer rating across different settings was validated in other subspecialties including surgery (Violato, et al., 2003), obstetrics and gynaecology (Joshi, et al., 2004), and intensive care (Johnson and Cujec, 1998). Abdulla (2008) quoted high content validity and reliability (correlation coefficient in the range of 0.7-0.8 for tools assessing communication skills, working relationships, team development and stress-coping strategies). Archer and colleagues (2008) concluded that mini-pat appears to provide a valid way of collating colleague opinions to help reliably assess Foundation trainees. Violato and colleagues (2008) showed evidence for the construct validity (is measuring the construct it claims to be measuring) and stability over time. As the questionnaire was mapped directly to the standards of the GMC good medical practice content validity had therefore been established (Archer, et al., 2005). Never the less, it is important to note that each study developed its own assessment tool and the number of raters providing feedback varied widely across the studies. Allowing the trainees to select their own raters does not bias assessment (Durning, et al. 2002; Violato, et al. 1997). Ramsey and colleagues (1993) demonstrated that the relationship between the person being rated and the rater did not substantially affect the results. However, it was found that the profession and seniority of assessors significantly influences assessment (Archer, et al. 2006). Johnson and Cujec (1998) were concerned that doctors rate their colleagues more favourably when compared to feedback from nurses that could affect inter-rater reliability. This selection bias may be reduced if the list of raters is discussed and agreed by both supervisor and trainee beforehand and education about the process may enhance credibility of the tool (Bracken, et al., 2001) and reduce errors of measurement such as halo effect and central tendency (Norcini, 2003; Johnson and Cujec, 1998). Also, self-assessments do not correlate with peer ratings, and differences in ratings have been found between peers with differing levels of experience (Hall, et al., 1999; Thomas, et al., 1999). This disagreement can be seen as measuring different aspects of performance from the position of the rater (Bozeman, 1997). Consequently, the College has constrained psychiatry specialty registrars discretion 3

over the selection of assessors for mini-pat, requiring them to nominate their assessors from a broad range of co-workers, not just medical staff. On the other hand, Schurwirth & Van der Vleuten (2006) cautioned against over reliance on these instruments as they are mainly summative rather than formative tools. Concern has also been expressed regarding the dangers of compromising the construct validity of assessment processes for the sake of increasing reliability (Cate, 2006). It is therefore important that we consider these important principles and do not end up being a competence checklist. Mini-PAT plays an important role in helping trainees acknowledge of how their performance is perceived by a range of stakeholders, in addressing weaknesses in their competence and bringing about changes in practice particularly when clinical competence is being assessed (Violato et al, 1997; Lipner et al, 2002; Higgins, et al, 2004). Therefore, considerable emphasis has been placed on the process and quality of feedback given by the educator (Sargeant, et al., 2005). Moreover, credible systematic feedback from the assessors helps trainees steer their learning towards desired outcomes (Norcini and Burch, 2007; Veloski, et al., 2006). Overeem and colleagues (2009) concluded that skilled facilitators are available to encourage reflection and that consultant groups should be aware of the existing lack of openness and absence of constructive feedback. Miller and Archer (2010) examined the impact of workplace based assessment on doctors education and performance and concluded that performance changes were more likely to occur when feedback was credible and accurate or when coaching was provided to help subjects identify their strength and weaknesses. In their views individual factors, the context of the feedback, and the presence of facilitation have a profound effect on the performance improvement. Higgins and colleagues (2004) showed that highly structured feedback is important in appreciating feedback from non-clinical sources. Atwater and colleagues (2000) demonstrated 50% improvement in performance of the supervisors who received 360 degree feedback. However, other studies showed that individuals receiving negative feedback may be discouraging and this decreased their performance (Kluger and Denisi, 1996; Bret and Atwater, 2001). These authors suggested that this may be related to the way feedback is provided rather than a flaw in the concept itself and how the process is implemented. Therefore, the educational supervisor should be appropriately trained for the task, adopt a constructive approach when discussing the results of the report, and allow learner interaction to put together an action plan for development (Sargeant, et al., 2007). In my view, the tool becomes irrelevant when less than half of the raters complete the feedback because they do not want to negatively mark the trainee. Finally, feedback may not lead to changes in practice (Lockyer, et al., 2007) and its perceived effectiveness of the tools was found to be low (Burford, et al., 2010). The use of shorter instruments can improve assessment and counter the view that MSF involves too much paperwork (Lockyer et al, 2006). Also, as the UK is a multiethnic country an additional difficulty is finding a way in which non-english speakers 4

can be included. One method for achieving this has been to conduct interviews with patients using interpreters (Mason, et al, 2003), but the cost of that needs to be considered. To improve patient satisfaction, it may be worthwhile linking patient satisfaction and mini PAT to provide evidence for validation that mini PAT is a useful tool to determine improved patient care. WPBA should be a part of the educational agreement that aims to help the trainees identify their personal goals and how they will achieve them. We recommend that the College review its eportfolio to modify the minimum number needed for mini-pat. Finally, mini-pat lacks sufficient field evaluation and large study with larger number of raters and bigger sample size is important for its validation and to reduce the measurement error. In conclusion, WBA assessment tools provide structured, standardised methods of monitoring doctors progress, against a broad range of core clinical and non-clinical competencies (Brown & Bhugra, 2005). Mini-PAT provides the trainees with an overview of how others see them at work and also offers an opportunity to compare self-perception with peer perception and allows comparison with the average peer group. It also highlights those areas which need to be worked on, through producing an agreed-action development plan following formal feedback. Dr Victor Aziz Consultant Psychiatrist victoraziz@doctors.org.uk Dr Natalie Thomas ST4 Older Persons Mental Health pm04njt@doctors.org.uk References: Abdulla A. A critical analysis of mini peer assessment tool (mini-pat). J R Soc Med 2008; 101: 22 26. Academy of Medical Royal Colleges 2010. Workplace based assessment forum: outcome. http://www.aomrc.org.uk/publications/reports-guidance.html Amin Z, Khoo HE. Overview of assessment and evaluation. In: Basics in Medical Education. World Scientific Publishing Company, Singapore 2003. 5

Archer J, et al. Mini-PAT (Peer Assessment Tool): a valid component of a national assessment programme in the UK. Advances in Health Sciences Education 2006, online content. http://springerlink.metapress.com/content/v8hth3h06507ph56/?p=a3e3005a289e47e 386a81bb976abef58&pi=2. Archer J, Norcini J,Southgate L. Mini-PAT (Peer Assessment Tool): a valid component of a national assessment programme in the UK? Advances in Health Sciences Education 2008; 13(2): 181-192. Archer JC, Norcini J, Davies HA. Use of SPRAT for peer review of paediatricians in training. BMJ 2005; 330: 1251 3 Atwater LA, et al. An upward feedback field experiment. Supervisor s cynicism, follow-up and commitment to subordinates. Personnel Psychology 2000; 53: 275 97 Bhugra D, Holsgrove G. Patient-centred Psychiatry. Training and assessment: the way forward? Psychiatric Bulletin 2005; 29: 49-52. Bozeman D. Interrater agreement in multisource performance appraisal: a commentary. Journal of Organizational Behavior 1997; 18: 313 316. Bracken DW, Timmreck CW, Church AH. The Handbook for Multisource Feedback: the comprehensive resource for designing and implementing MSF processes. Jossey-Bass, San Francisco, CA 2001. Bret J, Atwater L. 360-degree feedback: accuracy, reactions and perceptions of usefulness. J Appl Psychol 2001; 86: 930 42 Brown N, Bhugra D. Modernising medical careers, the Foundation Programme and psychiatry. Psychiatric Bulletin 2005; 29: 204-206. Brown N, Doshi M. Assessing professional and clinical competence: the way forward. Advances in Psychiatric Treatment 2006; 12: 81-89. Burlford B, et al. User perceptions of multi-source feedback tools for junior doctors. Medical Education 2010; 44:165-176. Davis DA, et al. Changing physician performance: a systematic review of the effect of continuing medical education strategies. Journal of the American Medical Association 1995; 274: 700-705. Durning SJ, et al. Assessing the reliability and validity of the mini-clinical evaluation exercise for internal medicine residency training. Academic Medicine 2002; 77: 900 904. Everett Julyan T. Educational supervision and the impact of workplace-based assessments: a survey of psychiatry trainees and their supervisors. BMC Medical Education 2009; 9: 51 6

Fitch C, et al. Assessing psychiatric competencies: what does the literature tell us about methods of workplace-based assessment? Adv. Psychiatric Treatment 2008; 14: 122-130. General Medical Council 2010. Workplace Based Assessment: A guide for implementation. www.gmc-uk.org Hall W, et al. Assessment of physician performance in Alberta the physician achievement review. CMAJ 1999; 161: 52 7 Hays RB, et al. Selecting performance assessment methods for experienced physicians. Medical Education 2002; 36: 910-917. Higgins, RSD, et al. Implementing the ACGME general competencies in a cardiothoracic surgery residency program using a 360-degree feedback. Annals of Thoracic Surgery 2004; 77: 12 17. Johnson D, Cujec B. Comparison of self, nurse, and physician assessment of residents rotating through an intensive care unit. Crit Care Med 1998; 26:1811 6 Joshi R, Ling F, Jaeger J. Assessment of 360-degree instrument to evaluate residents competency in interpersonal and communication skills. Acad Med 2004; 79: 458 63 King J. 360 degree appraisal. BMJ 2002; 324: S195-6. Kluger AN, Denisi A. The effects of feedback interventions on performance: A historical review, a meta-anaylsis, and a preliminary feedback theory. Psychol Bull 1996; 119: 254 84 Lipner RS, et al. The value of patient and peer ratings in recertification. Academic Medicine 2002; 77: S64 S66. Lockyer JM, Violato C. An examination of the appropriateness of using a common peer assessment instrument to assess physician skills across specialties. Academic Medicine 2004; 79: S5 S8. Lockyer J, et al. A study of a multisource feedback system for international medical graduates holding defined licenses. Medical Education 2006; 40: 340 347. Lockyer J, Violato C, Fidler H. Likelihood of change: a study assessing surgeon use of multisource feedback data. Teach Learn Med 2003; 15: 168-74. Mason R, et al. Developing an effective system of 360-degree appraisal for consultants: results of a pilot study. Clinical Governance Bulletin 2003; 4: 11 12. 7

McLellan, H., Bateman, H. and Bailey, P. The place of 360 degree appraisal within a team approach to professional development. Journal of Inter professional Care 2005; 19:137-148. Miller A and Archer J. Impact of workplace based assessment on doctors education and performance: a systematic review. BMJ 2010; 341: c5064. Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE Guide No 31. Med Teach 2007; 29:855-71. Norcini JJ. Workplace-based assessment in clinical training. In: Swanwick, T. (ed). Understanding Medical Education: Evidence, Theory and Practice. Wiley-Blackwell, UK 2010. Overeem K, et al. Doctors perceptions of why 360-degree feedback does not work: a qualitative study. Medical Education 2009; 43(9): 874-882. Postgraduate Medical Education and Training Board 2005. Workplace Based Assessment Subcommittee. Workplace based assessment. Postgraduate Medical Education and Training Board 2008. Standards for curricula and assessment systems. Ramsey PG, et al. Use of peer ratings to evaluate physician performance. JAMA 1993; 269 (13):1655 60. Ramsey PG, et al. Feasibility of hospital-based use of peer ratings to evaluate the performance of practising physicians. Acad Med 1996; 71: 364 70 Rowntree D. Assessing Students: How shall we know them? 2 nd ed. Kogan Page, London 1994. Sargeant J, et al. Challenges in multisource feedback: intended and unintended outcomes. Med Educ 2007; 41: 583 91 Sargeant J, Mann K, Ferrier S. Exploring family physicians reactions to multisource feedback: perceptions of credibility and usefulness. Med Educ 2005; 39: 497 504 Schurwirth LWT, van der Vleuten CPM. A plea for new psychometric models in educational assessment. Medical Education 2006; 40: 296-300. ten Cate O. Trust, competence, and the supervisor s role in postgraduate training. BMJ 2006; 333: 748 751. Thomas PA, et al. A pilot study of peer review in residency training. Journal of General Internal Medicine 1999; 14: 551 554. 8

Van Der Vleuten CPM. The assessment of professional competence: developments, research and practical implications. Adv Health Sci Educ 1996; 1: 41-67. Veloski J, et al. Systematic review of the literature on assessment, feedback and physicians clinical performance: BEME Guide No 7. Med Teach 2006; 28: 117-28. Violato C, et al. Feasibility and psychometric properties of using peers, consulting physicians, co-workers, and patients to assess physicians. Academic Medicine 1997; 72: S82 S84. Violato C, Lockyer J, Fidler H. Multisource feedback: a method of assessing surgical practice. BMJ 2003; 326: 546 8 Violato C, Lockyer JM, Fidler H. Changes in performance: a 5-year longitudinal study of participants in a multi-source feedback programme. Medical Education 2008; 42(10): 1007-1013. Wass V, et al. Assessment of clinical competence. Lancet 2004; 357: 945-9. Whitehouse A, Walzman M, Wall D. Pilot study of 360 degree appraisal of personal skills to inform record of in-training assessments for senior house officers. Hospital Medicine 2002; 63:172-175. Wilkinson JR, et al. Implementing workplace-based assessment across the medical specialties in the United Kingdom. Medical Education 2008; 42: 364 373. 9