Running head: CPPS REVIEW 1

Similar documents
Test review. Comprehensive Trail Making Test (CTMT) By Cecil R. Reynolds. Austin, Texas: PRO-ED, Inc., Test description


Career Counseling and Services: A Cognitive Information Processing Approach

Developmental Assessment of Young Children Second Edition (DAYC-2) Summary Report

Process of a neuropsychological assessment

Reliability. Internal Reliability

What Works Clearinghouse

TExES Deaf and Hard-of-Hearing (181) Test at a Glance

THE ROLE OF WORKING MEMORY IN ACADEMIC ACHIEVEMENT

Importance of Good Measurement

Examining the Psychometric Properties of The McQuaig Occupational Test

CANTAB Test descriptions by function

NEUROPSYCHOLOGICAL ASSESSMENT S A R A H R A S K I N, P H D, A B P P S A R A H B U L L A R D, P H D, A B P P

CRITICALLY APPRAISED PAPER

Alcohol interventions in secondary and further education

Chapter 3. Psychometric Properties

Behavior Rating Inventory of Executive Function BRIEF. Interpretive Report. Developed by SAMPLE

School orientation and mobility specialists School psychologists School social workers Speech language pathologists

M P---- Ph.D. Clinical Psychologist / Neuropsychologist

California Subject Examinations for Teachers

Early Childhood Measurement and Evaluation Tool Review

The Visual Communication and Sign Language Checklist: Online (VCSL:O) Part of the VL2 Online Assessment System

Research Questions and Survey Development

by Peter K. Isquith, PhD, Robert M. Roth, PhD, Gerard A. Gioia, PhD, and PAR Staff

Critical Thinking Assessment at MCC. How are we doing?

NIH Toolbox. Scoring and Interpretation Guide. September 18, 2012

Critical Evaluation of the Beach Center Family Quality of Life Scale (FQOL-Scale)

Creation and Use of the Pervasive Developmental Disorder Behavior Inventory (PDDBI) Parent Form

SAMPLE PSYCHOEDUCATIONAL REPORT. Atlanta Pediatric Psychology Associates 3580 Habersham at Northlake Tucker, Georgia (770)

Book review. Conners Adult ADHD Rating Scales (CAARS). By C.K. Conners, D. Erhardt, M.A. Sparrow. New York: Multihealth Systems, Inc.

Did you do a system check before you joined the webinar today?

Assessment for Students who are Deaf or Hard of Hearing

Empire BlueCross BlueShield Professional Commercial Reimbursement Policy

Brain-based disorders in children, teens, and young adults: When to know there is a problem and what to do

Citation for published version (APA): Parigger, E. M. (2012). Language and executive functioning in children with ADHD Den Bosch: Boxpress

The Discovering Diversity Profile Research Report

Beyond the Psychologist s Report. Nancy Foster, PhD Institute for Brain-Behavior Integration

A COMPLETE ABA CURRICULUM FOR INDIVIDUALS ON THE AUTISM SPECTRUM WITH A DEVELOPMENTAL AGE OF 3-5 YEARS: A STEP-BY-STEP TREATMENT MANUAL INC

GENERALIZABILITY AND RELIABILITY: APPROACHES FOR THROUGH-COURSE ASSESSMENTS

Learning to Use Episodic Memory

Patient Reported Outcomes (PROs) Tools for Measurement of Health Related Quality of Life

The Research Roadmap Checklist

A proposal for collaboration between the Psychometrics Committee and the Association of Test Publishers of South Africa

Using Analytical and Psychometric Tools in Medium- and High-Stakes Environments

CHAPTER 3 METHOD AND PROCEDURE

Visualizing Success: Investigating the Relationship between Ability and Self-Efficacy in the Domain of Visual Processing

mehealth for ADHD Parent Manual

Elderly Norms for the Hopkins Verbal Learning Test-Revised*

The Asian Conference on Education & International Development 2015 Official Conference Proceedings. iafor

Measurement and Descriptive Statistics. Katie Rommel-Esham Education 604

Tier Two Screening for High-Functioning Students with ASD

Review of Various Instruments Used with an Adolescent Population. Michael J. Lambert

Assessing the Validity and Reliability of the Teacher Keys Effectiveness. System (TKES) and the Leader Keys Effectiveness System (LKES)

Tasks of Executive Control TEC. Interpretive Report. Developed by Peter K. Isquith, PhD, Robert M. Roth, PhD, Gerard A. Gioia, PhD, and PAR Staff

Saville Consulting Wave Professional Styles Handbook

Registered Radiologist Assistant (R.R.A. ) 2016 Examination Statistics

New Mexico TEAM Professional Development Module: Deaf-blindness

SOCIABLE DELIVERABLE D7.2a Interim Assessment of the SOCIABLE Platform and Services

The significance of sensory motor functions as indicators of brain dysfunction in children

AUTISM SPECTRUM RATING SCALES (ASRS )

21/05/2018. Today s webinar will answer. Presented by: Valorie O Keefe Consultant Psychologist

CHAPTER 3 METHODOLOGY, DATA COLLECTION AND DATA ANALYSIS

VARIABLES AND MEASUREMENT

EMOTIONAL INTELLIGENCE skills assessment: technical report

The Personal Profile System 2800 Series Research Report

Introduction to the Theory and Measurement of Self-Determination

3/23/2017 ASSESSMENT AND TREATMENT NEEDS OF THE INDIVIDUAL WITH A TRAUMATIC BRAIN INJURY: A SPEECH-LANGUAGE PATHOLOGIST S PERSPECTIVE

PTHP 7101 Research 1 Chapter Assignments

Accessibility and Disability Service. A Guide to Services for Students with

Empirical Knowledge: based on observations. Answer questions why, whom, how, and when.

Object Lessons. The Questions

SENATE, No STATE OF NEW JERSEY. 217th LEGISLATURE INTRODUCED MARCH 14, 2016

CHAPTER III RESEARCH METHOD. method the major components include: Research Design, Research Site and

(77, 72, 74, 75, and 81).

Progress Report. Date: 12/18/ :15 PM Medical Record #: DOB: 10/17/1940 Account #: Patient Information

Everyday Problem Solving and Instrumental Activities of Daily Living: Support for Domain Specificity

Carmen Inoa Vazquez, Ph.D., ABPP Clinical Professor NYU School of Medicine Lead Litigation Conference Philadelphia May 19, 2009 Presentation

Statistics is the science of collecting, organizing, presenting, analyzing, and interpreting data to assist in making effective decisions

Reliability and Validity checks S-005

The Context and Purpose of Executive Function Assessments: Beyond the Tripartite Model

APPENDIX A TASK DEVELOPMENT AND NORMATIVE DATA

CRITICALLY APPRAISED PAPER (CAP)

Pediatrics Milestones and Meaningful Assessment Translating the Pediatrics Milestones into Assessment Items for use in the Clinical Setting

The Assessment in Advanced Dementia (PAINAD) Tool developer: Warden V., Hurley, A.C., Volicer, L. Country of origin: USA

Gezinskenmerken: De constructie van de Vragenlijst Gezinskenmerken (VGK) Klijn, W.J.L.

1. The Role of Sample Survey Design

Using contextual analysis to investigate the nature of spatial memory

DATA is derived either through. Self-Report Observation Measurement

REPORT. Technical Report: Item Characteristics. Jessica Masters

Statistics is the science of collecting, organizing, presenting, analyzing, and interpreting data to assist in making effective decisions

How Do We Assess Students in the Interpreting Examinations?

From: What s the problem? Pathway to Empowerment. Objectives 12/8/2015

2008 Ohio State University. Campus Climate Study. Prepared by. Student Life Research and Assessment

Intelligence. Intelligence Assessment Individual Differences

Simplifying Reporting of Communication Development Outcomes for Infants and Toddlers with Hearing Loss

AN ANALYSIS ON VALIDITY AND RELIABILITY OF TEST ITEMS IN PRE-NATIONAL EXAMINATION TEST SMPN 14 PONTIANAK

T. Rene Jamison * and Jessica Oeth Schuttler

Traumatic Brain Injury for VR Counselors Margaret A. Struchen, Ph.D. and Laura M. Ritter, Ph.D., M.P.H.

Kailey Leroux M.Cl.Sc SLP Candidate Western University: School of Communication Sciences and Disorders

Transcription:

Running head: CPPS REVIEW 1 Please use the following citation when referencing this work: McGill, R. J. (2013). Test review: Children s Psychological Processing Scale (CPPS). Journal of Psychoeducational Assessment, 31, 423-427. doi: 10.1177/0734282912463513 Test Review: Children s Psychological Processing Scale (CPPS) Ryan J. McGill Chapman University Test Citation: Dehn, M. J. (2012). Children s Psychological Processing Scale (CPPS). Stoddard, WI: Schoolhouse Educational Services. Word Count: 2,019 (including title and references) Author Note Ryan J. McGill, College of Educational Studies, Chapman University. The author declares no conflict of interest and has not received any funding for the research and publication of this review. Correspondence concerning this article should be addressed to Ryan J. McGill, College of Educational Studies, Chapman University, Orange, CA 92603. E-Mail: mcgil104@mail.chapman.edu

Running head: CPPS REVIEW 2 Test Review: Children s Psychological Processing Scale (CPPS) Test Description General Description The Children s Psychological Processing Scale (CPPS) authored by Milton J. Dehn and published by Schoolhouse Educational Services in 2012, is a third-party rating scale that can be administered to teachers who are familiar with children ages 5-12. The measure is designed to identify psychological processing deficits in children who are referred for a learning disability evaluation. It can be administered in clinical and research settings as a screening measure to determine if there is a need for more comprehensive cognitive testing, as a progress monitoring tool for evaluating the impact of psychoeducational interventions, and as part of a comprehensive assessment battery. Although the CPPS is not based upon a specific theory of cognitive processing, the author makes several important references to the Cattell-Horn-Carroll theory of cognitive abilities (CHC; Carroll, 1993). Additionally, all of the broad abilities proposed within the CHC model are represented in the CPPS. The CPPS is a web-based rating scale that is administered as a teacher rating form. Teachers are asked to complete 121 items which ask them to rate the frequency of academicrelated behaviors. The CPPS should only be completed by a teacher who has known the student for a minimum of 6 weeks, and who can provide accurate ratings about their daily functioning. In situations where a student is instructed by more than one teacher, users are encouraged to administer a separate CPPS form to each individual teacher in order to evaluate consistency across settings and raters. Individual items are evaluated according a 4-point checklist-type scale, with possible ratings of Never, Sometimes, Often, and Almost Always.

Running head: CPPS REVIEW 3 Most classroom teachers will be able to complete the CPPS in approximately 12-15 minutes. The CPPS should only be administered by competent professionals who have the necessary training and experience to administer and interpret advanced measures of achievement and cognition. Specific Description Construct subscales. The CPPS reports 11 construct subscales that can used to assess individual strengths and weaknesses in specific areas of cognitive processing. Attention: The self-inhibitory abilities that allow one to control, sustain, divide, and focus attention. Auditory Processing: The ability to perceive, analyze, synthesize, and discriminate auditory stimuli. Executive Functions: An array of mental processes responsible for regulating cognitive functions. Fine-Motor: The coordination of small muscle movements that occur in the fingers. Fluid Reasoning: The ability to reason, deductively, inductively, and quantitatively, especially when solving novel problems. Long-Term Recall: Delayed recall of new learning and the long-term memory processes of encoding, consolidation, storage, and fluent retrieval. Oral Language: The expressive language and development of dimensions of vocabulary, grammar, and functional communication. Phonological Processing: The manipulation of phonemes, the smallest units of speech that are used to form syllables and words.

Running head: CPPS REVIEW 4 Processing Speed: How quickly information is processed and how efficiently simple cognitive tasks are executed over a sustained period of time. Visual-Spatial Processing: The ability to perceive, analyze, synthesize, manipulate, and transform visual patterns and images, including those generated internally. Working Memory: The limited capacity to retain information while simultaneously processing the same or other information for a short period of time. Composite score. The 11 construct subscales combine to yield a differentially weighted composite which is referred to as the General Processing Ability (GPA) scale. The GPA is defined as an estimate of overall processing ability and cognitive fluency. Test Materials In order to use the CPPS, users must register and purchase an individual license online at www.psychprocesses.com. Once approved, they are provided with access to a passwordencrypted portal within the website. The portal provides direct access to a digital version of the test manual and allows for rating forms to be sent to teachers via e-mail. Users also have the option of submitting a traditional hardcopy of the rating form if needed. The introductory license package includes 50 online CPPS administrations and users have the option to purchase additional administrations and forms in packages of 25 or 100 online. Individual items are easy to understand and ask raters to assess only observable academic skills. The consistent use of negatively phrased items prevents confusion and provides for better discrimination between children who present with processing deficits and those that do not. The test manual is well-written, clear, and organized into sections that cover administration, interpretation, standardization, and technical properties. Case examples are also provided to help users with clinical use and intervention planning.

Running head: CPPS REVIEW 5 Scoring System In order to score the CPPS rating form information has to be completed online; manual scoring options are not available. If utilizing the e-mail option, score reports can be generated by the user within the portal as soon as each rating form is completed. If using the print copy option, users have to enter individual item ratings directly into the portal before a score report can be generated. Score reports provide conversions of raw scores to T-scores, confidence intervals, and percentile ranks for each scale. Interpretation There are several steps to interpreting the CPPS. First, each subscale can be interpreted in isolation, with higher T-scores (60 and above) indicating potential deficiencies in that specific area of processing. Higher scores on the GPA are reflective of more global deficits across multiple processing areas. Scores falling below elevated ranges are interpreted as average or within the expected range of functioning. Finally, subscales can also be combined into one of two secondary factors for qualitative analysis of performance. These grouping include the Self- Regulatory Processes factor (combination of Attention and Executive Processes subscales) and the Visual-Motor Processes factor (Fine-Motor and Visual-Spatial Processing subscales). Although primarily designed to be interpreted using T-scores, users are provided with several additional standardized score options to suit their needs. Each score report also contains W- scores and traditional standard score (M = 100, SD = 15) conversions. W-scores are based off of item response theory and are useful for measuring change (e.g., response-to-intervention). Technical Adequacy Test Construction and Item Analysis

Running head: CPPS REVIEW 6 An item pool was developed for a pilot version of the CPPS using task analysis of construct definitions, a review of the scientific processing literature, an examination of items from similar measures, and input from an expert panel of school neuropsychologists. This process resulted in the creation of an initial set of 75 items, which were divided into 10 subscales. The pilot version was then administered to teachers for 111 subjects across 3 states. As a result of the pilot data, revisions were made in order to extend construct representation and performance ranges for each of the subscales culminating in a 147 item, 10 subscale version that was administered to teachers of 96 subjects. Subsequent analysis resulted in the dropping of 24 items and the addition of an 11 th scale. The final norming edition consisted of 138 total items. Normative Sample The normative sample consisted of 1,121 subjects, ages 5-12, from 128 different communities in 30 states. The sample was constructed to be nationally representative using census data from 2008-2009, and stratified according to demographic variables such as age, gender, grade, region, race, and parent education level. A weighting procedure was utilized to minimize differences between the normative sample and expected frequencies across several demographic variables. Age brackets were adequately represented, with the exception of age 12 (n = 88). Reliability Internal consistency was evaluated using Cronbach s alpha for the subscales and Mosier s formula for weighted composites (Mosier, 1943) for the GPA. Average Cronbach s alpha ratings for the subscales are strong, ranging from.90 to.97. The alpha rating for the GPA was also strong with an average coefficient of.99 across age groups. Inter-rater reliability was estimated utilizing a sample of 22 subjects and 7 teachers. Each student was rated by a minimum

Running head: CPPS REVIEW 7 of two different teachers. A Pearson s product moment correlation (r) was then calculated between the W-scores obtained from all of the rater s in each case. The obtained median coefficient of agreement between raters (r =.77) was also strong. Test-retest reliability was not assessed. Validity Content validity. Content validity was estimated by having two nationally recognized experts in school neuropsychology sort items on the norming edition. Expert agreement with final item placement on the CPPS ranged from 67% to 75%. The manual notes that disagreement most often occurred on items that were closely related (i.e., auditory processing and oral language). Developmental evidence. An important consideration with assessments of cognitive abilities is the degree to which they accurately model expected increases and decreases in skills over time. W-score plots demonstrated expected distribution of ability levels for most cognitive domains. Construct validity. Principal components analysis revealed the presence of a strong general factor, which accounted for an average of 79% of total variance across age groups. All subscales demonstrated an average loading of.71 or better on the general factor. Follow-up exploratory factor analysis inconsistently revealed the presence of 1-2 additional secondary factors across age groups, raising questions about the proposed structure of the CPPS. Relationships with external measures were evaluated by examining correlations between the CPPS and established measures of cognitive abilities and achievement. Correlations between the GPA and individual clusters of the WJ-COG ranged from -.04 to -.74. The CPA also demonstrated significant statistical relationships with several achievement measures on the WJ-

Running head: CPPS REVIEW 8 ACH. In another study, the Executive Processes scale demonstrated a strong correlation (r =.86) with the GEC composite of the BRIEF. All correlations with external measures demonstrated theoretically expected directionality. To establish discriminate validity, Dehn examined the ability of the CPPS to differentiate between students with and without LD. Analysis found that the LD group demonstrated statistically significant elevated scores on all of the CPPS scales when compared to the control group. Commentary and Recommendations The technical documentation and delivery package of the CPPS is quite impressive for an assessment measure at its price point. Its coverage of the full spectrum of processing abilities through rating scale technology is a welcome addition to the cognitive assessment field. The author is also transparent in his appraisal of the strengths and weaknesses of the instrument. Significant departures on several demographic variables raise questions about the representativeness of the normative sample. In some cases these differences were as much as 16%. Although the use of a weighting procedure can correct for statistical shortcomings, it does little to improve the underlying weaknesses in the sample itself. The results of construct validity studies also raise questions about the structure and interpretation of the measure. The CPPS is principally organized and interpreted according to a hierarchical framework with a single second-order summary factor of overall cognitive processing. Yet, exploratory factor analysis inconsistently revealed the presence of 1-2 additional mid-range factors that accounted for a significant amount of additional variance, suggesting that the proposed model of interpretation may not be invariant across age groups. The author speculates about the nature of these secondary factors and provides users with potential interpretive strategies for them yet, they were not incorporated into the formal structure of the

Running head: CPPS REVIEW 9 CPPS and its scoring system. Furthermore, weak statistical relationships between the Processing Speed and Long-Term Recall subscales and their counterparts on the WJ-COG were also observed. It is worth noting that such observations are not uncommon in validity studies of cognitive tests and is not unexpected given the variation in measurement technologies being compared. Despite these shortcomings, the CPPS has potential utility as a screening tool for more selective testing, an adjunct measure within a multi-method psychoeducational assessment composed of more established measures, and as a tool for conducting empirical research on cognitive abilities. When compared to existing processing checklists such as the PPC-R (Swerdlick, Swerdlick, Kahn, & Thomas, 2008) the CPPS offers expanded age and coverage of more processing domains at the same price point. The author accomplishes his primary goal of providing practitioners with a summary measure of processing strengths and weaknesses in a format that is efficient and cost-effective.

Running head: CPPS REVIEW 10 References Carroll, J. B. (1993). Human cognitive abilities: A survey of factor-analytic studies. New York: Cambridge University Press. Mosier, C. I. (1943). On the reliability of a weighted composite. Psychometrika, 8, 161-168. doi:10.1007/bf02288700 Swerdlick, M. E., Swerdlick, P., Kahn, J. H., Thomas, T. (2008). Psychological Processing Checklist Revised. North Tonawanda, NY: Multi-Health Systems.