2011 CSv2 Reliability Study Final Report. FINAL REPORT AND RECOMMENDATIONS CSv2 RELIABILITY STUDY. September 2012

Size: px
Start display at page:

Download "2011 CSv2 Reliability Study Final Report. FINAL REPORT AND RECOMMENDATIONS CSv2 RELIABILITY STUDY. September 2012"

Transcription

1 FINAL REPORT AND RECOMMENDATIONS CSv2 RELIABILITY STUDY September 2012 Report Authors: CSv2 Field Study Team Editors: Jerri Linn Phillips, MA, CTR Patricia Murison, CHIM, RHIT Pg. 1 of 32

2 TABLE OF CONTENTS Page Number Section Background 3 Study Objective 4 Introduction 4 Preliminary Steps 5 Study Participants 5 Data Items 8 Reconciliation 9 Major, Minor and Unknown to Known Errors 12 Analysis 16 Conclusion 29 Recommendations 29 Protocol, Appendices and Tables located on Appendix A: Timeline Appendix B: Invitation to Participate Appendix C: Users Guide Appendix D: Demographics Appendix E: Site Specific Factors by Site Appendix F: Functional Requirements Document Appendix G: Post Participant Questions Appendix H: Call for Cases Appendix I: Case Assignments Appendix J: Team Members Appendix K: Case Review Worksheets Appendix L: Team Responsibilities and Detailed Timeline Appendix M: Request for CEUs Appendix N: Data Analysis Plan Appendix O: Frequency Reports Appendix P: Error Definitions Pg. 2 of 32

3 BACKGROUND The Collaborative Staging (CS) Task Force was formed in 1998 to address discrepancies in staging guidelines among the three major cancer staging systems used in the United States. The project was sponsored by the American Joint Committee on Cancer (AJCC) in collaboration with the); American College of Surgeons Commission on Cancer (CoC); the Canadian Council of Cancer Registries (CCCR); the Centers for Disease Control and Prevention National Program of Cancer Registries (CDC/NPCR); the National Cancer Institute s Surveillance, Epidemiology and End Results Program (NCI SEER); the National Cancer Registrars Association (NCRA); and the North American Association of Central Cancer Registries (NAACCR. The CS Task Force s assignment was to develop a coding system between the American Joint Committee on Cancer s (AJCC) TNM staging system, SEER Extent of Disease (EOD) and the SEER Summary Staging system that would eliminate duplicate data collection by registrars. Collaborative Stage version was released August 11, 2004 and was effective for all new cancers diagnosed January 1, 2004 and after. The system was comprised of 25 data items that described the extent of disease at the time of diagnosis as well as site specific factors (prognostic indicators). Most of these data items are similar to the Extent of Disease (EOD) data items that have been collected by SEER registrars for many years. New data items were added: evaluation fields that described the basis and sources for the collected data (i.e. clinical vs. surgical) and site specific factors. These fields were added to assist in deriving the final stage grouping and the computer algorithms that classify each case in multiple staging systems. The CSv2 Work Group was formed to make changes to the Collaborative Stage System based on new staging criteria from the AJCC Cancer Staging Manual, 7 th Edition. This included the additions of prognostic indicators and tumor markers that were felt to be clinically significant. Major changes included the addition of several new schemas and revisions to most schemas, including the addition of up to 19 new site specific factors. Changes to the schemas included, new codes, changing the hierarchy of codes, note revisions and the rewriting of Part I, including a more expanded section on the Site Specific Factors ( SSFs). In 2010, the CSv2 Field Study Team developed coding practice studies to evaluate the cancer registrar s ability to code the CSv2 items consistently and to evaluate inconsistencies so that improvements could be made to codes, coding structures, notes, and instructions. The primary sites targeted were bladder, breast, colon, corpus, esophagus GE junction, GIST stomach, kidney parenchyma, lung, melanoma skin, merkel cell, NET colon, thyroid, testis and tonsil. As of September 2012, the coding practice studies have been completed for all schemas. Due to the major changes made in CS for AJCC 7 th edition, the CSv2 Field Study Group s goal was to have 2000 registrars participate in this study. There were a total of 1,040 participants. To reach this number of participants, it required the cooperation of all the organizations involved in cancer registration, including, but not limited to, American Joint Committee on Cancer (AJCC)/Commission on Cancer (CoC) Canadian Partnership Against Cancer (CPAC) Canadian Council of Cancer Registries (CCCR) Pg. 3 of 32

4 Centers for Disease Control (CDC) National Program of Cancer Registries (NPCR) National Cancer Institute (NCI) Surveillance, Epidemiology, and End Results (SEER) National Cancer Registrars Association (NCRA) North American Association of Central Cancer Registries (NAACCR The purpose of the involvement of all these organizations was to encourage the registrars associated with their organizations to participate. The high number of participating registrars in 2011 provides a greater opportunity to assess how the registry community is handling CSv2. STUDY OBJECTIVE The 2011 CS Reliability study was conducted to: 1. Assess the accuracy and consistency with which registrars are able to use CSv2, version Coordinate efforts among the organizations to use this information to refine CSv2 rules, documentation and training for all organizations. 3. Identify clarifications and changes for future versions of CSv2. 4. Identify future quality improvement opportunities INTRODUCTION A reliability study is a quality improvement process designed to assess the proficiency of central and hospital cancer registrars and to measure consistency in the use of codes and coding rules across a program. In a reliability study, all participants code information under similar conditions from the same medical reports using the same references. A quality improvement process uses reliability studies to understand variability in participants coding compared to the expected or preferred answers (based on expert panel s consensus answers). Unexpected answers for this study were reviewed using rationale from comments on the website from the participants and discussion with the expert panel. Rationales for all answers were provided. The 2011 CSv2 Reliability Study is an assessment of two major factors: the revision of rules and clarifications as a result of the 2005 CS Study Assess if the registrar s comprehension and knowledge of CS staging has improved with two additional years of use. Based on the findings of this study, recommendations will be made for changes and/or clarifications to the Collaborative Staging Manual Data Collection System Version 2. Priority topics for educational interventions will also be developed. PRELIMINARY STEPS In preparation for the CSv Reliability Study, the results of previous NCI SEER Re evaluation Reliability Studies were reviewed to determine the areas that were identified as needing documentation changes or educational interventions. Criteria were developed for each cancer site based on problems found in previous reliability studies as well as new and complex schemas that were introduced in CSv2. Based on these criteria, a call for cases was done using the CoC listserve. Pg. 4 of 32

5 The Site Group Teams (Appendix J) reviewed the cases. Each of the four Site Group Teams was assigned specific cancer sites. The Site Group Teams prepared the cases for transcription for the web and determined the preferred answer for each case. The forty four cases were converted into a Microsoft Word standardized format by a transcriptionist who contracts with NCI SEER. Information Management Services Inc. (IMS), a biomedical computing contractor with NCI SEER, converted the cases to HTML format for placement on the study website. The preferred answers with rationales were posted by IMS to the study website. STUDY PARTICIPANTS Each participating organization was asked to provide a representative to coordinate participation from their members and to answer questions. Notices of the study were sent out to the cancer registry community via the CoC flash, CDC NPCR notices, the NCRA Connection, NCI SEER listserve, and the NAACCR listserve. The contact for Canada sent notices to each of the provinces. The study was open for seven weeks to allow ample time for participation from the registry community. The result was the highest number of participants (1040) of the three reliability studies (Table 2). The majority of the U.S. participants were from CoC hospitals (66%) reflecting the fact that most of the coding of cancer data in the U.S. occurs in hospitals. The participants were well educated and experienced with 88% having Certified Tumor Registrar (CTR) status and the majority having more than six years experience with CS and more than 1 year with CSv2. Since Collaborative Stage was initiated in 2004 and CSv2 effective for 2010 diagnoses, this was the maximum experience possible. Most study registrants (81%) completed at least one of the cases that they had been assigned to code, as shown in Figure 1. Each participant received an initial assignment of 10 cases, and upon request, the participant was assigned 10 additional cases, for a maximum of 20 cases. Twenty eight percent of participants completed the full 20 cases, and an additional 50% of participants completed 10 cases or more. Just 3% of participants did not fully complete the initial 10 cases. The normal work responsibilities of participants included case finding, abstracting and follow up (Figure 2). Over 40% of participants had also been involved in data analysis, data editing, and other QC activities. Figure 3 presents the distribution of study participant by years of experience in the cancer registration field. Close to one third of participants reported 15 years of experience or more, with another 46% reporting at least five years of experience. Just 12% of participants reported less than five years of cancer registry experience. Pg. 5 of 32

6 Table 1. Table 1: Characteristics of 2011 Study Participants N % Total Type of Organization Canada 42 4% CoC Hospitals % Central Registries % Other* 94 9% Credential CTR % Non CTR % Years of Experience CS Less than 1 year 62 6% 2 to 6 years % More than 6 years % Years of Experience CSv2 Less than 6 months 83 8% 6 months to 1 year % More than 1 year % * Non CoC hospitals, contractors, registry services companies and national standard setters Pg. 6 of 32

7 FIGURE 1. DISTRIBUTION OF STUDY PARTICIPANTS BY THE NUMBER OF CASES COMPLETED FIGURE 2. MOST COMMON WORK RESPONSIBILITIES OF STUDY PARTICIPANTS Top 5 most frequent answers Abstracting Casefinding Follow-up Quality control Editing/case consolidation Data analysis 64.2% 50.7% 49.3% 44.4% 43.8% 85.9% 0% 20% 40% 60% 80% 100% Pg. 7 of 32

8 FIGURE 3. DISTRIBUTION OF PARTICIPANTS BY YEARS OF EXPERIENCE IN CANCER REGISTRATION (N=1040) Unknown 10% 0-4 yrs. 12% 5-9 yrs. 38% 15+ yrs. 32% yrs. 8% 15+ yrs yrs. 5-9 yrs. 0-4 yrs. Unknown DATA ITEMS The data items in the study are a combination of long standing core data items such as tumor size and extension (Table 3) and newer fields, site specific factors (SSFs) that vary by schema. Examples of SSFs include Prostatic Specific Antigen (PSA) for prostate and Estrogen Receptor (ER) status for breast cancer and are listed in Appendix E. The data items included in the study are those required by SEER and CoC for CSv2. Some SSFs were not required by the other organizations (mostly NPCR registries and non CoC hospitals); however, all participants completed the same set of data items for each case. Participants had the option of completing non required SSFs questions for this study and these results are not included in the final analysis. The number of SSFs varies by schema with breast having 24 SSFs compared to only 2 for lung. Table 2: Core Data Primary site Histology Behavior code Laterality Age (Thyroid only) LVI (Testis only) Grade (Esophagus only) Table 2: CS Data Items CS Tumor Size CS Extension CS TS/Ext Eval CS Lymph Nodes CS Nodes Eval Reg LN Pos Reg LN Exam CS Mets at Dx CS Mets at DX Bone CS Mets at Dx Brain CS Mets at DX Liver CS Mets at DX Lung CS Mets Eval Pg. 8 of 32

9 RECONCILIATION Reconciliation provides an opportunity for registrars to provide their coding rationale, especially when they disagree with the preferred answer. This gives the study developers a chance to see where some of the problems within CS exist and provides educational opportunities to address those problems. Due to the number of sites/schemas involved in this study, the traditional format for doing reconciliation was changed. In the past, SEER distributed the data items eligible for reconciliation and then set up conference calls with their registries and other interested parties to discuss the data items. Changes were made based on these interactions. During the planning process for this study, it was determined that this way of doing reconciliation was not feasible and an alternate method was developed. The reconciliation process used for this study was an online web application open 24 hours a day for 2 weeks. This process gave more people the opportunity to participate. Of the 1035 data items collected in the study, 439 were eligible for reconciliation. Below is the distribution of the eligible data items for each of the groups. Table 3. Group Total Data Items Core Data Items SSFs Total % of Total % % % % Total % Reconciliation was open for two weeks and over 7,200 responses were entered by 284 (approximately 27% of participants) registrars into the online reconciliation. The comments were grouped by the type of data item. The Non CS data items include: primary site, histology, laterality, age (thyroid only) and LVI (testis only). The CS Core data items include: Tumor Size, Extension, TS/Ext Eval, Lymph nodes, LN eval, Reg nodes positive, Reg nodes examined Mets at Dx and Mets Eval. The Site specific factors are listed in Appendix E. The optional SSFs, which are not required by any of the standard setters, are not included in the analysis. Table 4. Types of Comments Received Type of Data Items Comments Received % of Total Comments Non CS Data Items % CS Core Data Items % Required SSFs % Optional SSFs % Total % Pg. 9 of 32

10 After the responses were received, the four teams reviewed comments from the registrars. The cases were reviewed once again with the registrar s comments. Based on this, 15 answers were changed, rationale was updated and/or multiple answers were allowed. The experts identified at the beginning of the study were contacted and asked to help with many of these answers. The table below summarizes the changed answers based on reconciliation. Table 5. Reconciliation Results Group Case Data item Preferred Final Reason for change Answer Answer(s) 1 Breast 1 SSFs 12, 13, 14, 17, 18, 22, ,998 No documentation. Preferred answer unknown (999). Test not done (998) also accepted 1 Breast 2 SSFs: 10, 11, 12, 13, 14, 17, 18, 22, 23 1 Corpus Reg Nodes Examined 1 Lung 1 Reg nodes positive 1 Lung 1 Reg nodes examined 1 Lung 1 Mets at Dx Lung 1 Merkel Cell Skin ,998 No documentation. Preferred answer unknown (999). Test not done (998) also accepted 16 16, 18 Due to differences in path report and CAP report, both node counts accepted Further evaluation determined that these nodes were fine needle aspirations and did not qualify for nodes examined 1 0 Pleural effusion in contralateral lung coded as mets. According to Part I, Section 1, this should not be coded as lung mets SSF No mention of matted or fixed lymph nodes which is needed to code NET Colon SSF No regional nodes seen on CT 1 NET Colon SSFs 16, ,998 No documentation. Preferred answer unknown (999). Test not done (998) also accepted 1 Testis SSF , 001 Code 001 added due to minimal information 1 Testis SSF Need to code values after 1 Testis SSF orchiectomy and before additional treatment 1 Thyroid Age , 52 Both formats accepted 2 Breast 1 SSF 10, 11, 12, 13, 14, ,998 No documentation. Preferred answer unknown Pg. 10 of 32

11 Group Case Data item Preferred Answer 17, 18, 22, 23 2 Breast 2 SSFs 10, 11, 14, 17, 18, 22, 23 Final Answer(s) Reason for change (999). Test not done (998) also accepted ,998 No documentation. Preferred answer unknown (999). Test not done (998) also accepted 2 Esophagus EG Junc CS Mets at Dx Liver 0 1 Positive mets on peritoneal biopsy 2 Lung 2 Extension Aorta not involved, paraaorta is. Code to mediastinal involvement 2 Lung 2 SSF Elastic layer not evaluated 2 Merkel Cell Skin Extension , 600 Due to minimal information, 2 codes accepted 2 Merkel Cell Skin SSF Cannot determine size based on a percentage given 2 NET Colon SSFs 16, ,998 No documentation. Preferred answer unknown (999). Test not done (998) also accepted 2 NET Colon Mets at Dx Bone 00 00, 99 No mention of bone mets 3 Colon 1 SSFs 1, 3, 7, 9, ,998 No documentation. Preferred answer unknown (999). Test not done (998) also accepted 3 Colon 1 SSF Negative CT scan. Enough to code no clinical LN involvement 3 Colon 2 SSFs 7, 9, ,998 No documentation. Preferred answer unknown (999). Test not done (998) also accepted 3 GIST Stomach SSFs 8, ,998 No documentation. Preferred answer unknown (999). Test not done (998) also accepted 3 Kidney SSF Adrenal gland involvement not mentioned, can code none 3 Ovary Mets Eval Colon 1 SSFs 1, 3, ,998 No documentation. Preferred answer unknown (999). Test not done (998) also accepted Pg. 11 of 32

12 Group Case Data item Preferred Final Reason for change Answer Answer(s) 4 Colon 2 SSFs 1, 3, 9, ,998 No documentation. Preferred answer unknown (999). Test not done (998) also accepted 4 Colon 2 SSF Only clinical workup is chest xray, not enough to code negative clinical LN s 4 GIST Stomach 4 Kidney Lymph 4 Melanoma Skin SSFs 8, ,998 No documentation. Preferred answer unknown (999). Test not done (998) also accepted 1 0 Nodes Eval SSFs 4, 5, ,998 No documentation. Preferred answer unknown (999). Test not done (998) also accepted 4 Ovary Mets Eval 1 0 MAJOR, MINOR AND UNKNOWN TO KNOWN (Unk2k) ERRORS There are three error classifications: major, minor, and unknown to known. (See Appendix C: Definition of Errors) Major errors are those that affect data analysis (staging) or incidence, specifically for Collaborative Staging items (major errors resulted in changes within Tumor, Nodes or Metastasis (TNM)). Minor errors show a lack of specificity that would have no important effect on data analysis or changes within TNM. Unknown to known errors occur when the participant uses a code meaning unknown when more specific information is available in the chart, or when a default code should have been used. Unknown to known errors may or may not affect analysis. If they do affect analysis, they are counted as major errors. The graph below shows the distribution of the major errors by CS Core Data Items (Tumor Size, Extension, TS/Ext Eval, Lymph Nodes, Regional nodes positive, Regional nodes examined, Lymph Node Eval, Mets at Dx and Mets Eval) and Stage Related SSFs. (For a site by site, item by item, case by case analysis of frequencies, please see Appendix E: Site, Data, Case response listing.) Pg. 12 of 32

13 Table 6. Accuracy Rate Goals for 2011 Study (Based on % of cases without Major Errors) Cells highlighted met the Accuracy Goal Schema Stage Related Core Items Stage Related SSFs Goal Actual Goal Actual Bladder % 95 n/a Brain 95 n/a 95 n/a Breast % % Colon % % Corpus % 95 n/a EGJ % % GIST St % % Lung % % Kidney % 95 Melanoma Skin % % Merkel Cell Skin % % NET Colon % % Pharyngeal Tonsil % % Ovary % 95 n/a Prostate % % Testis % % Thyroid % 95 n/a Below is a distribution of the types of answers for each of the major groups of cases in the study. Brain is not included since it is not staged. Pg. 13 of 32

14 Figure 4: Distribution of Correct, Major, Minor and Unk2 k answers for Breast, Colon, Lung and Prostate for CS Core Data Items Figure 5: Distribution of Correct, Major, Minor and Unk2 k answers for Corpus, GIST Stomach, Merkel Cell Skin, NET Colon and Testis for CS Core Data Items Pg. 14 of 32

15 Figure 6: Distribution of Correct, Major, Minor and Unk2 k answers for Corpus, GIST Stomach, Merkel Cell Skin, NET Colon, Thyroid and Testis for CS Core Data Items Figure 7: Distribution of Correct, Major, Minor and Unk2 k answers for Bladder, Esophagus GE Junc, Kidney, Melanoma Skin, Ovary and Pharyngeal Tonsil for CS Core Data Items Pg. 15 of 32

16 Figure 8: Distribution of Correct, Major, Minor and Unk2k answers for Breast SSFs 9, 11, 13, 15, 16. Below is the distribution for the HER2NEU interpretation and summary SSFs. The major errors are defined by registrar s using a code of 988, which is not applicable. For these SSFs, two answers of 998 (test not done) and 999 (unknown) were accepted for the SSFs where the answer was not documented in the record. SSF 16 is the only data item in the entire study that had a 100% agreement with the final answer. Figure 9: Distribution of Correct, Major, Minor and Unk2k answers for Prostate SSFs 3, 7, 8, 9, 10, and 11. Below is the distribution for pathologic extension for Prostate and the Gleason SSFs. Major errors for SSF 3, 8 and 10 are due to these factors being used for stage. SSFs 7 and 9 major errors are defined by registrars coding 988, which is not applicable. Since these are required SSFs, code 988 is not allowed. For a complete listing of schemas and SSFs by type of answer (correct, major, minor and unk2k), see Appendix E Pg. 16 of 32

17 ANALYSIS OF FACTORS AFFECTING THE ACCURACY RATE As shown in the Major, Minor, and Unknown to Known Errors analysis chapter, the accuracy of coding CS v2 data elements varied significantly by CS schema and data element. Numerous factors could explain these results. Some of the factors were related to the characteristics of the cases presented for coding, whereas others were related to the attributes of study participants. This section of the report focuses on the presentation of observed accuracy rate and the association of the rate with the characteristics of study participants. As with any study involving medical coding, it was expected that the accuracy would be correlated with the complexity of coding rules and the number of valid values that the system will accept for each data element. For example, data elements such as tumor behavior and laterality are generally coded more accurately than histology, because there are fewer values that a registrar has to select from as compared with the multitude of values available for coding primary site or histology. Meanwhile, primary site and histology are more important for CS coding because miscoding these data elements can result in the selection of an incorrect CS schema, with catastrophic consequences regarding the accuracy of coding all CS data elements for that given case. Table 7 presents the proportion of accurate coding for primary site, histology, behavior, and laterality for all CS schemas included in the study. The results are aggregated by CS schema, in the sense that the proportions shown for a given schema have been calculated based on answers from all participants who attempted to code a case (e.g. 1,399 answer for breast cases, 712 for bladder cases, etc.). As expected, the behavior and laterality are coded correctly for most sites, with the exception of laterality of thyroid and pharyngeal tonsil cases. Meanwhile, there is a huge variation with respect to the accuracy of primary site, ranging from perfect results for thyroid, kidney, ovary, and pharyngeal tonsil, to very poor results for bladder, breast, testis and GIST stomach. The perfect results are explained by the fact that there is a unique valid value for coding thyroid, ovarian, kidney parenchyma and pharyngeal tonsil tumors. Similarly, the histologic accuracy was perfect for Merkel cell CS schema, which is defined by a single value for histology. However, the results showed low accuracy for the coding of the histology of thyroid, breast, and testicular tumors. This finding is particularly important because errors related to histology for these sites can result in assigning a stage to tumors that are not covered by AJCC staging. Pg. 17 of 32

18 Table 7. Total Participants and Proportion of the correct answer for Primary site, Histology, Behavior and Laterality by CS schema. CS Schema Primary site Histology Behavior Laterality # % # % # % # % Bladder Breast 1, , Corpus carcinoma Brain Colon 1, , , , Net Colon Esophagus GE Lung 1, , Thyroid NA Testis Prostate 1, , , , Melanoma Merkel Cell Kidney Parenchyma GIST Stomach Ovary Pharyngeal tonsil NA The next three tables present the accuracy of selecting the preferred code value for all core CS data elements. These tables include the values for three data elements that are crucial for determination of both SEER Summary Stage and AJCC Stage. The three data elements are CS Extension, CS Lymph Nodes and CS Mets at Diagnosis. Table 8 shows the accuracy of coding tumor size, the direct extension of tumor, and the method for evaluating the size and the extension. In general, the tumor size is coded more accurately than the tumor extension. Except for skin melanomas, the accuracy of coding tumor extension was poor. Very low accuracy of coding tumor extension was observed for some of the most frequent tumor sites, such as bladder, lung and colon. While the accuracy of coding the extension of skin melanomas was almost perfect, the accuracy of coding tumor size was very low (48.5%), second worst after the accuracy of coding the size of brain tumors (47.9%). In general, the accuracy of size/extension evaluation was somewhat better than the accuracy for size or for extension. Nevertheless, there was no general pattern. For example, brain has low accuracy for tumor size (47.9%) and extension (82.9%) but perfect accuracy for evaluation, while for the pharyngeal tonsil schema, the accuracy of coding size was very good ( 95.3%) and the accuracy of coding the evaluation method was poor ( 42.4%). Table 9 presents the accuracy of coding data elements that describe the dissemination of tumors to regional lymph nodes. This group of data elements include: the CS Lymph Nodes, the Lymph Nodes Evaluation, Regional Pg. 18 of 32

19 Lymph Nodes Examined, and the Regional Lymph Nodes Positive. With few exceptions, the CS Lymph Nodes was coded more accurately than the CS Extension and CS Tumor Size. The exceptions are breast, skin melanomas, thyroid and pharyngeal tonsils tumors. The perfect accuracy recorded for brain tumors is explained by the absence of regional lymphatic dissemination in brain tumors. Except for brain, the CS schema with best accuracy for lymph nodes was Prostate. At the other end of the accuracy spectrum, low accuracy was noted for pharyngeal tonsils, colon tumors, and skin melanomas. Also, the accuracy was generally better for the two data elements that pre dated the CS schema: regional nodes examined and regional nodes positive. Accuracy of coding CS data elements that describe distant metastases are shown in table 10. In general, the accuracy of coding metastases is better than the accuracy of coding regional lymph nodes involvement, local extension or tumor size. Noticeable exceptions have been observed for coding the metastasis of lung tumors (43.9%) and of the NET colon tumors (20.5%). The CS schemas of tumors with highest incidence in population, which are breast and prostate tumors, both showed very good accuracy for all CS data elements. The four data elements that have been required for reporting more recently, Mets Lung, Mets Brain, Mets Liver and Mets Bone, while apparently are easier to code, have not been coded more accurately than the CS Mets at Dx. Less accurate coding in this group of data elements was observed for Mets at Dx evaluation, with the accuracy of coding Corpus carcinoma metastases below 20% and the accuracy of esophagus GE tumors at just 38.2%. Table 8. Total Participants and Proportion of the correct answer for Tumor Size, Extension, and Size/extension Evaluation by CS schema. CS Schema Tumor Size Extension Size/extension Evaluation # % # % # % Bladder Breast 1, , , Corpus carcinoma Brain Colon 1, , , Net Colon Esophagus GE Lung 1, , , Thyroid Testis Prostate 1, , , Melanoma Merkel Cell Kidney Parenchyma GIST Stomach Ovary Pharyngeal tonsil Pg. 19 of 32

20 Table 9. Total Participants and Proportion of the Accurate Answers for Lymph Nodes, Lymph Nodes Evaluation, Lymph Nodes examined, and Lymph Nodes Positive, by CS Schema. CS Schema Lymph Nodes Lymph Nodes Evaluation Lymph Nodes Examined Lymph Nodes Positive # % # % # % # % Bladder Breast 1, , , , Corpus carcinoma Brain Colon 1, , , , Net Colon Esophagus GE Lung 1, , , , Thyroid Testis Prostate 1, , , , Melanoma Merkel Cell Kidney Parenchyma GIST Stomach Ovary Pharyngeal tonsil Pg. 20 of 32

21 Table 10. Total Participants and Proportion of the correct answers for CS Metastases related data elements by CS schema. CS Schema Mets at Dx Mets at DX Evaluation Mets at Dx Bone Mets at Dx Brain Mets at Dx Liver Mets at Dx Lung # % # % # % # % # % # % Bladder Breast 1, , , , , , Corpus carcinoma Brain Colon 1, , , , , , Net Colon Esophagus GE Lung 1, , , , , , Thyroid Testis Prostate 1, , , , , , Melanoma Merkel Cell Kidney Parenchyma GIST Stomach Ovary Pharyngeal tonsil Core CS data elements, and certain site specific data elements, are used to derive the AJCC Group Stage and the Summary Stage. Table 11 presents the accuracy of stage value that was derived for each case based on the set of answers that the participants returned for the case. For the purpose of calculating the proportion accurate stage, the stage derived for each case completed by a participant was compared with the preferred stage derived from the set of final preferred answers adopted by the study team. Table 11 presents the accuracy calculated independently for Summary Stage 1977, Summary State 2000, AJCC Stage Group edition 6 and AJCC stage Group edition 7. According to the table, the Accuracy of AJCC 7 Stage for breast cases was 83.1%. This can be interpreted that for 83.1 percent of the breast cases coded by study participants, the AJCC stage calculated (derived) from the answers received from participants was correct. For the remaining 16.9% of the cases, the combination of CS codes selected by participants resulted in the inaccurate staging of the case. Pg. 21 of 32

22 Some of the most accurate answers across the four staging systems were obtained for kidney parenchyma, ovary, breast, brain, lung and the GIST tumors of stomach. High proportions of inaccurate stage have been observed for skin melanomas, neuroendocrine tumors of colon, pharyngeal tonsils, esophagus and prostate tumors. In general there have been fewer errors for the SEER Summary Stage system than for the AJCC system. Also, AJCC Stage Group edition 7 performed better than edition 6, although differences are minimal. As someone would expect, the more data elements considered for the derivation of stage, the higher the likelihood of disagreement with the preferred stage. Based on table 11, it can be inferred that Summary Stage 2000 and AJCC Stage Group 7th edition are probably the best global indicators of coding accuracy. Accordingly, the analytic team focused the rest of analysis on finding predictors of these to global coding outcome indicators. Table 11. Participants Count and the Proportion of Correct Answers by CS Schema and Staging System CS Schema AJCC 6 Stage AJCC 7 Stage Summary Stage 1977 Summary Stage 2000 # % # % # % # % Bladder Breast 1, , , , Corpus carcinoma Brain 702 NA 702 NA Colon 1, , , , Net Colon Esophagus GE Lung 1, , , , Thyroid Testis Prostate 1, , , , Melanoma Merkel Cell Kidney Parenchyma GIST Stomach 682 NA Ovary Pharyngeal tonsil Pg. 22 of 32

23 Table 12. Comparison of Proportion Accurate Stage by Certification Status, for Certain Staging Systems and CS Schemas. CS v2 Schema Summary Stage 2000 AJCC Stage Group 7 th edition CTR Non CTR CTR Non CTR N a % accurate b N a % p c N a % accurate b accurate b N a % accurate b Bladder Breast 1, d Corpus Carcinoma Brain d 1, d NA NA NA NA Colon 1, , Net Colon Esophagus GE d Lung 1, , Thyroid Testis d Prostate 1, d 1, <0.01 d Melanoma Merkel Cell GIST Stomach Ovary d Pharyngeal tonsil Kidney <0.01 d e Parenchyma a =number of distinct set of answers collected from participants and used for stage derivation. b =proportion correct derived stage (at the sub stage level). c =p value associated with Pearson chi square statistic (for the association between correct stage and CTR/Non CTR status). d =statistically significant p c Pg. 23 of 32

24 Table 13. Comparison of Proportion Accurate Stage by CS Coding Experience, for Certain Staging Systems and CS Schemas. CS v2 Schema Summary Stage 2000 AJCC Stage Group 7 th edition >= 5 yrs. CS experience <5 yrs. CS experience P d >= 5 yrs. CS experience <5 yrs. CS experience N b % accurate c N b % accurate c N b % accurate c N b % accurate c Bladder Breast 1, , <.01 e Corpus Carcinoma Brain NA NA NA NA Colon 1, , <.01 e Net Colon Esophagus GE e Lung 1, , Thyroid Testis Prostate 1, , Melanoma Merkel Cell GIST Stomach Ovary e Pharyngeal tonsil e Kidney Parenchyma a =experience with Collaborative Staging system categories: 5 or more years of experience versus less than 5 years of experience. b =number of distinct set of answers collected from participants and used for stage derivation. c =proportion correct derived stage (at the sub stage level). d =p value associated with Pearson chi square statistic (for the association between correct stage and Collaborative Staging years of experience). e =statistically significant P d Pg. 24 of 32

25 Table 14. Comparison of Proportion Accurate Stage by CS v2 Coding Experience, for Certain Staging Systems and CS Schemas. CS v2 Schema Summary Stage 2000 AJCC Stage Group 7 th edition >= 12 mos. CS v2 experience < 12 mos. CS v2 experience P d >= 12 mos. CS v2 experience < 12 mos. CS v2 experience P d N b % accurate c N b % accurate c N b % accurate c N b % accurate c Bladder e Breast Corpus Carcinoma Brain e NA NA NA NA Colon <.01 e Net Colon Esophagus GE Lung Thyroid e Testis Prostate <.01 e Melanoma Merkel Cell GIST Stomach <.01 e e Ovary <.01 e <.01 e Pharyngeal tonsil Kidney Parenchyma a =experience with Collaborative Staging system categories: 12 or more months of experience versus less than 12 month of experience. b =number of distinct set of answers collected from participants and used for stage derivation. c =proportion correct derived stage (at the sub stage level). d =p value associated with Pearson chi square statistic (for the association between correct stage and CS version 2 experience). e =statistically significant Pg. 25 of 32

26 Table 15. Comparison of Proportion Accurate Stage by Exposure to Organized CS v2 Training, for Certain Staging Systems and CS Schemas. CS v2 Schema Summary Stage 2000 AJCC Stage Group 7 th edition Some organized training Self taught/onthe job only P d Some organized training Self taught/onthe job only P d N b % N b % accurate c accurate c N b % accurate c N b % accurate c Bladder Breast 1, Corpus Carcinoma Brain NA NA NA NA Colon 1, e Net Colon Esophagus GE <0.01 e Lung 1, , Thyroid Testis Prostate 1, , <0.01 e Melanoma e Merkel Cell GIST Stomach Ovary e Pharyngeal tonsil Kidney Parenchyma a = Some organized training category includes any respondent who attended, webinars or national or local in person training; Self taught/on the job only category includes respondents who did not receive training. b =number of distinct set of answers collected from participants and used for stage derivation. c =proportion correct derived stage (at the sub stage level). d =p value associated with Pearson chi square statistic (for the association between correct stage and attending organized training). e =statistically significant Pg. 26 of 32

27 Figure 10. Risk of Inaccurate Staging by Education Achievement Category, adjusted for Certification Status, CS Schema, Abstracting Case Volume, and Experience with CS v2. Risk of Staging Errors by Education Achivement; reference Category = "Some College Education" Reference level 4% 13% +20% 4% 10% SS2000 AJCC 7 High school Some College Associate Degree Bachelor Degree Gradute Education # = Reference level the average risk for inaccurate staging observed among study participants in the some college education category. Figure 11. Risk of Inaccurate Staging by Certification Status, Adjusted for Education Achievement, CS Schema, Abstracting Case Volume, and Experience with CSv2. Reference level Risk of Staging Errors by Experience with CS v2; reference category = "No experience" 15% 39%** 45%** 12% 32%** 43%** SS2000 AJCC 7 No experience Less than 6 mos. 6 mos. 1yr. 1yr. + Pg. 27 of 32

28 Figure 12. Risk of Inaccurate Staging by Experience with CSv2, Adjusted for Education Achievement, Certification Status, CS Schema, and Abstracting Case Volume. Reference level Risk of Staging Errors by Certification Status; reference category = "certified tumor registrar " Ref. +16% Ref. +30%*** SS2000 AJCC 7 CTR (reference cat.) Non CTR staging observed among study participants with no experience in CS v2. **= statistically significantly different than the reference category # = Reference level the average risk for inaccurate Conclusion This was the largest cancer registry reliability study ever done. The participants were experienced registrars and the database is robust. The primary site of lung had the most inaccuracies. The most problematic variable was CS Extension. SEER Reliability Studies and CDC/PCR Audits over the past 5 years have shown the same results for the primary site and variable. The mean for CS Extension was 60.0% and the range was from 25.9% to 96.1% accuracy. Another area of concern is Lymph Nodes. The mean was 76.6% with a range of 46.0% to 100%. The variable, Behavior, had the highest accuracy with a mean of 99.7% and a range from 96.1% to 100%. Registrars with some organized training in CSv2 fared slightly better for Summary Stage 2000 than registrars that were self taught. The mean for those with training was 88.7% and the mean for those who were self taught was 84.8%. Accuracy drops for AJCC Stage Group (7 th Edition) and we see registrars with some CSv2 Training having a mean of 69.6% and registrars who have been self taught have a mean of 64.2% accuracy. Higher education and experience with CSv2 both reduce the risk of staging errors. The transition from CSv1 to CSv2 has added complexity to staging that may not provide the level of quality for staging data that is needed. Consideration should be given to reducing the complexity by collapsing codes where possible (where stage will not be affected). The CSv2 Field Study Team hopes the data provided in this report assists the CSv2 Governance Committee to determine the best way to increase the quality of cancer stage data. Pg. 28 of 32

29 Tables, Protocol, Appendices will be on the CSv2 website at Standard Setters involved in this reliability study have been given their specific data to perform additional reports as desired. RECOMMENDATIONS Recommendations for Educational Topics Retain study documentation to do a comparison Reliability Study on the same cases used in this study when CS is changed in the future for a true comparison for evaluation Simplify how to code when documentation is not in the record o Prominent in coding for SSFs o Consider combining all the codes indicating the test was not done, unknown if done; not applicable, etc. and putting them in one code. Direct AJCC Lung Chapter Team for 8 th Edition AJCC Staging Manual to provide more detail for lung staging Develop short presentation on how to read the online manual and how to find answers o Education should not refer them to CAnswer Forum but to the CSv2 Manual page that is appropriate o Reinforce that registrars MUST read or consult the CSv2 Manual for coding Collapse 0 and 1 in CS Eval Code to 0 for Clinical Clarify documentation in the CS Manual and provide further education on the following specific areas: o Part I, Section I Tumor Size Coding when multiple sizes are available and clinical vs pathologic o Part I, Section I & II Lymph Nodes (includes CS LNs & related SSFs) Coding clinical lymph nodes when there is no documentation o Part I, Section II SSFs Clarification of no histologic examination. Breast Lymph Nodes Difference between code 250 and 600 o Breast SSF #6 Invasive/Insitu Components Clarification and updating examples o Breast SSF #7 Nottingham Score Clarification and examplescolon Extension and Lymph Nodes Anatomy of colon in regards to coding appropriate extension and lymph nodes o Colon SSF #7 CRM How to code when there is no mention of CRM or documentation of margins, NOS o Lung Extension Understanding the anatomy of the lung o Lung Mets at DX Clarification Mets at DX codes, specifically pleural effusion o Lung SSF 1: Separate Tumor Nodules Examples of how to code o Lung SSF 2: Pleura/Layer Examples of how to code o Prostate Extension (Clinical) Coding inapparent vs apparent o Prostate SSF 3: Pathologic Extension Emphasizing margin positive codes o Prostate SSF 12, 13: Cores Positive and Examined Examples of how to code o Bladder Extension How to interpret no muscle fragments available how to code Pg. 29 of 32

30 o Bladder TS/Ext Eval Emphasizing TURB is an eval 1 o Kidney SSF 2: Vein Involvement Examples of how to code o Kidney SSF 3: Adrenal Gland Involvement Examples of how to code o Kidney SSF 4: Sarcomatoid Features Examples of how to code o Melanoma Skin Tumor Size and SSF 1: Breslow Depth s Examples of the differences between the two and how to code o Melanoma Skin Lymph Nodes Examples of in transit mets and satellite nodules o Ovary Extension Examples of how to code peritoneal implants o Ovary SSF 3: Residual tumor after cytoreduction surgery Examples of how to code o GIST Stomach SSF 6: Mitotic Rate Examples of how to code, specifically the format o Corpus Carcinoma Mets at DX Definition of omental sampling and how it relates to Mets at DX o Esophagus GE Junction SSF 25: Schema Discriminator Examples of how to code APPRECIATION This 3 year project required the assistance of many cancer data professionals. We would like to acknowledge their work and dedication to the project. CSv2 Field Study Team Lynda Douglas (Co Leader) Jennifer Ruhl (Co Leader) Dave Annett Connie Bura Elaine Collins Jean Cyr Brenda Edwards Donna Gress Jim Hofferkamp Gemma Lee Marty Madera Zachary Myles Jerri Linn Phillips Karen Pollitt Dave Roney Shannon Vann CS Data Analysis Team Serban Negoita (Leader) Jean Cyr Lynda Douglas Stacey Fedewa CDC/NPCR NCI SEER IMS CoC NPCR IMS NCI SEER AJCC NAACCR Canada CoC/AJCC CDC/NPCR CoC AJCC IMS NAACCR Westat IMS CDC/NPCR ACS Pg. 30 of 32

31 Missy Jamison NCI SEER Carol Kosary NCI SEER Gemma Lee Canada Zachary Myles CDC/NPCR Jerri Linn Phillips CoC Jennifer Ruhl NCISEER Site Group 1 Jennifer Ruhl (Leader) NCI SEER Cindi Dryer SEER Terry Dawson NPCR Bryan Palis CoC Cyndy Russell Canada Shannon Vann NAACCR Site Group 2 Gemma Lee (Leader) Canada Iris Chilton Canada Kathy Malin CoC Shawky Matta SEER Nancy Santos SEER Katheryne Vance NPCR Site Group 3 Lynda Douglas (Leader) CDC/NPCR Elaine Collins NPCR Joann Janzen Canada Erica McNamara CoC Bobbi Matt SEER Jeanne Whitlock SEER Site Group 4 Jerri Linn Phillips (Leader) CoC Sheena Batts SEER Leah Driscoll SEER Betty Gentry NPCR Jim Hofferkamp NAACCR Katharine Pearson Canada Theola Rarick SEER Tony Robbins ACS Technical Expert Consultants Lynn Ries Contractor April Fritz Contractor Annette Hurlbut Contractor (Transcription) Mary Mesnard Westat Pg. 31 of 32

NAACCR Webinar 2018 SeriesImplementations and Timelines

NAACCR Webinar 2018 SeriesImplementations and Timelines NAACCR 2015-2016 Webinar 2018 SeriesImplementations and Timelines August 8, 2017 Session 1 Q&A Please submit all questions concerning webinar content through the Q&A panel. A recording of today s session,

More information

2018 IMPLEMENTATION UPDATE: WHAT S NEW IN STAGING FOR 2018?

2018 IMPLEMENTATION UPDATE: WHAT S NEW IN STAGING FOR 2018? 2018 IMPLEMENTATION UPDATE: WHAT S NEW IN STAGING FOR 2018? SESSION 2 10/20/17 Q&A Please submit all questions concerning webinar content through the Q&A panel. A recording of today s session, the Q&A,

More information

Collaborative Stage Transition Newsletter March 18, 2014

Collaborative Stage Transition Newsletter March 18, 2014 Collaborative Stage Transition Newsletter March 18, 2014 Introduction This is a communication update from organizations within the Table of Contents cancer surveillance community to share with their members

More information

CANCER REPORTING IN CALIFORNIA: ABSTRACTING AND CODING PROCEDURES California Cancer Reporting System Standards, Volume I

CANCER REPORTING IN CALIFORNIA: ABSTRACTING AND CODING PROCEDURES California Cancer Reporting System Standards, Volume I CANCER REPORTING IN CALIFORNIA: ABSTRACTING AND CODING PROCEDURES California Cancer Reporting System Standards, Volume I Changes and Clarifications 16 th Edition April 15, 2016 Quick Look- Updates to Volume

More information

Version 2 Overview and Update CSv0202 to CSv0203

Version 2 Overview and Update CSv0202 to CSv0203 Version 2 Overview and Update CSv0202 to CSv0203 CS version 2 Education and Training Team What We ll Cover Rules changes and revisions CSv0202 to CSv0203 Sites with Major Changes Esophagus and Stomach

More information

Interactive Discussion of Part I CS Coding Instructions: Working the Cases

Interactive Discussion of Part I CS Coding Instructions: Working the Cases Interactive Discussion of Part I CS Coding Instructions: Working the Cases April Fritz, RHIT, CTR Donna M. Gress, RHIT, CTR Jennifer Ruhl, RHIT, CCS, CTR This presentation was supported by the Cooperative

More information

CS Evaluation Fields. Outline of Presentation. Purpose of Evaluation Field. CSv2 Title of Presentation Jan 2011 Lecture Version: 1.

CS Evaluation Fields. Outline of Presentation. Purpose of Evaluation Field. CSv2 Title of Presentation Jan 2011 Lecture Version: 1. CS Evaluation Fields Education and Training Team Collaborative Stage Data Collection System Version 02.03.02 (Effective date: 1/1/2011) Outline of Presentation Purpose AJCC TNM Classification Eval data

More information

Kyle L. Ziegler, CTR. California Cancer Registry U.C. Davis Health System

Kyle L. Ziegler, CTR. California Cancer Registry U.C. Davis Health System Kyle L. Ziegler, CTR California Cancer Registry U.C. Davis Health System Overview New Data Items Reportability Clarifications New Coding Rules Grade ICD-O-3 Changes Collaborative Stage v0205 2 New Data

More information

SEER 2014 TNM Training Needs Assessment Study

SEER 2014 TNM Training Needs Assessment Study SEER 2014 TNM Training Needs Assessment Study Availability of Cancer Staging Information at the Time of Registration Anne-Michelle Noone NAACCR Annual Conference June 16, 2016 Outline 1. Background 2.

More information

MCR: MANAGEMENT OF 2018 CHANGES. By: Maricarmen Traverso-Ortiz MPH, CGG, CTR

MCR: MANAGEMENT OF 2018 CHANGES. By: Maricarmen Traverso-Ortiz MPH, CGG, CTR MCR: MANAGEMENT OF 2018 CHANGES By: Maricarmen Traverso-Ortiz MPH, CGG, CTR LEARNING OBJECTIVES Discuss a summary of the new changes for 2018 Overview of how the Maryland Cancer Registry is managing and

More information

Kidney Q&A 5/5/16 Q1: Can we please get that clarification sent with the presentation and Q&A? Also a start date for that clarification

Kidney Q&A 5/5/16 Q1: Can we please get that clarification sent with the presentation and Q&A? Also a start date for that clarification Kidney Q&A 5/5/16 Q1: Can we please get that clarification sent with the presentation and Q&A? Also a start date for that clarification A1: Yes. See below. I don't think it will have a start date. Clarification

More information

Collaborative Stage Transition Newsletter August 18, 2014

Collaborative Stage Transition Newsletter August 18, 2014 Introduction Collaborative Stage Transition Newsletter August 18, 2014 This is the third in the series providing communication updates from organizations within the cancer surveillance community to Table

More information

Collaborative Stage. Site-Specific Instructions - LUNG

Collaborative Stage. Site-Specific Instructions - LUNG Slide 1 Collaborative Stage Site-Specific Instructions - LUNG In this presentation, we are going to review the AJCC Cancer Staging criteria for the lung primary site. Slide 2 Reading Assignments As each

More information

2010 Cancer Data Collection Updates: Standards Volume II, Introduction. What s New for 2010

2010 Cancer Data Collection Updates: Standards Volume II, Introduction. What s New for 2010 2010 Cancer Data Collection Updates: Standards Volume II, Version 12 NAACCR 2009 2010 Webinar Series Introduction What s New for 2010 Record length and record layout New/changed data items Multiple Primary

More information

Vivien W. Chen, PhD Mei-Chin Hsieh, MSPH, CTR Lisa A. Pareti, BS, CTR Xiao-Cheng Wu, MD, MPH, CTR. NAACCR Conference Portland, Oregon, June 5-7, 2012

Vivien W. Chen, PhD Mei-Chin Hsieh, MSPH, CTR Lisa A. Pareti, BS, CTR Xiao-Cheng Wu, MD, MPH, CTR. NAACCR Conference Portland, Oregon, June 5-7, 2012 Vivien W. Chen, PhD Mei-Chin Hsieh, MSPH, CTR Lisa A. Pareti, BS, CTR Xiao-Cheng Wu, MD, MPH, CTR NAACCR Conference Portland, Oregon, June 5-7, 2012 Background Purpose/Objective Methods Results Implications/Next

More information

NPCR s TNM Stage Calculator

NPCR s TNM Stage Calculator NPCR s TNM Stage Calculator A Tool for Central Registry Quality Control and Consolidation Assistance NAACCR Annual Meeting June 16, 2016 Jennifer Seiffert Northrop Grumman Under contract to NPCR Joseph

More information

4/10/2018. SEER EOD and Summary Stage. Overview KCR 2018 SPRING TRAINING. What is SEER EOD? Ambiguous Terminology General Guidelines

4/10/2018. SEER EOD and Summary Stage. Overview KCR 2018 SPRING TRAINING. What is SEER EOD? Ambiguous Terminology General Guidelines SEER EOD and Summary Stage KCR 2018 SPRING TRAINING Overview What is SEER EOD Ambiguous Terminology General Guidelines EOD Primary Tumor EOD Regional Nodes EOD Mets SEER Summary Stage 2018 Site Specific

More information

2018 Grade PEGGY ADAMO, RHIT, CTR OCTOBER 11, 2018

2018 Grade PEGGY ADAMO, RHIT, CTR OCTOBER 11, 2018 1 2018 Grade PEGGY ADAMO, RHIT, CTR ADAMOM@MAIL.NIH.GOV OCTOBER 11, 2018 2 Acknowledgements Donna Hansen, CCR Jennifer Ruhl, NCI SEER Introduction 3 Histologic Type vs. Grade Credit: Dr. Kay Washington

More information

2007 Multiple Primary and Histology Coding Rules

2007 Multiple Primary and Histology Coding Rules National Cancer Institute 2007 Multiple Primary and Histology Coding Rules NCI SEER for NAACCR 2006 Conference U.S. DEPARTMENT OF HEALTH AND HUMAN SERVICES Regina, Saskatchewan June 15, 2006 National Institutes

More information

ACOS Inquiry and Response Selected Inquires CS Tumor Size/Extension Evaluation, CS Lymph Nodes Evaluation, CS Metastasis at Diagnosis Evaluation *

ACOS Inquiry and Response Selected Inquires CS Tumor Size/Extension Evaluation, CS Lymph Nodes Evaluation, CS Metastasis at Diagnosis Evaluation * ACOS Inquiry and Response Selected Inquires CS Tumor Size/Extension Evaluation, CS Lymph Nodes Evaluation, CS Metastasis at Diagnosis Evaluation * CS Tumor Size/Extension Evaluation 24842 12/11/2007: Q:

More information

Esophagus, Esophagus GE Junction, Stomach

Esophagus, Esophagus GE Junction, Stomach Esophagus, Esophagus GE Junction, Stomach Education and Training Team Collaborative Stage Data Collection System Version v02.03 Learning Objectives Understand rationale behind changes and updates Understand

More information

A Practicum Approach to CS: GU Prostate, Testis, Bladder, Kidney, Renal Pelvis. Jennifer Ruhl, RHIT, CCS, CTR Janet Stengel, RHIA, CTR

A Practicum Approach to CS: GU Prostate, Testis, Bladder, Kidney, Renal Pelvis. Jennifer Ruhl, RHIT, CCS, CTR Janet Stengel, RHIA, CTR A Practicum Approach to CS: GU Prostate, Testis, Bladder, Kidney, Renal Pelvis Jennifer Ruhl, RHIT, CCS, CTR Janet Stengel, RHIA, CTR Survey Questions and Answers 250 Responses 2 Question #1 A gentleman

More information

Education & Training Plan FCDS Webcast Series Anatomic Staging Focus

Education & Training Plan FCDS Webcast Series Anatomic Staging Focus 2015-2016 Education & Training Plan FCDS Webcast Series Anatomic Staging Focus 1 F C D S A N N U A L C O N F E R E N C E S T P E T E R S B U R G, F L O R I D A 7 / 2 9 / 2 0 1 5 S T E V E N P E A C E,

More information

AJCC TNM STAGING UPDATES ARE YOU READY FOR TNM?

AJCC TNM STAGING UPDATES ARE YOU READY FOR TNM? AJCC TNM STAGING UPDATES ARE YOU READY FOR TNM? FCRA Annual Conference Boca Raton, Florida 7/26/2016 Steven Peace, CTR 1 Introduction Order AJCC Cancer Staging Manual, 7 th ed. How To Use - AJCC Cancer

More information

Q: In order to use the code 8461/3 (serous surface papillary) for ovary, does it have to say the term "surface" on the path report?

Q: In order to use the code 8461/3 (serous surface papillary) for ovary, does it have to say the term surface on the path report? Q&A Session for Collecting Cancer Data: Ovary Q: In order to use the code 8461/3 (serous surface papillary) for ovary, does it have to say the term "surface" on the path report? A: We reviewed both the

More information

7/29/ Education & Training Plan FCDS Webcast Series and VoIP Audio. Outline Education & Training Plan

7/29/ Education & Training Plan FCDS Webcast Series and VoIP Audio. Outline Education & Training Plan 2014-2015 Education & Training Plan FCDS Webcast Series and VoIP Audio 1 2 0 1 4-2 0 1 5 F C D S E D U C A T I O N & T R A I N I N G P L A N T R A I N I N G T O O L S A N D R E S O U R C E S G O T O M

More information

2018 New Required Data Items for Hospitals

2018 New Required Data Items for Hospitals 2018 New Required Data Items for Hospitals The NJSCR is a population-based registry, mandated by state law, that collects data on all cancer cases diagnosed and/or treated in New Jersey since October 1,

More information

Q&A. Overview. Collecting Cancer Data: Prostate. Collecting Cancer Data: Prostate 5/5/2011. NAACCR Webinar Series 1

Q&A. Overview. Collecting Cancer Data: Prostate. Collecting Cancer Data: Prostate 5/5/2011. NAACCR Webinar Series 1 Collecting Cancer Data: Prostate NAACCR 2010-2011 Webinar Series May 5, 2011 Q&A Please submit all questions concerning webinar content through the Q&A panel Overview NAACCR 2010-2011 Webinar Series 1

More information

Boot Camp /5/15

Boot Camp /5/15 Abstracting & Coding Boot Camp: Cancer Case Scenarios 2014-2015 NAACCR Webinar Series March 5, 2015 Q&A Please submit all questions concerning webinar content through the Q&A panel. Reminder: If you have

More information

Appendix H 2018 FCDS Required Site Specific Data Items (SSDIs)

Appendix H 2018 FCDS Required Site Specific Data Items (SSDIs) Below is the short list of Site Specific Data Items (SSDI) Required by FCDS for 2018. The list may be subject to changes. FCDS requires only a subset of the 136 total SSDIs available to be reported as

More information

Seventh Edition Staging 2017 Melanoma. Overview. This webinar is sponsored by. the Centers for Disease Control and Prevention.

Seventh Edition Staging 2017 Melanoma. Overview. This webinar is sponsored by. the Centers for Disease Control and Prevention. Seventh Edition Staging 2017 Melanoma Donna M. Gress, RHIT, CTR Validating science. Improving patient care. No materials in this presentation may be repurposed in print or online without the express written

More information

Cancer Registrars: Beyond the Abstract

Cancer Registrars: Beyond the Abstract Cancer Registrars: Beyond the Abstract Presented by Melissa Smith, RHIT, CTR Director of Client Services CHAMPS Oncology Objectives Identify what is a Certified Tumor Registrar (CTR) and how someone becomes

More information

Setting the Standard: NPCR and SEER Join Forces to Establish Data Quality Benchmarks

Setting the Standard: NPCR and SEER Join Forces to Establish Data Quality Benchmarks Setting the Standard: NPCR and SEER Join Forces to Establish Data Quality Benchmarks Serban Negoita, Clara Lam, Rebecca Ehrenkranz, Amy Solis, Reda Wilson, Manxia Wu, Vicki Benard June 12, 2018 NCI SRP

More information

North American Association of Central Cancer Registries, Inc. (NAACCR) What You Need to Know for Version 1.1

North American Association of Central Cancer Registries, Inc. (NAACCR) What You Need to Know for Version 1.1 North American Association of Central Cancer Registries, Inc. (NAACCR) What You Need to Know for 2017 Version 1.1 January 2017 Revised March 2017 Table of Contents 1 Introduction... 3 2 ICD-O-3 Histologies...

More information

5/8/2014. AJCC Stage Introduction and General Rules. Acknowledgements* Introduction. Melissa Pearson, CTR North Carolina Central Cancer Registry

5/8/2014. AJCC Stage Introduction and General Rules. Acknowledgements* Introduction. Melissa Pearson, CTR North Carolina Central Cancer Registry AJCC Stage Introduction and General Rules Linda Mulvihill Public Health Advisor NCRA Annual Meeting May 2014 National Center for Chronic Disease Prevention and Health Promotion Division of Cancer Prevention

More information

Q: How do you clinically code the N if the nodes are stated to be positive on mammogram/us or other imaging? No biopsy of nodes was done.

Q: How do you clinically code the N if the nodes are stated to be positive on mammogram/us or other imaging? No biopsy of nodes was done. Q&A Breast Webinar Q: One of my investigators is interested in knowing when Oncotype DX data collection was implemented. That data is collected in SSFs 22 and 23. I remember that the SSFs for breast were

More information

2018 Summary Stage PEGGY ADAMO, RHIT, CTR OCTOBER 11, 2018

2018 Summary Stage PEGGY ADAMO, RHIT, CTR OCTOBER 11, 2018 1 2018 Summary Stage PEGGY ADAMO, RHIT, CTR ADAMOM@MAIL.NIH.GOV OCTOBER 11, 2018 2 Acknowledgement Jennifer Ruhl, NCI SEER 3 Introduction 2018 SUMMARY STAGE 2018 Summary Stage 4 First update since 2001

More information

ACHIEVING EXCELLENCE IN ABSTRACTING: LYMPHOMA

ACHIEVING EXCELLENCE IN ABSTRACTING: LYMPHOMA ACHIEVING EXCELLENCE IN ABSTRACTING: LYMPHOMA ACHIEVING EXCELLENCE IN ABSTRACTING LYMPHOMA Recoding Audit Performed in 2009 260 cases audited 17 data items audited per case 4420 possible discrepancies

More information

2018 New Grade Coding Rules It s a Good Thing!

2018 New Grade Coding Rules It s a Good Thing! 2018 New Grade Coding Rules It s a Good Thing! Presented by Donna M. Hansen, CTR California Cancer Registry NAACCR Webinar May 1, 2018 & May 2, 2018 1 Acknowledgement Special Thanks To: Jennifer Ruhl,

More information

Coding Pitfalls 9/11/14

Coding Pitfalls 9/11/14 Coding Pitfalls 2013 2014 NAACCR Webinar Series September 11, 2014 Q&A Please submit all questions concerning webinar content through the Q&A panel. Reminder: If you have participants watching this webinar

More information

CS Release Notes Version ORGANIZATION OF RELEASE NOTES

CS Release Notes Version ORGANIZATION OF RELEASE NOTES ORGANIZATION OF RELEASE NOTES The Release Notes are organized in the following manner: Manual Review for Recoding of Cases : General Coding Instructions Part I Section 2: Lab Tests, Tumor Markers, and

More information

Staging for Residents, Nurses, and Multidisciplinary Health Care Team

Staging for Residents, Nurses, and Multidisciplinary Health Care Team Staging for Residents, Nurses, and Multidisciplinary Health Care Team Donna M. Gress, RHIT, CTR Validating science. Improving patient care. Learning Objectives Introduce the concept and history of stage

More information

Explaining Blanks and X, Ambiguous Terminology and Support for AJCC Staging

Explaining Blanks and X, Ambiguous Terminology and Support for AJCC Staging Explaining Blanks and X, Ambiguous Terminology and Support for AJCC Staging Donna M. Gress, RHIT, CTR Validating science. Improving patient care. This presentation was supported by the Cooperative Agreement

More information

ICD-O-3 UPDATES - PENDING

ICD-O-3 UPDATES - PENDING ICD-O-3 UPDATES - PENDING FCDS Annual Meeting July 26, 2013 Sunrise, Florida Steven Peace, CTR ICD-O-3 Work Group ICD-O-3 WORK GROUP Name April Fritz, CTR Lynn Ries, MS Lois Dickie, CTR Linda Mulvihill,

More information

ICD-O-3 UPDATES - PENDING

ICD-O-3 UPDATES - PENDING ICD-O-3 UPDATES - PENDING FCDS Annual Meeting July 26, 2013 Sunrise, Florida Steven Peace, CTR ICD-O-3 Work Group ICD-O-3 WORK GROUP Name April Fritz, CTR Lynn Ries, MS Lois Dickie, CTR Linda Mulvihill,

More information

Explaining Blanks and X, Ambiguous Terminology and Support for AJCC Staging

Explaining Blanks and X, Ambiguous Terminology and Support for AJCC Staging Explaining Blanks and X, Ambiguous Terminology and Support for AJCC Staging Donna M. Gress, RHIT, CTR Validating science. Improving patient care. This presentation was supported by the Cooperative Agreement

More information

NAACCR Webinar Series 1

NAACCR Webinar Series 1 Collecting Cancer Data: Skin Malignancies 2/4/2010 NAACCR 2009 2010 Webinar Series Questions Please use the Q&A panel to submit your questions Send questions to All Panelist Collecting Cancer Data: Skin

More information

Introduction & Descriptors

Introduction & Descriptors AJCC 8 th Edition Staging Introduction & Descriptors Donna M. Gress, RHIT, CTR Technical Editor, AJCC Cancer Staging Manual First Author, Chapter 1: Principles of Cancer Staging Validating science. Improving

More information

NAACCR Webinar Series 1

NAACCR Webinar Series 1 NAACCR 2009 2010 Webinar Series Collecting Cancer Data: Kidney 1 Questions Please use the Q&A panel to submit your questions Send questions to All Panelist 2 Fabulous Prizes 3 NAACCR 2009 2010 Webinar

More information

Q&A Session NAACCR Webinar Series Collecting Cancer Data: Pancreas January 05, 2012

Q&A Session NAACCR Webinar Series Collecting Cancer Data: Pancreas January 05, 2012 Q&A Session NAACCR Webinar Series Collecting Cancer Data: Pancreas January 05, 2012 Q: Will sticky notes be transferrable from the previous electronic version of CS to the updated version? A: It is our

More information

North American Association of Central Cancer Registries, Inc. (NAACCR)

North American Association of Central Cancer Registries, Inc. (NAACCR) North American Association of Central Cancer Registries, Inc. (NAACCR) 2018 Implementation Guidelines and Recommendations (For NAACCR Standards Volume II, Data Standards and Data Dictionary, Version 18,

More information

Commission on Cancer Updates

Commission on Cancer Updates Commission on Cancer Updates OBJECTIVES PROVIDE CANCER REGISTRARS WITH INFORMATION ABOUT CURRENT COC 2018 CHANGES DISCUSS CHANGES RELATED TO CANCER REGISTRY DATA COLLECTION DISCUSS CHANGES RELATED TO CANCER

More information

North American Association of Central Cancer Registries, Inc. (NAACCR)

North American Association of Central Cancer Registries, Inc. (NAACCR) North American Association of Central Cancer Registries, Inc. (NAACCR) 2016 Implementation Guidelines and Recommendations (For NAACCR Standards Volume II, Data Standards and Data Dictionary, Version 16,

More information

Q&A for Collecting Cancer Data: Unusual Sites and Histologies Thursday, October 1, 2015

Q&A for Collecting Cancer Data: Unusual Sites and Histologies Thursday, October 1, 2015 Q&A for Collecting Cancer Data: Unusual Sites and Histologies Thursday, October 1, 2015 Q1: why can t we use pos pleural effusion to stage t value? A: Pleural effusion in Pleural Mesothelioma does not

More information

AJCC 8 th Edition Staging. Introduction & Descriptors. Learning Objectives. This webinar is sponsored by

AJCC 8 th Edition Staging. Introduction & Descriptors. Learning Objectives. This webinar is sponsored by AJCC 8 th Edition Staging Introduction & Descriptors Donna M. Gress, RHIT, CTR Technical Editor, AJCC Cancer Staging Manual First Author, Chapter 1: Principles of Cancer Staging Validating science. Improving

More information

A CENTRAL REGISTRY RELIABILITY STUDY

A CENTRAL REGISTRY RELIABILITY STUDY A CENTRAL REGISTRY RELIABILITY STUDY Visual Editor TNM & Summary Stage Staging Skill Assessment Donna M. Hansen, CTR Auditor & Education Training Coordinator California Cancer Registry NAACCR June 16,

More information

Data Quality Analysis of Prostate Cancer Site Specific Factors in Metropolitan Detroit SEER Data,

Data Quality Analysis of Prostate Cancer Site Specific Factors in Metropolitan Detroit SEER Data, Data Quality Analysis of Prostate Cancer Site Specific Factors in Metropolitan Detroit SEER Data, 2004-2012 Jeanne Whitlock, MSLS, CTR Julie George, MS Ron Shore, MPH Fawn D. Vigneau, JD, MPH Metropolitan

More information

2018 Implementation: SEER Summary Stage 2018

2018 Implementation: SEER Summary Stage 2018 2018 Implementation: SEER Summary Stage 2018 PRESENTED BY JENNIFER RUHL OCTOBER 24, 2018 10/23/2018 1 Q&A Please submit all questions concerning the content of the webinar through the Q&A panel Submit

More information

Evaluation of Abstracting: Cancers Diagnosed in MCSS Quality Control Report 2005:2. Elaine N. Collins, M.A., R.H.I.A., C.T.R

Evaluation of Abstracting: Cancers Diagnosed in MCSS Quality Control Report 2005:2. Elaine N. Collins, M.A., R.H.I.A., C.T.R Evaluation of Abstracting: Cancers Diagnosed in 2001 MCSS Quality Control Report 2005:2 Elaine N. Collins, M.A., R.H.I.A., C.T.R Jane E. Braun, M.S., C.T.R John Soler, M.P.H September 2005 Minnesota Department

More information

Setting the stage for change: upgrading the physician cancer case reporting application in New York

Setting the stage for change: upgrading the physician cancer case reporting application in New York Setting the stage for change: upgrading the physician cancer case reporting application in New York April Austin New York State Cancer Registry (NYSCR) July 12, 2018 June 13, 2018 Aerial view of Thousand

More information

GUIDELINES FOR ICD O 3 HISTOLOGY CODE AND BEHAVIOR UPDATE IMPLEMENTATION Effective January 1, 2018

GUIDELINES FOR ICD O 3 HISTOLOGY CODE AND BEHAVIOR UPDATE IMPLEMENTATION Effective January 1, 2018 North American Association of Central Registries, Inc GUIDELINES FOR ICD O 3 HISTOLOGY CODE AND BEHAVIOR UPDATE IMPLEMENTATION Effective January 1, 2018 Prepared by: NAACCR ICD O 3 Update Implementation

More information

Boot Camp /5/15

Boot Camp /5/15 Abstracting & Coding Boot Camp: Cancer Case Scenarios 2014-2015 NAACCR Webinar Series March 5, 2015 Q&A Please submit all questions concerning webinar content through the Q&A panel. Reminder: If you have

More information

CDC & Florida DOH Attribution

CDC & Florida DOH Attribution FCDS Annual Educational Conference Tampa, Florida July 19, 2018 Steven Peace, CTR 1 CDC & Florida DOH Attribution We acknowledge the Centers for Disease Control and Prevention, for its support of the Florida

More information

Collaborative Stage for TNM 7 - Revised 12/02/2009 [ Schema ]

Collaborative Stage for TNM 7 - Revised 12/02/2009 [ Schema ] CS Tumor Size Collaborative Stage for TNM 7 - Revised 12/02/2009 [ Schema ] Note: the specific tumor size as documented in the medical record. If the ONLY information regarding tumor size is the physician's

More information

Abstracting and Coding Boot Camp: Cancer Case Scenarios

Abstracting and Coding Boot Camp: Cancer Case Scenarios NAACC R 2015-2016 Webinar Series Abstracting and Coding Boot Camp: Cancer Case Scenarios NAACCR 2015 2016 Webinar Series Presented by: Angela Martin amartin@naaccr.org Jim Hofferkamp jhofferkamp@naaccr.org

More information

Collaborative Stage Data Collection System (CSv2) Reporting Requirements Commission on Cancer (CoC) (Updated 4/8/ changes in red print)

Collaborative Stage Data Collection System (CSv2) Reporting Requirements Commission on Cancer (CoC) (Updated 4/8/ changes in red print) Collaborative Stage Data Collection System (CSv2) Reporting Requirements Commission on Cancer (CoC) (Updated 4/8/2010 - changes in red print) Timing. Collaborative Stage version 2 must be used for all

More information

Seventh Edition Staging 2017 Breast

Seventh Edition Staging 2017 Breast Seventh Edition Staging 2017 Breast Donna M. Gress, RHIT, CTR Validating science. Improving patient care. No materials in this presentation may be repurposed in print or online without the express written

More information

MCR MINI UPDATE DECEMBER 2017

MCR MINI UPDATE DECEMBER 2017 Fellow Registrars, MCR staff are very busy in November double checking our data before sending it to the national level, but we still have some important news, tips and resources to share with you. DUE

More information

Stage Data Capture in Ontario

Stage Data Capture in Ontario Stage Data Capture in Ontario February 23, 2010 Agenda Refresher: Ontario s Stage Capture Project Collaborative Staging and Population Stage Reporting in Ontario Use of Stage Data in System Performance

More information

Comparative Analysis of Stage and Other Prognostic Factors Among Urethral, Ureteral, and Renal Pelvis Malignant Tumors

Comparative Analysis of Stage and Other Prognostic Factors Among Urethral, Ureteral, and Renal Pelvis Malignant Tumors Comparative Analysis of Stage and Other Prognostic Factors Among Urethral, Ureteral, and Renal Pelvis Malignant Tumors Presented to NAACCR Annual Conference 2012 Serban Negoita, MD, DrPH; Marsha Dunn,

More information

Summary Stage 2018 (SS2018)

Summary Stage 2018 (SS2018) Summary Stage 2018 (SS2018) NAACCR October Webinar October 24, 2018 General Information 2 Summary Stage 2018 1 General Summary Stage is ANATOMICALLY based Unlike AJCC, it does not use the following in

More information

Exercise 15: CSv2 Data Item Coding Instructions ANSWERS

Exercise 15: CSv2 Data Item Coding Instructions ANSWERS Exercise 15: CSv2 Data Item Coding Instructions ANSWERS CS Tumor Size Tumor size is the diameter of the tumor, not the depth or thickness of the tumor. Chest x-ray shows 3.5 cm mass; the pathology report

More information

What s New in 2012 FCDS Annual Meeting Review

What s New in 2012 FCDS Annual Meeting Review What s New in 2012 FCDS Annual Meeting Review FCDS Educational Webcast Series August 16, 2012 FCDS Staff Steven Peace, CTR 2012 FORDS Cancer Program Standards CSv02.04 1 What is Cancer / What is Reportable

More information

AJCC 8 th Edition Staging. Introduction & Descriptors. Learning Objectives. This webinar is sponsored by

AJCC 8 th Edition Staging. Introduction & Descriptors. Learning Objectives. This webinar is sponsored by AJCC 8 th Edition Staging Introduction & Descriptors Donna M. Gress, RHIT, CTR Technical Editor, AJCC Cancer Staging Manual First Author, Chapter 1: Principles of Cancer Staging Validating science. Improving

More information

Collaborative Stage for TNM 7 - Revised 07/14/2009 [ Schema ]

Collaborative Stage for TNM 7 - Revised 07/14/2009 [ Schema ] MelanomaSkin CS Tumor Size Collaborative Stage for TNM 7 - Revised 07/14/2009 [ Schema ] Code 000 No mass/tumor found Description 001-988 001-988 millimeters (code exact size in millimeters) 989 989 millimeters

More information

Take Home Quiz 1 Please complete the quiz below prior to the session. Use the Multiple Primary and Histology Rules

Take Home Quiz 1 Please complete the quiz below prior to the session. Use the Multiple Primary and Histology Rules Take Home Quiz 1 Please complete the quiz below prior to the session. Use the Multiple Primary and Histology Rules Case 1 72 year old white female presents with a nodular thyroid. This was biopsied in

More information

North American Association of Central Cancer Registries, Inc. (NAACCR)

North American Association of Central Cancer Registries, Inc. (NAACCR) North American Association of Central Cancer Registries, Inc. (NAACCR) 2015 Implementation Guidelines and Recommendations (For NAACCR Standards Volume II, Data Standards and Data Dictionary, Version 15,

More information

DIAL-IN INFORMATION FOR ALL THE TELECONFERENCES: Dial In Number: Participant Code:

DIAL-IN INFORMATION FOR ALL THE TELECONFERENCES: Dial In Number: Participant Code: FCDS DAM 2006 SUPPLEMENT A HEMATOLOGIC MALIGNANCIES SOUTH FLORIDA NBC 6 VIDEO REPORT ON WOMEN AND CANCER (Includes comments from FCDS's Deputy Project Director, Dr. Jill MacKinnon) FCDS REGISTER VOL. 33

More information

FORDS to STORE: The Evolution of Cancer Registry Coding Frederick L. Greene, MD FACS Medical Director, Cancer Data Services Levine Cancer Institute

FORDS to STORE: The Evolution of Cancer Registry Coding Frederick L. Greene, MD FACS Medical Director, Cancer Data Services Levine Cancer Institute FORDS to STORE: The Evolution of Cancer Registry Coding Frederick L. Greene, MD FACS Medical Director, Cancer Data Services Levine Cancer Institute Charlotte, NC National Accreditation Program for Breast

More information

Interactive Staging Bee

Interactive Staging Bee Interactive Staging Bee ROBIN BILLET, MA, CTR GA/SC REGIONAL CONFERENCE NOVEMBER 6, 2018? Clinical Staging includes any information obtained about the extent of cancer obtained before initiation of treatment

More information

Registrar s Guide to Chapter 1, AJCC Seventh Edition. Overview. Learning Objectives. Describe intent and purpose of AJCC staging

Registrar s Guide to Chapter 1, AJCC Seventh Edition. Overview. Learning Objectives. Describe intent and purpose of AJCC staging Registrar s Guide to Donna M. Gress, RHIT, CTR Validating science. Improving patient care. This presentation was supported by the Cooperative Agreement Number DP13-1310 from The Centers for Disease Control

More information

Education & Training Plan

Education & Training Plan 2018-2019 Education & Training Plan 1 F C D S A N N U A L C O N F E R E N C E T A M P A, F L O R I D A 7 / 1 8 / 2 0 1 8 S T E V E N P E A C E, C T R CDC & Florida DOH Attribution 2 Funding for this conference

More information

Outline. How to Use the AJCC Cancer Staging Manual, 7 th ed. 7/9/2015 FCDS ANNUAL CONFERENCE ST PETERSBURG, FLORIDA JULY 30, 2015.

Outline. How to Use the AJCC Cancer Staging Manual, 7 th ed. 7/9/2015 FCDS ANNUAL CONFERENCE ST PETERSBURG, FLORIDA JULY 30, 2015. 1 How to Use the AJCC Cancer Staging Manual, 7 th ed. FCDS ANNUAL CONFERENCE ST PETERSBURG, FLORIDA JULY 30, 2015 Steven Peace, CTR Outline 2 History, Purpose and Background Purchase and Ordering Information

More information

Collaborative Stage Data Collection System (CSv xx) Reporting Requirements Commission on Cancer (CoC)

Collaborative Stage Data Collection System (CSv xx) Reporting Requirements Commission on Cancer (CoC) Collaborative Stage Data Collection System (CSv 02.03.xx) Reporting Requirements Commission on Cancer (CoC) Timing. Collaborative Stage version 02.03.xx must be used for all cases diagnosed on or after

More information

6. Cervical Lymph Nodes and Unknown Primary Tumors of the Head and Neck

6. Cervical Lymph Nodes and Unknown Primary Tumors of the Head and Neck 1 Terms of Use The cancer staging form is a specific document in the patient record; it is not a substitute for documentation of history, physical examination, and staging evaluation, or for documenting

More information

You Want ME to Stage that Case???

You Want ME to Stage that Case??? You Want ME to Stage that Case??? Jayne Holubowsky, CTR, Director, Virginia Cancer Registry 2 nd DelMarVa-DC Regional Conference October 11, 2018 What s New in the AJCC 8 th Edition Objectives Explain

More information

Collaborative Staging Manual and Coding Instructions Part II: Primary Site Schema

Collaborative Staging Manual and Coding Instructions Part II: Primary Site Schema C44.0-C44.9, C51.0-C51.2, C51.8-C51.9, C60.0-C60.2, C60.8-C60.9, C63.2 (M-8720-8790) C44.0 Skin of lip, NOS C44.1 Eyelid C44.2 External ear C44.3 Skin of ear and unspecified parts of face C44.4 Skin of

More information

Collaborative Staging

Collaborative Staging Slide 1 Collaborative Staging Site-Specific Instructions Prostate 1 In this presentation, we are going to take a closer look at the collaborative staging data items for the prostate primary site. Because

More information

Physician On-line Staging Application. Darlene Dale Head, PMH Cancer Registry

Physician On-line Staging Application. Darlene Dale Head, PMH Cancer Registry Physician On-line Staging Application Darlene Dale Head, PMH Cancer Registry NAACCR 2004 Outline Overview of Princess Margaret Hospital History of Staging at PMH Steps to On-Line Physician Staging at PMH

More information

Thursday, August 16, :30 AM - 4:30 PM and Friday, August 17, :30 AM 12:00 PM Crowne Plaza 830 Phillips Lane Louisville, KY 40209

Thursday, August 16, :30 AM - 4:30 PM and Friday, August 17, :30 AM 12:00 PM Crowne Plaza 830 Phillips Lane Louisville, KY 40209 KCR newsletter March 2018 KCR 2018 Fall Workshop/Regional Meeting 2018 Tri-State Regional Cancer Registrars Meeting Presented by: Kentucky Cancer Registry, Indiana Cancer Consortium, and Ohio Cancer Incidence

More information

FCDS is pleased to announce the rescheduled

FCDS is pleased to announce the rescheduled OCTOBER/ NOVEMBER 2010 MONTHLY JOURNAL OF UPDATES AND INFORMATION FCDS 2010 EDUCATIONAL WEBCAST SERIES RECORDINGS: *CSV2 LUNG, *CSV2 BREAST, *CSV2 PROSTATE, *CSV2 COLON, *HEME/LYMPH PART I, *HEME/LYMPH

More information

NCDB Vendor Webinar: NCDB Call for Data January 2018 and Upcoming RQRS Revisions

NCDB Vendor Webinar: NCDB Call for Data January 2018 and Upcoming RQRS Revisions NCDB Vendor Webinar: NCDB Call for Data January 2018 and Upcoming RQRS Revisions American American College College of of Surgeons 2013 Content 2014 Content cannot be be reproduced or or repurposed without

More information

SEPTEMBER 2010 GETTING REGISTERED

SEPTEMBER 2010 GETTING REGISTERED SEPTEMBER 2010 MONTHLY JOURNAL OF UPDATES AND INFORMATION FCDS 2010 EDUCATIONAL WEBCAST SERIES RECORDINGS: CSV2 LUNG, CSV2 BREAST, CSV2 PROSTATE, CSV2 COLON, HEME/LYMPH PART I, AND HEME/LYMPH PART II GETTING

More information

FINALIZED SEER SINQ QUESTIONS April July, 2017

FINALIZED SEER SINQ QUESTIONS April July, 2017 20170040 Source 1: 2016 SEER Manual pgs: 91 Source 2: 2007 MP/H Rules Notes: Lung MP/H Rules/Histology--Lung: What is the histology code for lung cancer case identified pathologically from a metastatic

More information

SEER Summary Stage Still Here!

SEER Summary Stage Still Here! SEER Summary Stage Still Here! CCRA NORTHERN REGION STAGING SYMPOSIUM SEPTEMBER 20, 2017 SEER Summary Stage Timeframe: includes all information available through completion of surgery(ies) in the first

More information

AJCC-NCRA Education Needs Assessment Results

AJCC-NCRA Education Needs Assessment Results AJCC-NCRA Education Needs Assessment Results Donna M. Gress, RHIT, CTR Survey Tool 1 Survey Development, Delivery, Analysis THANKS to NCRA for the following work Developed survey with input from partners

More information

Quiz Adenocarcinoma of the distal stomach has been increasing in the last 20 years. a. True b. False

Quiz Adenocarcinoma of the distal stomach has been increasing in the last 20 years. a. True b. False Quiz 1 1. Which of the following are risk factors for esophagus cancer. a. Obesity b. Gastroesophageal reflux c. Smoking and Alcohol d. All of the above 2. Adenocarcinoma of the distal stomach has been

More information

Collecting Cancer Data: Breast. Prizes! Collecting Cancer Data: Breast 8/4/ NAACCR Webinar Series 1. NAACCR Webinar Series

Collecting Cancer Data: Breast. Prizes! Collecting Cancer Data: Breast 8/4/ NAACCR Webinar Series 1. NAACCR Webinar Series Collecting Cancer Data: Breast NAACCR 2008 2009 Webinar Series Prizes! Question of the Month! The participant that submits the best question of the session will receive a fbl fabulous Pi Prize! Shannon

More information

Coding Pitfalls 9/11/14

Coding Pitfalls 9/11/14 Coding Pitfalls 2013 2014 NAACCR Webinar Series September 11, 2014 Q&A Please submit all questions concerning webinar content through the Q&A panel. Reminder: If you have participants watching this webinar

More information