2016 Workplace and Gender Relations Survey of Active Duty Members. Nonresponse Bias Analysis Report
|
|
- Oliver Terence Bruce
- 5 years ago
- Views:
Transcription
1 2016 Workplace and Gender Relations Survey of Active Duty Members Nonresponse Bias Analysis Report
2 Additional copies of this report may be obtained from: Defense Technical Information Center ATTN: DTIC-BRR 8725 John J. Kingman Rd., Suite #0944 Ft. Belvoir, VA Or from:
3 OPA Report No May WORKPLACE AND GENDER RELATIONS SURVEY OF ACTIVE DUTY MEMBERS: NONRESPONSE BIAS ANALYSIS REPORT Office of People Analytics (OPA) Defense Research, Surveys, and Statistics Center 4800 Mark Center Drive, Suite 06E22, Alexandria, VA
4 Contributors Many members of the Office of People Analytics (OPA) contributed to the analyses and writing of this report assessing the level and direction of potential nonresponse bias in estimates from the 2016 Workplace and Gender Relations Survey of Active Duty (WGRA). Sections of the report were written by the following authors: Eric Falk wrote Section 1: Compare Known Population Values with Weighted Survey Estimates. Jeff Schneider wrote Section 2: Analysis of OPA s Survey of Nonrespondents based on the 2016 Workplace and Gender Relations Survey of Active Duty Nonresponse Bias Survey. Jeff Schneider wrote Section 3: Evaluate the Sensitivity of Different Post-Survey Adjustments (Weighting Methods) on Survey Estimates. Reviews and comments were provided by David McGrath and Lisa Davis. Eric Falk and David McGrath guided the studies and served as primary editors. ii
5 Table of Contents Page Summary of Findings...1 Section 1: Comparison of Known Population Values With Weighted Survey Estimates...3 Summary...6 Section 2: Analysis of OPA s Survey of Nonrespondents...7 Weighting the 2016 WGRA-N...10 Summary...13 Section 3: Evaluate the Sensitivity of Different Post-Survey Adjustments (Weighting Methods) on Survey Estimates...15 OPA Weighting Methodology...15 Comparison of Adjustment Stages and Final Weights...16 Comparison of Key Estimates...17 Summary...18 References...21 List of Tables WGRA Reporting Questions Summary of Sexual Assault Reports in DSAID, by Type of Report and Service WGRA Estimates vs. Actual Number of Reported Sexual Assaults WGRA-N Comparison Questions: Control Questions WGRA-N Comparison Questions: MEO Questions Sample Disposition Codes for 2016 WGRA-N Comparison of 2016 WGRA Sample With Nonresponse Sample 2016 WGRA-N WGRA-N Final Weight Moments Comparison of WGRA Survey With Nonresponse Study Control Questions Comparison of WGRA Survey With Nonresponse Study MEO Questions Comparison Between Adjustment Factor: Standard OPA vs. 2-Stage Boosted Method for Eligibility, Completion and Poststratification Adjustments Comparison Between Standard OPA and 2-Stage Boosted Final Weights Comparison of OPA and Westat Key Survey Estimates (Males) Comparison of OPA and Westat Key Survey Estimates (Females)...18 iii
6
7 2016 WORKPLACE AND GENDER RELATIONS SURVEY OF ACTIVE DUTY MEMBERS: NONRESPONSE BIAS ANALYSIS REPORT Survey nonresponse has the potential to introduce bias in the estimates of key outcomes. To the extent that nonrespondents and respondents differ on observed characteristics, the Office of People Analytics (OPA) can use weights to adjust the sample so the weighted respondents match the full population on the most critical characteristics. This eliminates the portion of nonresponse bias (NRB) associated with those observed variables if these variables are strongly associated with the behaviors. When all NRB can be eliminated in this manner, the missingness is called ignorable or missing at random (Little & Rubin, 2002). The more observable demographic variables that were incorporated into the weights, the more plausible it is to assume that the weights eliminate any NRB. The objective of this research was to assess the extent of NRB for the estimated percentage of members who indicated experiencing a sexual assault in the prior 12 months (henceforth referred to as the sexual assault rate) in the active duty military. The level of nonresponse bias can vary for every question on the survey, but OPA focused on the sexual assault rate as this is the most important survey topic. Nonresponse bias occurs when survey respondents are systematically different from nonrespondents. Statistically, the bias in a respondent mean (e.g., sexual assault rate) is a function of the response rate and the relationship (covariance) between response propensities and the estimated statistics (i.e., sexual assault rate), and takes the following form: propensity,, where = covariance between and response NRB can occur with high or low survey response rates, but the decrease in overall survey response rates within the Department as well as in civilian studies in the past decade have resulted in a greater focus on potential NRB. OPA investigated the presence of NRB using many different methods, and are summarized in this report as follows: 1. Comparison of known population values with weighted survey estimates, 2. Analysis of OPA s survey of nonrespondents, and 3. Evaluating the sensitivity of different post-survey adjustments (weighting methods) on survey estimates. Summary of Findings NRB is difficult to assess and therefore most authors recommend averaging across several different studies to measure NRB (Montaquila & Olson, 2012). OPA has taken this approach here and conducted three studies to assess NRB in sexual assault estimates. Based on these three studies, OPA did not find evidence of significant NRB in sexual assault estimates from the 2016 WGRA. 1
8 The results from each study are summarized below: 1. Comparison of known population values with weighted survey estimates OPA compared weighted estimates of reported sexual assaults from the survey with actual reported sexual assaults to SAPRO. From this analysis, OPA concludes there is no evidence of NRB in sexual assault estimates from the 2016 WGRA. 2. Analysis of OPA s survey of nonrespondents Results from the 2016 WGRA Non- Responders Study (2016 WGRA-N) were mixed. OPA determined five of the eight estimates from overlapping questions were significantly different between the 2016 WGRA and 2016 WGRA-N which may be an indication of NRB. However, the estimates from one particular survey were not consistently more positive or negative in comparison to the other survey. Additionally, given the low response rate to the nonresponse study (under 5%, an extremely low number for a standard OPA survey) and the fact that the NRB study was a non-probability sample, the results should be used cautiously. From this analysis, OPA concludes there is little evidence of NRB in sexual assault estimates from the 2016 WGRA. 3. Evaluation of the sensitivity of different post-survey adjustments (weighting methods) on survey estimates Analysis of estimates using four different weighting methods show both the weights and key survey estimates are robust to the choice of weighting methods. From this analysis, OPA concludes that is little evidence of NRB in sexual assault estimates from the 2016 WGRA. 2
9 Section 1: Comparison of Known Population Values With Weighted Survey Estimates To assess total survey error, one common method is to compare a known parameter to a weighted estimate from the survey. 1 If OPA s sampling, measurement, weighting, and analysis methods performed well, confidence intervals of estimates should frequently contain the true parameters. In this investigation, OPA examined the number of reported sexual assaults in the active duty. A similar type of analysis was performed by the RAND Corporation for the 2014 RAND Military Workplace Study (2014 RMWS, Morral, 2015) and by OPA for the 2015 Workplace and Gender Relations Survey of Reserve Component Members (2015 WGRR, DMDC, 2016a). It is important to point out that OPA does not know the true number of sexual assaults in the active duty military. However, OPA compared the number of sexual assault reports filed by active duty members to weighted estimates from survey respondents to assess NRB (and overall total survey error). The Sexual Assault Prevention and Response Office (SAPRO) provided OPA with summary information of the number of reported sexual assaults (unrestricted and restricted), which was collected via the Defense Sexual Assault Incident Database (DSAID). For a record to be entered into DSAID, the survivor needed to complete a Victim Reporting Preference Statement (DDForm 2910) indicating whether the survivor would like to make either a restricted or unrestricted report. To match all possible 12-month reference periods for indicating a sexual assault on the 2016 WGRA, OPA requested the sexual assaults reported from July 1, 2015 through October 31, 2016 (16 months of data). For example, if a survey respondent completed the survey on July 22 and indicated they reported a sexual assault within the last year, DSAID would contain the incidence in the July 23, 2015 through July 22, 2016 timeframe. Rather than create all time frames that mirror the survey period, OPA decided to create five yearly timeframes as a basis for comparison to the estimates (see list below). In total there were 4,079 cases reported during the 16-month period provided by SAPRO and the five time periods that mirror the survey fielding period of July 22 to October 14 are: July 1, 2015 through June 30, 2016, August 1, 2015 through July 31, 2016, September 1, 2015 through August 31, 2016, October 1, 2015 through September 30, 2016, and November 1, 2015 through October 31, The summary files provided by SAPRO contained the number of sexual assaults by Service and whether the report type was restricted or unrestricted. On the 2016 WGRA survey, sexual assault survivors were asked follow-up questions to determine 1) if they filed a formal report of sexual assault, 2) type of report filed, and 3) to verify the sexual assault occurred within 1 For more information regarding the sampling and weighting of the 2016 WGRA please refer to 2016 Workplace and gender relations survey of active duty members: Statistical Methodology Report (Report No ) 3
10 the last 12 months. The 2016 WGRA survey questions regarding these behaviors are displayed in Table 1. Table WGRA Reporting Questions SAMILREPT 2016 WGRA Reporting Questions SA12MOS SAREPTYPE To ensure comparability between the DSAID file and the survey results, OPA selected cases with the following characteristics: Survivor service affiliation was either Army, Navy, Marine Corps, or Air Force, Survivor duty status was active, Survivor was in military at time of incident, and Date of incident occurred in the July 1, 2015 through October 31, 2016 timeframe. 4
11 Based on the definitions, the DSAID file provided by SAPRO contained a 12-month average of 3,097 restricted and unrestricted reports of sexual assault within the DoD Services. Table 2 summarizes sexual assault reports, by Service and type of report. Table 2. Summary of Sexual Assault Reports in DSAID, by Type of Report and Service Service/Component Restricted Unrestricted Total Army 181 1,025 1,206 Navy Marine Corps Air Force Total 608 2,489 3,097 Note: DSAID Reports were 12-month averages from July 1, 2015 to October 31, 2016 OPA used the following criteria from the survey to compare the information provided by respondents on the 2016 WGRA to actual reports in DSAID. Respondent indicated Yes to reported the sexual assault (Q129), Respondent indicated the report was restricted or unrestricted (Q131), 2 and Respondent confirmed the sexual assault occurred in the past 12 months (Q167). There were 448 DoD complete eligible respondents from the survey who indicated in Q129 they had reported a sexual assault (SAMILREPT=Yes). Of the 448 respondents, 389 indicated the sexual assault occurred in the last 12 months (Q167) and that they filed either a restricted or unrestricted report. 3 The weighted estimate based on these 389 respondents is 3,145, compared with the 3,097 average cases from DSAID during a 12 month period. The confidence interval from the survey estimate ranged from a lower bound of 2,847 to an upper bound of 3,443 and contains the true average number of 3,097 cases reported in DSAID. The estimate and the average actual are very close and only differ by 2%. Table 3 shows the 2016 WGRA number of reported sexual assaults, weighted estimates including 95 percent confidence intervals, and DSAID reported cases. For each Service, the 2016 WGRA confidence intervals contain the number of DSAID cases. For example, OPA estimates 1,259 Army reported cases of sexual assault with a 95 percent confidence interval ranging from 1,056 to 1,462, whereas the true number in DSAID for Army is 1, For purposes of this analysis those who indicated they were unsure were grouped with the unrestricted. 3 This included not only those who indicated the event definitely occurred within the last year (SA12MOS = Definitely occurred after X date) but those who are not sure if it was within the last year (SA12MOS = Not sure if it occurred before or after X date). In addition, this includes not only those who indicated that they filed a restricted or unrestricted report (SAREPTYPE = A restricted report or an unrestricted report) but also those who were unsure of the type of report filed (SAREPTYPE = Unsure what type of report was made). 5
12 RAND performed a similar analysis for the active duty in the 2014 RMWS and found that the survey estimates were systematically lower than the true DSAID number. This led RAND to conclude that sexual assault survivors that report are less likely to respond to the survey, and consequently survey estimates may be too low. It is unclear why RAND and OPA may find different results comparing weighted survey estimates to DSAID administrative data. Table WGRA Estimates vs. Actual Number of Reported Sexual Assaults Service Number of Respondents that filed a Report Weighted Total Estimate from Survey WGRA Reports 95% Confidence Interval Lower Bound of Estimate 95% Confidence Interval Upper Bound of Estimate Average Number of Reports in DSAID Army 127 1,259 1,056 1,462 1,206 Navy , Marine Corps Air Force Total 389 3,145 2,847 3,443 3,097 Note. The number of reports in DSAID is the average of the five 12-month totals within DSAID that correspond with the 2016 WGRA reference period for reporting sexual assaults, as described previously. Summary This NRB section assessed whether the 2016 WGRA survey estimates were similar to administrative data (cases in SAPRO s DSAID database). Differences between survey estimates and administrative data could be an indication of survey error, perhaps caused by nonresponse bias. 4 However, the survey estimates and the actual number of sexual assault survivors that made an unrestricted or restricted report in DSAID were very similar for each Service, and the confidence intervals contained the DSAID number in all cases. From this analysis OPA concludes there is no evidence of NRB in sexual assault estimates. 4 It is important to note that while likely accurate, errors do exist in administrative data. 6
13 Section 2: Analysis of OPA s Survey of Nonrespondents If survey respondents and nonrespondents have different sexual assault propensities that cannot be accounted for during survey weighting, it would result in biased survey estimates of sexual assault. OPA conducted a nonresponse study (2016 WGRA-N) based on a sample of nonrespondents from the original survey as another method to assess WGRA NRB. The 2016 WGRA-N was a much shorter web survey with only notifications (the 2016 WGRA had one postal notification, 10 reminders, and a paper form). The 2016 WGRA-N served two purposes. First, the 2016 WGRA-N assessed the 2016 WGRA and generated constructive feedback for future iterations of the survey. Second, the survey evaluated NRB by comparing the responses from the follow-up survey to the original survey. In particular, there were eight questions that were asked on both the 2016 WGRA and 2016 WGRA-N to assess NRB. These eight questions can be split into two groups: control questions and Military Equal Opportunity (MEO) questions. The control questions cover topics such as retention, health, and assessment of DoD policy toward gender related topics. Table 4 shows the four control questions. Table WGRA-N Comparison Questions: Control Questions 2016 WGRA-N Control Questions The four MEO questions are displayed in Table 5. 7
14 Table WGRA-N Comparison Questions: MEO Questions 2016 WGRA-N Comparison Questions If estimates from either set (control or MEO) of these matching questions were significantly different, that could be indicative of the presence of NRB in these questions (and potentially other correlated questions such as the sexual assault rate). For the 2016 WGRA, OPA selected a sample of 696,329 from the 1,291,357 DoD active duty members. There were 378, WGRA DoD nonrespondents (SAMP_DC=11), and OPA selected a sample of 100,811 for the 2016 WGRA-N. The sample was selected using the same sampling strata as the 2016 WGRA. Similar to the 2016 WGRA survey, OPA created sample disposition codes using the criteria shown in Table 6. The table shows that 2,864 of the 2016 WGRA-N sample were considered to be complete eligible respondents based on standard OPA criteria. 8
15 Table 6. Sample Disposition Codes for 2016 WGRA-N Sample Disposition Code (SAMP_DC) Conditions Sample Size Record Ineligbles: OPA identified members that 8,936 (8.9%) separated from active duty using the April Defense Enrollment Eligibility Reporting System (DEERS) Medical Point-in-Time Extract (PITE) The sampled member or a proxy reported that 2 (0.002%) 2 member was ineligible due to such reasons as "Separated," Retired, etc. The sampled member was determined to be 1 (0.001%) 3 ineligible based on their response to Q1 of the survey questionnaire Were you on active duty on [OPEN DATE]? 4 Complete Eligibles: If respondent was eligible 2,864 (2.8%) and completed 50% or more of the questions. Incomplete Eligibles: If respondent was eligible 265 (0.263%) 5 but failed to complete 50% or more of the questions Survey is refused due reasons such as too 238 (0.236%) 8 long, inappropriate/intrusive, Refused additional s,, etc. 10 Postal non-deliverable or original address is 5,215 (5.2%) non-locatable. 11 Nonrespondents: All others 83,290 (82.6%) Total 100,811 (100%) Table 7 shows population, sample size, respondents, and response rates by key domains (e.g., gender) for both the 2016 WGRA and 2016 WGRA-N. The weighted response rates for both surveys have similar patterns, although rates are much lower for the 2016 WGRA-N. For example, Air Force and female members were the highest responders for both the 2016 WGRA (34.9% and 28.4%) and 2016 WGRA-N (6.9% and 4.9%). It is important to highlight the difficulty in getting initial nonresponders to respond, and the lower response rates in the WGRA- N were expected. However, while the low response rates were expected, OPA cautions against drawing strong conclusions from a survey with a 4% response rate. 9
16 Table 7. Comparison of 2016 WGRA Sample With Nonresponse Sample 2016 WGRA-N 2016 WGRA 2016 WGRA-N Domain Domain Population Variable Sample Eligible Response Sample Eligible Response Size Responses Rates Size Responses Rates Sample Full Sample 1,291, , , % 100,811 2, % Service Army 474, ,584 44, % 45,856 1, % Navy 324, ,326 28, % 23, % Marine Corps 183, ,936 14, % 17, % Air Force 308, ,483 44, % 13, % Gender Male 1,089, ,197 93, % 81,217 2, % Female 202, ,132 39, % 19, % Paygrade E1 E4 565, ,995 38, % 67,703 1, % Group E5 E9 498, ,162 60, % 23,870 1, % W1 O3 145,830 60,069 19, % 6, % O4 O6 81,122 27,103 12, % 2, % Note: Both sets of response rates are weighted by sampling weights. Weighting the 2016 WGRA-N Because 2016 WGRA respondents were not eligible for the 2016 WGRA-N, it is not a probability sample of all active duty DoD members and there is no method to create base weights that represent the full active duty population. However, weights can be constructed that approximately represent the active duty population by raking. 5 OPA weighted the 2016 WGRA- N using the following process: OPA raked complete eligibles to population totals using the following seven dimensions: Gender (male and female), Service (Army, Navy, Marine Corps, Air Force), Three different paygrade groupings including a 2 level group (enlisted and officer), a 5 level group including (E1-E4, E5-E9, W1-W5, O1-O3 and O4-O6), and a 7 level group (E1-E3, E4, E5-E6, E7-E9, W1-W5, O1-O3 and O4-O6), Deployed in the last 12 months (Yes or No), and Family status (single with children, single without children, married with children, married without children). 5 Raking, otherwise known as iterative proportional fitting, is a method for adjusting weights to match known population characteristics. 10
17 Population totals for these raking totals were determined based on the total 2016 WGRA sample population. OPA computed final weights based on the raking adjustment. Table 8 shows standard moments for the final rake adjusted weights. Table 8. WGRA-N Final Weight Moments Moment Final Weight Mean Standard Deviation Max 1, % % % % % % % % % Min 99.6 Table 9 shows the estimates and corresponding margins of error for the four control questions that overlapped between the two surveys. The margins of error for the 2016 WGRA are small, whereas they are larger for the WGRA-N because there are fewer respondents. The 2016 WGRA estimated that 62.8% of active duty members indicated they were likely or very likely to stay on active duty compared with 64.4% on the 2016 WGRA-N. OPA judged these estimates to be similar, but did not compute statistical significance because the WGRA-N has a very low response rate and is also subject to NRB (e.g., WGRA-N respondents may not represent a random sample of 2016 WGRA nonrespondents). Estimates for the health question and sexual harassment question show more differences. WGRA-N respondents (58.3%) were less likely to report their health was good/excellent than WGRA respondents (67.6%) (Q23); however, for Q24 and Q25 WGRA-N respondents were less likely to indicate that the DoD climate was more of a problem for sexual harassment (WGRA-N, 6.6%; WGRA, 8.1%) and assault (WGRA-N, 7.5%; WGRA, 8.3%). 11
18 Table 9. Comparison of WGRA Survey With Nonresponse Study Control Questions NRB WGRA Q22 Q23 Q24 Q25 Q194 Q195 Q209 Q210 Question RETENTION: Assuming you could stay [in the Active Duty], how likely is it you would choose to do so? (% Saying Likely or Very Likely) HEALTH: In general, would you say your health is? (% Saying Good or Excellent) HOW ARE WE DOING: In your opinion, has sexual harassment in the military become more or less of a problem over the last 2 years? (% Saying More of a Problem) HOW ARE WE DOING: In your opinion, has sexual assault in the military become more or less of a problem over the last 2 years? (% Saying More of a Problem) Weighted Estimates 2016 WGRA 2016 WGRA-N 62.8% ± % ± % ± % ± % ± % ± % ± % ± 1.2 Table 10 shows that the estimates for the MEO questions on the WGRA-N differed from the 2016 WGRA. For example, WGRA-N estimates of hostile work environment (Q29) and gender discrimination (Q31) are both several percentage points higher than 2016 WGRA. For example, 5.7% of 2016 WGRA respondents indicated they heard someone make a gender discrimination comment, compared with 10.6% on the WGRA-N. In contrast, Q30 and Q32 estimates are similar across the two studies. Table 10. Comparison of WGRA Survey With Nonresponse Study MEO Questions NRB WGRA Question Q29 Q30 Q31 Q32 Q23 Q46 Q24 Q47 Since [X Date], did you hear someone from work say that [men] [women] are not as good as [women] [men] at your particular job, or that [men] [women] should be prevented from having your job? Do you think their beliefs about [men] [women] ever harmed or limited your career? For example, did they hurt your evaluation/fitness report, affect your chances of promotion or your next assignment? Since [X Date], do you think someone from work mistreated, ignored, excluded, or insulted you because you are a [man] [woman]? Do you think this treatment ever harmed or limited your career? For example, did it hurt your evaluation/fitness report, affect your chances of promotion or your next assignment? Weighted Estimates 2016 WGRA 2016 WGRA-N 5.7% ± % ± % ± % ± % ± % ± % ± % ±
19 Summary The purpose of this NRB analysis was to compare estimates from eight questions (four control questions and four MEO questions) asked identically on the 2016 WGRA-N and 2016 WGRA to assess NRB. If estimates were substantively and statistically different, this could be evidence of NRB for the 2016 WGRA estimates. Comparing the WGRA-N and 2016 WGRA is difficult because there is no clear pattern that 2016 WGRA respondents are more positive or negative. To summarize, three questions show WGRA-N respondents as more negative (Q23, Q29, Q31), three questions are similar (Q22, Q30, Q32), and for two questions WGRA-N respondents are more positive (Q24 and Q25). It appears WGRA-N respondents report more gender issues in Q29 and Q31, but at the same time have a more positive view that the military is improving on these issues. Thus, these results could indicate a potential sign of NRB, but it is difficult to determine the direction (e.g., whether 2016 WGRA may under- or overestimate correlates of the sexual harassment and sexual assault rates). While OPA would have hoped that all differences in estimates were within margins of error, this did not occur. NRB can vary question by question and the combination of MEO and control questions might not effectively represent NRB on sexual assault questions. Finally, the WGRA-N survey was a non-probability sample and had a very low response rate (under 5%) and OPA recommends cautious interpretation of this study. 13
20
21 Section 3: Evaluate the Sensitivity of Different Post-Survey Adjustments (Weighting Methods) on Survey Estimates Production weights for the 2016 WGRA were produced by first developing models that account for each member s propensity of experiencing unwanted sexual behaviors, and then using those estimated propensities throughout the weighting process. This method is consistent with RAND s approach for the 2014 RMWS and the OPA-Westat approach for the 2015 WGRR (hereafter called 2-Stage Boosted ) but represents a difference from how OPA conducts survey weighting across non-gender related surveys. For this study, OPA independently developed a set of weights using standard OPA methods to assess the effects of different weighting approaches on survey estimates. This section uses the OPA weights as a validity check to determine if large differences in the weights exist, and if these potential differences lead to more or less NRB in survey estimates. OPA Weighting Methodology OPA s standard weighting procedures have many similarities to the methods recently used by RAND in 2014 and OPA-Westat in Both methods estimate response propensities and make weighting adjustments based on the inverse of those propensities. However, there are two key differences; first, RAND and OPA-Westat used gradient-boosted decision trees (GBM and xgboost) to estimate the propensities, while standard OPA weighting uses single Chi-squared Automatic Interaction Detection classification trees (CHAID). Second, RAND and OPA-Westat first estimate propensities for several sexual assault characteristics, and then use those estimated propensities to predict survey response. Standard OPA weighting skips this step and directly models survey eligibility and response propensities. In addition to the production weights described in 2016 Department of Defense Workplace and Gender Relations Survey of Active Duty Members: Statistical Methods Report, OPA created a second set of weights for the 2016 WGRA using the following three steps: Step 1: Adjust weights for nonresponse based on eligibility as follows: Transfer the weight of the 568,191 nonrespondents (SAMP_DC = 8, 9, 10, 11) to the 157,891 cases with known eligibility (SAMP_DC = 2, 3, 4, 5). A decision tree technique based on Chi-square tests, was used to determine the probability of eligibility for the survey (known eligibility vs. unknown eligibility). Weighting adjustment factors for eligibility were computed as the inverse of the logistic model-predicted probabilities. The model was weighted using the sampling weight (base weight). Predictors in the eligibility model were the same variables used as in the 2 Stage production weights with the exception of the modelpredicted probabilities for unwanted sexual behaviors. Step 2: Adjust weights for survey completion as follows: Transfer the eligibility weight (created in Step 1) of the 5,603 incomplete survey responses (SAMP_DC = 5) to the 151,010 complete-eligible respondents (SAMP_DC = 4). Weighting adjustments for completion use the same methodology as Step 1 (CHAID and logistic model). 15
22 Step 3: Create final weights The weights were raked to match population totals and to reduce variance and bias unaccounted for by the previous weighting adjustments. OPA calculated the final weight as the product of adjustment factors in Steps 1, 2 and 3. The raking process followed the exact same steps and used the same program as the 2016 WGRA production weights. Comparison of Adjustment Stages and Final Weights Table 11 compares the standard OPA methodology and 2-Stage Boosted Methods (2014 RAND, 2015 OPA-Westat) for each of the weight adjustments discussed in Steps 1 through 3: eligibility, completion, and raked. The comparison shows the univariate distribution of each weighting adjustment factor. The results indicate that in aggregate both methods have very similar univariate distributions across adjustment Steps 1 through 3. Although the OPA method carries slightly more variance, all adjustments outside of the tails seem to convey the same adjustments in both instances. Table 11. Comparison Between Adjustment Factor: Standard OPA vs. 2-Stage Boosted Method for Eligibility, Completion and Poststratification Adjustments Standard OPA (Direct Method) 2-stage Boosted Method (Final Weights) Statistic Eligibility Completion Raking Eligibility Completion Raking Mean Standard Deviation % Max % % % % Q % Median % Q % % % % Min Table 12 extends the comparison of OPA standard and 2-stage boosted methods by showing the distribution of final weights. The final weight takes into account all of the previous weighting adjustments. OPA sees somewhat erratic behavior at the tails for the maximum 16
23 weight value, but are very close in most of the other quantiles. OPA concludes that overall both methods produce similar distributions of survey weights. Table 12. Comparison Between Standard OPA and 2-Stage Boosted Final Weights Moments Standard OPA 2-Stage Boosted Mean Standard Deviation % Max % % % % Q % Median % Q % % % % Min Comparison of Key Estimates Finally, differences in weighted survey estimates for sexual assault based on the 2-Stage Boosted and OPA standard weighting methods are compared in Table 13 and Table 14. Each table shows seven estimates associated with sexual assault and sexual harassment for males and females. For example, the estimates of any sexual assault occurring with females using OPA standard methods was 5.3% and the estimate was 5.4% using the 2-Stage Boosted weighting approach (Table 14). All comparisons are nearly identical for both weighting approaches. 17
24 Table 13. Comparison of OPA and Westat Key Survey Estimates (Males) Question Variable Standard OPA 2-Stage Boosted Sexual Quid Pro Quo QPQ 0.3% ± % ± 0.04 Sexual Assault-Penetrative SA_PEN 0.3% ± % ± 0.04 Sexual Assault-Any Type SA_RATE 0.8% ± % ± 0.1 Sexual Assault-Attempted Touch SA_TOUCH 0.5% ± % ± 0.1 Gender Discrimination SDISC 2.0% ± % ± 0.1 Sexual Harassment SEXHAR 5.7% ± % ± 0.2 Sexual Assault Rate Adjusted for telescoping SA_R_ADJ 0.8% ± % ± 0.1 Overall all of the other estimates are nearly identical. In addition, all of the confidence intervals for either weighting method overlap. Table 14. Comparison of OPA and Westat Key Survey Estimates (Females) Question Variable Standard OPA 2-Stage Boosted Sexual Quid Pro Quo QPQ 2.0% ± % ± 0.2 Sexual Assault-Penetrative SA_PEN 2.2% ± % ± 0.2 Sexual Assault-Any Type SA_RATE 5.3% ± % ± 0.3 Sexual Assault-Attempted Touch SA_TOUCH 3.0% ± % ± 0.2 Gender Discrimination SDISC 13.6% ± % ± 0.4 Sexual Harassment SEXHAR 20.8% ± % ± 0.5 Sexual Assault Rate Adjusted for telescoping SA_R_ADJ 4.9% ± % ± 0.3 Finally, while not displayed here, OPA also replaced the decision tree methodology in the direct estimates from CHAID to use two different alternative methods 1) Recursive Partioning Trees (RPART) in R and 2) xgboost in R. Both of these additional methodologies provided similar results compared to the 2-Stage Boosted and the Direct Estimate CHAID. Summary The direct OPA and 2-Stage Boosted weighting methods were conducted independently using different software (OPA standard weights used SAS and SPSS; 2-Stage Boosted used R and SAS) and methodology, but the overall results are strikingly similar across all of the intermediate weighting steps, the final weights, and a comparison of key estimates. In addition, all of the confidence intervals for either weighting method overlap, and none of the estimates are 18
25 statistically significant. This insight is particularly powerful given the differences in the methods. As mentioned earlier, the 2-Stage Boosted Method was designed to reduce variance in the estimates, and Table 13 and Table 14 show that this goal is achieved. In conclusion, 2016 WGRA estimates of sexual assault and sexual harassment are very robust to the choice of weighting methods. It should be noted that OPA conducted these comparisons for the 2015 WGRR and found similar results (DMDC, 2016a). Note that this study does not assess the level of NRB in 2016 WGRA estimates, but instead assesses whether different weighting methods differentially alter the level of NRB. OPA concludes that the choice of weighting methods does not substantially alter the level of NRB in 2016 WGRA estimates. 19
26
27 References DMDC. (2016) Workplace and gender relations survey of reserve component members: Statistical Methodology Report (Report No ). Alexandria, VA: Author. Little, R.J., & Rubin, D.B. (2002). Statistical analysis with missing data (2 nd ed.). New York: John Wiley & Sons, Inc. doi: / Montaquila, J.M. and Olson, K. M. (2012). Practical Tools for Nonresponse Bias Studies. Retrieved from Morral, A.R., Gore, K.L., & Schell, T.L. (Eds.). (2015). Sexual assault and sexual harassment in the U.S. military: Volume 4. Investigations of potential bias in estimates from the 2014 RAND military workplace study (No. RR-870/2-OSD). Santa Monica, CA: RAND Corporation. OPA. (2016a) Workplace and gender relations survey of active duty members: Statistical Methodology Report (Report No ). Alexandria, VA: Author. 21
28
29 This page is reserved for insertion of Standard Form 298, page 1 -- this is best accomplished by replacing this page after the document has been converted to PDF
30
SEXUAL ASSAULT AND SEXUAL HARASSMENT IN THE U.S. MILITARY
SEXUAL ASSAULT AND SEXUAL HARASSMENT IN THE U.S. MILITARY Volume 4. Investigations of Potential Bias in Estimates from the 2014 RAND Military Workplace Study Andrew R. Morral, Kristie L. Gore, Terry L.
More informationSEXUAL ASSAULT AND SEXUAL HARASSMENT IN THE MILITARY
SEXUAL ASSAULT AND SEXUAL HARASSMENT IN THE MILITARY Top-Line Estimates for Active-Duty Service Members from the 2014 RAND Military Workplace Study National Defense Research Institute Prepared for the
More information2017 Workplace and Gender Relations Survey of Reserve Component Members DoD Overview Report
2017 Workplace and Gender Relations Survey of Reserve Component Members DoD Overview Report Additional copies of this report may be obtained from: Defense Technical Information Center ATTN: DTIC-BRR 8725
More informationSEXUAL ASSAULT AND SEXUAL HARASSMENT IN THE U.S. MILITARY
SEXUAL ASSAULT AND SEXUAL HARASSMENT IN THE U.S. MILITARY Top-Line Estimates for Active-Duty Service Members from the 2014 RAND Military Workplace Study National Defense Research Institute C O R P O R
More informationMilitary Investigation and Justice Experience Survey (MIJES) Overview Report
2016 2017 Military Investigation and Justice Experience Survey (MIJES) Overview Report Additional copies of this report may be obtained from: Defense Technical Information Center ATTN: DTIC-BRR 8725 John
More informationSEXUAL ASSAULT AND SEXUAL HARASSMENT IN THE U.S. MILITARY
SEXUAL ASSAULT AND SEXUAL HARASSMENT IN THE U.S. MILITARY Volume 2. Estimates for Department of Defense Service Members from the 2014 RAND Military Workplace Study Andrew R. Morral, Kristie L. Gore, Terry
More informationAn Application of Propensity Modeling: Comparing Unweighted and Weighted Logistic Regression Models for Nonresponse Adjustments
An Application of Propensity Modeling: Comparing Unweighted and Weighted Logistic Regression Models for Nonresponse Adjustments Frank Potter, 1 Eric Grau, 1 Stephen Williams, 1 Nuria Diaz-Tena, 2 and Barbara
More informationPreliminary Findings from Dartmouth s 2015 AAU Campus Climate Survey on Sexual Assault and Sexual Misconduct 1
Preliminary Findings from Dartmouth s 2015 AAU Campus Climate Survey on Sexual Assault and Sexual Misconduct 1 Summary In spring 2015, all Dartmouth students (undergraduate and graduate/professional) were
More informationGreat Expectations: Changing Mode of Survey Data Collection in Military Populations
Great Expectations: Changing Mode of Survey Data Collection in Military Populations Ronald Z. Szoc, PhD Jacqueline Pflieger, PhD Frances M. Barlas, PhD Randall K. Thomas Federal Committee on Statistical
More informationNonresponse Adjustment Methodology for NHIS-Medicare Linked Data
Nonresponse Adjustment Methodology for NHIS-Medicare Linked Data Michael D. Larsen 1, Michelle Roozeboom 2, and Kathy Schneider 2 1 Department of Statistics, The George Washington University, Rockville,
More informationLOGISTIC PROPENSITY MODELS TO ADJUST FOR NONRESPONSE IN PHYSICIAN SURVEYS
LOGISTIC PROPENSITY MODELS TO ADJUST FOR NONRESPONSE IN PHYSICIAN SURVEYS Nuria Diaz-Tena, Frank Potter, Michael Sinclair and Stephen Williams Mathematica Policy Research, Inc., Princeton, New Jersey 08543-2393
More informationJ. Michael Brick Westat and JPSM December 9, 2005
Nonresponse in the American Time Use Survey: Who is Missing from the Data and How Much Does It Matter by Abraham, Maitland, and Bianchi J. Michael Brick Westat and JPSM December 9, 2005 Philosophy A sensible
More informationWhat is Sexual Harassment?
Terri Allred, MTS OBJECTIVES FOR TODAY Learn how to Motivate and reward positive and respectful workplace behaviors. Identify and respond to problematic attitudes and behaviors. Empower your employees
More informationMethodology for the VoicesDMV Survey
M E T R O P O L I T A N H O U S I N G A N D C O M M U N I T I E S P O L I C Y C E N T E R Methodology for the VoicesDMV Survey Timothy Triplett December 2017 Voices of the Community: DC, Maryland, Virginia
More informationEvaluating health management programmes over time: application of propensity score-based weighting to longitudinal datajep_
Journal of Evaluation in Clinical Practice ISSN 1356-1294 Evaluating health management programmes over time: application of propensity score-based weighting to longitudinal datajep_1361 180..185 Ariel
More informationA COMPARISON OF IMPUTATION METHODS FOR MISSING DATA IN A MULTI-CENTER RANDOMIZED CLINICAL TRIAL: THE IMPACT STUDY
A COMPARISON OF IMPUTATION METHODS FOR MISSING DATA IN A MULTI-CENTER RANDOMIZED CLINICAL TRIAL: THE IMPACT STUDY Lingqi Tang 1, Thomas R. Belin 2, and Juwon Song 2 1 Center for Health Services Research,
More informationVariable Data univariate data set bivariate data set multivariate data set categorical qualitative numerical quantitative
The Data Analysis Process and Collecting Data Sensibly Important Terms Variable A variable is any characteristic whose value may change from one individual to another Examples: Brand of television Height
More informationInterpreting Response Rate for the 2010 Department of Defense Comprehensive Review Survey: A Research Memo
Interpreting Response Rate for the 2010 Department of Defense Comprehensive Review Survey: A Research Memo November 2010 by Dr. Bonnie Moradi This research memo was commisioned by the Palm Center, a research
More informationSubject index. bootstrap...94 National Maternal and Infant Health Study (NMIHS) example
Subject index A AAPOR... see American Association of Public Opinion Research American Association of Public Opinion Research margins of error in nonprobability samples... 132 reports on nonprobability
More informationComparing Sexual Assault Survey Prevalence Rates at Military Service Academies and U.S. Colleges
December 15, 2015 SURVEY NOTE Note No. 2015-17 Comparing Sexual Assault Survey Prevalence Rates at Military Service Academies and U.S. Colleges Executive Summary The Association of American Universities
More informationArmy Acquisition, Logistics and Technology
Army Acquisition, Logistics and Technology HR Solutions Contractor Sexual Harassment & Assault Training Brent A. Thomas Project Lead Human Resource Solutions Program Executive Office for Enterprise Information
More informationJSM Survey Research Methods Section
A Weight Trimming Approach to Achieve a Comparable Increase to Bias across Countries in the Programme for the International Assessment of Adult Competencies Wendy Van de Kerckhove, Leyla Mohadjer, Thomas
More informationCHAPTER THIRTEEN Managing Communication
CHAPTER THIRTEEN Managing Communication 1 Effective Management 3 rd Edition Chuck Williams What Would You Do? JetBlue Headquarters Forest Hills, New York JetBlue offers direct flights, low fares, and great
More informationPredicting Workplace Violence: An EAP s Perspective Insights from the Warren Shepell Research Group
Predicting Workplace Violence: An EAP s Perspective Insights from the Warren Shepell Research Group Predicting Workplace Violence: An EAP's Perspective EXECUTIVE SUMMARY With mental health concerns and
More informationAn Empirical Study of Nonresponse Adjustment Methods for the Survey of Doctorate Recipients Wilson Blvd., Suite 965, Arlington, VA 22230
An Empirical Study of Nonresponse Adjustment Methods for the Survey of Doctorate Recipients 1 Fan Zhang 1 and Stephen Cohen 1 Donsig Jang 2, Amang Suasih 2, and Sonya Vartivarian 2 1 National Science Foundation,
More informationChapter 5 RESULTS AND DISCUSSION
Chapter 5 RESULTS AND DISCUSSION This chapter consists of two sections. The findings from the school survey are presented in the first section, followed by the findings from the focus group discussions
More informationA Comparison of Variance Estimates for Schools and Students Using Taylor Series and Replicate Weighting
A Comparison of Variance Estimates for Schools and Students Using and Replicate Weighting Peter H. Siegel, James R. Chromy, Ellen Scheib RTI International Abstract Variance estimation is an important issue
More informationHow to Recognize and Avoid Harassment in the Workplace. ENGT-2000 Professional Development
How to Recognize and Avoid Harassment in the Workplace ENGT-2000 Professional Development 1 Why Talk About Harassment? Because it is unprofessional and possibly illegal! Because we want to help prevent
More informationResults of the 2016 Gender Equality in the Legal Profession Survey
Results of the 2016 Gender Equality in the Legal Profession Survey October 2016 INTRODUCTION A Florida Bar Special Committee was appointed by President Bill Schifino at the beginning of the 2016-17 Bar
More informationI. Introduction and Data Collection B. Sampling. 1. Bias. In this section Bias Random Sampling Sampling Error
I. Introduction and Data Collection B. Sampling In this section Bias Random Sampling Sampling Error 1. Bias Bias a prejudice in one direction (this occurs when the sample is selected in such a way that
More informationMethodological Considerations to Minimize Total Survey Error in the National Crime Victimization Survey
Methodological Considerations to Minimize Total Survey Error in the National Crime Victimization Survey Andrew Moore, M.Stat., RTI International Marcus Berzofsky, Dr.P.H., RTI International Lynn Langton,
More informationLEAD HUMAN RESOURCES
LEAD UNLAWFUL WORKPLACE HARASSMENT HUMAN RESOURCES 2012 Jeanne Madorin TOPICS OF DISCUSSION 2 University s Obligation Sexual Harassment Definitions Hostile Work Environment Sexual Harassment Examples Third
More informationStatistics: A Brief Overview Part I. Katherine Shaver, M.S. Biostatistician Carilion Clinic
Statistics: A Brief Overview Part I Katherine Shaver, M.S. Biostatistician Carilion Clinic Statistics: A Brief Overview Course Objectives Upon completion of the course, you will be able to: Distinguish
More informationEvaluating Questionnaire Issues in Mail Surveys of All Adults in a Household
Evaluating Questionnaire Issues in Mail Surveys of All Adults in a Household Douglas Williams, J. Michael Brick, Sharon Lohr, W. Sherman Edwards, Pamela Giambo (Westat) Michael Planty (Bureau of Justice
More informationIt s All Relative: How Presentation of Information To Patients Influences Their Decision-Making
MUMJ Original Research 15 ORIGINAL RESEARCH It s All Relative: How Presentation of Information To Patients Influences Their Decision-Making Mohit Bhandari, MD, MSc Vikas Khera, BSc Jaydeep K. Moro, MD
More informationLETTER TO PARTICIPANT. Civilian Employee Wellness Program Participant Marine Corps Community Service Base Henderson Hall
Dear Participant, MARINE CORPS COMMUNITY SERVICES HENDERSON HALL HEADQUARTERS & SERVICE BATTALION, HEADQUARTERS MARINE CORPS, HENDERSON HALL P.O. BOX 4009, ARLINGTON, VIRGINIA 22204 0009 LETTER TO PARTICIPANT
More informationConnectedness DEOCS 4.1 Construct Validity Summary
Connectedness DEOCS 4.1 Construct Validity Summary DEFENSE EQUAL OPPORTUNITY MANAGEMENT INSTITUTE DIRECTORATE OF RESEARCH DEVELOPMENT AND STRATEGIC INITIATIVES Directed by Dr. Daniel P. McDonald, Executive
More informationState of Florida. Sexual Harassment Awareness Training
State of Florida Sexual Harassment Awareness Training Objectives To prevent sexual harassment in the workplace. To define the behavior that may constitute sexual harassment. To provide guidance to state
More informationHIV/AIDS Tool Kit. A. Introduction
HIV/AIDS Tool Kit A. Introduction International Planned Parenthood Federation (IPPF) believes that HIV is the pre-eminent health, social and human rights issue of our time, which threatens the survival
More information11 questions to help you make sense of a case control study
Critical Appraisal Skills Programme (CASP) making sense of evidence 11 questions to help you make sense of a case control study How to use this appraisal tool Three broad issues need to be considered when
More informationDepartment of Defense. Report on Child Abuse and Neglect and Domestic Abuse in the Military for Fiscal Year 2017
Department of Defense Report on Child Abuse and Neglect and Domestic Abuse in the Military for Fiscal Year 2017 April 2018 The estimated cost of this report or study for the Department of Defense is approximately
More informationUniversity student sexual assault and sexual harassment survey. Notes on reading institutional-level data
University student sexual assault and sexual harassment survey Notes on reading institutional-level data This is a note on reading the institutional-level data from the Australian Human Rights Commission
More informationStudent Performance Q&A:
Student Performance Q&A: 2009 AP Statistics Free-Response Questions The following comments on the 2009 free-response questions for AP Statistics were written by the Chief Reader, Christine Franklin of
More informationCREATING A BETTER PLACE TO WORK
CREATING A BETTER PLACE TO WORK REDUCING WORKPLACE STRESS & AGGRESSION TO INCREASE INDIVIDUAL & ORGANIZATIONAL PERFORMANCE A COLLABORATIVE ACTION PROJECT BETWEEN U. S. DEPARTMENT OF VETERANS AFFAIRS FAIRLEIGH
More informationSexual Misconduct in the Canadian Armed Forces, 2016
Catalogue no. 85-603-X ISBN 978-0-660-05665-4 Sexual Misconduct in the Canadian Armed Forces, 2016 by Adam Cotter Release date: November 28, 2016 How to obtain more information For information about this
More informationWeight Adjustment Methods using Multilevel Propensity Models and Random Forests
Weight Adjustment Methods using Multilevel Propensity Models and Random Forests Ronaldo Iachan 1, Maria Prosviryakova 1, Kurt Peters 2, Lauren Restivo 1 1 ICF International, 530 Gaither Road Suite 500,
More informationAP STATISTICS 2014 SCORING GUIDELINES
2014 SCORING GUIDELINES Question 4 Intent of Question The primary goals of this question were to assess a student s ability to (1) describe why the median might be preferred to the mean in a particular
More informationTHE WHYS AND HOWS OF SEXUAL HARASSMENT: WHAT ORGANIZATIONS SHOULD BE DOING
THE WHYS AND HOWS OF SEXUAL HARASSMENT: WHAT ORGANIZATIONS SHOULD BE DOING Vicki J. Magley, Ph.D. Professor of Psychology University of Connecticut March 19, 2018 DOMINANT METHODOLOGICAL APPROACH: LARGE-SCALE
More informationSubj: DEPARTMENT OF THE NAVY (DON) POLICY ON SEXUAL HARASSMENT
DEPARTMENT OF THE NAVY OFFICE OF THE SECRETARY 1000 NAVY PENTAGON WASHINGTON, D. C. 20350-1000 SECNAVINST 5300.26D ASN(M&RA) SECNAV INSTRUCTION 5300.26D From: Secretary of the Navy Subj: DEPARTMENT OF
More informationMethodology for Non-Randomized Clinical Trials: Propensity Score Analysis Dan Conroy, Ph.D., inventiv Health, Burlington, MA
PharmaSUG 2014 - Paper SP08 Methodology for Non-Randomized Clinical Trials: Propensity Score Analysis Dan Conroy, Ph.D., inventiv Health, Burlington, MA ABSTRACT Randomized clinical trials serve as the
More informationThe Impact of Relative Standards on the Propensity to Disclose. Alessandro Acquisti, Leslie K. John, George Loewenstein WEB APPENDIX
The Impact of Relative Standards on the Propensity to Disclose Alessandro Acquisti, Leslie K. John, George Loewenstein WEB APPENDIX 2 Web Appendix A: Panel data estimation approach As noted in the main
More informationInjuries and Illnesses of Vietnam War POWs Revisited: III. Marine Corps Risk Factors LT Saima S. Raza, MSC, USN, Jeffrey L. Moore, John P.
Injuries and Illnesses of Vietnam War POWs Revisited: III. Marine Corps Risk Factors LT Saima S. Raza, MSC, USN, Jeffrey L. Moore, John P. Albano Operation Homecoming (O/H), the negotiated release of 566
More informationSampling for Success. Dr. Jim Mirabella President, Mirabella Research Services, Inc. Professor of Research & Statistics
Sampling for Success Dr. Jim Mirabella President, Mirabella Research Services, Inc. Professor of Research & Statistics Session Objectives Upon completion of this workshop, participants will be able to:
More informationTable 1. White House Task Force sample CCS and Emory CCS Sexual Assault and Rape Questions Match-up White House Task Force sample CCS Emory CCS
Sexual Violence Measures Used in the Campus Climate Survey Prevalence numbers for sexual assault and rape The White House Task Force Not Alone 2014 report included a document, titled Climate Surveys: Useful
More informationThis report summarizes the stakeholder feedback that was received through the online survey.
vember 15, 2016 Test Result Management Preliminary Consultation Online Survey Report and Analysis Introduction: The College s current Test Results Management policy is under review. This review is being
More informationStudent Guide to Sexual Harassment. Charlotte Russell Assistant to the Chancellor For Equity, Access & Diversity
Student Guide to Sexual Harassment Charlotte Russell Assistant to the Chancellor For Equity, Access & Diversity 1 Objectives Understand the sexual harassment guidelines. Know how to recognize sexual harassment.
More informationClinical Trials: Questions and Answers
CANCER FACTS N a t i o n a l C a n c e r I n s t i t u t e N a t i o n a l I n s t i t u t e s o f H e a l t h D e p a r t m e n t o f H e a l t h a n d H u m a n S e r v i c e s Clinical Trials: Questions
More informationTips and Tricks for Raking Survey Data with Advanced Weight Trimming
SESUG Paper SD-62-2017 Tips and Tricks for Raking Survey Data with Advanced Trimming Michael P. Battaglia, Battaglia Consulting Group, LLC David Izrael, Abt Associates Sarah W. Ball, Abt Associates ABSTRACT
More information2018 National ASL Scholarship
Eligibility Statement 2018 National ASL Scholarship Deadline: May 11, 2018 High school seniors planning to major or minor in American Sign Language, Deaf Studies, Deaf Education, or Interpreter Preparation
More informationAdvocacy in the Criminal Justice System with Adults and Teens
Legal Advocacy State Assessment Summary 2014 WCSAP prioritized gathering information about sexual assault legal advocacy practices because we have heard from advocates that sexual assault survivors face
More informationUNLAWFUL WORKPLACE HARASSMENT
UNLAWFUL WORKPLACE HARASSMENT ASPIRE Paulette D. Russell Human Resources, Employee Relations P.Douglas@uncc.edu, Ext.7-0660 University s Obligation To ensure that all employees and supervisors are aware
More informationComparing Alternatives for Estimation from Nonprobability Samples
Comparing Alternatives for Estimation from Nonprobability Samples Richard Valliant Universities of Michigan and Maryland FCSM 2018 1 The Inferential Problem 2 Weighting Nonprobability samples Quasi-randomization
More informationInjuries and Illnesses of Vietnam War POWs Revisited: II. Army Risk Factors LT Saima S. Raza, MSC, USN, Jeffrey L. Moore, John P.
Injuries and Illnesses of Vietnam War POWs Revisited: II. Army Risk Factors LT Saima S. Raza, MSC, USN, Jeffrey L. Moore, John P. Albano Operation Homecoming (O/H), the negotiated release of 566 US Servicemen
More informationM2. Positivist Methods
M2. Positivist Methods While different research methods can t simply be attributed to different research methodologies - the Interpretivists would never touch a questionnaire approach - some methods are
More informationCRITICAL APPRAISAL SKILLS PROGRAMME Making sense of evidence about clinical effectiveness. 11 questions to help you make sense of case control study
CRITICAL APPRAISAL SKILLS PROGRAMME Making sense of evidence about clinical effectiveness 11 questions to help you make sense of case control study General comments Three broad issues need to be considered
More informationAnExaminationoftheQualityand UtilityofInterviewerEstimatesof HouseholdCharacteristicsinthe NationalSurveyofFamilyGrowth. BradyWest
AnExaminationoftheQualityand UtilityofInterviewerEstimatesof HouseholdCharacteristicsinthe NationalSurveyofFamilyGrowth BradyWest An Examination of the Quality and Utility of Interviewer Estimates of Household
More informationSummary and conclusions
Summary and conclusions Aggression and violence, posttraumatic stress, and absenteeism among employees in penitentiaries The study about which we have reported here was commissioned by the Sector Directorate
More informationReproductive Health s Knowledge, Attitudes, and Practices A European Youth Study Protocol October 13, 2009
Reproductive Health s Knowledge, Attitudes, and Practices A European Youth Study Protocol October 13, 2009 I. Introduction European youth has been facing major socio-demographic and epidemiological changes
More informationHEDS Campus Climate Sexual Assault Survey
HEDS Campus Climate Sexual Assault Survey OCCIDENTAL COLLEGE OFFICE OF INSTITUTIONAL RESEARCH, ASSESSMENT, AND PLANNING TITLE IX OFFICE APRIL 2015 About the Survey (Administered on Feb 16-Mar 13, 2015)
More informationChapter 18: Categorical data
Chapter 18: Categorical data Labcoat Leni s Real Research The impact of sexualized images on women s self-evaluations Problem Daniels, E., A. (2012). Journal of Applied Developmental Psychology, 33, 79
More informationBasic Biostatistics. Chapter 1. Content
Chapter 1 Basic Biostatistics Jamalludin Ab Rahman MD MPH Department of Community Medicine Kulliyyah of Medicine Content 2 Basic premises variables, level of measurements, probability distribution Descriptive
More informationStatutory Basis. Since Eden and Counting 1/28/2009. Chapter 8. Sexual Harassment
Chapter 8 Sexual Harassment Employment Law for BUSINESS sixth edition Dawn D. BENNETT-ALEXANDER and Laura P. HARTMAN McGraw-Hill/Irwin Copyright 2009 by The McGraw-Hill Companies, Inc. All rights reserved.
More informationOverview. Lesson 1 Introduction to Sexual Harassment. In this course, you will:
Overview In this course, you will: Find out what behavior constitutes sexual harassment. Understand the effects of sexual harassment on individuals and organizations. Better understand how your actions
More informationDEPARTMENT OF DEFENSE
DEPARTMENT OF DEFENSE DEOMI Organizational Climate Survey (DEOCS) Report Organization: Commander/Director: Admin Number: Sample Sample Sample Monday, April 21, 2014 Defense Equal Opportunity Management
More informationFirst Problem Set: Answers, Discussion and Background
First Problem Set: Answers, Discussion and Background Part I. Intuition Concerning Probability Do these problems individually Answer the following questions based upon your intuitive understanding about
More informationRecent developments for combining evidence within evidence streams: bias-adjusted meta-analysis
EFSA/EBTC Colloquium, 25 October 2017 Recent developments for combining evidence within evidence streams: bias-adjusted meta-analysis Julian Higgins University of Bristol 1 Introduction to concepts Standard
More informationGreat Expectations: Changing Mode of Survey Data Collection in Military Populations 1
Great Expectations: Changing Mode of Survey Data Collection in Military Populations 1 Abstract Ronald Z. Szoc ICF International 530 Gaither Road, Rockville, MD, 20850 Jacqueline Pflieger SRA International
More informationTitle:Modern contraceptive use among sexually active men in Uganda: Does discussion with a health worker matter?
Author's response to reviews Title:Modern contraceptive use among sexually active men in Uganda: Does discussion with a health worker matter? Authors: Allen Kabagenyi Ms. (allenka79@yahoo.com) Patricia
More informationEquality and Human Rights Commission. Sexual harassment and the law: Guidance for employers
Equality and Human Rights Commission Sexual harassment and the law: Guidance for employers 2 What is sexual harassment? Sexual harassment occurs when a person engages in unwanted conduct of a sexual nature
More informationCochrane Pregnancy and Childbirth Group Methodological Guidelines
Cochrane Pregnancy and Childbirth Group Methodological Guidelines [Prepared by Simon Gates: July 2009, updated July 2012] These guidelines are intended to aid quality and consistency across the reviews
More informationCONFRONTING THE INVISIBLE WOUNDS OF WAR: BARRIERS, MISUNDERSTANDING, AND A DIVIDE
«CONFRONTING THE INVISIBLE WOUNDS OF WAR: BARRIERS, MISUNDERSTANDING, AND A DIVIDE About the George W. Bush Institute Housed within the George W. Bush Presidential Center, the George W. Bush Institute
More informationFINAL TOPLINE. Diabetes Group. Qualities That Matter: Public Perceptions of Quality in Diabetes Care, Joint Replacement and Maternity Care
FINAL TOPLINE Group Qualities That Matter: Public Perceptions of Quality in Care, Joint Replacement and Maternity Care National Survey of adults recently diagnosed with type 2 diabetes about their perceptions
More informationPublic Attitudes and Knowledge about HIV/AIDS in Georgia Kaiser Family Foundation
Public Attitudes and Knowledge about HIV/AIDS in Georgia Kaiser Family Foundation Chart Pack November 2015 Methodology Public Attitudes and Knowledge about HIV/AIDS in Georgia is a representative, statewide
More informationUse of Paradata in a Responsive Design Framework to Manage a Field Data Collection
Journal of Official Statistics, Vol. 28, No. 4, 2012, pp. 477 499 Use of Paradata in a Responsive Design Framework to Manage a Field Data Collection James Wagner 1, Brady T. West 1, Nicole Kirgis 1, James
More informationTechnical Specifications
Technical Specifications In order to provide summary information across a set of exercises, all tests must employ some form of scoring models. The most familiar of these scoring models is the one typically
More informationTitle IX. And Sexual Harassment
Title IX And Sexual Harassment HISTORY Federal law Title IX of the Education Amendments of 1972 prohibits discrimination on the basis of sex, including sexual harassment in education programs and activities.
More informationAn Independent Analysis of the Nielsen Meter and Diary Nonresponse Bias Studies
An Independent Analysis of the Nielsen Meter and Diary Nonresponse Bias Studies Robert M. Groves and Ashley Bowers, University of Michigan Frauke Kreuter and Carolina Casas-Cordero, University of Maryland
More informationStudy Design STUDY DESIGN CASE SERIES AND CROSS-SECTIONAL STUDY DESIGN
STUDY DESIGN CASE SERIES AND CROSS-SECTIONAL Daniel E. Ford, MD, MPH Vice Dean for Clinical Investigation Johns Hopkins School of Medicine Introduction to Clinical Research July 15, 2014 STUDY DESIGN Provides
More informationby at Study Roles and Responsibilities
Full Study DHA Non-Exempt Human Subject Research Protocol Submission (You may need to click Enable Editing on the yellow bar above in order to complete this template) 1.0 Study Contacts Name/Rank/Degree:
More informationA Practical Guide to Getting Started with Propensity Scores
Paper 689-2017 A Practical Guide to Getting Started with Propensity Scores Thomas Gant, Keith Crowland Data & Information Management Enhancement (DIME) Kaiser Permanente ABSTRACT This paper gives tools
More informationSurvey Errors and Survey Costs
Survey Errors and Survey Costs ROBERT M. GROVES The University of Michigan WILEY- INTERSCIENCE A JOHN WILEY & SONS, INC., PUBLICATION CONTENTS 1. An Introduction To Survey Errors 1 1.1 Diverse Perspectives
More informationAbout this consent form
Protocol Title: Development of the smoking cessation app Smiling instead of Smoking Principal Investigator: Bettina B. Hoeppner, Ph.D. Site Principal Investigator: n/a Description of Subject Population:
More informationEstablishing a Gender Bias Task Force
Law & Inequality: A Journal of Theory and Practice Volume 4 Issue 1 Article 7 1986 Establishing a Gender Bias Task Force Marilyn Loftus Lynn Hecht Schafran Norma Wikler Follow this and additional works
More informationManuscript Presentation: Writing up APIM Results
Manuscript Presentation: Writing up APIM Results Example Articles Distinguishable Dyads Chung, M. L., Moser, D. K., Lennie, T. A., & Rayens, M. (2009). The effects of depressive symptoms and anxiety on
More informationMULTIPLE LINEAR REGRESSION 24.1 INTRODUCTION AND OBJECTIVES OBJECTIVES
24 MULTIPLE LINEAR REGRESSION 24.1 INTRODUCTION AND OBJECTIVES In the previous chapter, simple linear regression was used when you have one independent variable and one dependent variable. This chapter
More informationVanderbilt University Institutional Review Board Informed Consent Document for Research. Name of participant: Age:
This informed consent applies to: Adults Name of participant: Age: The following is given to you to tell you about this research study. Please read this form with care and ask any questions you may have
More informationThe Impact of Cellphone Sample Representation on Variance Estimates in a Dual-Frame Telephone Survey
The Impact of Cellphone Sample Representation on Variance Estimates in a Dual-Frame Telephone Survey A. Elizabeth Ormson 1, Kennon R. Copeland 1, B. Stephen J. Blumberg 2, and N. Ganesh 1 1 NORC at the
More informationHomeless Incidence and Risk Factors for Becoming Homeless in Veterans
Homeless Incidence and Risk Factors for Becoming Homeless in Veterans (http://www.va.gov/oig/pubs/vaoig-11-03428-173.pdf) Lin Clegg, Ph.D., M.S Lin.Clegg@VA.gov John Daigh, M.D., CPA Office of Inspector
More informationRoutine Questionnaire (A1)
Statistical Center for HIV/AIDS Research & Prevention (SCHARP) (RQ-1) RQ-1 (021) Page 1 of 8 Enrollment Date dd Staff ID: Team ID: Instructions: Use this Routine Questionnaire for all participants meeting
More informationSocial Issues in Nonmetropolitan Nebraska: Perceptions of Social Stigma and Drug and Alcohol Abuse: 2018 Nebraska Rural Poll Results
University of Nebraska - Lincoln DigitalCommons@University of Nebraska - Lincoln Publications of the Rural Futures Institute Rural Futures Institute at the University of Nebraska 8-2-2018 Social Issues
More information