X hits on this document

PDF document

Results From the 2007–08 Schools and Staffing Survey - page 47 / 59





47 / 59

mailed at the beginning of the school year.5 Next, schools were telephoned using a computer-assisted telephone-interviewing (CATI) instrument to verify school information, establish a survey coordinator (who became the main contact person at the school for subsequent communication), and follow up on the Teacher Listing Form. Teacher questionnaires were mailed to schools on a flow basis as teachers were sampled on an ongoing basis from the data provided on the Teacher Listing Form. The field follow-up period was preceded by phone calls from the telephone centers to remind the survey coordinator to have staff complete and return all forms. Individual survey respondents (principal, librarian, and teachers) were also called from the telephone centers and asked to complete the questionnaire by phone. Data collection ended in June 2008.

Data Processing and Imputation

The Census Bureau used both central processing and headquarters staff to check returned questionnaires, capture data, and implement quality control procedures. Questionnaires that had a preliminary classification of a complete interview were submitted to a series of computer edits consisting of a range check, a consistency edit, and a blanking edit.6 After these edits were run and reviewed by analysts, the records were put through another edit to make a final determination as to whether the case was eligible for the survey and whether sufficient data had been collected for the case to be classified as a complete interview.

After the final edits were run, cases with “not-answered” values for items remained. Values were imputed using a two stage process. In the first stage, items were imputed with a valid response using data either from the sample frame, other items in the same SASS questionnaire, or another questionnaire associated with the same school or school district. In addition, data were ratio adjusted in some circumstances so that items were consistent with one another. In the second stage, donor-respondent methods, such as hot- deck imputation, were used. If no suitable donor case could be matched, the few remaining items were imputed with a mean or mode from groups of similar cases. After each stage of imputation, computer edits were run again to verify that the imputed data were consistent with the existing questionnaire data. If that was not the case, an imputed value was blanked out by one of these computer edits due to inconsistency with other data within the same questionnaire or because it was out of the range of acceptable values. In these situations, Census Bureau analysts looked at the items and tried to determine an appropriate value. Imputation flags, indicating which imputation method

5 The SASS school package contained a cover letter to the principal, a cover letter to the survey coordinator, the Teacher Listing Form, the Public School Principal Questionnaire/Private School Principal Questionnaire, the Public School Questionnaire/Public School Questionnaire (With District Items)/Private School Questionnaire, the School Library Media Center Questionnaire (for public and BIE-funded schools only), postage-paid return envelopes, an NCES pamphlet detailing general information about SASS, an NCES brochure detailing some of the findings from the 2003-04 SASS, and the Statistical Abstract of the United States: 2007 CD.

6 Blanking edits delete answers to questions that should not have been filled in (e.g., if a respondent followed a wrong skip pattern).


Document info
Document views106
Page views106
Page last viewedFri Oct 28 08:29:51 UTC 2016