X hits on this document

Word document

FORMATIVE EVALUATION OF ACADEMIC PROGRESS: - page 5 / 30

109 views

0 shares

0 downloads

0 comments

5 / 30

Spelling measurement procedures. In Years 1 and 2, teachers administered two types of standard CBM spelling tests. One test incorporated randomly selected words from the instructional-level list (i.e., graded test) of the Harris-Jacobson (1972) word list.[2] The second test included words randomly sampled from the Harris-Jacobson word lists, across all six grade levels (i.e., common test). On this common test, all students' tests, regardless of the students' grade level, were identical (i.e., common). For graded and common tests, students took an alternative form of the test each week.

On each standard CBM test, the teacher dictated 20 words at 7-second intervals (total test duration = 2 min, 20 see); students wrote responses on lined, numbered paper. A coordinator periodically observed test administrations to monitor fidelity to standard procedures. A data-entry clerk entered every student's response to every test item into software. This software automatically scored performance as the number of correct letter sequences (LS) and words spelled correctly, and then stored and organized test information. Agreement between the computerized and human scoring has been documented at 99% (see Fuchs, Fuchs, Hamlett, & Allinder, 1991).

Math measurement procedures. During Years 1 and 2, teachers used a standard CBM math procedure; each test was an alternate form representing the type and proportion of problems from the grade-appropriate curriculum. Each CBM test had 25 problems, displayed in random order, encompassing randomly generated numerals. Within grade level, test time was held constant over the year: 45 sees at Grade 1, 1 min at Grade 2, 1.5 min at Grade, 3, 3 min at Grade 4, 5 min at Grade 5, and 6 min at Grade 6. These times increased as the maximum score, or the required number of operations performed and digits written, increased with grade. Teachers administered each test to the class according to standard directions. A coordinator periodically observed test administrations to monitor fidelity to standard procedures. A data-entry clerk entered every student's response to every test item into software, which automatically scored performance as the number of correct digits and problems in answers, and stored and organized test information. Perfect agreement between the computerized and human scoring has been documented (see Fuchs, Fuchs, Hamlett, & Stecker, 1991).

Data Analysis

Calculation of slope. The primary analysis involved the calculation of slope, or weekly rate of academic progress, for each participant on each datum; in reading, correct words on the oral passage measure and correct replacements on the maze measure; in spelling, correct letter sequences (LS) and correct words on both the graded and common measures; in math, correct digits and correct problems. Slope was calculated as in previous CBM research to permit comparison with earlier studies. Using standard CBM procedures, a least-squares regression was run for each student between scores and calendar days (a real time analysis was used). From this regression, the slope indicates the average increase in score for every subsequent calendar day. To conform to methods employed in previous CBM research, this calendar day slope was converted to a weekly slope of improvement by multiplying by 7 days. Consequently, slope is the average weekly increase in a student's CBM score across the school year.

Analysis of adequacy of linear relationship in modeling progress within 1 academic year. For a subset of up to 56[3] randomly selected students at each grade level (all of whom

5

Document info
Document views109
Page views109
Page last viewedSun Jan 22 00:14:25 UTC 2017
Pages30
Paragraphs1832
Words12541

Comments