X hits on this document

Word document

FORMATIVE EVALUATION OF ACADEMIC PROGRESS: - page 12 / 30

63 views

0 shares

0 downloads

0 comments

12 / 30

As indicated by these preliminary norms, therefore, a weekly increase of .30 digits seems to represent a realistic CBM weekly rate of improvement for Grades 1, 2, and 3; .45 digits for Grade 6; and .70 to .75 digits for Grades 4 and 5. Adding one standard deviation to the average slopes shown in Table 6, CBM practitioners may employ the following targets for establishing more ambitious weekly rates of growth for their students: at Grades 1, 2, and 3, .50; at Grade 6, 1.00; at Grades 4 and 5, 1.15 to 1.20.

Although number of problems correct typically is not employed in the ongoing measurement of progress within CBM methodology, some comment with respect to this secondary score appears warranted. It is interesting to note that, except at Grade 1, slopes consistently were higher for digits than problems. Comparable slopes for digits and problems at Grade 1 are expected because Grade 1 problems typically require only 1-digit answers. However, with higher grades, especially after Grade 3, the number of digits required per problem increases --hence, the increasing discrepancies between slopes for digits and problems scores. These larger slopes for digits support the use of digits as the primary CBM datum: As Deno (1985) noted, higher slopes are a desirable feature of ongoing measurement systems, because higher slopes makes student growth easier to detect. When, due to higher slopes, student change (or lack thereof) becomes evident sooner, ongoing measurement systems are more useful for instructional decision making and more satisfying for pupils and teachers.

Conclusions

Results must be interpreted in light of two serious limitations. First, because data were collected in one rural region, performance may be unrepresentative of more suburban or urban areas, of other regions of the country, or even of other rural districts in the same region. Future studies addressing the extent of variability across (and even within) districts should clarify whether a need exists to develop district- (or school-) specific norms for weekly rates of growth or whether norms based on large samples, representative of the greater population, will suffice. Second, because alternative materials from which to sample testing stimuli were not contrasted in this study, we do not know whether findings would generalize to other curricula. Clearly, additional research investigating such comparisons are required. Pending future studies with other types of samples and using other curricula, current norms for weekly rates of student improvement can be considered only preliminary.

Within the constraints of these limitations, the current study does provide important preliminary information (a) to help establish guidelines for formatively evaluating student progress when implementing CBM, (b) to provide a methodology to develop similar guidelines for other formative teaching systems and to measure and study student change, and (c) to increase our understanding of theoretical models of academic growth.

Establishing Guidelines for Formatively Evaluating Progress with CBM

Results provide estimates for designating realistic and appropriately ambitious weekly rates of growth when formatively evaluating student progress with CBM. These estimates, derived primarily from general education samples (but including small numbers of mainstreamed students with disabilities), can be used within both general and special education settings.

12

Document info
Document views63
Page views63
Page last viewedSat Dec 03 18:16:09 UTC 2016
Pages30
Paragraphs1832
Words12541

Comments