X hits on this document

PDF document

A Curriculum Management Audit - page 101 / 140





101 / 140

  • “All teachers don’t feel responsible for the WASL—only core teachers.” (teacher)

  • “We need to be further along with classroom assessments.” (central office administrator)

  • “We can’t connect the results from the WASL with teacher grades or make comparisons with

knowledge mastery.” (administrator)

“Because we don’t have test scores, we have to rely on running records and observation. (We have) formal structures to communicate and collaborate, but informal procedures (for

assessment).” (principal) “I am not concerned about test scores

. Kids are successful. We teach kids, not test-takers.”


“So far, the exams that the 8th graders take, we don’t get the results at the high school. I’m hoping that we will soon get the results.” (teacher)

  • “I don’t understand why we would have improvement models and no assessment piece to see if kids are benefiting from the model.” (parent)

  • “I get very little information on the (special education) students that I work with, which makes it difficult for me to do my job.” (therapist)

Auditors reviewed a variety of documents, which, according to policy and regulatory expectations and audit principles, ought to reflect the use of data in decision-making. Auditors were provided plans for 22 of 26 schools. Each has as an initial goal achievement of “target performance standards as measured by state assessments.” Auditors looked for implementation strategies demonstrating use of assessment data for curriculum management. Many of the plans included general descriptions of test data analysis to determine student needs. Some were more specific. For example, one school’s staff planned to “analyze areas of the test where (students) had the most trouble...focus attention on those areas and correlate the district curriculum.” Most school-site plans do not go into this detail. Auditors found no district plans describing program assessment or a means of continuing, modifying, or terminating programs based on assessment data.

Next, auditors reviewed a variety of recent reports, studies, program plans, and issues that reflect planning by the school system. Included below is a sample of some of the more recent planning activities and, in each case, a discussion of how assessment data are used effectively or ineffectively to guide decision-making:

  • A Business and Operations Plan (February 2000) gives no indication of use of data to set goals or means of evaluation of those goals.

  • The School Programs Department Plan 2000 - 2001 (revised 8/18/00) sets as a department goal “to influence school efforts to increase student achievement by implementing/accomplishing percentage [sic] of our strategies.” To do this, personnel will “collect and analyze data related to identified data point,” “develop a plan for sharing findings with schools,” and “work with schools to develop a response that prioritizes and addresses their most important findings.” However, neither a target percentage nor the means of evaluation were specified.

An Information Services Department Plan 2000 - 2001 (January 2000) sets as a training goal (goal 4) to “provide workshops to improve staff personal utility use, classroom instructional use, and Internet research use” of computers. Appropriately, the final action step is to “assess

feedback on effectiveness and carry-through to classroom implementation.” indication of use of hard data either to set the goals or to evaluate them.

There is no

  • A new Technology Plan 2001 - 2004 (January 2001) states “Clover Park School District is committed to ongoing evaluation of its programs in general, and to specific assessment strategies and technologies that provide timely data.” The authors identified the following four basic “questions in every significant educational initiative: what is working and should be continued?,

Clover Park School District Audit Report Page 95

Document info
Document views554
Page views563
Page last viewedTue Jan 24 16:16:28 UTC 2017