Wednesday, September 14, 2016

The State Report Card & Their Flawed Assessments

Tomorrow, September 15th is grade card day for Ohio Schools.  The release day for the State Report Card is a high anxiety day for nearly every school district across the state.  Even the "best" of districts have at least one area on their report card that they wish were better.  When we take into account the dismal results that Ohio experienced on last year's State Assessments, the anxiety is even greater.  As early as July, I've been attempting to spread the word to our learning community just how flawed these assessments were.  Allow me explain in greater detail.

New London Local Schools belongs to North Point Educational Service Center.  This ESC provides services to 27 districts in our region.  They are a great organization, that in addition to other valuable services, provide an avenue for regional districts to share information.  In July, when preliminary state assessment results were released, North Point created a spreadsheet to allow districts to self report their results.  Eighteen (18) districts provided their data, and here is where my eyes started to open.  The results were shocking.  Of the 18 districts that reported, only 1 district met the state indicator in the following tested areas:  4th Grade ELA, 7th Grade ELA, 7th Grade Math, and High School English II.  Even more shocking, not a single reporting district met the state indicator in 8th grade ELA and High School Geometry!  There were also a number of tested areas where only 2 schools met the state indicator, but listing all of them would result in a very lengthy list!

Let's look at these results at the state level.  For this year's results, the state predicted the percent of students that would score proficient on the assessments.  That prediction became the percent that each district needed to meet to earn their indicator on the report card.  Let's take a look at some of these, keeping in mind that the indicator was what Ohio predicted to be the percent of students scoring proficient.  In 5th Grade ELA, the state established the indicator at 73%.  The actual result - 60% of Ohio's students scored proficient on the assessment.  6th Grade ELA, the indicator was set at 74% with the actual results coming in at 54%.  A similar gap in what the state predicted and what actually happened occurred in 6th Grade Math wit the state predicting 74% proficient and 56% of students scoring proficient.   The results do not get any better as we move up the grade levels, as the gap between what the state predicted and what actually happened sits in the 20% points range.  The worse gap is in High School Geometry where the state predicted 80% of the state's students scoring proficient and only 49% were proficient, for a gap of 31% points!

Locally, I'd like to look at the data in an entirely different way.   Over the summer I looked at each grade level of students in grade 4th through 11th.  For each grade, I went back to their 3rd grade year, the year that they began taking state assessments and traced their level of success.  The results confirmed what I had suspected, but were no less shocking.  Starting with the 11th graders math scores.  These kids averaged a success rate between 66% and 74.6% over the years, yet last year only 42% of these students were proficient on the Geometry test.  Our 8th graders:  in math they scored between 79% and 74% over the years, but last year only 42% were proficient.  ELA say similar results.  Historically these kids scored between 72% and 91% yet only 40% scored proficient last year.  Again, I could provide additional examples, but in the interest of time I'll bring this to a close.

New London Local Schools has a strong tradition of academic excellence.  This tradition is not based on State Assessments, but is illustrated by them.  During the 2015-2016 School Year, our excellent team of teachers and staff didn't decide to stop educating our students.  During the 2015-2016 School Year our students did not stop attending school nor did they magically decide to stop learning.  Regardless of if you look at it from the State level, Regional level, or local level these scores point to one conclusion -  the assessments were flawed.  There is no rational explanation for such a dramatic disparity in historical scores, both on the district level and the individual student level.

Since much of the data used in generating our Report Card comes from these assessments, it's not just the Achievement component that will be impacted.  This testing environment that Ohio currently finds itself is not in the best interest of students.  Over assessing students, especially with assessments that are flawed does not promote education or educational reform.  Possibly the greatest negative from this flawed environment is the impact on public perception of public education.  As I discussed in a previous blog, our ACT scores were very strong last year.  Our students are becoming more prepared for college and career at a greater rate now than they ever have.  Our extra and co-curricular activities are the strongest they've been in over a decade.  The reality is, New London Local Schools is setting the example on how to educate the total student.  I'd hate to see one flawed state assessment change how people view our district.

 Let me know what you think! Post a comment to this Blog or contact me via email or phone. Thank you for taking the time to read such a lengthy Blog.

No comments:

Post a Comment