SBAC: The Beginning Of The End

So what did we learn from the release of the SBAC scores? What did we learn after spending more than 2 million dollars of state money and countless millions at the district levels to get these scores?

Not much.

We did learn that the achievement gap has not been in any way affected by implementation of the Common Core. I have been in a position to analyze CMT and CAPT scores over many years, and the SBAC scores tell the same story as the CMT and CAPT scores. That story is that students in affluent communities score significantly higher than students in poor communities do. No administration of a test will ever change that fact. No set of national standards or standardized test on those standards will ever “close the achievement gap”. First of all, high scores depend on the quality of the lives children have outside of school much more than what happens in school. Secondly, if the national standards and aligned testing did raise scores, then all scores would go up, both those of the students in affluent districts and those in poor cities. So the “gap” would be unchanged.

We did learn that charter schools, even with their cherry-picked student bodies, did not do better than many public school districts which do not restrict their student populations of special education students, English language learners, or students with behavioral issues. For example, SBAC 8th grade math scores for charter schools ranked 63, 67, 71, 74, 100, 103, 107, 119, 123,130, and 133 out of 133 reporting districts and schools. Of course, many of those charter schools had better scores than the districts from which their students came and should be expected to have better scores than the students’ originating public school districts because the charter schools have siphoned off some students with drive and potential from those districts.

We did learn that the SBAC scores tell us nothing about the learning going on in Connecticut schools. We can’t tell what schools just paid lip service to Common Core Standards and what ones focused almost exclusively on the Common Core. Without a doubt, the schools with scores demonstrating under 20% proficiency on the SBAC spent more time on test prep than the schools in affluent districts with higher SBAC scores. Yet we are told that schools must limit their curriculum to Common Core so that the school’s test scores will improve. It makes no sense. Some districts which had curriculum dedicated to the Common Core and teachers who taught to it diligently had low test scores, and some districts that just about ignored the Common Core in curriculum and practice had good scores. High test scores and teaching to the Common Core had  zero correlation.

We also learned that SBAC scores tell us nothing about students’ real competencies. As anyone who has an understanding of how to teach students to be thoughtful readers, effective writers, and competent thinkers knows, the more a teacher teaches to the Common Core ELA standards, the farther away those students will be from being thoughtful readers, effective writers, and competent thinkers. So the actual achievement gap will widen between the students in the affluent communities and the students in the cities with their increased test prep due to the low 2015 SBAC scores.

The Common Core Standards for English Language Arts lack any research base whatsoever and have no evidence that they will produce “college and career readiness”, yet we restrict our neediest students to that Common Core regimen due to our misplaced reliance on the SBAC scores. Just because a PR firm was hired to promote the Common Core Standards and that PR firm, through focus groups, determined that “rigor” was the word that would sell the standards to the American public does not make the standards or the SBAC test rigorous. Neither of them is. The Common Core ELA standards teach a discredited way of reading and an inadequate way of writing, and the SBAC test is an exercise in “Gotcha”.

We did learn from the 2015 SBAC test that opting-out is going to be an influential part of the narrative about assessing learning in the future. For example, in West Hartford, Conard High School had an opt-out rate of 5.5% and Hall High School had a 61.4 % opt out rate. What then can we tell about the two schools in the same town? Does Hall have more students who have applied to competitive colleges and do not want their excellent records of good grades and SAT scores hurt by a test designed to produce low scores? Does Hall High have parents who are more savvy than Conard parents and who are making a statement about their values and the kind of learning that they want for their children? Is learning richer and deeper at Hall than at Conard so that students and their parents seek other kinds of demonstrations of student achievement?

Also, are Westbrook High School, North Haven High School, Hartford Public High School’s Law and Government Academy, Daniel Hand High School in Madison, and E.O. Smith High School in Storrs places where the emphasis is on real learning because more than 85% of the juniors in those schools opted-out of the 2105 SBAC math test? School by school, parent by parent, district by district, those questions will be explored now that Connecticut has completed its first year of SBAC testing, and, if we can judge by what is happening in New York where implementation of the Common Core and the taking of a Common Core aligned test is a year ahead of Connecticut, it seems reasonable to believe that opting-out will increase.

Over this past year of SBAC testing, some told the story that we need SBAC to close the achievement gap. That story is wrong. Closing the achievement gap will never happen with standardized tests. Some told the story that we need SBAC to gather data in order to compare schools and districts. That story is wrong. SBAC data is same-old, same-old; we had it all along with our state tests. Some told the story that we need SBAC to gather data about individual students and the skills they need. That story is wrong. SBAC doesn’t address students’ learning needs; teachers do. Some told the story that SBAC measures what students need to learn, but that story is terribly wrong. Those telling it must not be educators. They must not know what real learning is or what students need to be prepared to do.

It is time to end SBAC. It is time for a new story. A true one.

6 thoughts on “SBAC: The Beginning Of The End

  1. Amen. Passed time to tell your legislators both in state and in Congress working on the education law of the nation right now to end this annual travesty of which we are the only nation on the planet to do so to no avail since 2001 when it was enacted. It seems the achievement gap actually serves an important purpose- and hence it will always remain- to give profit to those who sell stuff ” to fix it” , and to give policy makers something to talk about. Understand what is in the Every Child Achieves Act in the Senate and understand what is in the Student Success Act in the House. Understand how public education, open to all, which serves all, and are democratically run through elected boards of education are to be eliminated and replaced by their investor funder, privatizers to profit off of K-12 education and children starting in primarily poor, urban areas. Understand that eventually every district, every school becomes the bottom 5% and what happens to your district and school then. If you don’t know these details find out. Opt out. Deny the false data.


  2. The corporate education reform moment was never about raising test scores or finding better ways to teach children. It was always about profit, increased wealth for a few and wealth acquisition for a few more. The tests were just a tool to fool as many people as possible to support the corporate, profit agenda..


  3. Where to start? What we are seeing is a shift from Normed Referenced testing where the focus was on “spreading students out” and effectively ranking performance from highest to lowest. The problem was that this NRT model was unresponsive to classroom learning because it was “G” or IQ loaded. The solution was to move to Criterion-Referenced Testing such as the CMT which focused on clear narrowly defined content like “Addition of two-digit numbers”. Hence, content trumps item difficulty. This type of test does not typically result in a normal distribution as does an NRT, but rather one that is skewed to the high end. That is, the test is easier and more student scores bunch up at the top end. This in turn makes the identification of low performers more precise and accurate because those scores are at the thin tail of the low-score end of the distribution. Thinner tail means lower density of scores and fewer miss-classifications. As time went on, the CMT was also used to identify higher achievement, but this is less accurate because the scores are more dense at the high end and so ipso-facto there are relatively more miss-classifications. Now with SBAC we see the pendulum shift back to more of a focus on higher end performance and hence is more G-loaded. with this we can expect that the test will be less responsive to classroom instruction because the test content categories are more abstract and the items are more difficult and complex. Also, the test is adaptive, which means that items are delivered to the test taker based on performance of prior items. This type of test delivery is more efficient, but not as “diagnostic” because gone are the clearly defined objectives like “Addition of two-digit numbers”. A particular student would only “see” such an item if their performance was low enough to warrant it. A test like the SBAC is also more “G” loaded because the entire focus is on item difficulty and ranking students on difficulty, and not on item content as was the case with the old CMT. How this benefits Connecticut students is a matter for debate, but it is likely that it will increase the performance gap because the test content is harder to pin down, and the test and it’s administration method are more focused on difficulty. Hence it will be harder to identify what is needed to improve performance on the test and this will have a greater negative impact on students such as those in our inner-cities who are dealing with higher social disadvantage factors. Sure hope this helps…


Leave a Reply to Lloyd Lofthouse Cancel reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s