Visit our Campaign Facebook Page!

Wednesday, October 8, 2014

DOE: Don't hold us to the same standards we hold the schools

As Lt. Gov. Matt Denn's IEP Task Force continues to wade through the morass that is Special Education in Delaware, lost in the shuffle is a particularly blatant and telling piece of DOE sidestepping on so-called "Standards-based" IEPs.

In 2012 DOE applied for a grant to improve Special Education outcomes in Delaware.

What DOE told the Feds was that not enough kids on IEPs were passing DCAS or graduating:

Goal 1: To increase the academic achievement of students with disabilities, through the implementation of sustainable, evidence-based instructional strategies to impact students with the greatest academic needs. 
Goal 2: To increase the graduation rates and academic achievement of students most at risk of dropping out of school, through the use of sustainable, evidence-based social and behavioral practices, as well as enhanced professional development to educators and related staff. 
"Academic Achievement" is clearly and unambiguously defined in the grant proposal as scoring a passing grade on state assessments (then DCAS).

What DOE proposed to do (not surprisingly) was double-down on Common Core and High-stakes testing, and require--beginning in pilot districts then later extending to the whole state--that ALL IEPs be directly linked to Common Core standards.

Red Clay was selected as one of these pilot districts, and the special needs teachers there have been literally inundated with "training" in developing and implementing those new IEPs.

I'm going to leave aside for the moment the very real question of whether or not this is a good strategy for improving the education of these kids (it isn't).  Let's just assume that, for sake of argument, Standards-Based IEPS are actually going to work as advertised.

That's what DOE believes, right?  Or else they wouldn't be putting themselves on the line here to raise test scores for special needs kids, would they?

Well, it turns out that DOE is NOT putting itself on the line, and is in fact holding itself to a FAR LOWER standard than it does, say, the six "Priority" schools, Moyer, or Reach.

You see, DOE has identified the Problem (low Spec Ed test scores) and then identified a strategy to improve those scores (Standards-based IEPS), and that only leaves the ASSESSMENT of how effective the strategy has been, once it has been implemented.

That assessment would logically involve looking to see if test scores actually went up for the students in question, right?

Wrong.

Here's what DOE is assessing itself on:  Fidelity of Implementation
The project evaluator, working closely with the DE SPDG Management Team, will develop training, coaching, and intervention fidelity instruments during the first two quarters of Year 1. Each intervention fidelity instrument (IEP development, SIM, and communication interventions) will be developed in accordance with the evidence-base it is derived from. IEP training and coaching fidelity protocols will be developed in alignment with the research presented in the Holbrook/Courtade and Browder publications, and reviewed by the authors. SIM fidelity instruments are provided by the University of Kansas. Fidelity protocols established by researchers at U.K. will be used to assess the implementation of communication strategies. Pre/post training assessments will also be developed during this time. 
The DE SPDG Management Team will be responsible for overseeing fidelity measurement and reporting. Project evaluators will train and coach the state and LEA coaches on the use of implementation (training and coaching) and intervention (i.e., IEP development, SIM, and communication) fidelity instruments. An easy to use, web-based data management system (using tools such as SurveyMonkey and Microsoft Access) will be developed. 

"Fidelity of Implementation" means whether or not DOE trains teachers as it said it would do, and whether or not the teacher actually employ the new strategies like they are supposed to do.  In other words, DOE is evaluating its success in this grant NOT on whether student test scores actually go up, but just on whether or not they tried real hard according to the model they thought up.

Now you will have to read the entire 103-page grant to assure yourself that what I'm saying next is correct, and encourage you to do so:  DOE never uses data about improvements in student test scores on state testing to determine whether they have done their job.

Do you understand what this means?

To Red Clay, Christina, and the rest of the schools in the State, DOE has repeatedly said that all that matters in determining your effectiveness as educators is how well students do on the State assessments.  If a school does not do well on those assessments, it is failing, plain and simple.

Yet when DOE implements a plan to raise the test scores of a target group of students (in this case special education students), it does not evaluate its own success based on test scores.

Wow.  Just wow.

So while DOE is passing out MOUs on "failing" schools (based entirely on test scores) that empower the State to fire entire faculties and convert public schools involuntarily into charters--again, based entirely on test scores--DOE does not feel it is appropriate to measure the success of its own programs based on test scores.

Here's a serious suggestion to Red Clay and Christina:  how about when you finally develop those Priority Schools plans, you base success or failure on "fidelity of implementation," not test scores, and cite as your rationale that this is the standard that DOE applies to its own initiatives?

Here's the link to the grant application.  Check it out for yourself.

1 comment:

  1. I've been waiting for this article for weeks, and I'm glad to see it. Thank you Steve, for shedding light on this controversial topic.

    ReplyDelete